You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've integrated the Open Telemetry langsmith exporter as follows for the nextJS instructions using Instructions. I used the registerOTel pathway, but found that it seems very unstable. For a specific span I'm getting a reported 4.2m tokens, which just shouldn't be possible. Should be ~800 tokens.
I've tried a few different setups to get the Vercel AI SDK to export using OTel to Langsmith (NodeJS, Client Wrapper) without figuring out a way that worked well for me minus the deprecated client wrapper.
Is LangSmith OTel supposed to be supported now or is it still a beta feature?
A possible related thing I've noticed in the OTel logs is that the span is attempted to be ended many times more than once.
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: fd24d00f5e622849}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: fd24d00f5e622849}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 7d716f1187f8c2ca}
ai.generateText.doGenerate cb07b36b15a40966f3ddc45af4f9c825-7d716f1187f8c2ca - You can only call end() on a span once.
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 8fc168090ba0fba9}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 8fc168090ba0fba9}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 8fc168090ba0fba9}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 8fc168090ba0fba9}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 8fc168090ba0fba9}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 8fc168090ba0fba9}
Can not execute the operation on ended Span {traceId: cb07b36b15a40966f3ddc45af4f9c825, spanId: 8fc168090ba0fba9}
ai.generateText cb07b36b15a40966f3ddc45af4f9c825-8fc168090ba0fba9 - You can only call end() on a span once.
Suggestion:
No response
The text was updated successfully, but these errors were encountered:
If it's helpful I get this in the logs when doing it the NodeJS way
{"stack":"OTLPExporterError: Not Found\n at IncomingMessage.eval (webpack-internal:///(instrument)/./node_modules/@opentelemetry/otlp-exporter-base/build/src/platform/node/http-transport-utils.js:62:31)\n at eval (webpack-internal:///(instrument)/./node_modules/@opentelemetry/context-async-hooks/build/src/AbstractAsyncHooksContextManager.js:50:55)\n at AsyncLocalStorage.run (node:async_hooks:335:14)\n at AsyncLocalStorageContextManager.with (webpack-internal:///(instrument)/./node_modules/@opentelemetry/context-async-hooks/build/src/AsyncLocalStorageContextManager.js:33:40)\n at IncomingMessage.contextWrapper (webpack-internal:///(instrument)/./node_modules/@opentelemetry/context-async-hooks/build/src/AbstractAsyncHooksContextManager.js:50:32)\n at IncomingMessage.emit (node:events:531:35)\n at endReadableNT (node:internal/streams/readable:1696:12)\n at process.processTicksAndRejections (node:internal/process/task_queues:82:21)","message":"Not Found","name":"OTLPExporterError","code":"404"}
Hello @Averylamp! Coming from #1154, I assume that this might be a possible issue with Sentry. We've updated the documentation with a sample on how to attach our AISDKExporter to an existing Sentry OTEL setup.
Issue you'd like to raise.
I've integrated the Open Telemetry langsmith exporter as follows for the nextJS instructions using Instructions. I used the registerOTel pathway, but found that it seems very unstable. For a specific span I'm getting a reported 4.2m tokens, which just shouldn't be possible. Should be ~800 tokens.
I've tried a few different setups to get the Vercel AI SDK to export using OTel to Langsmith (NodeJS, Client Wrapper) without figuring out a way that worked well for me minus the deprecated client wrapper.
Is LangSmith OTel supposed to be supported now or is it still a beta feature?
A possible related thing I've noticed in the OTel logs is that the span is attempted to be ended many times more than once.
Suggestion:
No response
The text was updated successfully, but these errors were encountered: