You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am working on a project with CrewAI (built on top of LangChain) to implement agents. However, I encounter an issue where the traces of LLM calls are not displayed in LangSmith.
When I test a simple script using ChatOpenAI directly, as per LangSmith's documentation example, it works fine. For instance:
The text was updated successfully, but these errors were encountered:
othmansamih
changed the title
Issue: Traces Not Showing for LLM Calls When Using CrewAI Agents
Issue: Traces not showing for LLM calls when using CrewAI
Dec 22, 2024
Issue you'd like to raise.
Description:
I am working on a project with CrewAI (built on top of LangChain) to implement agents. However, I encounter an issue where the traces of LLM calls are not displayed in LangSmith.
When I test a simple script using
ChatOpenAI
directly, as per LangSmith's documentation example, it works fine. For instance:However, when using CrewAI built-on top of Langchain Framework, no traces appear in LangSmith.
Here are the configurations in my
.env
file:Environment Details:
Suggestion:
No response
The text was updated successfully, but these errors were encountered: