-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support other Copilot LLMs than OpenAI #401
Comments
Anthropic mostly works with copilot, except some issues around retrieving malicious packages in natural language chat. |
I'll do this together with #415 because the free copilot also supports anthropic. |
With Copilot+Anthropic the user message can contain instructions that are passed from the chat client and really sound like a system message. For whatever reason, the chat client includes them with `role=user`. These instructions confuse the LLM when extracting the package names or ecosystem and it replies with a mix of JSON and natural language. To work around that, Pankaj suggested to tune the system prompt and tell the LLM to ignore the instructions. This seems to work and helps Anthropic extract the package names even with copilot user message. Co-authored-by: Pankaj Telang <[email protected]> Related: #401
PR #428 helps to extract package names and ecosystems with Copilot and Anthropic. There will be a follow up to match them in weaviate - we are discussing how to approach that with Pankaj. Once those are fixed, Anthropic will be supported with Copilot! |
I co-assigned @ptelang because during testing we saw issues with extracting package names from natural language prompts also with GPT and this needs an LLM expert :-) |
Our Copilot provider is fairly tied to it using OpenAI as LLM, but since recently, Copilot can use other LLMs as well. We should support that.
The text was updated successfully, but these errors were encountered: