Use local models like gpt4all #1306
Closed
prenesh0309
started this conversation in
General
Replies: 2 comments 3 replies
-
if your model is served as a REST api,you can change the openai call in llm_utils.py by using a http request to your api endpoint instead. |
Beta Was this translation helpful? Give feedback.
3 replies
-
Closing as duplicate of #34 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
can this be configured to use a locally running gpt like gpt4all. that will reduce the api costs significantly.
Beta Was this translation helpful? Give feedback.
All reactions