Let's bring it all home #1059
JamesStallings
started this conversation in
General
Replies: 1 comment 1 reply
-
Hello @JamesStallings! You can use phidata with Ollama without docker: https://docs.phidata.com/llms/ollama Though we use a postgres server running on docker for memory and knowledge base for the Assistant. But it is not required and depends on your use case. So, docker is not needed to get an LLM working with phidata + Ollama. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is it not possible to use phidata with local LLMs runningg under e.g., ollama, and keep this all local and opensource? and please, without docker? No interest in dodging the limits on free token quotas with the cloud LLMs, and no interest in sending them money.
Docker has too much overhead, from a resource perspective and a complexity perspective. Too many moving parts, too much to go wrong, too much misdirection.
Give us the simple recipe, let the docker wizards do their thing with it.
Beta Was this translation helpful? Give feedback.
All reactions