Configure Hymalaia to use Ollama
โ ๏ธ Note: While we support self-hosted LLMs, you will get significantly better responses with more powerful models like GPT-4.
๐ Note: For the API Base, when using Docker, point tohost.docker.internal
instead oflocalhost
(e.g.,http://host.docker.internal:11434
).