Relative Content

Tag Archive for dockerdocker-composeollama

Connection refused error when using Ollama in a Docker environment

I’m developing a FastAPI application that uses Langchain with Ollama. The application is containerized using Docker, and I’m trying to connect to a separate Ollama container. However, I’m encountering a connection refused error when attempting to use the Ollama service.

Ollama + Docker compose: how to pull model automatically with container creation?

When trying to access the ollama container from another (node) service in my docker compose setup, I get the following error: ResponseError: model 'llama3' not found, try pulling it first. I want the setup for the containers to be automatic and don’t want to manually connect to the containers and manually pull the models.
Is there a way to load the model of my choice automatically when I create the ollama docker container?