Server Connectivity Issues
We're here to help you get everything set up and running smoothly. Below, you'll find step-by-step instructions tailored for different scenarios to solve common connection issues with Ollama and external servers like Hugging Face.
π Connection to Ollama Serverβ
π Accessing Ollama from Open WebUIβ
Struggling to connect to Ollama from Open WebUI? It could be because Ollama isnβt listening on a network interface that allows external connections. Letβs sort that out:
-
Configure Ollama to Listen Broadly π§: Set
OLLAMA_HOSTto0.0.0.0to make Ollama listen on all network interfaces. -
Update Environment Variables: Ensure that the
OLLAMA_HOSTis accurately set within your deployment environment. -
Restart Ollamaπ: A restart is needed for the changes to take effect.
π‘ After setting up, verify that Ollama is accessible by visiting the WebUI interface.
For more detailed instructions on configuring Ollama, please refer to the Ollama's Official Documentation.
π³ Docker Connection Errorβ
If you're seeing a connection error when trying to access Ollama, it might be because the WebUI docker container can't talk to the Ollama server running on your host. Letβs fix that:
-
Adjust the Network Settings π οΈ: Use the
--network=hostflag in your Docker command. This links your container directly to your hostβs network. -
Change the Port: Remember that the internal port changes from 3000 to 8080.
Example Docker Command:
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
π After running the above, your WebUI should be available at http://localhost:8080.