π Connect a Provider
Connect Open WebUI to any model provider and start chatting in minutes.
Open WebUI supports multiple connection protocols, including Ollama, OpenAI-compatible APIs, and Open Responses. Any cloud API or local server that speaks one of these protocols works out of the box. Just add a URL and API key, and your models appear in the dropdown.
How It Worksβ
ββββββββββββββββ ββββββββββββββββββββ ββββββββββββββββ
β β HTTP β β Inferenceβ β
β Open WebUI ββββββββββΆβ Provider API ββββββββββΆ β Model β
β (frontend) βββββββββββ (cloud/local) ββββββββββ β (LLM/VLM) β
β β Stream β β Tokens β β
ββββββββββββββββ ββββββββββββββββββββ ββββββββββββββββ
- You type a message in Open WebUI
- Open WebUI sends it to your provider's API endpoint
- The provider runs inference on the selected model
- Tokens stream back to Open WebUI in real time
- You see the response in the chat interface
Adding a provider is as simple as entering a URL and API key in Admin Settings β Connections. Open WebUI auto-detects available models from most providers.
Cloud Providersβ
Hosted APIs that require an account and API key. No hardware needed.
| Provider | Models | Guide |
|---|---|---|
| Ollama | Llama, Mistral, Gemma, Phi, and thousands more (local) | Starting with Ollama β |
| OpenAI | GPT-4o, GPT-4.1, o3, o4-mini | Starting with OpenAI β |
| Anthropic | Claude Opus, Sonnet, Haiku | Starting with Anthropic β |
| OpenAI-Compatible | Google Gemini, DeepSeek, Mistral, Groq, OpenRouter, Amazon Bedrock, Azure, and more | OpenAI-Compatible Providers β |
Local Serversβ
Run models on your own hardware. No API keys, no cloud dependency.
| Server | Description | Guide |
|---|---|---|
| llama.cpp | Efficient GGUF model inference with OpenAI-compatible API | Starting with llama.cpp β |
| vLLM | High-throughput inference engine for production workloads | Starting with vLLM β |
More local servers (LM Studio, LocalAI, Docker Model Runner, Lemonade) are covered in the OpenAI-Compatible Providers guide.
Other Connection Methodsβ
| Feature | Description | Guide |
|---|---|---|
| Open Responses | Connect providers using the Open Responses specification | Starting with Open Responses β |
| Functions | Extend Open WebUI with custom pipe functions for any backend | Starting with Functions β |
Looking for Agents?β
If you want to connect an autonomous AI agent (with terminal access, file operations, web search, and more) instead of a plain model provider, see Connect an Agent.