Starting With vLLM
Overview
vLLM provides an OpenAI-compatible API, making it easy to connect to Open WebUI. This guide will show you how to connect your vLLM server.
Step 1: Set Up Your vLLM Server
Make sure your vLLM server is running and accessible. The default API base URL is typically:
http://localhost:8000/v1
For remote servers, use the appropriate hostname or IP address.
Step 2: Add the API Connection in Open WebUI
- Go to ⚙️ Admin Settings.
- Navigate to Connections > OpenAI > Manage (look for the wrench icon).
- Click ➕ Add New Connection.
- Fill in the following:
- API URL:
http://localhost:8000/v1(or your vLLM server URL) - API Key: Leave empty (vLLM typically doesn't require an API key for local connections)
- API URL:
- Click Save.
Step 3: Start Using Models
Select any model that's available on your vLLM server from the Model Selector and start chatting.