Skip to main content

⏱️ Quick Start

Sponsored by Open WebUI
Open WebUI
The top banner spot is reserved for Emerald+ Enterprise sponsors on a first-come, first-served basis
Important Note on User Roles and Privacy:
  • Admin Creation: The first account created on Open WebUI gains Administrator privileges, controlling user management and system settings.
  • User Registrations: Subsequent sign-ups start with Pending status, requiring Administrator approval for access.
  • Privacy and Data Security: All your data, including login details, is locally stored on your device. Open WebUI ensures strict confidentiality and no external requests for enhanced privacy and security.
    • All models are private by default. Models must be explicitly shared via groups or by being made public. If a model is assigned to a group, only members of that group can see it. If a model is made public, anyone on the instance can see it.

Choose your preferred installation method below:

  • Docker: Officially supported and recommended for most users
  • Python: Suitable for low-resource environments or those wanting a manual setup
  • Kubernetes: Ideal for enterprise deployments that require scaling and orchestration

Quick Start with Docker πŸ³β€‹

Follow these steps to install Open WebUI with Docker.

Step 1: Pull the Open WebUI Image​

Start by pulling the latest Open WebUI Docker image from the GitHub Container Registry.

docker pull ghcr.io/open-webui/open-webui:main

Step 2: Run the Container​

Run the container with default settings. This command includes a volume mapping to ensure persistent data storage.

docker run -d -p 3000:8080 -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main

Important Flags​

  • Volume Mapping (-v open-webui:/app/backend/data): Ensures persistent storage of your data. This prevents data loss between container restarts.
  • Port Mapping (-p 3000:8080): Exposes the WebUI on port 3000 of your local machine.

Using GPU Support​

For Nvidia GPU support, add --gpus all to the docker run command:

docker run -d -p 3000:8080 --gpus all -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:cuda

Single-User Mode (Disabling Login)​

To bypass the login page for a single-user setup, set the WEBUI_AUTH environment variable to False:

docker run -d -p 3000:8080 -e WEBUI_AUTH=False -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main
warning

You cannot switch between single-user mode and multi-account mode after this change.

Advanced Configuration: Connecting to Ollama on a Different Server​

To connect Open WebUI to an Ollama server located on another host, add the OLLAMA_BASE_URL environment variable:

docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Access the WebUI​

After the container is running, access Open WebUI at:

http://localhost:3000

For detailed help on each Docker flag, see Docker's documentation.

Updating​

To update your local Docker installation to the latest version, you can either use Watchtower or manually update the container.

Option 1: Using Watchtower​

With Watchtower, you can automate the update process:

docker run --rm --volume /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --run-once open-webui

(Replace open-webui with your container's name if it's different.)

Option 2: Manual Update​

  1. Stop and remove the current container:

    docker rm -f open-webui
  2. Pull the latest version:

    docker pull ghcr.io/open-webui/open-webui:main
  3. Start the container again:

    docker run -d -p 3000:8080 -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main

Both methods will get your Docker instance updated and running with the latest build.

Next Steps​

After installing, visit:

You are now ready to start using Open WebUI!

Using Open WebUI with Ollama​

If you're using Open WebUI with Ollama, be sure to check out our Starting with Ollama Guide to learn how to manage your Ollama instances with Open WebUI.

Join the Community​

Need help? Have questions? Join our community:

Stay updated with the latest features, troubleshooting tips, and announcements!