Skip to main content
Sponsored by Open WebUI Pipelines
Open WebUI Pipelines
Pipelines: Versatile, UI-Agnostic OpenAI-Compatible Plugin Framework

How to Install ⏱️

Important Note on User Roles and Privacy:
  • Admin Creation: The first account created on Open WebUI gains Administrator privileges, controlling user management and system settings.
  • User Registrations: Subsequent sign-ups start with Pending status, requiring Administrator approval for access.
  • Privacy and Data Security: All your data, including login details, is locally stored on your device. Open WebUI ensures strict confidentiality and no external requests for enhanced privacy and security.
    • All models are private by default. Models must be explicitly shared via groups or by being made public. If a model is assigned to a group, only members of that group can see it. If a model is made public, anyone on the instance can see it.

Choose your preferred installation method below:

  • Docker: Officially supported and recommended for most users
  • Python: Suitable for low-resource environments or those wanting a manual setup
  • Kubernetes: Ideal for enterprise deployments that require scaling and orchestration

Manual Docker Setup

If you prefer to set up Docker manually, follow these steps for Open WebUI.

Step 1: Pull the Open WebUI Image

Start by pulling the latest Open WebUI Docker image from the GitHub Container Registry.

docker pull ghcr.io/open-webui/open-webui:main

Step 2: Run the Container

Run the container with default settings. This command includes a volume mapping to ensure persistent data storage.

docker run -d -p 3000:8080 -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main

Important Flags

  • Volume Mapping (-v open-webui:/app/backend/data): Ensures persistent storage of your data. This prevents data loss between container restarts.
  • Port Mapping (-p 3000:8080): Exposes the WebUI on port 3000 of your local machine.

Using GPU Support

For Nvidia GPU support, add --gpus all to the docker run command:

docker run -d -p 3000:8080 --gpus all -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:cuda

User Roles and Privacy

Important Note on User Roles and Privacy:
  • Admin Creation: The first account created on Open WebUI will have Administrator privileges, managing user access and system settings.
  • User Registrations: Additional accounts start with a Pending status, requiring Administrator approval for access.
  • Data Security: All your data is stored locally on your device, ensuring privacy and no external requests. Open WebUI does not transmit data externally by default.

Single-User Mode (Disabling Login)

To bypass the login page for a single-user setup, set the WEBUI_AUTH environment variable to False:

docker run -d -p 3000:8080 -e WEBUI_AUTH=False -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main
warning

You cannot switch between single-user mode and multi-account mode after this change.

Quick Start with Docker 🐳

Essential Docker Command Options

For Docker installation, consider the following options:

  • Persistent Storage: Use the -v open-webui:/app/backend/data option to ensure all application data remains available across sessions.
  • Exposing Ports: Use the -p flag to map internal container ports to your system.

For example, if you're installing Open WebUI with data persistence and local-only port exposure, the command would be:

docker run -d -p 127.0.0.1:3000:8080 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Advanced Configuration: Connecting to Ollama on a Different Server

To connect Open WebUI to an Ollama server located on another host, add the OLLAMA_BASE_URL environment variable:

docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

GPU Support with Nvidia

For running Open WebUI with Nvidia GPU support, use:

docker run -d -p 3000:8080 --gpus all -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda

Access the WebUI

After the container is running, access Open WebUI at:

http://localhost:3000

For detailed help on each Docker flag, see Docker's documentation.

Data Storage and Bind Mounts

This project uses Docker named volumes to persist data. If needed, replace the volume name with a host directory:

Example:

-v /path/to/folder:/app/backend/data

Ensure the host folder has the correct permissions.

Updating

To update your local Docker installation to the latest version, you can either use Watchtower or manually update the container.

Option 1: Using Watchtower

With Watchtower, you can automate the update process:

docker run --rm --volume /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --run-once open-webui

(Replace open-webui with your container's name if it's different.)

Option 2: Manual Update

  1. Stop and remove the current container:
    docker rm -f open-webui
  2. Pull the latest version:
    docker pull ghcr.io/open-webui/open-webui:main
  3. Start the container again:
    docker run -d -p 3000:8080 -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main

Both methods will get your Docker instance updated and running with the latest build.

Next Steps

After installing, visit:

You are now ready to start Using OpenWebUI!

Join the Community

Need help? Have questions? Join our community:

Stay updated with the latest features, troubleshooting tips, and announcements!