π Migration Guide: Open WebUI 0.4 to 0.5
Welcome to the Open WebUI 0.5 migration guide! If you're working on existing projects or building new ones, this guide will walk you through the key changes from version 0.4 to 0.5 and provide an easy-to-follow roadmap for upgrading your Functions. Let's make this transition as smooth as possible! π
π§ What Has Changed and Why?β
With Open WebUI 0.5, weβve overhauled the architecture to make the project simpler, more unified, and scalable. Here's the big picture:
- Old Architecture: π― Previously, Open WebUI was built on a sub-app architecture where each app (e.g.,
ollama
,openai
) was a separate FastAPI application. This caused fragmentation and extra complexity when managing apps. - New Architecture: π With version 0.5, we have transitioned to a single FastAPI app with multiple routers. This means better organization, centralized flow, and reduced redundancy.
Key Changes:β
Hereβs an overview of what changed:
-
Apps have been moved to Routers.
- Previous:
open_webui.apps
- Now:
open_webui.routers
- Previous:
-
Main app structure simplified.
- The old
open_webui.apps.webui
has been transformed intoopen_webui.main
, making it the central entry point for the project.
- The old
-
Unified API Endpoint
- Open WebUI 0.5 introduces a unified function,
chat_completion
, inopen_webui.main
, replacing separate functions for models likeollama
andopenai
. This offers a consistent and streamlined API experience. However, the direct successor of these individual functions isgenerate_chat_completion
fromopen_webui.utils.chat
. If you prefer a lightweight POST request without handling additional parsing (e.g., files, tools, or misc), this utility function is likely what you want.
- Open WebUI 0.5 introduces a unified function,
Example:β
# Full API flow with parsing (new function):
from open_webui.main import chat_completion
# Lightweight, direct POST request (direct successor):
from open_webui.utils.chat import generate_chat_completion
Choose the approach that best fits your use case!
- Updated Function Signatures.
- Function signatures now adhere to a new format, requiring a
request
object. - The
request
object can be obtained using the__request__
parameter in the function signature. Below is an example:
- Function signatures now adhere to a new format, requiring a
class Pipe:
def __init__(self):
pass
async def pipe(
self,
body: dict,
__user__: dict,
__request__: Request, # New parameter
) -> str:
# Write your function here
π Why did we make these changes?
- To simplify the codebase, making it easier to extend and maintain.
- To unify APIs for a more streamlined developer experience.
- To enhance performance by consolidating redundant elements.
β Step-by-Step Migration Guideβ
Follow this guide to smoothly update your project.
π 1. Shifting from apps
to routers
β
All apps have been renamed and relocated under open_webui.routers
. This affects imports in your codebase.
Quick changes for import paths:
Old Path | New Path |
---|---|
open_webui.apps.ollama | open_webui.routers.ollama |
open_webui.apps.openai | open_webui.routers.openai |
open_webui.apps.audio | open_webui.routers.audio |
open_webui.apps.retrieval | open_webui.routers.retrieval |
open_webui.apps.webui | open_webui.main |
π An Important Exampleβ
To clarify the special case of the main app (webui
), hereβs a simple rule of thumb:
- Was in
webui
? Itβs now in the projectβs root oropen_webui.main
. - For example:
- Before (0.4):
from open_webui.apps.webui.models import SomeModel
- After (0.5):
from open_webui.models import SomeModel
- Before (0.4):
In general, just replace open_webui.apps
with open_webui.routers
βexcept for webui
, which is now open_webui.main
!
π©βπ» 2. Updating Import Statementsβ
Letβs look at what this update looks like in your code:
Before:β
from open_webui.apps.ollama import main as ollama
from open_webui.apps.openai import main as openai
After:β
# Separate router imports
from open_webui.routers.ollama import generate_chat_completion
from open_webui.routers.openai import generate_chat_completion
# Or use the unified endpoint
from open_webui.main import chat_completion
π‘ Pro Tip: Prioritize the unified endpoint (chat_completion
) for simplicity and future compatibility.
π Additional Note: Choosing Between main.chat_completion
and utils.chat.generate_chat_completion
β
Depending on your use case, you can choose between:
-
open_webui.main.chat_completion
:- Simulates making a POST request to
/api/chat/completions
. - Processes files, tools, and other miscellaneous tasks.
- Best when you want the complete API flow handled automatically.
- Simulates making a POST request to
-
open_webui.utils.chat.generate_chat_completion
:- Directly makes a POST request without handling extra parsing or tasks.
- This is the direct successor to the previous
main.generate_chat_completions
,ollama.generate_chat_completion
andopenai.generate_chat_completion
functions in Open WebUI 0.4. - Best for simplified and more lightweight scenarios.
Example:β
# Use this for the full API flow with parsing:
from open_webui.main import chat_completion
# Use this for a stripped-down, direct POST request:
from open_webui.utils.chat import generate_chat_completion
π 3. Adapting to Updated Function Signaturesβ
Weβve updated the function signatures to better fit the new architecture. If you're looking for a direct replacement, start with the lightweight utility function generate_chat_completion
from open_webui.utils.chat
. For the full API flow, use the new unified chat_completion
function in open_webui.main
.
Function Signature Changes:β
Old | Direct Successor (New) | Unified Option (New) |
---|---|---|
openai.generate_chat_completion(form_data: dict, user: UserModel) | generate_chat_completion(request: Request, form_data: dict, user: UserModel) | chat_completion(request: Request, form_data: dict, user: UserModel) |
- Direct Successor (
generate_chat_completion
): A lightweight, 1:1 replacement for previousollama
/openai
methods. - Unified Option (
chat_completion
): Use this for the complete API flow, including file parsing and additional functionality.
Example:β
If you're using chat_completion
, hereβs how your function should look now:
π οΈ How to Refactor Your Custom Functionβ
Letβs rewrite a sample function to match the new structure:
Before (0.4):β
from pydantic import BaseModel
from open_webui.apps.ollama import generate_chat_completion
class User(BaseModel):
id: str
email: str
name: str
role: str
class Pipe:
def __init__(self):
pass
async def pipe(self, body: dict, __user__: dict) -> str:
# Calls OpenAI endpoint
user = User(**__user__)
body["model"] = "llama3.2:latest"
return await ollama.generate_chat_completion(body, user)
After (0.5):β
from pydantic import BaseModel
from fastapi import Request
from open_webui.utils.chat import generate_chat_completion
class User(BaseModel):
id: str
email: str
name: str
role: str
class Pipe:
def __init__(self):
pass
async def pipe(
self,
body: dict,
__user__: dict,
__request__: Request,
) -> str:
# Uses the unified endpoint with updated signature
user = User(**__user__)
body["model"] = "llama3.2:latest"
return await generate_chat_completion(__request__, body, user)
Important Notes:β
- You must pass a
Request
object (__request__
) in the new function signature. - Other optional parameters (like
__user__
and__event_emitter__
) ensure flexibility for more complex use cases.
π 4. Recap: Key Concepts in Simple Termsβ
Hereβs a quick cheat sheet to remember:
- Apps to Routers: Update all imports from
open_webui.apps
β‘οΈopen_webui.routers
. - Unified Endpoint: Use
open_webui.main.chat_completion
for simplicity if bothollama
andopenai
are involved. - Adapt Function Signatures: Ensure your functions pass the required
request
object.
π Hooray! You're Ready!β
That's it! You've successfully migrated from Open WebUI 0.4 to 0.5. By refactoring your imports, using the unified endpoint, and updating function signatures, you'll be fully equipped to leverage the latest features and improvements in version 0.5.
π¬ Questions or Feedback? If you run into any issues or have suggestions, feel free to open a GitHub issue or ask in the community forums!
Happy coding! β¨