Langfuse Integration with OpenWebUI
Langfuse (GitHub) offers open source observability and evaluations for OpenWebUI. By enabling the Langfuse integration, you can trace your application data with Langfuse to develop, monitor, and improve the use of OpenWebUI, including:
- Application traces
- Usage patterns
- Cost data by user and model
- Replay sessions to debug issues
- Evaluations
How to integrate Langfuse with OpenWebUIâ
Langfuse integration steps
Pipelines in OpenWebUi is an UI-agnostic framework for OpenAI API plugins. It enables the injection of plugins that intercept, process, and forward user prompts to the final LLM, allowing for enhanced control and customization of prompt handling.
To trace your application data with Langfuse, you can use the Langfuse pipeline, which enables real-time monitoring and analysis of message interactions.
Quick Start Guideâ
Step 1: Setup OpenWebUIâ
Make sure to have OpenWebUI running. To do so, have a look at the OpenWebUI documentation.
Step 2: Set Up Pipelinesâ
Launch Pipelines by using Docker. Use the following command to start Pipelines:
docker run -p 9099:9099 --add-host=host.docker.internal:host-gateway -v pipelines:/app/pipelines --name pipelines --restart always ghcr.io/open-webui/pipelines:main
Step 3: Connecting OpenWebUI with Pipelinesâ
In the Admin Settings, create and save a new connection of type OpenAI API with the following details:
- URL: http://host.docker.internal:9099 (this is where the previously launched Docker container is running).
- Password: 0p3n-w3bu! (standard password)