Skip to main content

Open Responses

Overview

Open WebUI has experimental support for Open Responses, an open specification for multi-provider, interoperable LLM interfaces. This guide walks you through enabling the Open Responses API for a connection.

Experimental Feature

This feature is currently experimental and may not work as expected with all providers.


What is Open Responses?

Open Responses is an open-source specification that standardizes how LLM requests and responses are structured across providers. It provides:

  • One spec, many providers: Describe inputs/outputs once; run on OpenAI, Anthropic, Gemini, or local models.
  • Composable agentic loops: Unified streaming, tool invocation, and message orchestration.
  • Easier evaluation and routing: Compare providers, route requests, and log results through a shared schema.

Open Responses builds on the OpenAI Responses API format but is designed to work across any provider that implements the spec.


Step 1: Add or Edit a Connection

  1. Go to the ⚙️ Admin Settings.
  2. Navigate to Connections > OpenAI > Manage (look for the wrench icon).
  3. Click ➕ Add New Connection or edit an existing connection.

Step 2: Select the API Type

In the connection modal, look for the API Type setting:

  • Chat Completions (default): Uses the standard OpenAI Chat Completions API format.
  • Responses (experimental): Uses the Open Responses API format.

Click the toggle to switch to Responses. You'll see an "Experimental" label indicating this feature is still in development.

API Type Toggle


Step 3: Configure Your Provider

Enter the connection details for a provider that supports the Open Responses format:

  • URL: Your provider's API endpoint
  • API Key: Your authentication credentials
  • Model IDs: (Optional) Specify specific models to use

Supported Providers

Open Responses is a new specification, so provider support is still growing. Check the Open Responses website for the latest list of compliant providers and implementations.


Troubleshooting

Connection works with Chat Completions but not Responses

Not all providers support the Open Responses format yet. Try:

  1. Switching back to Chat Completions
  2. Checking if your provider explicitly supports Open Responses
  3. Using a middleware proxy that can translate to Open Responses format

Streaming or tool calls behave unexpectedly

This feature is experimental. If you encounter issues:

  1. Check the Open WebUI Discord for known issues
  2. Report bugs on GitHub

Learn More