You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Open Responses lets you run a fully self-hosted version of OpenAI's Responses API. It works seamlessly with any large language model (LLM) providerโwhether it's Claude, Qwen, Deepseek R1, Ollama, or others. It's a fully-compatible drop-in replacement for the official API. Swap out OpenAI without changing your existing Agents SDK code.
The Responses API is our newest core API and an agentic API primitive, combining the simplicity of Chat Completions with the ability to do more agentic tasks. > As model capabilities evolve, the Responses API is a flexible foundation for building action-oriented applications, with built-in tools:
This project is developed by the team behind Julep AI, the open-source platform making it easy for data teams to build, deploy, and scale stateful AI agents and workflows. Check us out on github:
โจ Why use Open Responses?
๐ Bring Your Own Model - Compatible with any LLM provider you prefer.
๐ Privacy First - Fully self-hosted, giving you total control over your data.
๐ Easy Switch - Drop-in replacement compatible with OpenAIโs official Agents SDK.
๐ Fast Setup - Get started quickly with Docker or our straightforward CLI.
๐ ๏ธ Built-in Tools - Supports automatic tool calls like web searches using open-source alternatives.
fromopenaiimportAsyncOpenAIfromagentsimportset_default_openai_client# Create and configure the OpenAI clientcustom_client=AsyncOpenAI(base_url="https://localhost:8080/", api_key="YOUR_RESPONSES_API_KEY")
set_default_openai_client(custom_client)
agent=Agent(
name="Test Agent",
instructions="You are a helpful assistant that provides concise responses."model="openrouter/deepseek/deepseek-r1"
)
result=awaitRunner.run(agent, "Hello! Are you working correctly?")
print(result.final_output)
import{OpenAI}from'openai';constclient=newOpenAI({baseURL: 'https://localhost:8080/',apiKey: "RESPONSE_API_KEY"});constresponse=awaitclient.responses.create({model: "gpt-4o-mini",input: "What's the population of the world today?"});console.log(response.output[0].content[0].text);
importosfromopenaiimportOpenAIclient=OpenAI(
base_url="https://localhost:8080/",
api_key=os.getenv("RESPONSE_API_KEY")
)
response=client.responses.create(
model="gpt-4o-mini",
input="What's the population of the world today?"
)
print(response.output[0].content[0].text)
Open Responses is proudly built by Julep AIโthe open-source platform empowering data and ML teams to rapidly create, deploy, and manage stateful AI workflows and intelligent agents at scale.
๐ค Contributing
Weโd love your contributions! Open Responses is licensed under Apache-2.0.