OpenAI Responses API
The OpenAI Responses API provides methods to create, retrieve, stream, and delete OpenAI-compatible responses through Mastra agents.
These routes are agent-backed adapters over Mastra agents, memory, and storage. Use agent_id to select the Mastra agent that should handle the request. You can pass model to override the agent's configured model for a single request, or omit it to use the model already configured on the agent.
Stored responses also return conversation_id. In Mastra, this is the raw memory threadId.
This API is currently experimental.
Usage exampleDirect link to Usage example
import { MastraClient } from '@mastra/client-js'
const client = new MastraClient({
baseUrl: 'http://localhost:4111',
})
const response = await client.responses.create({
agent_id: 'support-agent',
input: 'Summarize this ticket',
store: true,
})
console.log(response.output_text)
MethodsDirect link to Methods
LifecycleDirect link to Lifecycle
create(params)Direct link to createparams
Creates a response.
const response = await client.responses.create({
agent_id: 'support-agent',
input: 'Summarize this ticket',
})
Returns: Promise<ResponsesResponse> when stream is omitted or false.
When stream: true, create() returns an async iterable of SSE-style event payloads:
const stream = await client.responses.create({
agent_id: 'support-agent',
input: 'Summarize this ticket',
stream: true,
})
for await (const event of stream) {
if (event.type === 'response.output_text.delta') {
process.stdout.write(event.delta)
}
}
Returns: Promise<ResponsesStream>.
retrieve(responseId, requestContext?)Direct link to retrieveresponseid-requestcontext
Retrieves a stored response.
const response = await client.responses.retrieve('msg_123')
Returns: Promise<ResponsesResponse>.
delete(responseId, requestContext?)Direct link to deleteresponseid-requestcontext
Deletes a stored response.
const deleted = await client.responses.delete('msg_123')
Returns: Promise<{ id: string; object: "response"; deleted: true }>
stream(params)Direct link to streamparams
Creates a streaming response.
const stream = await client.responses.stream({
agent_id: 'support-agent',
input: 'Say hello',
})
for await (const event of stream) {
console.log(event.type)
}
Returns: Promise<ResponsesStream>.
Stored responses and conversationsDirect link to Stored responses and conversations
Stored responses include both response.id and conversation_id.
response.idis the response ID. For stored agent-backed responses, this is the persisted assistant message ID.conversation_idis the raw Mastra thread ID.
Use previous_response_id when you want to continue from a previous stored response. Use conversation_id when you want to target a known thread directly.
const first = await client.responses.create({
agent_id: 'support-agent',
input: 'Start a support thread',
store: true,
})
const second = await client.responses.create({
agent_id: 'support-agent',
conversation_id: first.conversation_id!,
input: 'Add a follow-up to the same thread',
store: true,
})
Use client.conversations when you want to create, retrieve, delete, or inspect the underlying OpenAI Responses API conversation directly.
Function calling (tools)Direct link to Function calling (tools)
response.tools contains the configured function definitions available for the request.
If the model calls a function, that activity appears in response.output as function_call and function_call_output items alongside the final assistant message.
Structured outputDirect link to Structured output
Use text.format when you want JSON output.
json_objectenables JSON mode.json_schemaenables schema-constrained structured output.
Mastra routes both through the agent's structured output path and still returns the JSON in the normal assistant message text output.
const response = await client.responses.create({
agent_id: 'support-agent',
input: 'Return a structured support ticket summary.',
text: {
format: {
type: 'json_schema',
name: 'ticket_summary',
schema: {
type: 'object',
properties: {
summary: { type: 'string' },
priority: { type: 'string' },
},
required: ['summary', 'priority'],
additionalProperties: false,
},
},
},
})
Provider-backed requestsDirect link to Provider-backed requests
Use providerOptions when you need provider-specific options that Mastra does not normalize at the Responses layer.
const response = await client.responses.create({
agent_id: 'support-agent',
input: 'Continue this exchange',
providerOptions: {
openai: {
previousResponseId: 'resp_123',
},
},
})
Response shapeDirect link to Response shape
The returned response object includes:
id: The response IDoutput: Output items such as the assistantmessage,function_call, andfunction_call_outputoutput_text: Convenience getter that joins assistant text outputtools: Configured tool definitions for the requestconversation_id: The raw thread ID for stored responsestext: The requested text output format, when provided