Skip to Content
DocsFrameworksAI SDK

AI SDK

Mastra leverages AI SDK’s model routing (a unified interface on top of OpenAI, Anthropic, etc), structured output, and tool calling.

We explain this in greater detail in this blog post

Mastra + AI SDK

Mastra acts as a layer on top of AI SDK to help teams productionize their proof-of-concepts quickly and easily.

Agent interaction trace showing spans, LLM calls, and tool executions

Model routing

When creating agents in Mastra, you can specify any AI SDK-supported model:

import { openai } from "@ai-sdk/openai"; import { Agent } from "@mastra/core/agent"; const agent = new Agent({ name: "WeatherAgent", instructions: "Instructions for the agent...", model: openai("gpt-4-turbo"), // Model comes directly from AI SDK }); const result = await agent.generate("What is the weather like?");

AI SDK Hooks

Mastra is compatible with AI SDK’s hooks for seamless frontend integration:

useChat

The useChat hook enables real-time chat interactions in your frontend application

  • Works with agent data streams i.e. .toDataStreamResponse()
  • The useChat api defaults to /api/chat
  • Works with the Mastra REST API agent stream endpoint {MASTRA_BASE_URL}/agents/:agentId/stream for data streams, i.e. no structured output is defined.
app/api/chat/route.ts
import { mastra } from "@/src/mastra"; export async function POST(req: Request) { const { messages } = await req.json(); const myAgent = mastra.getAgent("weatherAgent"); const stream = await myAgent.stream(messages); return stream.toDataStreamResponse(); }
import { useChat } from '@ai-sdk/react'; export function ChatComponent() { const { messages, input, handleInputChange, handleSubmit } = useChat({ api: '/path-to-your-agent-stream-api-endpoint' }); return ( <div> {messages.map(m => ( <div key={m.id}> {m.role}: {m.content} </div> ))} <form onSubmit={handleSubmit}> <input value={input} onChange={handleInputChange} placeholder="Say something..." /> </form> </div> ); }

Gotcha: When using useChat with agent memory functionality, make sure to check out the Agent Memory section for important implementation details.

useCompletion

For single-turn completions, use the useCompletion hook:

  • Works with agent data streams i.e. .toDataStreamResponse()
  • The useCompletion api defaults to /api/completion
  • Works with the Mastra REST API agent stream endpoint {MASTRA_BASE_URL}/agents/:agentId/stream for data streams, i.e. no structured output is defined.
app/api/completion/route.ts
import { mastra } from "@/src/mastra"; export async function POST(req: Request) { const { messages } = await req.json(); const myAgent = mastra.getAgent("weatherAgent"); const stream = await myAgent.stream(messages); return stream.toDataStreamResponse(); }
import { useCompletion } from "@ai-sdk/react"; export function CompletionComponent() { const { completion, input, handleInputChange, handleSubmit, } = useCompletion({ api: '/path-to-your-agent-stream-api-endpoint' }); return ( <div> <form onSubmit={handleSubmit}> <input value={input} onChange={handleInputChange} placeholder="Enter a prompt..." /> </form> <p>Completion result: {completion}</p> </div> ); }

useObject

For consuming text streams that represent JSON objects and parsing them into a complete object based on a schema.

  • Works with agent text streams i.e. .toTextStreamResponse()
  • The useObject api defaults to /api/completion
  • Works with the Mastra REST API agent stream endpoint {MASTRA_BASE_URL}/agents/:agentId/stream for text streams, i.e. structured output is defined.
app/api/use-object/route.ts
import { mastra } from "@/src/mastra"; export async function POST(req: Request) { const { messages } = await req.json(); const myAgent = mastra.getAgent("weatherAgent"); const stream = await myAgent.stream(messages, { output: z.object({ weather: z.string(), }), }); return stream.toTextStreamResponse(); }
import { experimental_useObject as useObject } from '@ai-sdk/react'; export default function Page() { const { object, submit } = useObject({ api: '/api/use-object', schema: z.object({ weather: z.string(), }), }); return ( <div> <button onClick={() => submit('example input')}>Generate</button> {object?.content && <p>{object.content}</p>} </div> ); }

Tool Calling

AI SDK Tool Format

Mastra supports tools created using the AI SDK format, so you can use them directly with Mastra agents. See our tools doc on Vercel AI SDK Tool Format for more details.

Client-side tool calling

Mastra leverages AI SDK’s tool calling, so what applies in AI SDK applies here still. Agent Tools in Mastra are 100% percent compatible with AI SDK tools.

Mastra tools also expose an optional execute async function. It is optional because you might want to forward tool calls to the client or to a queue instead of executing them in the same process.

One way to then leverage client-side tool calling is to use the @ai-sdk/react useChat hook’s onToolCall property for client-side tool execution

Custom DataStream

In certain scenarios you need to write custom data, message annotations to an agent’s dataStream. This can be useful for:

  • Streaming additional data to the client
  • Passing progress info back to the client in real time

Mastra integrates well with AI SDK to make this possible

CreateDataStream

The createDataStream function allows you to stream additional data to the client

import { createDataStream } from "ai" import { Agent } from '@mastra/core/agent'; export const weatherAgent = new Agent({ name: 'Weather Agent', instructions: ` You are a helpful weather assistant that provides accurate weather information. Your primary function is to help users get weather details for specific locations. When responding: - Always ask for a location if none is provided - If the location name isn’t in English, please translate it - If giving a location with multiple parts (e.g. "New York, NY"), use the most relevant part (e.g. "New York") - Include relevant details like humidity, wind conditions, and precipitation - Keep responses concise but informative Use the weatherTool to fetch current weather data. `, model: openai('gpt-4o'), tools: { weatherTool }, }); const stream = createDataStream({ async execute(dataStream) { // Write data dataStream.writeData({ value: 'Hello' }); // Write annotation dataStream.writeMessageAnnotation({ type: 'status', value: 'processing' }); //mastra agent stream const agentStream = await weatherAgent.stream('What is the weather') // Merge agent stream agentStream.mergeIntoDataStream(dataStream); }, onError: error => `Custom error: ${error.message}`, });

CreateDataStreamResponse

The createDataStreamResponse function creates a Response object that streams data to the client

app/api/chat/route.ts
import { mastra } from "@/src/mastra"; export async function POST(req: Request) { const { messages } = await req.json(); const myAgent = mastra.getAgent("weatherAgent"); //mastra agent stream const agentStream = await myAgent.stream(messages); const response = createDataStreamResponse({ status: 200, statusText: 'OK', headers: { 'Custom-Header': 'value', }, async execute(dataStream) { // Write data dataStream.writeData({ value: 'Hello' }); // Write annotation dataStream.writeMessageAnnotation({ type: 'status', value: 'processing' }); // Merge agent stream agentStream.mergeIntoDataStream(dataStream); }, onError: error => `Custom error: ${error.message}`, }); return response }