# Integrate Mastra in your Next.js project In this guide, you'll build a tool-calling AI agent using Mastra, then connect it to Next.js by importing and calling the agent directly from your routes. You'll use [AI SDK UI](https://ai-sdk.dev/docs/ai-sdk-ui/overview) and [AI Elements](https://ai-sdk.dev/elements) to create a beautiful, interactive chat experience. ![Screenshot of a chat-style web app displaying a completed \"weatherTool\" tool call, answering \"What is the weather in London?\" with a JSON result. A message suggests offering activity ideas, and a text input field is at the bottom.](/assets/images/nextjs-quickstart-11fce4f78d172367bb97a14f132d701f.png) What you'll build: an agent that can call a weather tool, display the JSON result, stream a weather summary in the chat UI, and persist conversation history across reloads. ## Before you begin - You'll need an API key from a supported [model provider](https://mastra.ai/models). If you don't have a preference, use [OpenAI](https://mastra.ai/models/providers/openai). - Install Node.js `v22.13.0` or later ## Create a new Next.js app (optional) If you already have a Next.js app, skip to the next step. Run the following command to [create a new Next.js app](https://nextjs.org/docs/app/getting-started/installation): **npm**: ```bash npx create-next-app@latest my-nextjs-agent --yes --ts --eslint --tailwind --src-dir --app --turbopack --no-react-compiler --no-import-alias ``` **pnpm**: ```bash pnpm dlx create-next-app@latest my-nextjs-agent --yes --ts --eslint --tailwind --src-dir --app --turbopack --no-react-compiler --no-import-alias ``` **Yarn**: ```bash yarn dlx create-next-app@latest my-nextjs-agent --yes --ts --eslint --tailwind --src-dir --app --turbopack --no-react-compiler --no-import-alias ``` **Bun**: ```bash bun x create-next-app@latest my-nextjs-agent --yes --ts --eslint --tailwind --src-dir --app --turbopack --no-react-compiler --no-import-alias ``` This creates a project called `my-nextjs-agent`, but you can replace it with any name you want. ## Initialize Mastra Navigate to your Next.js project: ```bash cd my-nextjs-agent ``` Run [`mastra init`](https://mastra.ai/reference/cli/mastra). When prompted, choose a provider (e.g. OpenAI) and enter your key: **npm**: ```bash npx --force mastra@latest init ``` **pnpm**: ```bash pnpm dlx --force mastra@latest init ``` **Yarn**: ```bash yarn dlx --force mastra@latest init ``` **Bun**: ```bash bun x --force mastra@latest init ``` This creates a `src/mastra` folder with an example weather agent and the following files: - `index.ts` - Mastra config, including memory - `tools/weather-tool.ts` - a tool to fetch weather for a given location - `agents/weather-agent.ts`- a weather agent with a prompt that uses the tool You'll call `weather-agent.ts` from your Next.js routes in the next steps. ## Install AI SDK UI & AI Elements Install AI SDK UI along with the Mastra adapter: **npm**: ```bash npm install @mastra/ai-sdk@latest @ai-sdk/react ai ``` **pnpm**: ```bash pnpm add @mastra/ai-sdk@latest @ai-sdk/react ai ``` **Yarn**: ```bash yarn add @mastra/ai-sdk@latest @ai-sdk/react ai ``` **Bun**: ```bash bun add @mastra/ai-sdk@latest @ai-sdk/react ai ``` Next, initialize AI Elements. When prompted, choose the default options: **npm**: ```bash npx ai-elements@latest ``` **pnpm**: ```bash pnpm dlx ai-elements@latest ``` **Yarn**: ```bash yarn dlx ai-elements@latest ``` **Bun**: ```bash bun x ai-elements@latest ``` This downloads the entire AI Elements UI component library into a `@/components/ai-elements` folder. ## Create a chat route Create `src/app/api/chat/route.ts`: ```ts import { handleChatStream } from '@mastra/ai-sdk'; import { toAISdkV5Messages } from '@mastra/ai-sdk/ui' import { createUIMessageStreamResponse } from 'ai'; import { mastra } from '@/mastra'; import { NextResponse } from 'next/server'; const THREAD_ID = 'example-user-id'; const RESOURCE_ID = 'weather-chat'; export async function POST(req: Request) { const params = await req.json(); const stream = await handleChatStream({ mastra, agentId: 'weather-agent', params: { ...params, memory: { ...params.memory, thread: THREAD_ID, resource: RESOURCE_ID, } }, }); return createUIMessageStreamResponse({ stream }); } export async function GET() { const memory = await mastra.getAgentById('weather-agent').getMemory() let response = null try { response = await memory?.recall({ threadId: THREAD_ID, resourceId: RESOURCE_ID, }) } catch { console.log('No previous messages found.') } const uiMessages = toAISdkV5Messages(response?.messages || []); return NextResponse.json(uiMessages) } ``` The `POST` route accepts a prompt and streams the agent's response back in AI SDK format, while the `GET` route fetches message history from memory so the UI can be hydrated when the client reloads. ## Create a chat page Create `src/app/chat/page.tsx`: ```tsx 'use client'; import '@/app/globals.css'; import { useEffect, useState } from 'react'; import { DefaultChatTransport, ToolUIPart } from 'ai'; import { useChat } from '@ai-sdk/react'; import { PromptInput, PromptInputBody, PromptInputTextarea, } from '@/components/ai-elements/prompt-input'; import { Conversation, ConversationContent, ConversationScrollButton, } from '@/components/ai-elements/conversation'; import { Message, MessageContent, MessageResponse } from '@/components/ai-elements/message'; import { Tool, ToolHeader, ToolContent, ToolInput, ToolOutput, } from '@/components/ai-elements/tool'; function Chat() { const [input, setInput] = useState(''); const { messages, setMessages, sendMessage, status } = useChat({ transport: new DefaultChatTransport({ api: '/api/chat', }), }); useEffect(() => { const fetchMessages = async () => { const res = await fetch('/api/chat'); const data = await res.json(); setMessages([...data]); }; fetchMessages(); }, [setMessages]); const handleSubmit = async () => { if (!input.trim()) return; sendMessage({ text: input }); setInput(''); }; return (
{messages.map((message) => (
{message.parts?.map((part, i) => { if (part.type === 'text') { return ( {part.text} ); } if (part.type?.startsWith('tool-')) { return ( ); } return null; })}
))}
setInput(e.target.value)} className="md:leading-10" value={input} placeholder="Type your message..." disabled={status !== 'ready'} />
); } export default Chat; ``` This component connects [`useChat()`](https://ai-sdk.dev/docs/reference/ai-sdk-ui/use-chat) to the `api/chat` endpoint, sending prompts there and streaming the response back in chunks. It renders the response text using the [``](https://ai-sdk.dev/elements/components/message#messageresponse-) component, and shows any tool invocations with the [``](https://ai-sdk.dev/elements/components/tool) component. ## Test your agent 1. Run your Next.js app with `npm run dev` 2. Open the chat at 3. Try asking about the weather. If your API key is set up correctly, you'll get a response ## Next steps Congratulations on building your Mastra agent with Next.js! 🎉 From here, you can extend the project with your own tools and logic: - Learn more about [agents](https://mastra.ai/docs/agents/overview) - Give your agent its own [tools](https://mastra.ai/docs/agents/using-tools) - Add human-like [memory](https://mastra.ai/docs/agents/agent-memory) to your agent When you're ready, read more about how Mastra integrates with AI SDK UI and Next.js, and how to deploy your agent anywhere, including Vercel: - Integrate Mastra with [AI SDK UI](https://mastra.ai/guides/build-your-ui/ai-sdk-ui) - Deploy your agent to [Vercel](https://mastra.ai/guides/deployment/vercel-deployer) - Deploy your agent [anywhere](https://mastra.ai/docs/deployment/overview)