Skip to main content

Next.js Quickstart

In this quickstart, you'll build an AI agent with tool-calling capabilities using Mastra, then connect it to Next.js by importing and calling the agent directly within your routes.

You'll use AI SDK UI and AI Elements to create a beautiful, interactive chat experience.

In ~5 minutes, you'll have a functional AI-powered app ready to extend with your own tools and logic.

Screenshot of a chat-style web app displaying a completed "weatherTool" tool call, answering "What is the weather in London?" with a JSON result. A message suggests offering activity ideas, and a text input field is at the bottom.

What you'll build: an agent that can call a weather tool, display the JSON result, stream a weather summary in the chat UI, and persist conversation history across reloads.

Before you begin

  • You'll need an API key from a supported model provider. If you don't have a preference, use OpenAI.

Create a new Next.js app

Run the following command to create a new Next.js app:

npx create-next-app \
my-nextjs-agent \
--yes \
--ts \
--eslint \
--tailwind \
--src-dir \
--app \
--turbopack \
--no-react-compiler \
--no-import-alias

This creates a project called my-nextjs-agent, but you can replace it with any name you want.

Initialize Mastra

cd and run mastra init.

When prompted, choose a provider (e.g. OpenAI) and enter your key:

cd my-nextjs-agent
npx --force mastra@latestinit

This creates a src/mastra folder with an example weather agent and the following files:

  • index.ts - Mastra config, including memory
  • tools/weather-tool.ts - a tool to fetch weather for a given location
  • agents/weather-agent.ts- a weather agent with a prompt that uses the tool

You'll call weather-agent.ts from your Next.js routes in the next steps.

Install AI SDK UI and AI Elements

Install AI SDK UI along with the Mastra adapter:

npm install \
@mastra/ai-sdk \
@ai-sdk/react

Next, initialize AI Elements. When prompted, choose the default options:

npx ai-elements@latest

This downloads the entire AI Elements UI component library into a @/components/ai-elements folder.

Create a chat route

Create src/app/api/chat/route.ts:

src/app/api/chat/route.ts
import { mastra } from "@/mastra";
import { NextResponse } from "next/server";
import { toAISdkFormat } from "@mastra/ai-sdk";
import { convertMessages } from "@mastra/core/agent";
import { createUIMessageStreamResponse } from "ai";

const weatherAgent = mastra.getAgent("weatherAgent");

export async function POST(req: Request) {
const { messages } = await req.json();

const stream = await weatherAgent.stream(messages, {
memory: {
thread: "example-user-id",
resource: "weather-chat",
},
});

return createUIMessageStreamResponse({
stream: toAISdkFormat(stream, { from: "agent" }),
})
}

export async function GET() {
const memory = await weatherAgent.getMemory();
const response = await memory?.query({
threadId: "example-user-id",
resourceId: "weather-chat",
});

const uiMessages =
convertMessages(response?.uiMessages ?? []).to("AIV5.UI");
return NextResponse.json(uiMessages);
}

The POST route accepts a prompt and streams the agent's response back in AI SDK format, while the GET route fetches message history from memory so the UI can be hydrated when the client reloads.

Create a chat page

Create src/app/chat/page.tsx:

src/app/chat/page.tsx
'use client';

import '@/app/globals.css';
import { useEffect, useState } from 'react';
import { DefaultChatTransport } from 'ai';
import { useChat } from '@ai-sdk/react';

import {
PromptInput,
PromptInputBody,
PromptInputTextarea,
} from '@/components/ai-elements/prompt-input';

import {
Conversation,
ConversationContent,
ConversationScrollButton,
} from '@/components/ai-elements/conversation';

import { Message, MessageContent, MessageResponse } from '@/components/ai-elements/message';

import {
Tool,
ToolHeader,
ToolContent,
ToolInput,
ToolOutput,
} from '@/components/ai-elements/tool';


function Chat() {
const [input, setInput] = useState<string>('');

const { messages, setMessages, sendMessage, status } = useChat({
transport: new DefaultChatTransport({
api: '/api/chat',
}),
});

useEffect(() => {
const fetchMessages = async () => {
const res = await fetch('/api/chat');
const data = await res.json();
setMessages([...data]);
};
fetchMessages();
}, [setMessages]);

const handleSubmit = async () => {
if (!input.trim()) return;

sendMessage({ text: input });
setInput('');
};

return (
<div className="w-full p-6 relative size-full h-screen">
<div className="flex flex-col h-full">
<Conversation className="h-full">
<ConversationContent>
{messages.map((message: any) => (
<div key={message.id}>
{message.parts?.map((part: any, i: number) => {
if (part.type === 'text') {
return (
<Message
key={`${message.id}-${i}`}
from={message.role}>
<MessageContent>
<MessageResponse>{part.text}</MessageResponse>
</MessageContent>
</Message>
);
}

if (part.type?.startsWith('tool-')) {
return (
<Tool key={`${message.id}-${i}`}>
<ToolHeader
type={part.type}
state={part.state || 'output-available'}
className="cursor-pointer"
/>
<ToolContent>
<ToolInput input={part.input || part.args || {}} />
<ToolOutput
output={part.output || part.result}
errorText={part.errorText || part.error}
/>
</ToolContent>
</Tool>
);
}

return null;
})}
</div>
))}
<ConversationScrollButton />
</ConversationContent>
</Conversation>

<PromptInput onSubmit={handleSubmit} className="mt-20">
<PromptInputBody>
<PromptInputTextarea
onChange={(e: any) => setInput(e.target.value)}
className="md:leading-10"
value={input}
placeholder="Type your message..."
disabled={status !== 'ready'}
/>
</PromptInputBody>
</PromptInput>
</div>
</div>
);
}

export default Chat;

This component connects useChat() to the api/chat endpoint, sending prompts there and streaming the response back in chunks.

It renders the response text using the <MessageResponse> component, and shows any tool invocations with the <Tool> component.

Test your agent

  1. Run your Next.js app with npm run dev

  2. Open the chat at http://localhost:3000/chat

  3. Try asking about the weather. If your API key is set up correctly, you'll get a response.

Next steps

Congratulations on building your Mastra agent with Next.js! 🎉

From here, you can extend the project with your own tools and logic:

  • Learn more about agents
  • Give your agent its own tools
  • Add human-like memory to your agent

When you're ready, read more about how Mastra integrates with AI SDK and Next.js, and how to deploy to Vercel: