# Integrate Mastra in your Nuxt project
In this guide, you'll build a tool-calling AI agent using Mastra, then connect it to Nuxt by importing and calling the agent directly from your server routes.
You'll use [AI SDK UI](https://ai-sdk.dev/docs/ai-sdk-ui/overview) to create a beautiful, interactive chat experience with Vue.
## Before you begin
- You'll need an API key from a supported [model provider](https://mastra.ai/models). If you don't have a preference, use [OpenAI](https://mastra.ai/models/providers/openai).
- Install Node.js `v22.13.0` or later
## Create a new Nuxt app (optional)
If you already have a Nuxt app, skip to the next step.
Run the following command to [create a new Nuxt app](https://nuxt.com/docs/getting-started/installation):
**npm**:
```bash
npm create nuxt@latest mastra-nuxt -- --template minimal --packageManager npm --gitInit --modules
```
**pnpm**:
```bash
pnpm create nuxt mastra-nuxt --template minimal --packageManager npm --gitInit --modules
```
**Yarn**:
```bash
yarn create nuxt mastra-nuxt --template minimal --packageManager npm --gitInit --modules
```
**Bun**:
```bash
bunx create-nuxt mastra-nuxt --template minimal --packageManager npm --gitInit --modules
```
This creates a project called `mastra-nuxt`, but you can replace it with any name you want.
## Initialize Mastra
Navigate to your Nuxt project:
```bash
cd mastra-nuxt
```
Run [`mastra init`](https://mastra.ai/reference/cli/mastra). When prompted, choose a provider (e.g. OpenAI) and enter your key:
**npm**:
```bash
npx mastra@latest init
```
**pnpm**:
```bash
pnpm dlx mastra@latest init
```
**Yarn**:
```bash
yarn dlx mastra@latest init
```
**Bun**:
```bash
bun x mastra@latest init
```
This creates a `mastra` folder with an example weather agent and the following files:
- `index.ts` - Mastra config, including memory
- `tools/weather-tool.ts` - a tool to fetch weather for a given location
- `agents/weather-agent.ts`- a weather agent with a prompt that uses the tool
You'll call `weather-agent.ts` from your Nuxt server routes in the next steps.
## Install AI SDK UI
Install AI SDK UI along with the Mastra adapter:
**npm**:
```bash
npm install @mastra/ai-sdk@latest @ai-sdk/vue ai
```
**pnpm**:
```bash
pnpm add @mastra/ai-sdk@latest @ai-sdk/vue ai
```
**Yarn**:
```bash
yarn add @mastra/ai-sdk@latest @ai-sdk/vue ai
```
**Bun**:
```bash
bun add @mastra/ai-sdk@latest @ai-sdk/vue ai
```
## Create a chat route
Create `server/api/chat.ts`:
```ts
import { handleChatStream } from '@mastra/ai-sdk';
import { toAISdkV5Messages } from '@mastra/ai-sdk/ui';
import { createUIMessageStreamResponse } from 'ai';
import { mastra } from '../../src/mastra';
const THREAD_ID = 'example-user-id';
const RESOURCE_ID = 'weather-chat';
export default defineEventHandler(async (event) => {
const method = event.method;
if (method === 'POST') {
const params = await readBody(event);
const stream = await handleChatStream({
mastra,
agentId: 'weather-agent',
params: {
...params,
memory: {
...params.memory,
thread: THREAD_ID,
resource: RESOURCE_ID,
}
}
});
return createUIMessageStreamResponse({ stream });
}
if (method === 'GET') {
const memory = await mastra.getAgentById('weather-agent').getMemory();
let response = null;
try {
response = await memory?.recall({
threadId: THREAD_ID,
resourceId: RESOURCE_ID,
});
} catch {
console.log('No previous messages found.');
}
const uiMessages = toAISdkV5Messages(response?.messages || []);
return uiMessages;
}
});
```
The `POST` handler accepts a prompt and streams the agent's response back in AI SDK format, while the `GET` handler fetches message history from memory so the UI can be hydrated when the client reloads.
## Add the chat UI
Replace the contents of `app/app.vue` with the following:
```html
{{ part.text }}
{{ (part as ToolUIPart).type?.split('-').slice(1).join('-') }} -
{{ STATE_TO_LABEL_MAP[(part as ToolUIPart).state ?? 'output-available'] }}
Parameters
{{ JSON.stringify((part as ToolUIPart).input, null, 2) }}
{{ (part as ToolUIPart).errorText ? 'Error' : 'Result' }}
{{ JSON.stringify((part as ToolUIPart).output, null, 2) }}
{{ (part as ToolUIPart).errorText }}
```
This component connects [`Chat()`](https://ai-sdk.dev/docs/reference/ai-sdk-ui/use-chat) to the `/api/chat` endpoint, sending prompts there and streaming the response back in chunks.
It renders the response text using custom message styling and shows any tool invocations in a collapsible details element.
## Test your agent
1. Run your Nuxt app with `npm run dev`
2. Open the chat at
3. Try asking about the weather. If your API key is set up correctly, you'll get a response
## Next steps
Congratulations on building your Mastra agent with Nuxt! 🎉
From here, you can extend the project with your own tools and logic:
- Learn more about [agents](https://mastra.ai/docs/agents/overview)
- Give your agent its own [tools](https://mastra.ai/docs/agents/using-tools)
- Add human-like [memory](https://mastra.ai/docs/agents/agent-memory) to your agent
When you're ready, read more about how Mastra integrates with AI SDK UI and Nuxt, and how to deploy your agent anywhere:
- Integrate Mastra with [AI SDK UI](https://mastra.ai/guides/build-your-ui/ai-sdk-ui)
- Deploy your agent [anywhere](https://mastra.ai/docs/deployment/overview)