Using Assistant UI
Assistant UI is the TypeScript/React library for AI Chat. Built on shadcn/ui and Tailwind CSS, it enables developers to create beautiful, enterprise-grade chat experiences in minutes.
For a full-stack integration approach where Mastra runs directly in your Next.js API routes, see the Full-Stack Integration Guide on Assistant UI's documentation site.
Visit Mastra's "UI Dojo" to see real-world examples of Assistant UI integrated with Mastra.
Integration Guide
Run Mastra as a standalone server and connect your Next.js frontend (with Assistant UI) to its API endpoints.
Set up your directory structure. A possible directory structure could look like this:
project-root
├── mastra-server
│ ├── src
│ │ └── mastra
│ └── package.json
└── my-app
└── package.jsonBootstrap your Mastra server:
npx create-mastra@betaThis command will launch an interactive wizard to help you scaffold a new Mastra project, including prompting you for a project name and setting up basic configurations. Follow the prompts to create your server project.
Navigate to your newly created Mastra server directory:
cd mastra-server # Replace with the actual directory name you providedYou now have a basic Mastra server project ready. You should have the following files and folders:
src
└── mastra
├── agents
│ └── weather-agent.ts
├── scorers
│ └── weather-scorer.ts
├── tools
│ └── weather-tool.ts
├── workflows
│ └── weather-workflow.ts
└── index.tsnoteEnsure that you have set the appropriate environment variables for your LLM provider in the
.envfile.Create a chat route for the Assistant UI frontend by using the
chatRoute()helper from@mastra/ai-sdk. Add it to your Mastra project:npm install @mastra/ai-sdk@betaIn your
src/mastra/index.tsfile, register the chat route:src/mastra/index.tsimport { Mastra } from '@mastra/core/mastra';
import { chatRoute } from '@mastra/ai-sdk';
// Rest of the imports...
export const mastra = new Mastra({
// Rest of the configuration...
server: {
apiRoutes: [
chatRoute({
path: '/chat/:agentId'
})
]
}
});This will make all agents available in AI SDK-compatible formats, including the
weatherAgentat the endpoint/chat/weatherAgent.Run the Mastra server using the following command:
npm run devBy default, the Mastra server will run on
http://localhost:4111. Keep this server running for the next steps where we'll set up the Assistant UI frontend to connect to it.Go up one directory to your project root.
cd ..Create a new
assistant-uiproject with the following command.npx assistant-ui@latest createnoteFor detailed setup instructions, including adding API keys, basic configuration, and manual setup steps, please refer to assistant-ui's official documentation.
The default Assistant UI setup configures the chat runtime to use a local API route (
/api/chat) within the Next.js project. Since our Mastra agent is running on a separate server, we need to update the frontend to point to that server's endpoint.Open the file in your assistant-ui frontend project that contains the
useChatRuntimehook (usuallyapp/assistant.tsxorsrc/app/assistant.tsx). Find theuseChatRuntimehook and change theapiproperty to the full URL of your Mastra agent's stream endpoint:app/assistant.tsx"use client";
// Rest of the imports...
export const Assistant = () => {
const runtime = useChatRuntime({
transport: new AssistantChatTransport({
api: "http://localhost:4111/chat/weatherAgent",
}),
});
// Rest of the component...
};Now, the Assistant UI frontend will send chat requests directly to your running Mastra server.
You're ready to connect the pieces! Make sure both the Mastra server and the Assistant UI frontend are running. Start the Next.js development server:
npm run devYou should now be able to chat with your agent in the browser.
Congratulations! You have successfully integrated Mastra with Assistant UI using a separate server approach. Your Assistant UI frontend now communicates with a standalone Mastra agent server.