Skip to main content

Using with Assistant UI

Assistant UI is the TypeScript/React library for AI Chat. Built on shadcn/ui and Tailwind CSS, it enables developers to create beautiful, enterprise-grade chat experiences in minutes.

info

For a full-stack integration approach where Mastra runs directly in your Next.js API routes, see the Full-Stack Integration Guide on Assistant UI's documentation site.

Integration Guide

Run Mastra as a standalone server and connect your Next.js frontend (with Assistant UI) to its API endpoints.

  1. Set up your directory structure. A possible directory structure could look like this:

    project-root
    ├── mastra-server
    │ ├── src
    │ │ └── mastra
    │ └── package.json
    └── nextjs-frontend
    └── package.json

    Bootstrap your Mastra server:

    npx create-mastra@latest

    This command will launch an interactive wizard to help you scaffold a new Mastra project, including prompting you for a project name and setting up basic configurations. Follow the prompts to create your server project.

    You now have a basic Mastra server project ready. You should have the following files and folders:

    src
    └── mastra
    ├── agents
    │ └── weather-agent.ts
    ├── tools
    │ └── weather-tool.ts
    ├── workflows
    │ └── weather-workflow.ts
    └── index.ts
    note

    Ensure that you have set the appropriate environment variables for your LLM provider in the .env file.

  2. Run the Mastra server using the following command:

    npm run dev

    By default, the Mastra server will run on http://localhost:4111. Your weatherAgent should now be accessible via a POST request endpoint, typically http://localhost:4111/api/agents/weatherAgent/stream. Keep this server running for the next steps where we'll set up the Assistant UI frontend to connect to it.

  3. Create a new assistant-ui project with the following command.

    npx assistant-ui@latest create
    note

    For detailed setup instructions, including adding API keys, basic configuration, and manual setup steps, please refer to assistant-ui's official documentation.

  4. The default Assistant UI setup configures the chat runtime to use a local API route (/api/chat) within the Next.js project. Since our Mastra agent is running on a separate server, we need to update the frontend to point to that server's endpoint.

    Find the useChatRuntime hook in the assistant-ui project, typically at app/assistant.tsx and change the api property to the full URL of your Mastra agent's stream endpoint:

    app/assistant.tsx
    import {
    useChatRuntime,
    AssistantChatTransport,
    } from "@assistant-ui/react-ai-sdk";

    const runtime = useChatRuntime({
    transport: new AssistantChatTransport({
    api: "MASTRA_ENDPOINT",
    }),
    });

    Now, the Assistant UI frontend will send chat requests directly to your running Mastra server.

  5. You're ready to connect the pieces! Make sure both the Mastra server and the Assistant UI frontend are running. Start the Next.js development server:

    npm run dev

    You should now be able to chat with your agent in the browser.

Congratulations! You have successfully integrated Mastra with Assistant UI using a separate server approach. Your Assistant UI frontend now communicates with a standalone Mastra agent server.

On this page