Skip to main content
Mastra 1.0 is available 🎉 Read announcement

Deploy Mastra to Vercel

Use @mastra/deployer-vercel to deploy your Mastra server as serverless functions on Vercel. The deployer bundles your code and generates a .vercel/output directory conforming to Vercel's Build Output API, ready to deploy with no additional configuration.

info

If you're using a server adapter or web framework, deploy the way you normally would for that framework.

Before you begin
Direct link to Before you begin

You'll need a Mastra application and a Vercel account.

Vercel Functions use an ephemeral filesystem, so any storage you configure (including observability storage) must be hosted externally. If you're using LibSQLStore with a file URL, switch to a remotely hosted database.

Installation
Direct link to Installation

Add the @mastra/deployer-vercel package to your project:

npm install @mastra/deployer-vercel@latest

Import VercelDeployer and set it as the deployer in your Mastra configuration:

src/mastra/index.ts
import { Mastra } from "@mastra/core";
import { VercelDeployer } from "@mastra/deployer-vercel";

export const mastra = new Mastra({
deployer: new VercelDeployer(),
});

Usage
Direct link to Usage

  1. Push your project to a remote Git provider (e.g. GitHub) and connect your repository to Vercel.

    By default, Vercel runs npm run build, which triggers mastra build. If you don't have a build script, add "build": "mastra build" to your package.json.

    note

    Remember to set your environment variables needed to run your application (e.g. your model provider API key).

  2. Once you're ready, click Deploy and wait for the first deployment to complete.

  3. Verify your deployment at https://<your-project>.vercel.app/api/agents, which should return a JSON list of your agents.

    Since the Mastra server prefixes every endpoint with /api, you have to add it to your URLs when making requests.

  4. You can now call your Mastra endpoints over HTTP.

    note

    Set up authentication before exposing your endpoints publicly.

Optional overrides
Direct link to Optional overrides

The Vercel deployer supports configuration options that are written to the Vercel Output API function config. See the VercelDeployer reference for available options like maxDuration, memory, and regions.

Observability
Direct link to Observability

Serverless functions can terminate immediately after returning a response. Any pending async work - like sending telemetry - may get killed before completing. Awaiting flush() ensures all traces are sent before the function exits.

api/chat.ts
import type { VercelRequest, VercelResponse } from "@vercel/node";
import { mastra } from "../src/mastra";

export default async function handler(req: VercelRequest, res: VercelResponse) {
const { message } = req.body;
const agent = mastra.getAgent("myAgent");
const result = await agent.generate([{ role: "user", content: message }]);

const observability = mastra.getObservability();
await observability?.flush();

return res.json(result);
}
warning

The Vercel deployer doesn't include flush calls. If you need this, you'll need to wrap the handler yourself to add the logic before returning the response. Alternatively, deploy to a long-running server like a virtual machine where this isn't an issue.

On this page