Deploy Mastra to Vercel
Use @mastra/deployer-vercel to deploy your Mastra server as serverless functions on Vercel. The deployer bundles your code and generates a .vercel/output directory conforming to Vercel's Build Output API, ready to deploy with no additional configuration.
If you're using a server adapter or web framework, deploy the way you normally would for that framework.
Before you beginDirect link to Before you begin
You'll need a Mastra application and a Vercel account.
Vercel Functions use an ephemeral filesystem, so any storage you configure (including observability storage) must be hosted externally. If you're using LibSQLStore with a file URL, switch to a remotely hosted database.
InstallationDirect link to Installation
Add the @mastra/deployer-vercel package to your project:
- npm
- pnpm
- Yarn
- Bun
npm install @mastra/deployer-vercel@latest
pnpm add @mastra/deployer-vercel@latest
yarn add @mastra/deployer-vercel@latest
bun add @mastra/deployer-vercel@latest
Import VercelDeployer and set it as the deployer in your Mastra configuration:
import { Mastra } from "@mastra/core";
import { VercelDeployer } from "@mastra/deployer-vercel";
export const mastra = new Mastra({
deployer: new VercelDeployer(),
});
UsageDirect link to Usage
Push your project to a remote Git provider (e.g. GitHub) and connect your repository to Vercel.
By default, Vercel runs
npm run build, which triggersmastra build. If you don't have a build script, add"build": "mastra build"to yourpackage.json.noteRemember to set your environment variables needed to run your application (e.g. your model provider API key).
Once you're ready, click Deploy and wait for the first deployment to complete.
Verify your deployment at
https://<your-project>.vercel.app/api/agents, which should return a JSON list of your agents.Since the Mastra server prefixes every endpoint with
/api, you have to add it to your URLs when making requests.You can now call your Mastra endpoints over HTTP.
noteSet up authentication before exposing your endpoints publicly.
Optional overridesDirect link to Optional overrides
The Vercel deployer supports configuration options that are written to the Vercel Output API function config. See the VercelDeployer reference for available options like maxDuration, memory, and regions.
ObservabilityDirect link to Observability
Serverless functions can terminate immediately after returning a response. Any pending async work - like sending telemetry - may get killed before completing. Awaiting flush() ensures all traces are sent before the function exits.
import type { VercelRequest, VercelResponse } from "@vercel/node";
import { mastra } from "../src/mastra";
export default async function handler(req: VercelRequest, res: VercelResponse) {
const { message } = req.body;
const agent = mastra.getAgent("myAgent");
const result = await agent.generate([{ role: "user", content: message }]);
const observability = mastra.getObservability();
await observability?.flush();
return res.json(result);
}
The Vercel deployer doesn't include flush calls. If you need this, you'll need to wrap the handler yourself to add the logic before returning the response. Alternatively, deploy to a long-running server like a virtual machine where this isn't an issue.