Skip to main content
Mastra 1.0 is available 🎉 Read announcement

VercelDeployer

The VercelDeployer class handles deployment of standalone Mastra applications to Vercel. It manages configuration, deployment, and extends the base Deployer class with Vercel specific functionality.

warning

Vercel Functions use an ephemeral filesystem. Remove any usage of LibSQLStore with file URLs from your Mastra configuration. Use in-memory storage (:memory:) or external storage providers like Turso, PostgreSQL, or Upstash.

Installation
Direct link to Installation

npm install @mastra/deployer-vercel@latest

Usage example
Direct link to Usage example

src/mastra/index.ts
import { Mastra } from "@mastra/core";
import { VercelDeployer } from "@mastra/deployer-vercel";

export const mastra = new Mastra({
deployer: new VercelDeployer(),
});

See the VercelDeployer API reference for all available configuration options.

Optional overrides
Direct link to Optional overrides

The Vercel deployer can write a few high‑value settings into the Vercel Output API function config (.vc-config.json):

  • maxDuration?: number — Function execution timeout (seconds)
  • memory?: number — Function memory allocation (MB)
  • regions?: string[] — Regions (e.g. ['sfo1','iad1'])

Example:

src/mastra/index.ts
deployer: new VercelDeployer({
maxDuration: 600,
memory: 1536,
regions: ["sfo1", "iad1"],
});

Continuous integration
Direct link to Continuous integration

After connecting your Mastra project’s Git repository to Vercel, update the project settings. In the Vercel dashboard, go to Settings > Build and Deployment, and under Framework settings, set the following:

  • Build command: npm run build (optional)

Environment variables
Direct link to Environment variables

Before your first deployment, make sure to add any environment variables used by your application. For example, if you're using OpenAI as the LLM, you'll need to set OPENAI_API_KEY in your Vercel project settings.

See Environment variables for more details.

Your project is now configured with automatic deployments which occur whenever you push to the configured branch of your GitHub repository.

Manual deployment
Direct link to Manual deployment

Manual deployments are also possible using the Vercel CLI. With the Vercel CLI installed run the following from your project root to deploy your application.

npm run build && vercel --prod --prebuilt --archive=tgz

You can also run vercel dev from your project root to test your Mastra application locally.

Build output
Direct link to Build output

The build output for Mastra applications using the VercelDeployer includes all agents, tools, and workflows in your project, along with Mastra specific files required to run your application on Vercel.

.vercel/
├── output/
│ ├── functions/
│ │ └── index.func/
│ │ └── index.mjs
│ └── config.json
└── package.json

The VercelDeployer automatically generates a config.json configuration file in .vercel/output with the following settings:

{
"version": 3,
"routes": [
{
"src": "/(.*)",
"dest": "/"
}
]
}

Observability
Direct link to Observability

To enable observability on Vercel, configure external storage (since Vercel's ephemeral filesystem doesn't support local file storage) and set up your preferred exporter(s).

Storage configuration
Direct link to Storage configuration

Use a PostgreSQL provider like Neon or Supabase for persistent storage:

src/mastra/index.ts
import { Mastra } from "@mastra/core";
import { PinoLogger } from "@mastra/loggers";
import { PostgresStore } from "@mastra/pg";
import { Observability, DefaultExporter, CloudExporter } from "@mastra/observability";

export const mastra = new Mastra({
logger: new PinoLogger(),
storage: new PostgresStore({
connectionString: process.env.DATABASE_URL!,
}),
observability: new Observability({
configs: {
default: {
serviceName: "my-mastra-app",
exporters: [
// Choose based on your observability platform:
new DefaultExporter(), // Your storage → Mastra Studio
new CloudExporter(), // Mastra Cloud
// new ArizeExporter(), // Arize
// new LaminarExporter(), // Laminar
// etc.
],
},
},
}),
// ... agents, tools, workflows
});

See the Tracing documentation for all available exporters.

Environment variables
Direct link to Environment variables

Add these to your Vercel project settings based on your configuration:

VariableDescription
DATABASE_URLPostgreSQL connection string (required for storage)
MASTRA_CLOUD_ACCESS_TOKENRequired if using CloudExporter
OPENAI_API_KEYYour LLM provider API key

Flushing traces
Direct link to Flushing traces

In serverless environments, call flush() to ensure traces are exported before the function completes:

app/api/chat/route.ts
import { mastra } from "@/mastra";

export async function POST(req: Request) {
const { message } = await req.json();
const agent = mastra.getAgent("myAgent");

const result = await agent.generate([{ role: "user", content: message }]);

// Flush traces before the serverless function completes
const observability = mastra.getObservability();
await observability.flush();

return Response.json(result);
}

For more details on tracing configuration, see the Tracing documentation.

Next steps
Direct link to Next steps