# VercelDeployer The `VercelDeployer` class handles deployment of standalone Mastra applications to Vercel. It manages configuration, deployment, and extends the base [Deployer](https://mastra.ai/reference/deployer/llms.txt) class with Vercel specific functionality. Vercel Functions use an ephemeral filesystem. Remove any usage of [LibSQLStore](https://mastra.ai/reference/storage/libsql/llms.txt) with file URLs from your Mastra configuration. Use in-memory storage (`:memory:`) or external storage providers like Turso, PostgreSQL, or Upstash. ## Installation ```bash npm install @mastra/deployer-vercel@latest ``` ```bash pnpm add @mastra/deployer-vercel@latest ``` ```bash yarn add @mastra/deployer-vercel@latest ``` ```bash bun add @mastra/deployer-vercel@latest ``` ## Usage example ```typescript import { Mastra } from "@mastra/core"; import { VercelDeployer } from "@mastra/deployer-vercel"; export const mastra = new Mastra({ deployer: new VercelDeployer(), }); ``` > See the [VercelDeployer](https://mastra.ai/reference/deployer/vercel/llms.txt) API reference for all available configuration options. ### Optional overrides The Vercel deployer can write a few high‑value settings into the Vercel Output API function config (`.vc-config.json`): - `maxDuration?: number` — Function execution timeout (seconds) - `memory?: number` — Function memory allocation (MB) - `regions?: string[]` — Regions (e.g. `['sfo1','iad1']`) Example: ```ts deployer: new VercelDeployer({ maxDuration: 600, memory: 1536, regions: ["sfo1", "iad1"], }); ``` ## Continuous integration After connecting your Mastra project’s Git repository to Vercel, update the project settings. In the Vercel dashboard, go to **Settings** > **Build and Deployment**, and under **Framework settings**, set the following: - **Build command**: `npm run build` (optional) ### Environment variables Before your first deployment, make sure to add any environment variables used by your application. For example, if you're using OpenAI as the LLM, you'll need to set `OPENAI_API_KEY` in your Vercel project settings. > See [Environment variables](https://vercel.com/docs/environment-variables) for more details. Your project is now configured with automatic deployments which occur whenever you push to the configured branch of your GitHub repository. ## Manual deployment Manual deployments are also possible using the [Vercel CLI](https://vercel.com/docs/cli). With the Vercel CLI installed run the following from your project root to deploy your application. ```bash npm run build && vercel --prod --prebuilt --archive=tgz ``` > You can also run `vercel dev` from your project root to test your Mastra application locally. ## Build output The build output for Mastra applications using the `VercelDeployer` includes all agents, tools, and workflows in your project, along with Mastra specific files required to run your application on Vercel. ```text .vercel/ ├── output/ │ ├── functions/ │ │ └── index.func/ │ │ └── index.mjs │ └── config.json └── package.json ``` The `VercelDeployer` automatically generates a `config.json` configuration file in `.vercel/output` with the following settings: ```json { "version": 3, "routes": [ { "src": "/(.*)", "dest": "/" } ] } ``` ## Observability To enable observability on Vercel, configure external storage (since Vercel's ephemeral filesystem doesn't support local file storage) and set up your preferred exporter(s). ### Storage configuration Use a PostgreSQL provider like [Neon](https://neon.tech) or [Supabase](https://supabase.com) for persistent storage: ```typescript import { Mastra } from "@mastra/core"; import { PinoLogger } from "@mastra/loggers"; import { PostgresStore } from "@mastra/pg"; import { Observability, DefaultExporter, CloudExporter } from "@mastra/observability"; export const mastra = new Mastra({ logger: new PinoLogger(), storage: new PostgresStore({ connectionString: process.env.DATABASE_URL!, }), observability: new Observability({ configs: { default: { serviceName: "my-mastra-app", exporters: [ // Choose based on your observability platform: new DefaultExporter(), // Your storage → Mastra Studio new CloudExporter(), // Mastra Cloud // new ArizeExporter(), // Arize // new LaminarExporter(), // Laminar // etc. ], }, }, }), // ... agents, tools, workflows }); ``` See the [Tracing documentation](https://mastra.ai/docs/observability/tracing/overview/llms.txt) for all available exporters. ### Environment variables Add these to your Vercel project settings based on your configuration: | Variable | Description | | --------------------------- | --------------------------------------------------- | | `DATABASE_URL` | PostgreSQL connection string (required for storage) | | `MASTRA_CLOUD_ACCESS_TOKEN` | Required if using CloudExporter | | `OPENAI_API_KEY` | Your LLM provider API key | ### Flushing traces In serverless environments, call `flush()` to ensure traces are exported before the function completes: ```typescript import { mastra } from "@/mastra"; export async function POST(req: Request) { const { message } = await req.json(); const agent = mastra.getAgent("myAgent"); const result = await agent.generate([{ role: "user", content: message }]); // Flush traces before the serverless function completes const observability = mastra.getObservability(); await observability.flush(); return Response.json(result); } ``` For more details on tracing configuration, see the [Tracing documentation](https://mastra.ai/docs/observability/tracing/overview/llms.txt). ## Next steps - [Mastra Client SDK](https://mastra.ai/reference/client-js/mastra-client/llms.txt)