Skip to main content

CLI Commands

You can use the Command-Line Interface (CLI) provided by Mastra to develop, build, and start your Mastra project.

mastra dev
Direct link to mastra-dev

Starts a server which exposes Studio and REST endpoints for your agents, tools, and workflows. You can visit http://localhost:4111/swagger-ui for an overview of all available endpoints once mastra dev is running.

You can also configure the server.

Flags
Direct link to Flags

The command accepts common flags and the following additional flags:

--https
Direct link to --https

Enable local HTTPS support. Learn more.

--inspect
Direct link to --inspect

Start the development server in inspect mode, helpful for debugging. Optionally specify a custom host and port (e.g., --inspect=0.0.0.0:9229 for Docker). This can't be used together with --inspect-brk.

--inspect-brk
Direct link to --inspect-brk

Start the development server in inspect mode and break at the beginning of the script. Optionally specify a custom host and port (e.g., --inspect-brk=0.0.0.0:9229). This can't be used together with --inspect.

--custom-args
Direct link to --custom-args

Comma-separated list of custom arguments to pass to the development server. You can pass arguments to the Node.js process, e.g. --experimental-transform-types.

Configs
Direct link to Configs

You can set certain environment variables to modify the behavior of mastra dev.

Disable build caching
Direct link to Disable build caching

Set MASTRA_DEV_NO_CACHE=1 to force a full rebuild rather than using the cached assets under .mastra/:

MASTRA_DEV_NO_CACHE=1 mastra dev

This helps when you are debugging bundler plugins or suspect stale output.

Limit parallelism
Direct link to Limit parallelism

MASTRA_CONCURRENCY caps how many expensive operations run in parallel (primarily build and evaluation steps). For example:

MASTRA_CONCURRENCY=4 mastra dev

Leave it unset to let the CLI pick a sensible default for the machine.

Custom provider endpoints
Direct link to Custom provider endpoints

When using providers supported by the Vercel AI SDK you can redirect requests through proxies or internal gateways by setting a base URL. For OpenAI:

OPENAI_API_KEY=<your-api-key> \
OPENAI_BASE_URL=https://openrouter.example/v1 \
mastra dev

For Anthropic:

ANTHROPIC_API_KEY=<your-api-key> \
ANTHROPIC_BASE_URL=https://anthropic.internal \
mastra dev

These are forwarded to the Mastra model router and will work with any "openai/..." or "anthropic/..." model selections.

mastra build
Direct link to mastra-build

The mastra build command bundles your Mastra project into a production-ready Hono server. Hono is a lightweight, type-safe web framework that makes it easy to deploy Mastra agents as HTTP endpoints with middleware support.

Under the hood Mastra's Rollup server locates your Mastra entry file and bundles it to a production-ready Hono server. During that bundling it tree-shakes your code and generates source maps for debugging.

The output in .mastra can be deployed to any cloud server using mastra start.

If you're deploying to a serverless platform you need to install the correct deployer in order to receive the correct output in .mastra.

It accepts common flags.

Configs
Direct link to Configs

You can set certain environment variables to modify the behavior of mastra build.

Limit parallelism
Direct link to Limit parallelism

For CI or when running in resource constrained environments you can cap how many expensive tasks run at once by setting MASTRA_CONCURRENCY.

MASTRA_CONCURRENCY=2 mastra build

mastra start
Direct link to mastra-start

info

You need to run mastra build before using mastra start.

Starts a local server to serve your built Mastra application in production mode. By default, OTEL Tracing is enabled.

Flags
Direct link to Flags

The command accepts common flags and the following additional flags:

--dir
Direct link to --dir

The path to your built Mastra output directory. Defaults to .mastra/output.

--no-telemetry
Direct link to --no-telemetry

Disable the OTEL Tracing.

mastra studio
Direct link to mastra-studio

Starts Mastra Studio as a static server. After starting, you can enter your Mastra instance URL (e.g. http://localhost:4111) to connect Studio to your Mastra backend.

Flags
Direct link to Flags

The command accepts common flags and the following additional flags:

--port
Direct link to --port

The port to run Studio on. Defaults to 3000.

mastra lint
Direct link to mastra-lint

The mastra lint command validates the structure and code of your Mastra project to ensure it follows best practices and is error-free.

It accepts common flags.

mastra scorers
Direct link to mastra-scorers

The mastra scorers command provides management capabilities for evaluation scorers that measure the quality, accuracy, and performance of AI-generated outputs.

Read the Scorers overview to learn more.

add
Direct link to add

Add a new scorer to your project. You can use an interactive prompt:

mastra scorers add

Or provide a scorer name directly:

mastra scorers add answer-relevancy

Use the list command to get the correct ID.

list
Direct link to list

List all available scorer templates. Use the ID for the add command.

mastra init
Direct link to mastra-init

The mastra init command initializes Mastra in an existing project. Use this command to scaffold the necessary folders and configuration without generating a new project from scratch.

Flags
Direct link to Flags

The command accepts the following additional flags:

--default
Direct link to --default

Creates files inside src using OpenAI. It also populates the src/mastra folders with example code.

--dir
Direct link to --dir-1

The directory where Mastra files should be saved to. Defaults to src.

--components
Direct link to --components

Comma-separated list of components to add. For each component a new folder will be created. Choose from: "agents" | "tools" | "workflows" | "scorers". Defaults to ['agents', 'tools', 'workflows'].

--llm
Direct link to --llm

Default model provider. Choose from: "openai" | "anthropic" | "groq" | "google" | "cerebras" | "mistral".

--llm-api-key
Direct link to --llm-api-key

The API key for your chosen model provider. Will be written to an environment variables file (.env).

--example
Direct link to --example

If enabled, example code is written to the list of components (e.g. example agent code).

--no-example
Direct link to --no-example

Do not include example code. Useful when using the --default flag.

--mcp
Direct link to --mcp

Configure your code editor with Mastra's MCP server. Choose from: "cursor" | "cursor-global" | "windsurf" | "vscode".

Common flags
Direct link to Common flags

--dir
Direct link to --dir-2

Available in: dev, build, lint

The path to your Mastra folder. Defaults to src/mastra.

--debug
Direct link to --debug

Available in: dev, build

Enable verbose logging for Mastra's internals. Defaults to false.

--env
Direct link to --env

Available in: dev, start, studio

Custom environment variables file to include. By default, includes .env.development, .env.local, and .env.

--root
Direct link to --root

Available in: dev, build, lint

Path to your root folder. Defaults to process.cwd().

--tools
Direct link to --tools

Available in: dev, build, lint

Comma-separated list of tool paths to include. Defaults to src/mastra/tools.

Global flags
Direct link to Global flags

Use these flags to get information about the mastra CLI.

--version
Direct link to --version

Prints the Mastra CLI version and exits.

--help
Direct link to --help

Prints help message and exits.

Telemetry
Direct link to Telemetry

By default, Mastra collects anonymous information about your project like your OS, Mastra version or Node.js version. You can read the source code to check what's collected.

You can opt out of the CLI analytics by setting an environment variable:

MASTRA_TELEMETRY_DISABLED=1

You can also set this while using other mastra commands:

MASTRA_TELEMETRY_DISABLED=1 mastra dev