Studio
Studio provides an interactive UI for building, testing, and managing your agents, workflows, and tools. Run it locally during development, or deploy it to production so your team can manage agents, monitor performance, and gain insights through built-in observability.
Add authentication to protect your deployed Studio with login screens, role-based access control, and permission-based UI rendering so you can control what each team member can see and do. You can also create a project in Mastra Cloud for a hosted option.
Start StudioDirect link to Start Studio
If you created your application with create mastra, start the development server using the dev script. You can also run it directly with mastra dev.
- npm
- pnpm
- Yarn
- Bun
npm run dev
pnpm run dev
yarn dev
bun run dev
Once the server is running, you can:
- Open the Studio UI at http://localhost:4111/ to interact with your agents, workflows, and tools.
- Visit http://localhost:4111/swagger-ui to discover and interact with the underlying REST API.
To run Studio in production, see Deploy Studio.
Studio UIDirect link to Studio UI
The Studio UI lets you interact with your agents, workflows, and tools, observe exactly what happens under the hood with each interaction, and tweak things as you go.
AgentsDirect link to Agents
Chat with your agent directly, dynamically switch models, and tweak settings like temperature and top-p to understand how they affect the output.
When you interact with your agent, you can follow each step of its reasoning, view tool call outputs, and observe traces and logs to see how responses are generated. You can also attach scorers to measure and compare response quality over time.
WorkflowsDirect link to Workflows
Visualize your workflow as a graph and run it step by step with a custom input. During execution, the interface updates in real time to show the active step and the path taken.
When running a workflow, you can also view detailed traces showing tool calls, raw JSON outputs, and any errors that might have occurred along the way.
ToolsDirect link to Tools
Run tools in isolation to observe their behavior. Test them before assigning them to your agent, or isolate them to debug issues should something go wrong.
ProcessorsDirect link to Processors
View the input and output processors attached to each agent. The agent detail panel lists every processor by name and type, so you can verify your guardrails, token limiters, and custom processors are wired up correctly before testing.
See Processors and Guardrails for configuration details.
MCPDirect link to MCP
List the MCP servers attached to your Mastra instance and explore their available tools.
ObservabilityDirect link to Observability
When you run an agent or workflow, the Observability tab displays traces that highlight the key AI operations such as model calls, tool executions, and workflow steps. Follow these traces to see how data moves, where time is spent, and what's happening under the hood.
Tracing filters out low-level framework details so your traces stay focused and readable.
ScorersDirect link to Scorers
The Scorers tab displays the results of your agent's scorers as they run. When messages pass through your agent, the defined scorers evaluate each output asynchronously and render their results here. This allows you to understand how your scorers respond to different interactions, compare performance across test cases, and identify areas for improvement.
DatasetsDirect link to Datasets
Create and manage collections of test cases to evaluate your agents and workflows. Import items from CSV or JSON, define input and ground-truth schemas, and pin to specific versions so you can reproduce experiments exactly. Run experiments with scorers to compare quality across prompts, models, or code changes.
See Datasets overview for the full API and versioning details.
REST APIDirect link to REST API
Studio is backed by a complete set of REST API routes that let you programmatically interact with your agents, workflows, and tools. These are the same routes exposed by the Mastra Server, so everything you build against locally works identically in production.
You can explore all available endpoints in the OpenAPI specification at http://localhost:4111/api/openapi.json, which details every endpoint and its request and response schemas.
To explore the API interactively, visit the Swagger UI at http://localhost:4111/swagger-ui. Here, you can discover endpoints and test them directly from your browser.
The OpenAPI and Swagger endpoints are disabled in production by default. To enable them, set server.build.openAPIDocs and server.build.swaggerUI to true respectively.
ConfigurationDirect link to Configuration
By default, Studio runs at http://localhost:4111. You can change the host, port, and studioBase in the Mastra server configuration.
For production deployments, see Deploy Studio to learn about hosting Studio alongside your server, as a standalone SPA, or on a CDN.
Add authentication to control who can access Studio in production. Studio displays the appropriate login UI, which can be an SSO button, an email/password form, or both. All API routes require authentication. This applies to any request made to your Mastra API, whether from Studio or a direct API call.
Mastra also supports HTTPS development through the --https flag, which automatically creates and manages certificates for your project. When you run mastra dev --https, a private key and certificate are generated for localhost (or your configured host). Visit the HTTPS reference to learn more.
Next stepsDirect link to Next steps
- Learn more about Mastra's suggested project structure.
- Integrate Mastra with your frontend framework of choice - Next.js, React, or Astro.