Open Source
Platform
For Any Agent
Get Started
Latest
Books
Workshops
Your workflows can now run on Temporal with durable, fault-tolerant execution.
Your agents can now stream progress feedback during long-running tool calls.
Make your Mastra application production-ready with observability storage that scales.
Your agents can now work with remote filesystems like S3, GCS, and Azure.
Give users access to your agent from Slack, Discord, Telegram, and more.
Give agents semantic understanding of your codebase.
Your agents can now browse and take actions on the web, just like you.
Edit agent instructions, tools, and display conditions from Studio with draft/publish versioning, or programmatically through the editor API.
Mastra Studio now supports logs, traces, and metrics, so you can understand how your agents behave in production.
Secure your deployed Mastra Studio with login screens, third-party identity providers, and role-based permissions.
Three new sandbox providers for safely running agent code in isolated cloud containers.
A new video course designed to get you from zero to deploying your first agent in under two hours.
Run test cases against agents and workflows, score the results, and track quality over time.
Define test cases, run experiments, and track response quality for every change you make.
Mastra Skills teach coding agents about Mastra through structured knowledge files, giving any compatible agent the context it needs to help you build.
File access, command execution, and reusable skills for agents that actually get things done.
Mastra’s new Server Adapters automatically expose your agents, workflows, tools, and MCP servers as HTTP endpoints on the server you’re already running.
We have a new UI dojo. It's a collection of working examples showing Mastra agents integrated with the most popular AI UI frameworks. Try it out at ui-dojo.mastra.ai.
We've renamed Playground to Studio and it's now shareable with your team.
Learn how to use HITL (Human-in-the-Loop) to safely build tools that require human approval.
Mastra workflows now support suspend and resume in Playground. You can also use the new resumeStream API to close streams on suspend and resume them later.
We've unified multi-agent coordination under a new .network() primitive—read about our journey, the experiments, and how you can use it to orchestrate collaborative agents simply.
Access 600+ LLM models from 40+ providers with a single string. Full TypeScript autocomplete turns your IDE into a model search engine.
Model fallbacks ensure your application stays online by automatically switching to backup models when your primary provider fails.
Mastra introduces AI Tracing: filter out noise and use multiple observability platforms.
You can now easily merge Mastra agent templates into your existing projects with just a few clicks.
We're making our new streaming architecture the default experience starting from `@mastra/core v0.19.0.`
Introducing output processors: the gatekeepers that sit between your language model's response and your users.
Cloneable scorers let you copy, customize, and extend Mastra’s evals into your project.
Mastra now controls the agent loop and tool calling with increased orchestration capabilities—while maintaining backward compatibility with AI SDK v4 and v5.
We're excited to announce the release of scorers in Mastra, a new way to evaluate and rank the quality of your agent's responses.
Mastra's nested streaming support provides real-time visibility into agent and workflow execution, with comprehensive cost tracking and a unified messaging interfaces.
Announcing Mastra Templates: pre-built, production-ready, customizable agent and workflow projects. Plus: join our MASTRA.BUILD hackathon to create and showcase your own templates.
Agent Network (vNext) introduces intelligent AI orchestration that automatically routes and executes complex multi-agent tasks without predetermined workflows.
Explore how our new docs chatbot leverages the Mastra MCP Docs server to enhance user experience and streamline access to documentation.
vNext brings stronger control flow, better type safety, and multi-engine support.
Create and share your AI tools with the Model Context Protocol (MCP) - now available in our latest stable release.
Mastra's MCP Registry Registry simplifies the process of finding MCP registries.
We've expanded Mastra Voice with real-time speech-to-speech capabilities for Agents.
We built a Mastra MCP documentation server that provides AI assistants and IDE's with direct access to Mastra.ai's complete knowledge base.
A guide to Mastra's evaluation metrics suite, demonstrating how to measure and validate AI model outputs across dimensions including answer relevancy, completeness, content similarity, context usage, and tone consistency.
Use Mastra's TTS system with OpenAI's speech models to create text-to-speech applications, covering voice selection, audio generation, streaming capabilities