Workflow state lets you share values across steps without threading them through every `inputSchema` and `outputSchema`.
We've unified multi-agent coordination under a new .network() primitive—read about our journey, the experiments, and how you can use it to orchestrate collaborative agents simply.
Access 600+ LLM models from 40+ providers with a single string. Full TypeScript autocomplete turns your IDE into a model search engine.
Mastra secures $13m from a coalition of legendary investors to fuel AI agent development and production.
Learn how to migrate from Mastra’s VNext streaming methods to the new standard APIs, with details on renaming, compatibility, and key differences between AI SDK v4 and v5.
New model router and automatic model fallbacks.
Model fallbacks ensure your application stays online by automatically switching to backup models when your primary provider fails.
Mastra introduces AI Tracing: filter out noise and use multiple observability platforms.
You can now easily merge Mastra agent templates into your existing projects with just a few clicks.
New streaming architecture, scoring improvements, and more.
We're making our new streaming architecture the default experience starting from `@mastra/core v0.19.0.`
Resumable streams, structured output API change, bundling overhaul, and more.