Using Cedar-OS
Cedar-OS is an open-source agentic UI framework designed specifically for building the most ambitious AI-native applications. Cedar was built with Mastra in mind.
Also see the Extending Mastra and Custom Backend Implementation guides on Cedar's documentation site for more advanced integration patterns.
Integration Guide
Run Cedar's CLI command:
npx cedar-os-cli plant-seedIf starting from scratch, select the Mastra starter template for a complete setup with both frontend and backend in a monorepo
If you already have a Mastra backend, use the blank frontend cedar repo option instead. This will give you the option to download components and download all dependencies for Cedar. It is recommended to download at least one of the chat components to get started.
Wrap your application with the CedarCopilot provider to connect to your Mastra backend:
import { CedarCopilot } from "cedar-os";
function App() {
return (
<CedarCopilot
llmProvider={{
provider: "mastra",
baseURL: "http://localhost:4111", // default dev port for Mastra
apiKey: process.env.NEXT_PUBLIC_MASTRA_API_KEY, // optional — only for backend auth
}}
>
<YourApp />
</CedarCopilot>
);
}Configure your Mastra backend to work with Cedar by following the Mastra Configuration Options.
Register API routes in your Mastra server (or NextJS serverless routes if in a monorepo):
import { registerApiRoute } from "@mastra/core/server";
// POST /chat
// The chat's non-streaming default endpoint
registerApiRoute("/chat", {
method: "POST",
// …validate input w/ zod
handler: async (c) => {
/* your agent.generate() logic */
},
});
// POST /chat/stream (SSE)
// The chat's streaming default endpoint
registerApiRoute("/chat/stream", {
method: "POST",
handler: async (c) => {
/* stream agent output in SSE format */
},
});Drop Cedar components into your frontend – see Chat Overview.
Your backend and frontend are now linked! You're ready to start building AI-native experiences with your Mastra agents.