# Observability Overview Mastra provides observability features for AI applications. Monitor LLM operations, trace agent decisions, and debug complex workflows with tools that understand AI-specific patterns. ## Key Features ### Tracing Specialized tracing for AI operations that captures: - **Model interactions**: Token usage, latency, prompts, and completions - **Agent execution**: Decision paths, tool calls, and memory operations - **Workflow steps**: Branching logic, parallel execution, and step outputs - **Automatic instrumentation**: Tracing with decorators ## Storage Requirements The `DefaultExporter` persists traces to your configured storage backend. Not all storage providers support observability—for the full list, see [Storage Provider Support](https://mastra.ai/docs/observability/tracing/exporters/default/llms.txt). For production environments with high traffic, we recommend using **ClickHouse** for the observability domain via [composite storage](https://mastra.ai/reference/storage/composite/llms.txt). See [Production Recommendations](https://mastra.ai/docs/observability/tracing/exporters/default/llms.txt) for details. ## Quick Start Configure Observability in your Mastra instance: ```typescript import { Mastra } from "@mastra/core"; import { PinoLogger } from "@mastra/loggers"; import { LibSQLStore } from "@mastra/libsql"; import { Observability, DefaultExporter, CloudExporter, SensitiveDataFilter, } from "@mastra/observability"; export const mastra = new Mastra({ logger: new PinoLogger(), storage: new LibSQLStore({ id: 'mastra-storage', url: "file:./mastra.db", // Storage is required for tracing }), observability: new Observability({ configs: { default: { serviceName: "mastra", exporters: [ new DefaultExporter(), // Persists traces to storage for Mastra Studio new CloudExporter(), // Sends traces to Mastra Cloud (if MASTRA_CLOUD_ACCESS_TOKEN is set) ], spanOutputProcessors: [ new SensitiveDataFilter(), // Redacts sensitive data like passwords, tokens, keys ], }, }, }), }); ``` The `file:./mastra.db` storage URL uses the local filesystem, which doesn't work in serverless environments like Vercel, AWS Lambda, or Cloudflare Workers. For serverless deployments, use external storage. See the [Vercel deployment guide](https://mastra.ai/guides/deployment/vercel-deployer/llms.txt) for a complete example. With this basic setup, you will see Traces and Logs in both Studio and in Mastra Cloud. We also support various external tracing providers like MLflow, Langfuse, Braintrust, and any OpenTelemetry-compatible platform (Datadog, New Relic, SigNoz, etc.). See more about this in the [Tracing](https://mastra.ai/docs/observability/tracing/overview/llms.txt) documentation. ## What's Next? - **[Set up Tracing](https://mastra.ai/docs/observability/tracing/overview/llms.txt)**: Configure tracing for your application - **[Configure Logging](https://mastra.ai/docs/observability/logging/llms.txt)**: Add structured logging - **[API Reference](https://mastra.ai/reference/observability/tracing/instances/llms.txt)**: Detailed configuration options