Skip to main content
Mastra v1 is coming in January 2026. Get ahead by starting new projects with the beta or upgrade your existing project today.

Observability Overview

Mastra provides comprehensive observability features designed specifically for AI applications. Monitor LLM operations, trace agent decisions, and debug complex workflows with specialized tools that understand AI-specific patterns.

Key FeaturesDirect link to Key Features

Structured LoggingDirect link to Structured Logging

Debug applications with contextual logging:

  • Context propagation: Automatic correlation with traces
  • Configurable levels: Filter by severity in development and production

AI TracingDirect link to AI Tracing

Specialized tracing for AI operations that captures:

  • LLM interactions: Token usage, latency, prompts, and completions
  • Agent execution: Decision paths, tool calls, and memory operations
  • Workflow steps: Branching logic, parallel execution, and step outputs
  • Automatic instrumentation: Zero-configuration tracing with decorators

OTEL Tracing (Deprecated)Direct link to OTEL Tracing (Deprecated)

warning

OTEL Tracing via the telemetry configuration is deprecated and will be removed in a future release. Use AI Tracing with the OpenTelemetry exporter instead for OTLP-compatible tracing.

If you are not using the old telemetry system, you should explicitly disable it to suppress deprecation warnings:

telemetry: { enabled: false }

Traditional distributed tracing with OpenTelemetry:

  • Standard OTLP protocol: Compatible with existing observability infrastructure
  • HTTP and database instrumentation: Automatic spans for common operations
  • Provider integrations: Datadog, New Relic, Jaeger, and other OTLP collectors
  • Distributed context: W3C Trace Context propagation

Quick StartDirect link to Quick Start

Configure Observability in your Mastra instance:

src/mastra/index.ts
import { Mastra } from "@mastra/core";
import { PinoLogger } from "@mastra/core";
import { LibSqlStorage } from "@mastra/libsql";

export const mastra = new Mastra({
// ... other config
logger: new PinoLogger(),
observability: {
default: { enabled: true }, // Enables AI Tracing
},
storage: new LibSQLStore({
url: "file:./mastra.db", // Storage is required for tracing
}),
telemetry: {
enabled: false, // Disable deprecated OTEL Tracing to suppress warnings
},
});

With this basic setup, you will see Traces and Logs in both Studio and in Mastra Cloud.

We also support various external tracing providers like Langfuse, Braintrust, and any OpenTelemetry-compatible platform (Datadog, New Relic, SigNoz, etc.). See more about this in the AI Tracing documentation.

What's Next?Direct link to What's Next?