DocsDeploymentLogging and Tracing

Logging and Tracing

Effective logging and tracing are crucial for understanding the behavior of your application.

Tracing is especially important for AI engineering. Teams building AI products find that visibility into inputs and outputs of every step of every run is crucial to improving accuracy. You get this with Mastra’s telemetry.

Logging

In Mastra, logs can detail when certain functions run, what input data they receive, and how they respond.

Basic Setup

Here’s a minimal example that sets up a console logger at the INFO level. This will print out informational messages and above (i.e., INFO, WARN, ERROR) to the console.

mastra.config.ts
import { Mastra, createLogger } from "@mastra/core";
 
export const mastra = new Mastra({
  // Other Mastra configuration...
  logger: createLogger({
    type: "CONSOLE",
    level: "INFO",
  }),
});

In this configuration:

  • type: "CONSOLE" specifies that logs should be output to the console.
  • level: "INFO" sets the minimum severity of logs to record.

Configuration

Telemetry

Mastra supports the OpenTelemetry Protocol (OTLP) for tracing and monitoring your application. When telemetry is enabled, Mastra automatically traces all core primitives including agent operations, LLM interactions, tool executions, integration calls, workflow runs, and database operations. Your telemetry data can then be exported to any OTEL collector.

Basic Configuration

Here’s a simple example of enabling telemetry:

mastra.config.ts
export const mastra = new Mastra({
  // ... other config
  telemetry: {
    serviceName: "my-app",
    enabled: true,
    sampling: {
      type: "always_on",
    },
    export: {
      type: "otlp",
      endpoint: "http://localhost:4318", // SigNoz local endpoint
    },
  },
});

Configuration Options

The telemetry config accepts these properties:

type OtelConfig = {
  // Name to identify your service in traces (optional)
  serviceName?: string;
 
  // Enable/disable telemetry (defaults to true)
  enabled?: boolean;
 
  // Control how many traces are sampled
  sampling?: {
    type: "ratio" | "always_on" | "always_off" | "parent_based";
    probability?: number; // For ratio sampling
    root?: {
      probability: number; // For parent_based sampling
    };
  };
 
  // Where to send telemetry data
  export?: {
    type: "otlp" | "console";
    endpoint?: string;
    headers?: Record<string, string>;
  };
};

See the OtelConfig reference documentation for more details.

Environment Variables

You can configure the OTLP endpoint and headers through environment variables:

.env
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
OTEL_EXPORTER_OTLP_HEADERS=x-api-key=your-api-key

Then in your config:

mastra.config.ts
export const mastra = new Mastra({
  // ... other config
  telemetry: {
    serviceName: "my-app",
    enabled: true,
    export: {
      type: "otlp",
      // endpoint and headers will be picked up from env vars
    },
  },
});

Example: SigNoz Integration

Here’s what a traced agent interaction looks like in SigNoz:

Agent interaction trace showing spans, LLM calls, and tool executions

Other Supported Providers

For a complete list of supported observability providers and their configuration details, see the Observability Providers reference.

Next.js Configuration [Local Dev]

When developing locally with Next.js, you’ll need to:

  1. Install the instrumentation package:
npm install import-in-the-middle # or require-in-the-middle for CJS
  1. Add it as an external dependency in your Next.js config:
next.config.ts
import type { NextConfig } from "next";
 
const nextConfig: NextConfig = {
  serverExternalPackages: ["import-in-the-middle"],
};
 
export default nextConfig;

This configuration is only necessary for local development to ensure proper instrumentation during hot reloading.


MIT 2025 © Nextra.