Skip to main content

OpenTelemetry Exporter

The OpenTelemetry (OTEL) exporter sends your traces to any OTEL-compatible observability platform using standardized OpenTelemetry Semantic Conventions for GenAI. This ensures broad compatibility with platforms like Datadog, New Relic, SigNoz, MLflow, Dash0, Traceloop, Laminar, and more.

Looking for bidirectional OTEL integration?

If you have existing OpenTelemetry instrumentation and want Mastra traces to inherit context from active OTEL spans, see the OpenTelemetry Bridge instead.

Installation
Direct link to Installation

Each provider requires specific protocol packages. Install the base exporter plus the protocol package for your provider:

For HTTP/Protobuf Providers (SigNoz, New Relic, Laminar, MLflow)
Direct link to For HTTP/Protobuf Providers (SigNoz, New Relic, Laminar, MLflow)

npm install @mastra/otel-exporter@beta @opentelemetry/exporter-trace-otlp-proto

For gRPC Providers (Dash0, Datadog)
Direct link to For gRPC Providers (Dash0, Datadog)

npm install @mastra/otel-exporter@beta @opentelemetry/exporter-trace-otlp-grpc @grpc/grpc-js

For HTTP/JSON Providers (Traceloop)
Direct link to For HTTP/JSON Providers (Traceloop)

npm install @mastra/otel-exporter@beta @opentelemetry/exporter-trace-otlp-http

Provider Configurations
Direct link to Provider Configurations

MLflow
Direct link to MLflow

MLflow supports native Mastra tracing through its OTLP endpoint at /v1/traces. Use the custom provider with HTTP/Protobuf and include the experiment header so traces land in the correct MLflow experiment:

src/mastra/index.ts
new OtelExporter({
provider: {
custom: {
endpoint: `${process.env.MLFLOW_TRACKING_URI}/v1/traces`,
protocol: "http/protobuf",
headers: {
"x-mlflow-experiment-id": process.env.MLFLOW_EXPERIMENT_ID,
},
},
},
})

Dash0
Direct link to Dash0

Dash0 provides real-time observability with automatic insights.

src/mastra/index.ts
import { Mastra } from "@mastra/core";
import { Observability } from "@mastra/observability";
import { OtelExporter } from "@mastra/otel-exporter";

export const mastra = new Mastra({
observability: new Observability({
configs: {
otel: {
serviceName: "my-service",
exporters: [
new OtelExporter({
provider: {
dash0: {
apiKey: process.env.DASH0_API_KEY,
endpoint: process.env.DASH0_ENDPOINT, // e.g., 'ingress.us-west-2.aws.dash0.com:4317'
dataset: "production", // Optional dataset name
},
},
resourceAttributes: {
// Optional OpenTelemetry Resource Attributes for the trace
["deployment.environment"]: "dev",
},
}),
],
},
},
}),
});
info

Get your Dash0 endpoint from your dashboard. It should be in the format ingress.{region}.aws.dash0.com:4317.

SigNoz
Direct link to SigNoz

SigNoz is an open-source APM alternative with built-in Tracing support.

src/mastra/index.ts
new OtelExporter({
provider: {
signoz: {
apiKey: process.env.SIGNOZ_API_KEY,
region: "us", // 'us' | 'eu' | 'in'
// endpoint: 'https://my-signoz.example.com', // For self-hosted
},
},
});

New Relic
Direct link to New Relic

New Relic provides comprehensive observability with AI monitoring capabilities.

src/mastra/index.ts
new OtelExporter({
provider: {
newrelic: {
apiKey: process.env.NEW_RELIC_LICENSE_KEY,
// endpoint: 'https://otlp.eu01.nr-data.net', // For EU region
},
},
});

Traceloop
Direct link to Traceloop

Traceloop specializes in LLM observability with automatic prompt tracking.

src/mastra/index.ts
new OtelExporter({
provider: {
traceloop: {
apiKey: process.env.TRACELOOP_API_KEY,
destinationId: "my-destination", // Optional
},
},
});

Laminar
Direct link to Laminar

Laminar provides specialized LLM observability and analytics.

src/mastra/index.ts
new OtelExporter({
provider: {
laminar: {
apiKey: process.env.LMNR_PROJECT_API_KEY,
// teamId: process.env.LAMINAR_TEAM_ID, // Optional, for backwards compatibility
},
},
});

Datadog
Direct link to Datadog

Datadog APM provides application performance monitoring with distributed tracing. To send traces to Datadog via OTLP, you need the Datadog Agent running with OTLP ingestion enabled.

Datadog uses gRPC for OTLP ingestion, which requires explicit imports and bundler configuration to work correctly:

src/mastra/index.ts
// Explicitly import gRPC dependencies for the bundler
import "@grpc/grpc-js";
import "@opentelemetry/exporter-trace-otlp-grpc";
import { Mastra } from "@mastra/core";
import { Observability } from "@mastra/observability";
import { OtelExporter, type ExportProtocol } from "@mastra/otel-exporter";

export const mastra = new Mastra({
// Add grpc-js to externals so it's handled at runtime
bundler: {
externals: ["@grpc/grpc-js"],
},

observability: new Observability({
configs: {
default: {
serviceName: "my-service",
exporters: [
new OtelExporter({
provider: {
custom: {
endpoint:
process.env.OTEL_EXPORTER_OTLP_ENDPOINT ||
"http://localhost:4317",
protocol: (process.env.OTEL_EXPORTER_OTLP_PROTOCOL ||
"grpc") as ExportProtocol,
headers: {},
},
},
}),
],
},
},
}),
});
info

The Datadog Agent must be configured with OTLP ingestion enabled. Add the following to your datadog.yaml:

otlp_config:
receiver:
protocols:
grpc:
endpoint: 0.0.0.0:4317

The default OTLP endpoint is http://localhost:4317 when running the Datadog Agent locally.

warning

The explicit imports of @grpc/grpc-js and @opentelemetry/exporter-trace-otlp-grpc at the top of the file, along with the bundler.externals configuration, are required for the gRPC transport to work correctly. Without these, you may encounter connection issues.

Custom/Generic OTEL Endpoints
Direct link to Custom/Generic OTEL Endpoints

For other OTEL-compatible platforms or custom collectors:

src/mastra/index.ts
new OtelExporter({
provider: {
custom: {
endpoint: "https://your-collector.example.com/v1/traces",
protocol: "http/protobuf", // 'http/json' | 'http/protobuf' | 'grpc'
headers: {
"x-api-key": process.env.API_KEY,
},
},
},
});

Configuration Options
Direct link to Configuration Options

Complete Configuration
Direct link to Complete Configuration

new OtelExporter({
// Provider configuration (required)
provider: {
// Use one of: dash0, signoz, newrelic, traceloop, laminar, custom
},

// Export configuration
timeout: 30000, // Export timeout in milliseconds
batchSize: 100, // Number of spans per batch

// Debug options
logLevel: "info", // 'debug' | 'info' | 'warn' | 'error'
});

OpenTelemetry Semantic Conventions
Direct link to OpenTelemetry Semantic Conventions

The exporter follows OpenTelemetry Semantic Conventions for GenAI v1.38.0, ensuring compatibility with observability platforms:

Span Naming
Direct link to Span Naming

  • LLM Operations: chat {model}
  • Tool Execution: execute_tool {tool_name}
  • Agent Runs: invoke_agent {agent_id}
  • Workflow Runs: invoke_workflow {workflow_id}

Key Attributes
Direct link to Key Attributes

  • gen_ai.operation.name - Operation type (chat, tool.execute, etc.)
  • gen_ai.provider.name - AI provider (openai, anthropic, etc.)
  • gen_ai.request.model - Model identifier
  • gen_ai.input.messages - Chat history provided to the model
  • gen_ai.output.messages - Messages returned by the model
  • gen_ai.usage.input_tokens - Number of input tokens
  • gen_ai.usage.output_tokens - Number of output tokens
  • gen_ai.request.temperature - Sampling temperature
  • gen_ai.response.finish_reasons - Completion reasons

Protocol Selection Guide
Direct link to Protocol Selection Guide

Choose the right protocol package based on your provider:

ProviderProtocolRequired Package
Dash0gRPC@opentelemetry/exporter-trace-otlp-grpc
DatadoggRPC@opentelemetry/exporter-trace-otlp-grpc
SigNozHTTP/Protobuf@opentelemetry/exporter-trace-otlp-proto
New RelicHTTP/Protobuf@opentelemetry/exporter-trace-otlp-proto
TraceloopHTTP/JSON@opentelemetry/exporter-trace-otlp-http
LaminarHTTP/Protobuf@opentelemetry/exporter-trace-otlp-proto
CustomVariesDepends on your collector
warning

Make sure to install the correct protocol package for your provider. The exporter will provide a helpful error message if the wrong package is installed.

Troubleshooting
Direct link to Troubleshooting

Missing Dependency Error
Direct link to Missing Dependency Error

If you see an error like:

HTTP/Protobuf exporter is not installed (required for signoz).
To use HTTP/Protobuf export, install the required package:
npm install @opentelemetry/exporter-trace-otlp-proto

Install the suggested package for your provider.

Common Issues
Direct link to Common Issues

  1. Wrong protocol package: Verify you installed the correct exporter for your provider
  2. Invalid endpoint: Check endpoint format matches provider requirements
  3. Authentication failures: Verify API keys and headers are correct

Using Tags
Direct link to Using Tags

Tags help you categorize and filter traces in your observability platform. Add tags when executing agents or workflows:

const result = await agent.generate({
messages: [{ role: "user", content: "Hello" }],
tracingOptions: {
tags: ["production", "experiment-v2", "user-request"],
},
});

Tags are exported as a JSON string in the mastra.tags span attribute for broad backend compatibility. Common use cases include:

  • Environment labels: "production", "staging"
  • Experiment tracking: "experiment-v1", "control-group"
  • Priority levels: "priority-high", "batch-job"