# OpenTelemetry Exporter The OpenTelemetry (OTEL) exporter sends your traces to any OTEL-compatible observability platform using standardized [OpenTelemetry Semantic Conventions for GenAI](https://opentelemetry.io/docs/specs/semconv/gen-ai/). This ensures broad compatibility with platforms like Datadog, New Relic, SigNoz, MLflow, Dash0, Traceloop, Laminar, and more. If you have existing OpenTelemetry instrumentation and want Mastra traces to inherit context from active OTEL spans, see the [OpenTelemetry Bridge](https://mastra.ai/docs/observability/tracing/bridges/otel/llms.txt) instead. ## Installation Each provider requires specific protocol packages. Install the base exporter plus the protocol package for your provider: ### For HTTP/Protobuf Providers (SigNoz, New Relic, Laminar, MLflow) ```bash npm install @mastra/otel-exporter@latest @opentelemetry/exporter-trace-otlp-proto ``` ```bash pnpm add @mastra/otel-exporter@latest @opentelemetry/exporter-trace-otlp-proto ``` ```bash yarn add @mastra/otel-exporter@latest @opentelemetry/exporter-trace-otlp-proto ``` ```bash bun add @mastra/otel-exporter@latest @opentelemetry/exporter-trace-otlp-proto ``` ### For gRPC Providers (Dash0, Datadog) ```bash npm install @mastra/otel-exporter@latest @opentelemetry/exporter-trace-otlp-grpc @grpc/grpc-js ``` ```bash pnpm add @mastra/otel-exporter@latest @opentelemetry/exporter-trace-otlp-grpc @grpc/grpc-js ``` ```bash yarn add @mastra/otel-exporter@latest @opentelemetry/exporter-trace-otlp-grpc @grpc/grpc-js ``` ```bash bun add @mastra/otel-exporter@latest @opentelemetry/exporter-trace-otlp-grpc @grpc/grpc-js ``` ### For HTTP/JSON Providers (Traceloop) ```bash npm install @mastra/otel-exporter@latest @opentelemetry/exporter-trace-otlp-http ``` ```bash pnpm add @mastra/otel-exporter@latest @opentelemetry/exporter-trace-otlp-http ``` ```bash yarn add @mastra/otel-exporter@latest @opentelemetry/exporter-trace-otlp-http ``` ```bash bun add @mastra/otel-exporter@latest @opentelemetry/exporter-trace-otlp-http ``` ## Environment Variables All providers support zero-config setup via environment variables. Set the appropriate variables and the exporter will automatically use them: | Provider | Environment Variables | | --------- | ------------------------------------------------------------------------------------------- | | Dash0 | `DASH0_API_KEY` (required), `DASH0_ENDPOINT` (required), `DASH0_DATASET` (optional) | | SigNoz | `SIGNOZ_API_KEY` (required), `SIGNOZ_REGION` (optional), `SIGNOZ_ENDPOINT` (optional) | | New Relic | `NEW_RELIC_LICENSE_KEY` (required), `NEW_RELIC_ENDPOINT` (optional) | | Traceloop | `TRACELOOP_API_KEY` (required), `TRACELOOP_DESTINATION_ID`, `TRACELOOP_ENDPOINT` (optional) | | Laminar | `LMNR_PROJECT_API_KEY` (required), `LAMINAR_ENDPOINT` (optional) | ## Provider Configurations ### MLflow [MLflow](https://mlflow.org/docs/latest/genai/tracing/integrations/listing/mastra) supports native Mastra tracing through its OTLP endpoint at `/v1/traces`. Use the `custom` provider with HTTP/Protobuf and include the experiment header so traces land in the correct MLflow experiment: ```typescript new OtelExporter({ provider: { custom: { endpoint: `${process.env.MLFLOW_TRACKING_URI}/v1/traces`, protocol: "http/protobuf", headers: { "x-mlflow-experiment-id": process.env.MLFLOW_EXPERIMENT_ID, }, }, }, }) ``` ### Dash0 [Dash0](https://www.dash0.com/) provides real-time observability with automatic insights. #### Zero-Config Setup Set environment variables and use the exporter with an empty config: ```bash # Required DASH0_API_KEY=your-api-key DASH0_ENDPOINT=ingress.us-west-2.aws.dash0.com:4317 # Optional DASH0_DATASET=production ``` ```typescript import { Mastra } from "@mastra/core"; import { Observability } from "@mastra/observability"; import { OtelExporter } from "@mastra/otel-exporter"; export const mastra = new Mastra({ observability: new Observability({ configs: { otel: { serviceName: "my-service", exporters: [new OtelExporter({ provider: { dash0: {} } })], }, }, }), }); ``` #### Explicit Configuration ```typescript import { Mastra } from "@mastra/core"; import { Observability } from "@mastra/observability"; import { OtelExporter } from "@mastra/otel-exporter"; export const mastra = new Mastra({ observability: new Observability({ configs: { otel: { serviceName: "my-service", exporters: [ new OtelExporter({ provider: { dash0: { apiKey: process.env.DASH0_API_KEY, endpoint: process.env.DASH0_ENDPOINT, // e.g., 'ingress.us-west-2.aws.dash0.com:4317' dataset: "production", // Optional dataset name }, }, resourceAttributes: { // Optional OpenTelemetry Resource Attributes for the trace ["deployment.environment"]: "dev", }, }), ], }, }, }), }); ``` Get your Dash0 endpoint from your dashboard. It should be in the format `ingress.{region}.aws.dash0.com:4317`. ### SigNoz [SigNoz](https://signoz.io/) is an open-source APM alternative with built-in Tracing support. #### Zero-Config Setup ```bash # Required SIGNOZ_API_KEY=your-api-key # Optional SIGNOZ_REGION=us # 'us' | 'eu' | 'in' SIGNOZ_ENDPOINT=https://my-signoz.example.com # For self-hosted ``` ```typescript new OtelExporter({ provider: { signoz: {} } }) ``` #### Explicit Configuration ```typescript new OtelExporter({ provider: { signoz: { apiKey: process.env.SIGNOZ_API_KEY, region: "us", // 'us' | 'eu' | 'in' // endpoint: 'https://my-signoz.example.com', // For self-hosted }, }, }); ``` ### New Relic [New Relic](https://newrelic.com/) provides comprehensive observability with AI monitoring capabilities. #### Zero-Config Setup ```bash # Required NEW_RELIC_LICENSE_KEY=your-license-key # Optional NEW_RELIC_ENDPOINT=https://otlp.eu01.nr-data.net # For EU region ``` ```typescript new OtelExporter({ provider: { newrelic: {} } }) ``` #### Explicit Configuration ```typescript new OtelExporter({ provider: { newrelic: { apiKey: process.env.NEW_RELIC_LICENSE_KEY, // endpoint: 'https://otlp.eu01.nr-data.net', // For EU region }, }, }); ``` ### Traceloop [Traceloop](https://www.traceloop.com/) specializes in LLM observability with automatic prompt tracking. #### Zero-Config Setup ```bash # Required TRACELOOP_API_KEY=your-api-key # Optional TRACELOOP_DESTINATION_ID=my-destination TRACELOOP_ENDPOINT=https://custom.traceloop.com ``` ```typescript new OtelExporter({ provider: { traceloop: {} } }) ``` #### Explicit Configuration ```typescript new OtelExporter({ provider: { traceloop: { apiKey: process.env.TRACELOOP_API_KEY, destinationId: "my-destination", // Optional }, }, }); ``` ### Laminar [Laminar](https://laminar.sh/) provides specialized LLM observability and analytics. #### Zero-Config Setup ```bash # Required LMNR_PROJECT_API_KEY=your-api-key # Optional LAMINAR_ENDPOINT=https://api.lmnr.ai/v1/traces ``` ```typescript new OtelExporter({ provider: { laminar: {} } }) ``` #### Explicit Configuration ```typescript new OtelExporter({ provider: { laminar: { apiKey: process.env.LMNR_PROJECT_API_KEY, }, }, }); ``` For Laminar-specific features like native span paths, metadata, and tags rendering in the Laminar dashboard, consider using the dedicated [`@mastra/laminar`](https://mastra.ai/docs/observability/tracing/exporters/laminar/llms.txt) exporter instead. It provides optimized integration with Laminar's platform. ### Datadog [Datadog](https://www.datadoghq.com/) APM provides application performance monitoring with distributed tracing. To send traces to Datadog via OTLP, you need the Datadog Agent running with OTLP ingestion enabled. Datadog uses gRPC for OTLP ingestion, which requires explicit imports and [bundler configuration](https://mastra.ai/reference/configuration/llms.txt) to work correctly: ```typescript // Explicitly import gRPC dependencies for the bundler import "@grpc/grpc-js"; import "@opentelemetry/exporter-trace-otlp-grpc"; import { Mastra } from "@mastra/core"; import { Observability } from "@mastra/observability"; import { OtelExporter, type ExportProtocol } from "@mastra/otel-exporter"; export const mastra = new Mastra({ // Add grpc-js to externals so it's handled at runtime bundler: { externals: ["@grpc/grpc-js"], }, observability: new Observability({ configs: { default: { serviceName: "my-service", exporters: [ new OtelExporter({ provider: { custom: { endpoint: process.env.OTEL_EXPORTER_OTLP_ENDPOINT || "http://localhost:4317", protocol: (process.env.OTEL_EXPORTER_OTLP_PROTOCOL || "grpc") as ExportProtocol, headers: {}, }, }, }), ], }, }, }), }); ``` The Datadog Agent must be configured with OTLP ingestion enabled. Add the following to your `datadog.yaml`: ```yaml otlp_config: receiver: protocols: grpc: endpoint: 0.0.0.0:4317 ``` The default OTLP endpoint is `http://localhost:4317` when running the Datadog Agent locally. The explicit imports of `@grpc/grpc-js` and `@opentelemetry/exporter-trace-otlp-grpc` at the top of the file, along with the [`bundler.externals`](https://mastra.ai/reference/configuration/llms.txt) configuration, are required for the gRPC transport to work correctly. Without these, you may encounter connection issues. For Datadog-specific features like automatic span type mapping, LLM span categorization, and simplified setup without gRPC configuration, consider using the dedicated [`@mastra/datadog`](https://mastra.ai/docs/observability/tracing/exporters/datadog/llms.txt) exporter instead. It provides optimized integration with Datadog's APM platform. ### Custom/Generic OTEL Endpoints For other OTEL-compatible platforms or custom collectors: ```typescript new OtelExporter({ provider: { custom: { endpoint: "https://your-collector.example.com/v1/traces", protocol: "http/protobuf", // 'http/json' | 'http/protobuf' | 'grpc' headers: { "x-api-key": process.env.API_KEY, }, }, }, }); ``` ## Configuration Options ### Complete Configuration ```typescript new OtelExporter({ // Provider configuration (required) provider: { // Use one of: dash0, signoz, newrelic, traceloop, laminar, custom }, // Export configuration timeout: 30000, // Export timeout in milliseconds batchSize: 100, // Number of spans per batch // Debug options logLevel: "info", // 'debug' | 'info' | 'warn' | 'error' }); ``` ## OpenTelemetry Semantic Conventions The exporter follows [OpenTelemetry Semantic Conventions for GenAI v1.38.0](https://github.com/open-telemetry/semantic-conventions/tree/v1.38.0/docs/gen-ai), ensuring compatibility with observability platforms: ### Span Naming - **LLM Operations**: `chat {model}` - **Tool Execution**: `execute_tool {tool_name}` - **Agent Runs**: `invoke_agent {agent_id}` - **Workflow Runs**: `invoke_workflow {workflow_id}` ### Key Attributes - `gen_ai.operation.name` - Operation type (chat, tool.execute, etc.) - `gen_ai.provider.name` - AI provider (openai, anthropic, etc.) - `gen_ai.request.model` - Model identifier - `gen_ai.input.messages` - Chat history provided to the model - `gen_ai.output.messages` - Messages returned by the model - `gen_ai.usage.input_tokens` - Number of input tokens - `gen_ai.usage.output_tokens` - Number of output tokens - `gen_ai.request.temperature` - Sampling temperature - `gen_ai.response.finish_reasons` - Completion reasons ## Protocol Selection Guide Choose the right protocol package based on your provider: | Provider | Protocol | Required Package | | --------- | ------------- | ------------------------------------------ | | Dash0 | gRPC | `@opentelemetry/exporter-trace-otlp-grpc` | | Datadog | gRPC | `@opentelemetry/exporter-trace-otlp-grpc` | | SigNoz | HTTP/Protobuf | `@opentelemetry/exporter-trace-otlp-proto` | | New Relic | HTTP/Protobuf | `@opentelemetry/exporter-trace-otlp-proto` | | Traceloop | HTTP/JSON | `@opentelemetry/exporter-trace-otlp-http` | | Laminar | HTTP/Protobuf | `@opentelemetry/exporter-trace-otlp-proto` | | Custom | Varies | Depends on your collector | Make sure to install the correct protocol package for your provider. The exporter will provide a helpful error message if the wrong package is installed. ## Troubleshooting ### Missing Dependency Error If you see an error like: ```text HTTP/Protobuf exporter is not installed (required for signoz). To use HTTP/Protobuf export, install the required package: npm install @opentelemetry/exporter-trace-otlp-proto ``` Install the suggested package for your provider. ### Common Issues 1. **Wrong protocol package**: Verify you installed the correct exporter for your provider 2. **Invalid endpoint**: Check endpoint format matches provider requirements 3. **Authentication failures**: Verify API keys and headers are correct ## Related - [Tracing Overview](https://mastra.ai/docs/observability/tracing/overview/llms.txt) - [OpenTelemetry Bridge](https://mastra.ai/docs/observability/tracing/bridges/otel/llms.txt) - [OpenTelemetry Semantic Conventions for GenAI v1.38.0](https://github.com/open-telemetry/semantic-conventions/tree/v1.38.0/docs/gen-ai) - [OTEL Exporter Reference](https://mastra.ai/reference/observability/tracing/exporters/otel/llms.txt)