Arize Exporter
Arize provides observability platforms for AI applications through Phoenix (open-source) and Arize AX (enterprise). The Arize exporter sends AI traces using OpenTelemetry and OpenInference semantic conventions, compatible with any OpenTelemetry platform that supports OpenInference.
When to Use Arize
Arize is ideal when you need:
- OpenInference standards - Industry-standard semantic conventions for AI traces
- Flexible deployment - Self-hosted Phoenix or managed Arize AX
- OpenTelemetry compatibility - Works with any OTLP-compatible platform
- Comprehensive AI observability - LLM traces, embeddings, and retrieval analytics
- Open-source option - Full-featured local deployment with Phoenix
Installation
npm install @mastra/arizeConfiguration
Phoenix Setup
Phoenix is an open-source observability platform that can be self-hosted or used via Phoenix Cloud.
Prerequisites
- Phoenix Instance: Deploy using Docker or sign up at Phoenix CloudÂ
- Endpoint: Your Phoenix endpoint URL (ends in
/v1/traces) - API Key: Optional for unauthenticated instances, required for Phoenix Cloud
- Environment Variables: Set your configuration
PHOENIX_ENDPOINT=http://localhost:6006/v1/traces # Or your Phoenix Cloud URL
PHOENIX_API_KEY=your-api-key # Optional for local instances
PHOENIX_PROJECT_NAME=mastra-service # Optional, defaults to 'mastra-service'Basic Setup
import { Mastra } from "@mastra/core";
import { ArizeExporter } from "@mastra/arize";
export const mastra = new Mastra({
observability: {
configs: {
arize: {
serviceName: process.env.PHOENIX_PROJECT_NAME || 'mastra-service',
exporters: [
new ArizeExporter({
endpoint: process.env.PHOENIX_ENDPOINT!,
apiKey: process.env.PHOENIX_API_KEY,
projectName: process.env.PHOENIX_PROJECT_NAME,
}),
],
},
},
},
});Quick Start with Docker
Test locally with an in-memory Phoenix instance:
docker run --pull=always -d --name arize-phoenix -p 6006:6006 \
-e PHOENIX_SQL_DATABASE_URL="sqlite:///:memory:" \
arizephoenix/phoenix:latestSet PHOENIX_ENDPOINT=http://localhost:6006/v1/traces and run your Mastra agent to see traces at localhost:6006Â .
Arize AX Setup
Arize AX is an enterprise observability platform with advanced features for production AI systems.
Prerequisites
- Arize AX Account: Sign up at app.arize.comÂ
- Space ID: Your organization’s space identifier
- API Key: Generate in Arize AX settings
- Environment Variables: Set your credentials
ARIZE_SPACE_ID=your-space-id
ARIZE_API_KEY=your-api-key
ARIZE_PROJECT_NAME=mastra-service # OptionalBasic Setup
import { Mastra } from "@mastra/core";
import { ArizeExporter } from "@mastra/arize";
export const mastra = new Mastra({
observability: {
configs: {
arize: {
serviceName: process.env.ARIZE_PROJECT_NAME || 'mastra-service',
exporters: [
new ArizeExporter({
apiKey: process.env.ARIZE_API_KEY!,
spaceId: process.env.ARIZE_SPACE_ID!,
projectName: process.env.ARIZE_PROJECT_NAME,
}),
],
},
},
},
});Configuration Options
The Arize exporter supports advanced configuration for fine-tuning OpenTelemetry behavior:
Complete Configuration
new ArizeExporter({
// Phoenix Configuration
endpoint: 'https://your-collector.example.com/v1/traces', // Required for Phoenix
// Arize AX Configuration
spaceId: 'your-space-id', // Required for Arize AX
// Shared Configuration
apiKey: 'your-api-key', // Required for authenticated endpoints
projectName: 'mastra-service', // Optional project name
// Optional OTLP settings
headers: {
'x-custom-header': 'value', // Additional headers for OTLP requests
},
// Debug and performance tuning
logLevel: 'debug', // Logging: debug | info | warn | error
batchSize: 512, // Batch size before exporting spans
timeout: 30000, // Timeout in ms before exporting spans
// Custom resource attributes
resourceAttributes: {
'deployment.environment': process.env.NODE_ENV,
'service.version': process.env.APP_VERSION,
},
})Batch Processing Options
Control how traces are batched and exported:
new ArizeExporter({
endpoint: process.env.PHOENIX_ENDPOINT!,
apiKey: process.env.PHOENIX_API_KEY,
// Batch processing configuration
batchSize: 512, // Number of spans to batch (default: 512)
timeout: 30000, // Max time in ms to wait before export (default: 30000)
})Resource Attributes
Add custom attributes to all exported spans:
new ArizeExporter({
endpoint: process.env.PHOENIX_ENDPOINT!,
resourceAttributes: {
'deployment.environment': process.env.NODE_ENV,
'service.namespace': 'production',
'service.instance.id': process.env.HOSTNAME,
'custom.attribute': 'value',
},
})OpenInference Semantic Conventions
This exporter implements the OpenInference Semantic Conventions for generative AI applications, providing standardized trace structure across different observability platforms.