ArizeExporter
Sends AI tracing data to Arize Phoenix, Arize AX, or any OpenTelemetry-compatible observability platform that supports OpenInference semantic conventions.
Constructor
new ArizeExporter(config: ArizeExporterConfig)ArizeExporterConfig
interface ArizeExporterConfig {
// Phoenix / OpenTelemetry configuration
endpoint?: string;
apiKey?: string;
// Arize AX configuration
spaceId?: string;
// Common configuration
projectName?: string;
headers?: Record<string, string>;
// Inherited from OtelExporterConfig
timeout?: number;
batchSize?: number;
logLevel?: 'debug' | 'info' | 'warn' | 'error';
resourceAttributes?: Record<string, any>;
}Methods
exportEvent
async exportEvent(event: AITracingEvent): Promise<void>Exports a tracing event to the configured endpoint.
export
async export(spans: ReadOnlyAISpan[]): Promise<void>Batch exports spans using OpenTelemetry with OpenInference semantic conventions.
shutdown
async shutdown(): Promise<void>Flushes pending data and shuts down the client.
Usage
Phoenix Configuration
import { ArizeExporter } from '@mastra/arize';
const exporter = new ArizeExporter({
endpoint: 'http://localhost:6006/v1/traces',
apiKey: process.env.PHOENIX_API_KEY, // Optional for local Phoenix
projectName: 'my-ai-project',
});Arize AX Configuration
import { ArizeExporter } from '@mastra/arize';
const exporter = new ArizeExporter({
spaceId: process.env.ARIZE_SPACE_ID!,
apiKey: process.env.ARIZE_API_KEY!,
projectName: 'my-ai-project',
});OpenInference Semantic Conventions
The ArizeExporter implements OpenInference Semantic Conventions for generative AI applications, providing standardized trace structure across different observability platforms.