Basic AI Tracing Example
This example demonstrates how to set up basic AI tracing in a Mastra application with automatic instrumentation for agents and workflows.
Prerequisites
- Mastra v0.14.0 or higher
- Node.js 18+
- A configured storage backend (libsql or memory)
Setup
1. Install Dependencies
npm install @mastra/core
If you want to view traces in Mastra Cloud, create an .env
file with your access token:
export MASTRA_CLOUD_ACCESS_TOKEN=your_token_here
2. Configure Mastra with Default Tracing
Create your Mastra configuration with AI tracing enabled:
src/mastra/index.ts
import { Mastra } from "@mastra/core";
import { LibSQLStorage } from "@mastra/libsql";
export const mastra = new Mastra({
// Configure storage (required for DefaultExporter)
storage: new LibSQLStorage({
url: 'file:local.db'
}),
// Enable AI tracing with default configuration
observability: {
default: { enabled: true }
}
});
This default configuration automatically includes:
- DefaultExporter - Persists traces to your storage
- CloudExporter - Sends to Mastra Cloud (if token is set)
- SensitiveDataFilter - Redacts sensitive fields
- 100% sampling - All traces are collected
3. Create an Agent with Automatic Tracing
src/mastra/agents/example-agent.ts
import { Agent } from "@mastra/core/agent";
import { createTool } from "@mastra/core/tools";
import { openai } from "@ai-sdk/openai";
// Create a tool using createTool
const getCurrentTime = createTool({
name: "getCurrentTime",
description: "Get the current time",
input: {},
execute: async () => {
// Tool calls are automatically traced
return { time: new Date().toISOString() };
}
});
export const exampleAgent = new Agent({
name: "example-agent",
instructions: "You are a helpful AI assistant.",
model: openai("gpt-4"),
tools: {
getCurrentTime
}
});
4. Execute and View Traces
src/example.ts
import { mastra } from "./mastra";
async function main() {
// Get the agent
const agent = mastra.getAgent("example-agent");
// Execute agent - automatically creates traces
const result = await agent.generate("What time is it?");
console.log("Agent response:", result.text);
console.log("Trace ID:", result.traceId);
console.log("View trace at: http://localhost:3000/traces/" + result.traceId);
}
main().catch(console.error);
What Gets Traced
When you run this example, Mastra automatically creates spans for:
- AGENT_RUN - The complete agent execution
- LLM_GENERATION - The model execution inside the agent
- TOOL_CALL - Tool executions
Example trace hierarchy:
AGENT_RUN (example-agent)
├── LLM_GENERATION (gpt-4) - Model input & output
├── TOOL_CALL (getCurrentTime) - Tool execution
Viewing Traces
In the Playground or in Mastra Cloud, go to the Observability page, and click on a Trace. You can also click on individual spans to get input, output, attributes & metadata.
Adding Custom Metadata
Enhance your traces with custom metadata:
src/example-with-metadata.ts
const result = await agent.generate(
"What time is it?",
{
// Add custom metadata to traces
metadata: {
userId: "user_123",
sessionId: "session_abc",
feature: "time-query",
environment: "development"
}
}
);
Related
Documentation
- AI Tracing Overview - Complete guide to tracing
- Sensitive Data Filter - Redact sensitive information
- Configuration Patterns - Best practices
Reference
- Configuration - ObservabilityConfig API
- Exporters - DefaultExporter details
- Span Types - Span interfaces and methods
- AITracing Classes - Core tracing classes