LangSmith Exporter
LangSmith is LangChain's platform for monitoring and evaluating LLM applications. The LangSmith exporter sends your traces to LangSmith, providing insights into model performance, debugging capabilities, and evaluation workflows.
InstallationDirect link to Installation
- npm
- pnpm
- Yarn
- Bun
npm install @mastra/langsmith@latest
pnpm add @mastra/langsmith@latest
yarn add @mastra/langsmith@latest
bun add @mastra/langsmith@latest
ConfigurationDirect link to Configuration
PrerequisitesDirect link to Prerequisites
- LangSmith Account: Sign up at smith.langchain.com
- API Key: Generate an API key in LangSmith Settings → API Keys
- Environment Variables: Set your credentials
# Required
LANGSMITH_API_KEY=ls-xxxxxxxxxxxx
# Optional
LANGCHAIN_PROJECT=my-project # Default project for traces
LANGSMITH_BASE_URL=https://api.smith.langchain.com # For self-hosted
Zero-Config SetupDirect link to Zero-Config Setup
With environment variables set, use the exporter with no configuration:
import { Mastra } from "@mastra/core";
import { Observability } from "@mastra/observability";
import { LangSmithExporter } from "@mastra/langsmith";
export const mastra = new Mastra({
observability: new Observability({
configs: {
langsmith: {
serviceName: "my-service",
exporters: [new LangSmithExporter()],
},
},
}),
});
Explicit ConfigurationDirect link to Explicit Configuration
You can also pass credentials directly (takes precedence over environment variables):
import { Mastra } from "@mastra/core";
import { Observability } from "@mastra/observability";
import { LangSmithExporter } from "@mastra/langsmith";
export const mastra = new Mastra({
observability: new Observability({
configs: {
langsmith: {
serviceName: "my-service",
exporters: [
new LangSmithExporter({
apiKey: process.env.LANGSMITH_API_KEY,
}),
],
},
},
}),
});
Configuration OptionsDirect link to Configuration Options
Complete ConfigurationDirect link to Complete Configuration
new LangSmithExporter({
// Required credentials
apiKey: process.env.LANGSMITH_API_KEY!,
// Optional settings
apiUrl: process.env.LANGSMITH_BASE_URL, // Default: https://api.smith.langchain.com
projectName: "my-project", // Project to send traces to (overrides LANGCHAIN_PROJECT env var)
callerOptions: {
// HTTP client options
timeout: 30000, // Request timeout in ms
maxRetries: 3, // Retry attempts
},
logLevel: "info", // Diagnostic logging: debug | info | warn | error
// LangSmith-specific options
hideInputs: false, // Hide input data in UI
hideOutputs: false, // Hide output data in UI
});
Environment VariablesDirect link to Environment Variables
| Variable | Description |
|---|---|
LANGSMITH_API_KEY | Your LangSmith API key (required) |
LANGCHAIN_PROJECT | Default project name for traces (optional, defaults to "default") |
LANGSMITH_BASE_URL | API URL for self-hosted instances (optional) |
The projectName config option takes precedence over the LANGCHAIN_PROJECT environment variable, allowing you to programmatically route traces to different projects.
Dynamic ConfigurationDirect link to Dynamic Configuration
You can dynamically override LangSmith settings per-span using withLangsmithMetadata. This is useful for routing traces to different projects based on runtime conditions (e.g., customer, environment, or feature).
Using the HelperDirect link to Using the Helper
Use withLangsmithMetadata with buildTracingOptions to set LangSmith-specific options:
import { Agent } from "@mastra/core/agent";
import { buildTracingOptions } from "@mastra/observability";
import { withLangsmithMetadata } from "@mastra/langsmith";
export const supportAgent = new Agent({
id: "support-agent",
name: "support-agent",
instructions: "You are a helpful support agent.",
model: "openai/gpt-4o",
defaultOptions: {
tracingOptions: buildTracingOptions(
withLangsmithMetadata({ projectName: "customer-support" }),
),
},
});
Dynamic Project RoutingDirect link to Dynamic Project Routing
Use requestContext to route traces to different projects based on runtime conditions.
import { Agent } from "@mastra/core/agent";
import { buildTracingOptions } from "@mastra/observability";
import { withLangsmithMetadata } from "@mastra/langsmith";
export const supportAgent = new Agent({
id: "support-agent",
name: "support-agent",
instructions: "You are a helpful support agent.",
model: "openai/gpt-4o",
defaultOptions: ({ requestContext }) => {
const userTier = requestContext?.get("user-tier") as string;
const userId = requestContext?.get("user-id") as string;
return {
tracingOptions: buildTracingOptions(
withLangsmithMetadata({
projectName: userTier === "enterprise"
? "enterprise-traces"
: "standard-traces",
sessionId: userId,
}),
),
};
},
});
Available FieldsDirect link to Available Fields
The withLangsmithMetadata helper accepts these fields:
| Field | Type | Description |
|---|---|---|
projectName | string | Override the project for this trace |
sessionId | string | Group related traces by session |
sessionName | string | Display name for the session |
All fields are optional. The helper merges with any existing metadata, so you can call it multiple times or combine with other tracing options.