Skip to main content

LangSmith

LangSmith is LangChain's platform for debugging, testing, evaluating, and monitoring LLM applications.

Note: Currently, this integration only traces AI-related calls in your application. Other types of operations are not captured in the telemetry data.

ConfigurationDirect link to Configuration

To use LangSmith with Mastra, you'll need to configure the following environment variables:

LANGSMITH_TRACING=true
LANGSMITH_ENDPOINT=https://api.smith.langchain.com
LANGSMITH_API_KEY=your-api-key
LANGSMITH_PROJECT=your-project-name

ImplementationDirect link to Implementation

Here's how to configure Mastra to use LangSmith:

import { Mastra } from "@mastra/core";
import { AISDKExporter } from "langsmith/vercel";

export const mastra = new Mastra({
// ... other config
telemetry: {
serviceName: "your-service-name",
enabled: true,
export: {
type: "custom",
exporter: new AISDKExporter(),
},
},
});

DashboardDirect link to Dashboard

Access your traces and analytics in the LangSmith dashboard at smith.langchain.com

Note: Even if you run your workflows, you may not see data appearing in a new project. You will need to sort by Name column to see all projects, select your project, then filter by LLM Calls instead of Root Runs.

On this page