DocsReferenceObservabilityProvidersLangSmith

LangSmith

LangSmith is LangChain’s platform for debugging, testing, evaluating, and monitoring LLM applications.

Note: Currently, this integration only traces AI-related calls in your application. Other types of operations are not captured in the telemetry data.

Configuration

To use LangSmith with Mastra, you’ll need to configure the following environment variables:

LANGSMITH_TRACING=true
LANGSMITH_ENDPOINT=https://api.smith.langchain.com
LANGSMITH_API_KEY=your-api-key
LANGSMITH_PROJECT=your-project-name

Implementation

Here’s how to configure Mastra to use LangSmith:

import { Mastra } from "@mastra/core";
import { AISDKExporter } from "langsmith/vercel";
 
export const mastra = new Mastra({
  // ... other config
  telemetry: {
    serviceName: "your-service-name",
    enabled: true,
    export: {
      type: "custom",
      exporter: new AISDKExporter(),
    },
  },
});

Dashboard

Access your traces and analytics in the LangSmith dashboard at smith.langchain.com