Skip to Content
ReferenceObservabilityProvidersLangfuse

Langfuse

Langfuse is an open-source observability platform designed specifically for LLM applications.

Note: Currently, only AI-related calls will contain detailed telemetry data. Other operations will create traces but with limited information.

Configuration

To use Langfuse with Mastra, you’ll need to configure the following environment variables:

LANGFUSE_PUBLIC_KEY=your_public_key LANGFUSE_SECRET_KEY=your_secret_key LANGFUSE_BASEURL=https://cloud.langfuse.com # Optional - defaults to cloud.langfuse.com

Important: When configuring the telemetry export settings, the traceName parameter must be set to "ai" for the Langfuse integration to work properly.

Implementation

Here’s how to configure Mastra to use Langfuse:

import { Mastra } from "@mastra/core"; import { LangfuseExporter } from "langfuse-vercel"; export const mastra = new Mastra({ // ... other config telemetry: { serviceName: "ai", // this must be set to "ai" so that the LangfuseExporter thinks it's an AI SDK trace enabled: true, export: { type: "custom", exporter: new LangfuseExporter({ publicKey: process.env.LANGFUSE_PUBLIC_KEY, secretKey: process.env.LANGFUSE_SECRET_KEY, baseUrl: process.env.LANGFUSE_BASEURL, }), }, }, });

Dashboard

Once configured, you can view your traces and analytics in the Langfuse dashboard at cloud.langfuse.com