Skip to Content
ReferenceObservabilityProvidersReference: Keywords AI Integration | Mastra Observability Docs

Keywords AI  is a full-stack LLM engineering platform that helps developers and PMs build reliable AI products faster. In a shared workspace, product teams can build, monitor, and improve AI performance.

This tutorial shows how to set up Keywords AI tracing with Mastra  to monitor and trace your AI-powered applications.

To help you get started quickly, we’ve provided a pre-built example. You can find the code on GitHub .

Setup

Here’s the tutorial about the Mastra Weather Agent example.

1. Install Dependencies

pnpm install

2. Environment Variables

Copy the example environment file and add your API keys:

cp .env.local.example .env.local

Update .env.local with your credentials:

OPENAI_API_KEY=your-openai-api-key KEYWORDSAI_API_KEY=your-keywordsai-api-key KEYWORDSAI_BASE_URL=https://api.keywordsai.co

3. Setup Mastra client with Keywords AI tracing

Configure with KeywordsAI telemetry in src/mastra/index.ts:

src/mastra/index.ts
import { Mastra } from "@mastra/core/mastra"; import { KeywordsAIExporter } from "@keywordsai/exporter-vercel"; telemetry: { serviceName: "keywordai-mastra-example", enabled: true, export: { type: "custom", exporter: new KeywordsAIExporter({ apiKey: process.env.KEYWORDSAI_API_KEY, baseUrl: process.env.KEYWORDSAI_BASE_URL, debug: true, }) } }

3. Run the Project

mastra dev

This opens the Mastra playground where you can interact with the weather agent.

Observability

Once configured, you can view your traces and analytics in the Keywords AI platform .