ExamplesLLM ModelsCall Google Gemini

Generate Text with Gemini

Many developers need to use different language models but don’t want to learn multiple APIs. Mastra provides a unified interface for working with various LLM providers, handling the complexity of different API implementations. This example shows how to use Google’s Gemini model through the same interface used for other providers.

import { Mastra } from '@mastra/core';
 
const mastra = new Mastra();
 
const llm = mastra.LLM({
  provider: 'GOOGLE',
  name: 'gemini-1.5-flash',
  apiKey: process.env.GEMINI_API_KEY,
});
 
const result = await llm.generate('Who invented the submarine?');
 
console.log(result.text);





View Example on GitHub

MIT 2025 © Nextra.