ExamplesLLM ModelsUse a System Prompt

Generate Text with System Prompt

When interacting with language models, you can guide their behavior by providing initial instructions. A system prompt sets the overall context and behavior for the model before it processes any user messages. This example shows how to use system prompts to control model behavior.

import { Mastra } from '@mastra/core';
 
const mastra = new Mastra();
 
const llm = mastra.LLM({
  provider: 'OPEN_AI',
  name: 'gpt-4',
});
 
const response = await llm.generate([
  { role: 'system', content: 'You are a helpful assistant.' },
  { role: 'user', content: 'What is the meaning of life?' },
]);
 
console.log(response.text);





View Example on GitHub

MIT 2025 © Nextra.