Skip to main content

Agent.getLLM()

The .getLLM() method retrieves the language model instance configured for an agent, resolving it if it's a function. You can also pass a request-scoped model override without mutating the agent's configured model.

Usage example
Direct link to Usage example

await agent.getLLM()
await agent.getLLM({
model: 'openai/gpt-5.4',
})

Parameters
Direct link to Parameters

options?:

{ requestContext?: RequestContext; model?: MastraModelConfig | DynamicArgument<MastraModelConfig> }
= {}
Optional configuration object containing request context and an optional request-scoped model override.
{ requestContext?: RequestContext; model?: MastraModelConfig | DynamicArgument<MastraModelConfig> }

requestContext?:

RequestContext
Request Context for dependency injection and contextual information.

model?:

MastraModelConfig | DynamicArgument<MastraModelConfig>
Optional request-scoped model override. The agent's configured model is not mutated.

Returns
Direct link to Returns

llm:

MastraLLMV1 | Promise<MastraLLMV1>
The language model instance configured for the agent, either as a direct instance or a promise that resolves to the LLM.

Extended usage example
Direct link to Extended usage example

await agent.getLLM({
requestContext: new RequestContext(),
model: 'openai/gpt-5.4',
})
On this page