Embed
Mastra uses the AI SDK's embed and embedMany functions to generate vector embeddings for text inputs, enabling similarity search and RAG workflows.
Single EmbeddingDirect link to Single Embedding
The embed function generates a vector embedding for a single text input:
import { embed } from "ai";
const result = await embed({
model: openai.embedding("text-embedding-3-small"),
value: "Your text to embed",
maxRetries: 2, // optional, defaults to 2
});
ParametersDirect link to Parameters
model:
EmbeddingModel
The embedding model to use (e.g. openai.embedding('text-embedding-3-small'))
value:
string | Record<string, any>
The text content or object to embed
maxRetries?:
number
= 2
Maximum number of retries per embedding call. Set to 0 to disable retries.
abortSignal?:
AbortSignal
Optional abort signal to cancel the request
headers?:
Record<string, string>
Additional HTTP headers for the request (only for HTTP-based providers)
Return ValueDirect link to Return Value
embedding:
number[]
The embedding vector for the input
Multiple EmbeddingsDirect link to Multiple Embeddings
For embedding multiple texts at once, use the embedMany function:
import { embedMany } from "ai";
const result = await embedMany({
model: openai.embedding("text-embedding-3-small"),
values: ["First text", "Second text", "Third text"],
maxRetries: 2, // optional, defaults to 2
});
ParametersDirect link to Parameters
model:
EmbeddingModel
The embedding model to use (e.g. openai.embedding('text-embedding-3-small'))
values:
string[] | Record<string, any>[]
Array of text content or objects to embed
maxRetries?:
number
= 2
Maximum number of retries per embedding call. Set to 0 to disable retries.
abortSignal?:
AbortSignal
Optional abort signal to cancel the request
headers?:
Record<string, string>
Additional HTTP headers for the request (only for HTTP-based providers)
Return ValueDirect link to Return Value
embeddings:
number[][]
Array of embedding vectors corresponding to the input values
Example UsageDirect link to Example Usage
import { embed, embedMany } from "ai";
import { openai } from "@ai-sdk/openai";
// Single embedding
const singleResult = await embed({
model: openai.embedding("text-embedding-3-small"),
value: "What is the meaning of life?",
});
// Multiple embeddings
const multipleResult = await embedMany({
model: openai.embedding("text-embedding-3-small"),
values: [
"First question about life",
"Second question about universe",
"Third question about everything",
],
});
For more detailed information about embeddings in the Vercel AI SDK, see: