# Embed Mastra uses the AI SDK's `embed` and `embedMany` functions to generate vector embeddings for text inputs, enabling similarity search and RAG workflows. ## Single Embedding The `embed` function generates a vector embedding for a single text input: ```typescript import { embed } from "ai"; import { ModelRouterEmbeddingModel } from "@mastra/core/llm"; const result = await embed({ model: new ModelRouterEmbeddingModel("openai/text-embedding-3-small"), value: "Your text to embed", maxRetries: 2, // optional, defaults to 2 }); ``` ### Parameters ### model: EmbeddingModel The embedding model to use (e.g. openai.embedding('text-embedding-3-small')) ### value: string | Record\ The text content or object to embed ### maxRetries?: number Maximum number of retries per embedding call. Set to 0 to disable retries. ### abortSignal?: AbortSignal Optional abort signal to cancel the request ### headers?: Record\ Additional HTTP headers for the request (only for HTTP-based providers) ### Return Value ### embedding: number\[] The embedding vector for the input ## Multiple Embeddings For embedding multiple texts at once, use the `embedMany` function: ```typescript import { embedMany } from "ai"; const result = await embedMany({ model: new ModelRouterEmbeddingModel("openai/text-embedding-3-small"), values: ["First text", "Second text", "Third text"], maxRetries: 2, // optional, defaults to 2 }); ``` ### Parameters ### model: EmbeddingModel The embedding model to use (e.g. openai.embedding('text-embedding-3-small')) ### values: string\[] | Record\\[] Array of text content or objects to embed ### maxRetries?: number Maximum number of retries per embedding call. Set to 0 to disable retries. ### abortSignal?: AbortSignal Optional abort signal to cancel the request ### headers?: Record\ Additional HTTP headers for the request (only for HTTP-based providers) ### Return Value ### embeddings: number\[]\[] Array of embedding vectors corresponding to the input values ## Example Usage ```typescript import { embed, embedMany } from "ai"; // Single embedding const singleResult = await embed({ model: new ModelRouterEmbeddingModel("openai/text-embedding-3-small"), value: "What is the meaning of life?", }); // Multiple embeddings const multipleResult = await embedMany({ model: new ModelRouterEmbeddingModel("openai/text-embedding-3-small"), values: [ "First question about life", "Second question about universe", "Third question about everything", ], }); ``` For more detailed information about embeddings in the Vercel AI SDK, see: - [AI SDK Embeddings Overview](https://sdk.vercel.ai/docs/ai-sdk-core/embeddings) - [embed()](https://sdk.vercel.ai/docs/reference/ai-sdk-core/embed) - [embedMany()](https://sdk.vercel.ai/docs/reference/ai-sdk-core/embed-many)