Streaming Working Memory
This example demonstrates how to create an agent that maintains a working memory for relevant conversational details like the users name, location, or preferences.
Setup
First, set up the memory system with working memory enabled. Note that new Memory()
without a configured storage provider will not persist data across application restarts. For persistence, you can configure a storage provider like @mastra/libsql
:
import { Memory } from "@mastra/memory";
const memory = new Memory({
options: {
workingMemory: {
enabled: true,
},
},
storage: new LibSQLStore({
url: "file:../mastra.db",
}),
});
Add the memory instance to an agent:
import { openai } from "@ai-sdk/openai";
const agent = new Agent({
name: "Memory agent",
instructions: "You are a helpful AI assistant.",
model: openai("gpt-4o-mini"),
memory, // or toolCallMemory
});
Usage Example
Now that working memory is set up you can interact with the agent and it will remember key details about interactions.
import { randomUUID } from "crypto";
const threadId = randomUUID();
const resourceId = "SOME_USER_ID";
const response = await agent.stream("Hello, my name is Jane", {
threadId,
resourceId,
});
for await (const chunk of response.textStream) {
process.stdout.write(chunk);
}
Summary
This example demonstrates:
- Setting up memory with working memory enabled
- The agent maintaining relevant user info between interactions
Advanced use cases
For examples on controlling which information is relevant for working memory, or showing loading states while working memory is being saved, see our advanced working memory example.
To learn more about agent memory, including other memory types and storage options, check out the Memory documentation page.