ExamplesMemoryMemory with Upstash

Memory with Upstash

This example demonstrates how to use Mastra’s memory system with Upstash as the storage backend.

Setup

First, set up the memory system with Upstash storage and vector capabilities:

import { Memory } from "@mastra/memory";
import { UpstashStore, UpstashVector } from "@mastra/upstash";
import { Agent } from "@mastra/core/agent";
import { openai } from "@ai-sdk/openai";
 
// Initialize memory with Upstash storage and vector search
const memory = new Memory({
  storage: new UpstashStore({
    url: process.env.UPSTASH_REDIS_REST_URL,
    token: process.env.UPSTASH_REDIS_REST_TOKEN,
  }),
  vector: new UpstashVector({
    url: process.env.UPSTASH_REDIS_REST_URL,
    token: process.env.UPSTASH_REDIS_REST_TOKEN,
  }),
  options: {
    lastMessages: 10,
    semanticRecall: {
      topK: 3,
      messageRange: 2,
    },
  },
});
 
// Create an agent with memory capabilities
const chefAgent = new Agent({
  name: "chefAgent",
  instructions:
    "You are Michel, a practical and experienced home chef who helps people cook great meals with whatever ingredients they have available.",
  model: openai("gpt-4o-mini"),
  memory,
});

Environment Setup

Make sure to set up your Upstash credentials in the environment variables:

UPSTASH_REDIS_REST_URL=your-redis-url
UPSTASH_REDIS_REST_TOKEN=your-redis-token

Usage Example

import { randomUUID } from "crypto";
 
// Start a conversation
const threadId = randomUUID();
const resourceId = "SOME_USER_ID";
 
// Ask about ingredients
const response1 = await chefAgent.stream(
  "In my kitchen I have: pasta, canned tomatoes, garlic, olive oil, and some dried herbs (basil and oregano). What can I make?",
  {
    threadId,
    resourceId,
  },
);
 
// Ask about different ingredients
const response2 = await chefAgent.stream(
  "Now I'm over at my friend's house, and they have: chicken thighs, coconut milk, sweet potatoes, and curry powder.",
  {
    threadId,
    resourceId,
  },
);
 
// Use memory to recall previous conversation
const response3 = await chefAgent.stream(
  "What did we cook before I went to my friends house?",
  {
    threadId,
    resourceId,
    memoryOptions: {
      lastMessages: 3, // Get last 3 messages for context
      semanticRecall: {
        topK: 2, // Also get 2 most relevant messages
        messageRange: 2, // Include context around matches
      },
    },
  },
);

The example shows:

  1. Setting up Upstash storage with vector search capabilities
  2. Configuring environment variables for Upstash connection
  3. Creating an agent with memory integration
  4. Using both recent history and semantic search in the same query