Skip to main content

Deploy Mastra to AWS Lambda

Deploy your Mastra application to AWS Lambda using Docker containers and the AWS Lambda Web Adapter. This approach runs your Mastra server as a containerized Lambda function with automatic scaling.

info

This guide covers deploying the Mastra server. If you're using a server adapter or web framework, deploy the way you normally would for that framework.

Before you begin
Direct link to Before you begin

You'll need:

warning

On AWS Lambda, the filesystem is ephemeral, so any local database file will be lost between invocations. If you're using LibSQLStore with a local file, configure it to use a remote LibSQL-compatible database (for example, Turso) instead.

Create a Dockerfile
Direct link to Create a Dockerfile

Create a Dockerfile in your project root:

Dockerfile
FROM node:22-alpine

WORKDIR /app

# Install dependencies
COPY package*.json ./
RUN npm ci

# Copy source and build Mastra
COPY src ./src
RUN npx mastra build

# Alpine compatibility for some native deps
RUN apk add --no-cache gcompat

# Add the Lambda Web Adapter
COPY --from=public.ecr.aws/awsguru/aws-lambda-adapter:0.9.1 /lambda-adapter /opt/extensions/lambda-adapter

# Run as non-root
RUN addgroup -g 1001 -S nodejs && \
adduser -S mastra -u 1001 && \
chown -R mastra:nodejs /app

USER mastra

# Adapter / app configuration
ENV PORT=8080
ENV NODE_ENV=production
ENV AWS_LWA_READINESS_CHECK_PATH="/api"

# Start the Mastra server
CMD ["node", ".mastra/output/index.mjs"]
note

This Dockerfile uses npm. If you're using pnpm, yarn, or another package manager, adjust the commands accordingly (e.g. npm ci won't work with pnpm lockfiles).

Build and push the Docker image
Direct link to Build and push the Docker image

  1. Set up environment variables for the deployment. Use the same region for all AWS services (ECR, Lambda) to avoid issues with resources not appearing across regions:

    export PROJECT_NAME="your-mastra-app"
    export AWS_REGION="us-east-1"
    export AWS_ACCOUNT_ID=$(aws sts get-caller-identity --query Account --output text)
  2. Build your Docker image. Disable BuildKit to ensure the image is compatible with Lambda's container runtime:

    export DOCKER_BUILDKIT=0
    docker build --platform linux/arm64 -t "$PROJECT_NAME" .
    note

    BuildKit's default image format can cause compatibility issues with Lambda. Setting DOCKER_BUILDKIT=0 uses the classic builder, which produces images in a format Lambda reliably accepts.

  3. Create an Amazon ECR repository to store your Docker image:

    aws ecr create-repository --repository-name "$PROJECT_NAME" --region "$AWS_REGION"
  4. Log in to Amazon ECR:

    aws ecr get-login-password --region "$AWS_REGION" | \
    docker login --username AWS --password-stdin \
    "$AWS_ACCOUNT_ID.dkr.ecr.$AWS_REGION.amazonaws.com"
  5. Tag your image and push it to ECR:

    docker tag "$PROJECT_NAME":latest \
    "$AWS_ACCOUNT_ID.dkr.ecr.$AWS_REGION.amazonaws.com/$PROJECT_NAME":latest

    docker push \
    "$AWS_ACCOUNT_ID.dkr.ecr.$AWS_REGION.amazonaws.com/$PROJECT_NAME":latest

Create the Lambda function
Direct link to Create the Lambda function

  1. Navigate to the AWS Lambda Console and click Create function. Make sure you have the same region selected as where you created your ECR repository.

  2. Select Container image and configure the function:

    • Function name: Enter a name (e.g. mastra-app)
    • Container image URI: Click Browse images, select your ECR repository, and choose the latest tag
    • Architecture: Select arm64

    Click Create function to create the Lambda function.

  3. Configure memory and timeout. Go to Configuration > General configuration and click Edit:

    • Memory: 512 MB (adjust based on your application needs)
    • Timeout: 1 minute or higher (the default 3 seconds is too low for most Mastra applications)

    Click Save.

  4. Add environment variables. Go to Configuration > Environment variables and click Edit. Add the variables your Mastra application needs:

    • OPENAI_API_KEY: Your OpenAI API key (if using OpenAI)
    • ANTHROPIC_API_KEY: Your Anthropic API key (if using Anthropic)
    • Other provider-specific API keys as needed (e.g. TURSO_DATABASE_URL and TURSO_AUTH_TOKEN if using LibSQL with Turso)

    Click Save.

  5. Enable a Function URL for external access. Go to Configuration > Function URL and click Create function URL:

    • Auth type: Select NONE for public access
    • Under Additional settings, check Configure cross-origin resource sharing (CORS), then configure:
      • Allow origin: *
      • Allow headers: content-type (x-amzn-request-context is also required when used with services like CloudFront/API Gateway)
      • Allow methods: *

    Click Save.

Verify your deployment
Direct link to Verify your deployment

  1. Copy the Function URL from the Lambda console and visit <your-function-url>/api/agents in your browser. You should see a JSON list of your agents.

  2. You can now call your Mastra endpoints over HTTP.

    warning

    For production deployments, set up authentication before exposing your endpoints publicly, restrict CORS origins to your trusted domains, use AWS IAM roles for secure access to other AWS services, and store sensitive environment variables in AWS Secrets Manager or Parameter Store.

Next steps
Direct link to Next steps

This guide provides a quickstart for deploying Mastra to AWS Lambda. For production workloads, consider enabling CloudWatch monitoring for your Lambda function, setting up AWS X-Ray for distributed tracing, and configuring provisioned concurrency for consistent performance.