Skip to content

AWS Labs Aurora DSQL MCP Server

An AWS Labs Model Context Protocol (MCP) server for Aurora DSQL

Features

  • Converting human-readable questions and commands into structured Postgres-compatible SQL queries and executing them against the configured Aurora DSQL database.
  • Read-only by default, transactions enabled with --allow-writes
  • Connection reuse between requests for improved performance

Prerequisites

  1. An AWS account with an Aurora DSQL Cluster
  2. This MCP server can only be run locally on the same host as your LLM client.
  3. Set up AWS credentials with access to AWS services
  4. You need an AWS account with appropriate permissions
  5. Configure AWS credentials with aws configure or environment variables

Installation

Install MCP Server

Using uv

  1. Install uv from Astral or the GitHub README
  2. Install Python using uv python install 3.10

Configure the MCP server in your MCP client configuration (e.g., for Amazon Q Developer CLI, edit ~/.aws/amazonq/mcp.json):

{
  "mcpServers": {
    "awslabs.aurora-dsql-mcp-server": {
      "command": "uvx",
      "args": [
        "awslabs.aurora-dsql-mcp-server@latest",
        "--cluster_endpoint",
        "[your dsql cluster endpoint]",
        "--region",
        "[your dsql cluster region, e.g. us-east-1]",
        "--database_user",
        "[your dsql username]",
        "--profile", "default"
      ],
      "env": {
        "FASTMCP_LOG_LEVEL": "ERROR"
      },
      "disabled": false,
      "autoApprove": []
    }
  }
}

Using Docker

  1. 'git clone https://github.com/awslabs/mcp.git'
  2. Go to sub-directory 'src/aurora-dsql-mcp-server/'
  3. Run 'docker build -t awslabs/aurora-dsql-mcp-server:latest .'
  4. Create a env file with temporary credentials:

Either manually:

# fictitious `.env` file with AWS temporary credentials
AWS_ACCESS_KEY_ID=<from the profile you set up>
AWS_SECRET_ACCESS_KEY=<from the profile you set up>
AWS_SESSION_TOKEN=<from the profile you set up>

Or using aws configure:

aws configure export-credentials --profile your-profile-name --format env > temp_aws_credentials.env | sed 's/^export //' > temp_aws_credentials.env
{
  "mcpServers": {
    "awslabs.aurora-dsql-mcp-server": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "--env-file",
        "/full/path/to/file/above/.env",
        "awslabs/aurora-dsql-mcp-server:latest",
        "--cluster_endpoint", "[your data]",
        "--database_user", "[your data]",
        "--region", "[your data]"
      ]
    }
  }
}

Server Configuration options

--allow-writes

By default, the dsql mcp server does not allow write operations ("read-only mode"). Any invocations of transact tool will fail in this mode. To use transact tool, allow writes by passing --allow-writes parameter.

We recommend using least-privilege access when connecting to DSQL. For example, users should use a role that is read-only when possible. The read-only mode has a best-effort client-side enforcement to reject mutations.

--cluster_endpoint

This is mandatory parameter to specify the cluster to connect to. This should be the full endpoint of your cluster, e.g., 01abc2ldefg3hijklmnopqurstu.dsql.us-east-1.on.aws

--database_user

This is a mandatory parameter to specify the user to connect as. For example admin, or my_user. Note that the AWS credentials you are using must have permission to login as that user. For more information on setting up and using database roles in DSQL, see Using database roles with IAM roles.

--profile

You can specify the aws profile to use for your credentials. Note that this is not supported for docker installation.

Using the AWS_PROFILE environment variable in your MCP configuration is also supported:

"env": {
  "AWS_PROFILE": "your-aws-profile"
}

If neither is provided, the MCP server defaults to using the "default" profile in your AWS configuration file.

--region

This is a mandatory parameter to specify the region of your DSQL database.

Development and Testing

Running Tests

This project includes comprehensive tests to validate the readonly enforcement mechanisms. To run the tests:

# Install dependencies and run tests
uv run pytest tests/test_readonly_enforcement.py -v

# Run all tests
uv run pytest -v

# Run tests with coverage
uv run pytest --cov=awslabs.aurora_dsql_mcp_server tests/ -v

Local Docker Testing

To test the MCP server locally using Docker:

  1. Build the Docker image:

    cd src/aurora-dsql-mcp-server
    docker build -t awslabs/aurora-dsql-mcp-server:latest .
    

  2. Create AWS credentials file:

Option A - Manual creation:

# Create .env file with your AWS credentials
cat > .env << EOF
AWS_ACCESS_KEY_ID=your_access_key_here
AWS_SECRET_ACCESS_KEY=your_secret_key_here
AWS_SESSION_TOKEN=your_session_token_here
EOF

Option B - Export from AWS CLI:

aws configure export-credentials --profile your-profile-name --format env > temp_aws_credentials.env
sed 's/^export //' temp_aws_credentials.env > .env
rm temp_aws_credentials.env

  1. Test the container directly:

    docker run -i --rm \
      --env-file .env \
      awslabs/aurora-dsql-mcp-server:latest \
      --cluster_endpoint "your-dsql-cluster-endpoint" \
      --database_user "your-username" \
      --region "us-east-1"
    

  2. Test with write operations enabled:

    docker run -i --rm \
      --env-file .env \
      awslabs/aurora-dsql-mcp-server:latest \
      --cluster_endpoint "your-dsql-cluster-endpoint" \
      --database_user "your-username" \
      --region "us-east-1" \
      --allow-writes
    

Note: Replace the placeholder values with your actual DSQL cluster endpoint, username, and region.