AWS Bedrock Data Automation MCP Server
A Model Context Protocol (MCP) server for Amazon Bedrock Data Automation that enables AI assistants to analyze documents, images, videos, and audio files using Amazon Bedrock Data Automation projects.
Features
- Project Management: List and get details about Bedrock Data Automation projects
- Asset Analysis: Extract insights from unstructured content using Bedrock Data Automation
- Support for Multiple Content Types: Process documents, images, videos, and audio files
- Integration with Amazon S3: Seamlessly upload and download assets and results
Prerequisites
- Install
uv
from Astral or the GitHub README - Install Python using
uv python install 3.10
- Set up AWS credentials with access to Amazon Bedrock Data Automation
- You need an AWS account with Amazon Bedrock Data Automation enabled
- Configure AWS credentials with
aws configure
or environment variables - Ensure your IAM role/user has permissions to use Amazon Bedrock Data Automation
- Create an AWS S3 Bucket
- Example AWS CLI command to create the bucket
bash aws s3 create-bucket <bucket-name>
Installation
Here are some ways you can work with MCP across AWS, and we'll be adding support to more products including Amazon Q Developer CLI soon: (e.g. for Amazon Q Developer CLI MCP, ~/.aws/amazonq/mcp.json
):
{
"mcpServers": {
"awslabs.aws-bedrock-data-automation-mcp-server": {
"command": "uvx",
"args": ["awslabs.aws-bedrock-data-automation-mcp-server@latest"],
"env": {
"AWS_PROFILE": "your-aws-profile",
"AWS_REGION": "us-east-1",
"AWS_BUCKET_NAME": "your-s3-bucket-name",
"BASE_DIR": "/path/to/base/directory",
"FASTMCP_LOG_LEVEL": "ERROR"
},
"disabled": false,
"autoApprove": []
}
}
}
or docker after a successful docker build -t awslabs/aws-bedrock-data-automation-mcp-server .
:
{
"mcpServers": {
"awslabs.aws-bedrock-data-automation-mcp-server": {
"command": "docker",
"args": [
"run",
"--rm",
"--interactive",
"--env",
"AWS_PROFILE",
"--env",
"AWS_REGION",
"--env",
"AWS_BUCKET_NAME",
"--env",
"BASE_DIR",
"--env",
"FASTMCP_LOG_LEVEL",
"awslabs/aws-bedrock-data-automation-mcp-server:latest"
],
"env": {
"AWS_PROFILE": "your-aws-profile",
"AWS_REGION": "us-east-1",
"AWS_BUCKET_NAME": "your-s3-bucket-name",
"BASE_DIR": "/path/to/base/directory",
"FASTMCP_LOG_LEVEL": "ERROR"
},
"disabled": false,
"autoApprove": []
}
}
}
Environment Variables
AWS_PROFILE
: AWS CLI profile to use for credentialsAWS_REGION
: AWS region to use (default: us-east-1)AWS_BUCKET_NAME
: S3 bucket name for storing assets and resultsBASE_DIR
: Base directory for file operations (optional)FASTMCP_LOG_LEVEL
: Logging level (ERROR, WARNING, INFO, DEBUG)
AWS Authentication
The server uses the AWS profile specified in the AWS_PROFILE
environment variable. If not provided, it defaults to the default credential provider chain.
"env": {
"AWS_PROFILE": "your-aws-profile",
"AWS_REGION": "us-east-1"
}
Make sure the AWS profile has permissions to access Amazon Bedrock Data Automation services. The MCP server creates a boto3 session using the specified profile to authenticate with AWS services. Amazon Bedrock Data Automation services is currently available in the following regions: us-east-1 and us-west-2.
Tools
getprojects
Get a list of data automation projects.
getprojects() -> list
Returns a list of available Bedrock Data Automation projects.
getprojectdetails
Get details of a specific data automation project.
getprojectdetails(projectArn: str) -> dict
Returns detailed information about a specific Bedrock Data Automation project.
analyzeasset
Analyze an asset using a data automation project.
analyzeasset(assetPath: str, projectArn: Optional[str] = None) -> dict
Extracts insights from unstructured content (documents, images, videos, audio) using Amazon Bedrock Data Automation.
assetPath
: Path to the asset file to analyzeprojectArn
: ARN of the Bedrock Data Automation project to use (optional, uses default public project if not provided)
Example Usage
# List available projects
projects = await getprojects()
# Get details of a specific project
project_details = await getprojectdetails(projectArn="arn:aws:bedrock:us-east-1:123456789012:data-automation-project/my-project")
# Analyze a document
results = await analyzeasset(assetPath="/path/to/document.pdf")
# Analyze an image with a specific project
results = await analyzeasset(
assetPath="/path/to/image.jpg",
projectArn="arn:aws:bedrock:us-east-1:123456789012:data-automation-project/my-project"
)
Security Considerations
- Use AWS IAM roles with appropriate permissions
- Store credentials securely
- Use temporary credentials when possible
- Ensure S3 bucket permissions are properly configured
License
This project is licensed under the Apache License, Version 2.0. See the LICENSE file for details.