Python MCP Server
Generate a Python Model Context Protocol (MCP) server for providing context to Large Language Models (LLMs), and optionally deploy it to Amazon Bedrock AgentCore.
What is MCP?
Section titled “What is MCP?”The Model Context Protocol (MCP) is an open standard that allows AI assistants to interact with external tools and resources. It provides a consistent way for LLMs to:
- Execute tools (functions) that perform actions or retrieve information
- Access resources that provide context or data
Generate an MCP Server
Section titled “Generate an MCP Server”You can generate a Python MCP server in two ways:
- Install the Nx Console VSCode Plugin if you haven't already
- Open the Nx Console in VSCode
- Click
Generate (UI)
in the "Common Nx Commands" section - Search for
@aws/nx-plugin - py#mcp-server
- Fill in the required parameters
- Click
Generate
pnpm nx g @aws/nx-plugin:py#mcp-server
yarn nx g @aws/nx-plugin:py#mcp-server
npx nx g @aws/nx-plugin:py#mcp-server
bunx nx g @aws/nx-plugin:py#mcp-server
You can also perform a dry-run to see what files would be changed
pnpm nx g @aws/nx-plugin:py#mcp-server --dry-run
yarn nx g @aws/nx-plugin:py#mcp-server --dry-run
npx nx g @aws/nx-plugin:py#mcp-server --dry-run
bunx nx g @aws/nx-plugin:py#mcp-server --dry-run
Options
Section titled “Options”Parameter | Type | Default | Description |
---|---|---|---|
project Required | string | - | The project to add an MCP server to |
computeType | string | BedrockAgentCoreRuntime | The type of compute to host your MCP server. Select None for no hosting. |
name | string | - | The name of your MCP server (default: mcp-server) |
iacProvider | string | CDK | The preferred IaC provider |
Generator Output
Section titled “Generator Output”The generator will add the following files to your existing Python project:
Directoryyour-project/
Directoryyour_module/
Directorymcp_server/ (or custom name if specified)
- __init__.py Python package initialization
- server.py Main server definition with sample tools and resources
- stdio.py Entry point for STDIO transport, useful for simple local MCP servers
- http.py Entry point for Streamable HTTP transport, useful for hosting your MCP server
- Dockerfile Entry point for hosting your MCP server (excluded when
computeType
is set toNone
)
- pyproject.toml Updated with MCP dependencies
- project.json Updated with MCP server serve targets
Infrastructure
Section titled “Infrastructure”Since this generator vends infrastructure as code based on your chosen iacProvider
, it will create a project in packages/common
which includes the relevant CDK constructs or Terraform modules.
The common infrastructure as code project is structured as follows:
Directorypackages/common/constructs
Directorysrc
Directoryapp/ Constructs for infrastructure specific to a project/generator
- …
Directorycore/ Generic constructs which are reused by constructs in
app
- …
- index.ts Entry point exporting constructs from
app
- project.json Project build targets and configuration
Directorypackages/common/terraform
Directorysrc
Directoryapp/ Terraform modules for infrastructure specific to a project/generator
- …
Directorycore/ Generic modules which are reused by modules in
app
- …
- project.json Project build targets and configuration
For deploying your MCP Server, the following files are generated:
Directorypackages/common/constructs/src
Directoryapp
Directorymcp-servers
Directory<project-name>
- <project-name>.ts CDK construct for deploying your MCP Server
- Dockerfile Passthrough docker file used by the CDK construct
Directorycore
Directoryagent-core
- runtime.ts Generic CDK construct for deploying to Bedrock AgentCore Runtime
Directorypackages/common/terraform/src
Directoryapp
Directorymcp-servers
Directory<project-name>
- <project-name>.tf Module for deploying your MCP Server
Directorycore
Directoryagent-core
- runtime.tf Generic module for deploying to Bedrock AgentCore Runtime
Working with Your MCP Server
Section titled “Working with Your MCP Server”Adding Tools
Section titled “Adding Tools”Tools are functions that the AI assistant can call to perform actions. The Python MCP server uses the MCP Python SDK (FastMCP) library, which provides a simple decorator-based approach for defining tools.
You can add new tools in the server.py
file:
@mcp.tool(description="Your tool description")def your_tool_name(param1: str, param2: int) -> str: """Tool implementation with type hints""" # Your tool logic here return f"Result: {param1} with {param2}"
The FastMCP library automatically handles:
- Type validation based on your function’s type hints
- JSON schema generation for the MCP protocol
- Error handling and response formatting
Adding Resources
Section titled “Adding Resources”Resources provide context to the AI assistant. You can add resources using the @mcp.resource
decorator:
@mcp.resource("example://static-resource", description="Static resource example")def static_resource() -> str: """Return static content""" return "This is static content that provides context to the AI"
@mcp.resource("dynamic://resource/{item_id}", description="Dynamic resource example")def dynamic_resource(item_id: str) -> str: """Return dynamic content based on parameters""" # Fetch data based on item_id data = fetch_data_for_item(item_id) return f"Dynamic content for {item_id}: {data}"
Configuring with AI Assistants
Section titled “Configuring with AI Assistants”Configuration Files
Section titled “Configuration Files”Most AI assistants that support MCP use a similar configuration approach. You’ll need to create or update a configuration file with your MCP server details:
{ "mcpServers": { "your-mcp-server": { "command": "uv", "args": [ "run", "python", "-m", "my_module.mcp_server.stdio" ], "env": { "VIRTUAL_ENV": "/path/to/your/project/.venv" } } }}
Assistant-Specific Configuration
Section titled “Assistant-Specific Configuration”Please refer to the following documentation for configuring MCP with specific AI Assistants:
Running Your MCP Server
Section titled “Running Your MCP Server”Inspector
Section titled “Inspector”The generator configures a target named <your-server-name>-inspect
, which starts the MCP Inspector with the configuration to connect to your MCP server using STDIO transport.
pnpm nx run your-project:your-server-name-inspect
yarn nx run your-project:your-server-name-inspect
npx nx run your-project:your-server-name-inspect
bunx nx run your-project:your-server-name-inspect
This will start the inspector at http://localhost:6274
. Get started by clicking on the “Connect” button.
The easiest way to test and use an MCP server is by using the inspector or configuring it with an AI assistant (as above).
You can however run your server with STDIO transport directly using the <your-server-name>-serve-stdio
target.
pnpm nx run your-project:your-server-name-serve-stdio
yarn nx run your-project:your-server-name-serve-stdio
npx nx run your-project:your-server-name-serve-stdio
bunx nx run your-project:your-server-name-serve-stdio
This command uses uv run
to execute your MCP server with STDIO transport.
Streamable HTTP
Section titled “Streamable HTTP”If you would like to run your MCP server locally using Streamable HTTP transport, you can use the <your-server-name>-serve-http
target.
pnpm nx run your-project:your-server-name-serve-http
yarn nx run your-project:your-server-name-serve-http
npx nx run your-project:your-server-name-serve-http
bunx nx run your-project:your-server-name-serve-http
This command uses uv run
to execute your MCP server with HTTP transport, typically running on port 8000.
Deploying Your MCP Server to Bedrock AgentCore Runtime
Section titled “Deploying Your MCP Server to Bedrock AgentCore Runtime”Infrastructure as Code
Section titled “Infrastructure as Code”If you selected BedrockAgentCoreRuntime
for computeType
, the relevant CDK or Terraform infrastructure is generated which you can use to deploy your MCP server to Amazon Bedrock AgentCore Runtime.
A CDK construct is generated for your, named based on the name
you chose when running the generator, or <ProjectName>McpServer
by default.
You can use this CDK construct in a CDK application:
import { MyProjectMcpServer } from ':my-scope/common-constructs';
export class ExampleStack extends Stack { constructor(scope: Construct, id: string) { // Add the MCP server to your stack new MyProjectMcpServer(this, 'MyProjectMcpServer'); }}
A Terraform module is generated for you, named based on the name
you chose when running the generator, or <ProjectName>-mcp-server
by default.
You can use this terraform module in a Terraform project:
# MCP Servermodule "my_project_mcp_server" { # Relative path to the generated module in the common/terraform project source = "../../common/terraform/src/app/mcp-servers/my-project-mcp-server"}
Authentication
Section titled “Authentication”By default, your MCP server will be secured using IAM authentication, simply deploy it without any arguments:
import { MyProjectMcpServer } from ':my-scope/common-constructs';
export class ExampleStack extends Stack { constructor(scope: Construct, id: string) { new MyProjectMcpServer(this, 'MyProjectMcpServer'); }}
You can grant access to invoke your MCP on Bedrock AgentCore Runtime using the grantInvoke
method. For example you may wish for an agent generated with the py#strands-agent
generator to call your MCP server:
import { MyProjectAgent, MyProjectMcpServer } from ':my-scope/common-constructs';
export class ExampleStack extends Stack { constructor(scope: Construct, id: string) { const agent = new MyProjectAgent(this, 'MyProjectAgent'); const mcpServer = new MyProjectMcpServer(this, 'MyProjectMcpServer');
mcpServer.agentCoreRuntime.grantInvoke(agent.agentCoreRuntime); }}
# MCP Servermodule "my_project_mcp_server" { # Relative path to the generated module in the common/terraform project source = "../../common/terraform/src/app/mcp-servers/my-project-mcp-server"}
To grant access to invoke your agent, you will need to add a policy such as the following, referencing the module.my_project_mcp_server.agent_core_runtime_arn
output:
{ Effect = "Allow" Action = [ "bedrock-agentcore:InvokeAgentRuntime" ] Resource = [ module.my_project_mcp_server.agent_core_runtime_arn, "${module.my_project_mcp_server.agent_core_runtime_arn}/*" ]}
Cognito JWT Authentication
Section titled “Cognito JWT Authentication”The below demonstrates how to configure Cognito authentication for your agent.
To configure JWT authentication, you can pass the authorizerConfiguration
property to your MCP server construct. Here is an example which configures a Cognito user pool and client to secure the MCP server:
import { MyProjectMcpServer } from ':my-scope/common-constructs';
export class ExampleStack extends Stack { constructor(scope: Construct, id: string) { const userPool = new UserPool(this, 'UserPool'); const client = userPool.addClient('Client', { authFlows: { userPassword: true, }, });
new MyProjectMcpServer(this, 'MyProjectMcpServer', { authorizerConfiguration: { customJWTAuthorizer: { discoveryUrl: `https://cognito-idp.${Stack.of(userPool).region}.amazonaws.com/${userPool.userPoolId}/.well-known/openid-configuration`, allowedClients: [client.userPoolClientId], }, }, }); }}
To configure JWT authentication, you can edit your MCP Server module to configure the customJWTAuthorizer
variable as follows:
data "aws_region" "current" {}
locals { aws_region = data.aws_region.current.name
# Replace with your user pool and client ids or expose as variables user_pool_id = "xxx" user_pool_client_ids = ["yyy"]}
module "agent_core_runtime" { source = "../../../core/agent-core" agent_runtime_name = "MyProjectMcpServer" docker_image_tag = "my-scope-my-project-agent:latest" server_protocol = "MCP" customJWTAuthorizer = { discoveryUrl = "https://cognito-idp.${local.aws_region}.amazonaws.com/${local.user_pool_id}/.well-known/openid-configuration", allowedClients = local.user_pool_client_ids } env = var.env additional_iam_policy_statements = var.additional_iam_policy_statements tags = var.tags}
Bundle and Docker Targets
Section titled “Bundle and Docker Targets”In order to build your MCP server for Bedrock AgentCore Runtime, a bundle
target is added to your project, which:
- Exports your Python dependencies to a
requirements.txt
file usinguv export
- Installs dependencies for the target platform (
aarch64-manylinux2014
) usinguv pip install
A docker
target specific to your MCP server is also added, which:
- Builds a docker image from the
Dockerfile
which runs your MCP server on port8000
, as per the MCP protocol contract
Observability
Section titled “Observability”Your MCP server is automatically configured with observability using the AWS Distro for Open Telemetry (ADOT), by configuring auto-instrumentation in your Dockerfile
.
You can find traces in the CloudWatch AWS Console, by selecting “GenAI Observability” in the menu. Note that for traces to be populated you will need to enable Transaction Search.
For more details, refer to the AgentCore documentation on observability.