Skip to content

Python MCP Server

Generate a Python Model Context Protocol (MCP) server for providing context to Large Language Models (LLMs), and optionally deploy it to Amazon Bedrock AgentCore.

The Model Context Protocol (MCP) is an open standard that allows AI assistants to interact with external tools and resources. It provides a consistent way for LLMs to:

  • Execute tools (functions) that perform actions or retrieve information
  • Access resources that provide context or data

You can generate a Python MCP server in two ways:

  1. Install the Nx Console VSCode Plugin if you haven't already
  2. Open the Nx Console in VSCode
  3. Click Generate (UI) in the "Common Nx Commands" section
  4. Search for @aws/nx-plugin - py#mcp-server
  5. Fill in the required parameters
    • Click Generate
    Parameter Type Default Description
    project Required string - The project to add an MCP server to
    computeType string BedrockAgentCoreRuntime The type of compute to host your MCP server. Select None for no hosting.
    name string - The name of your MCP server (default: mcp-server)
    iacProvider string CDK The preferred IaC provider

    The generator will add the following files to your existing Python project:

    • Directoryyour-project/
      • Directoryyour_module/
        • Directorymcp_server/ (or custom name if specified)
          • __init__.py Python package initialization
          • server.py Main server definition with sample tools and resources
          • stdio.py Entry point for STDIO transport, useful for simple local MCP servers
          • http.py Entry point for Streamable HTTP transport, useful for hosting your MCP server
          • Dockerfile Entry point for hosting your MCP server (excluded when computeType is set to None)
      • pyproject.toml Updated with MCP dependencies
      • project.json Updated with MCP server serve targets

    Since this generator vends infrastructure as code based on your chosen iacProvider, it will create a project in packages/common which includes the relevant CDK constructs or Terraform modules.

    The common infrastructure as code project is structured as follows:

    • Directorypackages/common/constructs
      • Directorysrc
        • Directoryapp/ Constructs for infrastructure specific to a project/generator
        • Directorycore/ Generic constructs which are reused by constructs in app
        • index.ts Entry point exporting constructs from app
      • project.json Project build targets and configuration

    For deploying your MCP Server, the following files are generated:

    • Directorypackages/common/constructs/src
      • Directoryapp
        • Directorymcp-servers
          • Directory<project-name>
            • <project-name>.ts CDK construct for deploying your MCP Server
            • Dockerfile Passthrough docker file used by the CDK construct
      • Directorycore
        • Directoryagent-core
          • runtime.ts Generic CDK construct for deploying to Bedrock AgentCore Runtime

    Tools are functions that the AI assistant can call to perform actions. The Python MCP server uses the MCP Python SDK (FastMCP) library, which provides a simple decorator-based approach for defining tools.

    You can add new tools in the server.py file:

    @mcp.tool(description="Your tool description")
    def your_tool_name(param1: str, param2: int) -> str:
    """Tool implementation with type hints"""
    # Your tool logic here
    return f"Result: {param1} with {param2}"

    The FastMCP library automatically handles:

    • Type validation based on your function’s type hints
    • JSON schema generation for the MCP protocol
    • Error handling and response formatting

    Resources provide context to the AI assistant. You can add resources using the @mcp.resource decorator:

    @mcp.resource("example://static-resource", description="Static resource example")
    def static_resource() -> str:
    """Return static content"""
    return "This is static content that provides context to the AI"
    @mcp.resource("dynamic://resource/{item_id}", description="Dynamic resource example")
    def dynamic_resource(item_id: str) -> str:
    """Return dynamic content based on parameters"""
    # Fetch data based on item_id
    data = fetch_data_for_item(item_id)
    return f"Dynamic content for {item_id}: {data}"

    Most AI assistants that support MCP use a similar configuration approach. You’ll need to create or update a configuration file with your MCP server details:

    {
    "mcpServers": {
    "your-mcp-server": {
    "command": "uv",
    "args": [
    "run",
    "python",
    "-m",
    "my_module.mcp_server.stdio"
    ],
    "env": {
    "VIRTUAL_ENV": "/path/to/your/project/.venv"
    }
    }
    }
    }

    Please refer to the following documentation for configuring MCP with specific AI Assistants:

    The generator configures a target named <your-server-name>-inspect, which starts the MCP Inspector with the configuration to connect to your MCP server using STDIO transport.

    Terminal window
    pnpm nx run your-project:your-server-name-inspect

    This will start the inspector at http://localhost:6274. Get started by clicking on the “Connect” button.

    The easiest way to test and use an MCP server is by using the inspector or configuring it with an AI assistant (as above).

    You can however run your server with STDIO transport directly using the <your-server-name>-serve-stdio target.

    Terminal window
    pnpm nx run your-project:your-server-name-serve-stdio

    This command uses uv run to execute your MCP server with STDIO transport.

    If you would like to run your MCP server locally using Streamable HTTP transport, you can use the <your-server-name>-serve-http target.

    Terminal window
    pnpm nx run your-project:your-server-name-serve-http

    This command uses uv run to execute your MCP server with HTTP transport, typically running on port 8000.

    Deploying Your MCP Server to Bedrock AgentCore Runtime

    Section titled “Deploying Your MCP Server to Bedrock AgentCore Runtime”

    If you selected BedrockAgentCoreRuntime for computeType, the relevant CDK or Terraform infrastructure is generated which you can use to deploy your MCP server to Amazon Bedrock AgentCore Runtime.

    A CDK construct is generated for your, named based on the name you chose when running the generator, or <ProjectName>McpServer by default.

    You can use this CDK construct in a CDK application:

    import { MyProjectMcpServer } from ':my-scope/common-constructs';
    export class ExampleStack extends Stack {
    constructor(scope: Construct, id: string) {
    // Add the MCP server to your stack
    new MyProjectMcpServer(this, 'MyProjectMcpServer');
    }
    }

    By default, your MCP server will be secured using IAM authentication, simply deploy it without any arguments:

    import { MyProjectMcpServer } from ':my-scope/common-constructs';
    export class ExampleStack extends Stack {
    constructor(scope: Construct, id: string) {
    new MyProjectMcpServer(this, 'MyProjectMcpServer');
    }
    }

    You can grant access to invoke your MCP on Bedrock AgentCore Runtime using the grantInvoke method. For example you may wish for an agent generated with the py#strands-agent generator to call your MCP server:

    import { MyProjectAgent, MyProjectMcpServer } from ':my-scope/common-constructs';
    export class ExampleStack extends Stack {
    constructor(scope: Construct, id: string) {
    const agent = new MyProjectAgent(this, 'MyProjectAgent');
    const mcpServer = new MyProjectMcpServer(this, 'MyProjectMcpServer');
    mcpServer.agentCoreRuntime.grantInvoke(agent.agentCoreRuntime);
    }
    }

    The below demonstrates how to configure Cognito authentication for your agent.

    To configure JWT authentication, you can pass the authorizerConfiguration property to your MCP server construct. Here is an example which configures a Cognito user pool and client to secure the MCP server:

    import { MyProjectMcpServer } from ':my-scope/common-constructs';
    export class ExampleStack extends Stack {
    constructor(scope: Construct, id: string) {
    const userPool = new UserPool(this, 'UserPool');
    const client = userPool.addClient('Client', {
    authFlows: {
    userPassword: true,
    },
    });
    new MyProjectMcpServer(this, 'MyProjectMcpServer', {
    authorizerConfiguration: {
    customJWTAuthorizer: {
    discoveryUrl: `https://cognito-idp.${Stack.of(userPool).region}.amazonaws.com/${userPool.userPoolId}/.well-known/openid-configuration`,
    allowedClients: [client.userPoolClientId],
    },
    },
    });
    }
    }

    In order to build your MCP server for Bedrock AgentCore Runtime, a bundle target is added to your project, which:

    • Exports your Python dependencies to a requirements.txt file using uv export
    • Installs dependencies for the target platform (aarch64-manylinux2014) using uv pip install

    A docker target specific to your MCP server is also added, which:

    • Builds a docker image from the Dockerfile which runs your MCP server on port 8000, as per the MCP protocol contract

    Your MCP server is automatically configured with observability using the AWS Distro for Open Telemetry (ADOT), by configuring auto-instrumentation in your Dockerfile.

    You can find traces in the CloudWatch AWS Console, by selecting “GenAI Observability” in the menu. Note that for traces to be populated you will need to enable Transaction Search.

    For more details, refer to the AgentCore documentation on observability.