Python Strands Agent
Generate a Python Strands Agent for building AI agents with tools, and optionally deploy it to Amazon Bedrock AgentCore Runtime.
What is Strands?
Section titled “What is Strands?”Strands is a lightweight, production-ready Python framework for building AI agents. Key features include:
- Lightweight and customizable: Simple agent loop that gets out of your way
- Production ready: Full observability, tracing, and deployment options for scale
- Model and provider agnostic: Supports many different models from various providers
- Community-driven tools: Powerful set of community-contributed tools
- Multi-agent support: Advanced techniques like agent teams and autonomous agents
- Flexible interaction modes: Conversational, streaming, and non-streaming support
Generate a Strands Agent
Section titled “Generate a Strands Agent”You can generate a Python Strands Agent in two ways:
- Install the Nx Console VSCode Plugin if you haven't already
- Open the Nx Console in VSCode
- Click
Generate (UI)in the "Common Nx Commands" section - Search for
@aws/nx-plugin - py#strands-agent - Fill in the required parameters
- Click
Generate
pnpm nx g @aws/nx-plugin:py#strands-agentyarn nx g @aws/nx-plugin:py#strands-agentnpx nx g @aws/nx-plugin:py#strands-agentbunx nx g @aws/nx-plugin:py#strands-agentYou can also perform a dry-run to see what files would be changed
pnpm nx g @aws/nx-plugin:py#strands-agent --dry-runyarn nx g @aws/nx-plugin:py#strands-agent --dry-runnpx nx g @aws/nx-plugin:py#strands-agent --dry-runbunx nx g @aws/nx-plugin:py#strands-agent --dry-runOptions
Section titled “Options”| Parameter | Type | Default | Description |
|---|---|---|---|
| project Required | string | - | The project to add the Strands Agent to |
| computeType | string | BedrockAgentCoreRuntime | The type of compute to host your Strands Agent. |
| name | string | - | The name of your Strands Agent (default: agent) |
| iacProvider | string | Inherit | The preferred IaC provider. By default this is inherited from your initial selection. |
Generator Output
Section titled “Generator Output”The generator will add the following files to your existing Python project:
Directoryyour-project/
Directoryyour_module/
Directoryagent/ (or custom name if specified)
- __init__.py Python package initialization
- agent.py Main agent definition with sample tools
- main.py Entry point for Bedrock AgentCore Runtime
- agentcore_mcp_client.py Client factory useful for invoking MCP servers also hosted on Bedrock AgentCore runtime
- Dockerfile Entry point for hosting your agent (excluded when
computeTypeis set toNone)
- pyproject.toml Updated with Strands dependencies
- project.json Updated with agent serve targets
Infrastructure
Section titled “Infrastructure”Since this generator vends infrastructure as code based on your chosen iacProvider, it will create a project in packages/common which includes the relevant CDK constructs or Terraform modules.
The common infrastructure as code project is structured as follows:
Directorypackages/common/constructs
Directorysrc
Directoryapp/ Constructs for infrastructure specific to a project/generator
- …
Directorycore/ Generic constructs which are reused by constructs in
app- …
- index.ts Entry point exporting constructs from
app
- project.json Project build targets and configuration
Directorypackages/common/terraform
Directorysrc
Directoryapp/ Terraform modules for infrastructure specific to a project/generator
- …
Directorycore/ Generic modules which are reused by modules in
app- …
- project.json Project build targets and configuration
For deploying your Strands Agent, the following files are generated:
Directorypackages/common/constructs/src
Directoryapp
Directoryagents
Directory<project-name>
- <project-name>.ts CDK construct for deploying your agent
- Dockerfile Passthrough docker file used by the CDK construct
Directorycore
Directoryagent-core
- runtime.ts Generic CDK construct for deploying to Bedrock AgentCore Runtime
Directorypackages/common/terraform/src
Directoryapp
Directoryagents
Directory<project-name>
- <project-name>.tf Module for deploying your agent
Directorycore
Directoryagent-core
- runtime.tf Generic module for deploying to Bedrock AgentCore Runtime
Working with Your Strands Agent
Section titled “Working with Your Strands Agent”Adding Tools
Section titled “Adding Tools”Tools are functions that the AI agent can call to perform actions. The Strands framework uses a simple decorator-based approach for defining tools.
You can add new tools in the agent.py file:
from strands import Agent, tool
@tooldef calculate_sum(numbers: list[int]) -> int: """Calculate the sum of a list of numbers""" return sum(numbers)
@tooldef get_weather(city: str) -> str: """Get weather information for a city""" # Your weather API integration here return f"Weather in {city}: Sunny, 25°C"
# Add tools to your agentagent = Agent( system_prompt="You are a helpful assistant with access to various tools.", tools=[calculate_sum, get_weather],)The Strands framework automatically handles:
- Type validation based on your function’s type hints
- JSON schema generation for tool calling
- Error handling and response formatting
Using Pre-built Tools
Section titled “Using Pre-built Tools”Strands provides a collection of pre-built tools through the strands-tools package:
from strands_tools import current_time, http_request, file_read
agent = Agent( system_prompt="You are a helpful assistant.", tools=[current_time, http_request, file_read],)Model Configuration
Section titled “Model Configuration”By default, Strands agents use Claude 4 Sonnet, but you can customize the model provider. See the Strands documentation on model providers for configuration options:
from strands import Agentfrom strands.models import BedrockModel
# Create a BedrockModelbedrock_model = BedrockModel( model_id="anthropic.claude-sonnet-4-20250514-v1:0", region_name="us-west-2", temperature=0.3,)
agent = Agent(model=bedrock_model)Consuming MCP Servers
Section titled “Consuming MCP Servers”You can add tools from MCP servers to your Strands agent.
For consuming MCP Servers which you have created using the py#mcp-server or ts#mcp-server generators (or others hosted on Bedrock AgentCore Runtime), a client factory is generated for you in agentcore_mcp_client.py.
You can update your get_agent method in agent.py to create MCP clients and add tools. The following example shows how to perform this with IAM (SigV4) authentication:
import osfrom contextlib import contextmanager
import boto3from strands import Agent
from .agentcore_mcp_client import AgentCoreMCPClient
# Obtain the region an credentialsregion = os.environ["AWS_REGION"]boto_session = boto3.Session(region_name=region)credentials = boto_session.get_credentials()
@contextmanagerdef get_agent(session_id: str): mcp_client = AgentCoreMCPClient.with_iam_auth( agent_runtime_arn=os.environ["MCP_AGENTCORE_RUNTIME_ARN"], credentials=credentials, region=region, session_id=session_id, )
with mcp_client: mcp_tools = mcp_client.list_tools_sync()
yield Agent( system_prompt="..." tools=[*mcp_tools], )With the IAM authentication example above, we need to configure two things in our infrastructure. Firstly, we need to add the environment variable our agent is consuming for our MCP server’s AgentCore Runtime ARN, and secondly we need to grant our agent permissions to invoke the MCP server. This can be achieved as follows:
import { MyProjectAgent, MyProjectMcpServer } from ':my-scope/common-constructs';
export class ExampleStack extends Stack { constructor(scope: Construct, id: string) { const mcpServer = new MyProjectMcpServer(this, 'MyProjectMcpServer');
const agent = new MyProjectAgent(this, 'MyProjectAgent', { environment: { MCP_AGENTCORE_RUNTIME_ARN: mcpServer.agentCoreRuntime.arn, }, });
mcpServer.agentCoreRuntime.grantInvoke(agent.agentCoreRuntime); }}# MCP Servermodule "my_project_mcp_server" { source = "../../common/terraform/src/app/mcp-servers/my-project-mcp-server"}
# Agentmodule "my_project_agent" { source = "../../common/terraform/src/app/agents/my-project-agent"
env = { MCP_AGENTCORE_RUNTIME_ARN = module.my_project_mcp_server.agent_core_runtime_arn }
additional_iam_policy_statements = [ { Effect = "Allow" Action = [ "bedrock-agentcore:InvokeAgentRuntime" ] Resource = [ module.my_project_mcp_server.agent_core_runtime_arn, "${module.my_project_mcp_server.agent_core_runtime_arn}/*" ] } ]}For a more in-depth guide to writing Strands agents, refer to the Strands documentation.
FastAPI Server
Section titled “FastAPI Server”The generator uses FastAPI to create the HTTP server for your Strands Agent. FastAPI provides a modern, fast web framework for building APIs with Python, with automatic API documentation and type validation.
The generated server includes:
- FastAPI application setup with CORS middleware
- Error handling middleware
- OpenAPI schema generation
- Health check endpoint (
/ping) - Agent invocation endpoint (
/invocations)
Customizing Invoke Inputs and Outputs with Pydantic
Section titled “Customizing Invoke Inputs and Outputs with Pydantic”The agent’s invocation endpoint uses Pydantic models to define and validate the request and response schemas. You can customize these models in main.py to match your agent’s requirements.
Defining Input Models
Section titled “Defining Input Models”The default InvokeInput model accepts a prompt and session ID.
from pydantic import BaseModel
class InvokeInput(BaseModel): prompt: str session_id: strYou can extend this model to include any additional fields your agent needs.
Defining Output Models
Section titled “Defining Output Models”For streaming responses, the return type annotation on your endpoint corresponds to the type of each value yielded by your generator function. By default, the agent yields strings containing the agent’s response text as it streams back from Strands:
@app.post("/invocations", openapi_extra={"x-streaming": True})async def invoke(input: InvokeInput) -> str: """Entry point for agent invocation""" return StreamingResponse(handle_invoke(input), media_type="text/event-stream")You can define a Pydantic model to yield structured data instead:
from pydantic import BaseModel
class StreamChunk(BaseModel): content: str timestamp: str token_count: int
@app.post("/invocations", openapi_extra={"x-streaming": True})async def invoke(input: InvokeInput) -> StreamChunk: return StreamingResponse(handle_invoke(input), media_type="application/json")Bedrock AgentCore Python SDK
Section titled “Bedrock AgentCore Python SDK”The generator includes a dependency on the Bedrock AgentCore Python SDK for the PingStatus constants. If desired, it is straightforward to use BedrockAgentCoreApp instead of FastAPI, however note that type-safety is lost.
You can find more details about the SDK’s capabilities in the documentation here.
Running Your Strands Agent
Section titled “Running Your Strands Agent”Local Development
Section titled “Local Development”The generator configures a target named <your-agent-name>-serve, which starts your Strands Agent locally for development and testing.
pnpm nx run your-project:agent-serveyarn nx run your-project:agent-servenpx nx run your-project:agent-servebunx nx run your-project:agent-serveThis command uses uv run to execute your Strands Agent using the Bedrock AgentCore Python SDK.
Deploying Your Strands Agent to Bedrock AgentCore Runtime
Section titled “Deploying Your Strands Agent to Bedrock AgentCore Runtime”Infrastructure as Code
Section titled “Infrastructure as Code”If you selected BedrockAgentCoreRuntime for computeType, the relevant CDK or Terraform infrastructure is generated which you can use to deploy your Strands Agent to Amazon Bedrock AgentCore Runtime.
A CDK construct is generated for your, named based on the name you chose when running the generator, or <ProjectName>Agent by default.
You can use this CDK construct in a CDK application:
import { MyProjectAgent } from ':my-scope/common-constructs';
export class ExampleStack extends Stack { constructor(scope: Construct, id: string) { // Add the agent to your stack const agent = new MyProjectAgent(this, 'MyProjectAgent');
// Grant permissions to invoke the relevant models in bedrock agent.agentCoreRuntime.role.addToPolicy( new PolicyStatement({ actions: [ 'bedrock:InvokeModel', 'bedrock:InvokeModelWithResponseStream', ], // You can scope the below down to the specific models you use resources: ['arn:aws:bedrock:*::foundation-model/*'], }), ); }}A Terraform module is generated for you, named based on the name you chose when running the generator, or <ProjectName>-agent by default.
You can use this terraform module in a Terraform project:
# Agentmodule "my_project_agent" { # Relative path to the generated module in the common/terraform project source = "../../common/terraform/src/app/agents/my-project-agent"
# Grant permissions to invoke the relevant models in bedrock additional_iam_policy_statements = [ { Effect = "Allow" Action = [ "bedrock:InvokeModel", "bedrock:InvokeModelWithResponseStream" ] Resource = [ "arn:aws:bedrock:*::foundation-model/*" ] } ]}Bundle and Docker Targets
Section titled “Bundle and Docker Targets”In order to build your Strands Agent for Bedrock AgentCore Runtime, a bundle target is added to your project, which:
- Exports your Python dependencies to a
requirements.txtfile usinguv export - Installs dependencies for the target platform (
aarch64-manylinux2014) usinguv pip install
A docker target specific to your Strands Agent is also added, which:
- Builds a docker image from the
Dockerfilewhich runs your agent, as per the AgentCore runtime contract
Authentication
Section titled “Authentication”By default, your Strands Agent will be secured using IAM authentication, simply deploy it without any arguments:
import { MyProjectAgent } from ':my-scope/common-constructs';
export class ExampleStack extends Stack { constructor(scope: Construct, id: string) { new MyProjectAgent(this, 'MyProjectAgent'); }}You can grant access to invoke your agent on Bedrock AgentCore Runtime using the grantInvoke method, for example:
import { MyProjectAgent } from ':my-scope/common-constructs';
export class ExampleStack extends Stack { constructor(scope: Construct, id: string) { const agent = new MyProjectAgent(this, 'MyProjectAgent'); const lambdaFunction = new Function(this, ...);
agent.agentCoreRuntime.grantInvoke(lambdaFunction); }}# Agentmodule "my_project_agent" { # Relative path to the generated module in the common/terraform project source = "../../common/terraform/src/app/agents/my-project-agent"}To grant access to invoke your agent, you will need to add a policy such as the following, referencing the module.my_project_agent.agent_core_runtime_arn output:
{ Effect = "Allow" Action = [ "bedrock-agentcore:InvokeAgentRuntime" ] Resource = [ module.my_project_agent.agent_core_runtime_arn, "${module.my_project_agent.agent_core_runtime_arn}/*" ]}Cognito JWT Authentication
Section titled “Cognito JWT Authentication”The below demonstrates how to configure Cognito authentication for your agent.
To configure JWT authentication, you can pass the authorizerConfiguration property to your agent construct. Here is an example which configures a Cognito user pool and client to secure the agent:
import { MyProjectAgent } from ':my-scope/common-constructs';
export class ExampleStack extends Stack { constructor(scope: Construct, id: string) { const userPool = new UserPool(this, 'UserPool'); const client = userPool.addClient('Client', { authFlows: { userPassword: true, }, });
new MyProjectAgent(this, 'MyProjectAgent', { authorizerConfiguration: { customJwtAuthorizer: { discoveryUrl: `https://cognito-idp.${Stack.of(userPool).region}.amazonaws.com/${userPool.userPoolId}/.well-known/openid-configuration`, allowedClients: [client.userPoolClientId], }, }, }); }}To configure JWT authentication, you can edit your agent module to configure the customJWTAuthorizer variable as follows:
data "aws_region" "current" {}
locals { aws_region = data.aws_region.current.name
# Replace with your user pool and client ids or expose as variables user_pool_id = "xxx" user_pool_client_ids = ["yyy"]}
module "agent_core_runtime" { source = "../../../core/agent-core" agent_runtime_name = "MyProjectAgent" docker_image_tag = "my-scope-my-project-agent:latest" server_protocol = "HTTP" customJWTAuthorizer = { discoveryUrl = "https://cognito-idp.${local.aws_region}.amazonaws.com/${local.user_pool_id}/.well-known/openid-configuration", allowedClients = local.user_pool_client_ids } env = var.env additional_iam_policy_statements = var.additional_iam_policy_statements tags = var.tags}Observability
Section titled “Observability”Your agent is automatically configured with observability using the AWS Distro for Open Telemetry (ADOT), by configuring auto-instrumentation in your Dockerfile.
You can find traces in the CloudWatch AWS Console, by selecting “GenAI Observability” in the menu. Note that for traces to be populated you will need to enable Transaction Search.
For more details, refer to the AgentCore documentation on observability.