FastAPI
FastAPI is a framework for building APIs in Python.
The FastAPI generator creates a new FastAPI with AWS CDK infrastructure setup. The generated backend uses AWS Lambda for serverless deployment, exposed via an AWS API Gateway HTTP API. It sets up AWS Lambda Powertools for observability, including logging, AWS X-Ray tracing and Cloudwatch Metrics.
Usage
Generate a FastAPI
You can generate a new FastAPI in two ways:
- Install the Nx Console VSCode Plugin if you haven't already
- Open the Nx Console in VSCode
- Click
Generate (UI)
in the "Common Nx Commands" section - Search for
@aws/nx-plugin - py#fast-api
- Fill in the required parameters
- Click
Generate
pnpm nx g @aws/nx-plugin:py#fast-api
yarn nx g @aws/nx-plugin:py#fast-api
npx nx g @aws/nx-plugin:py#fast-api
bunx nx g @aws/nx-plugin:py#fast-api
You can also perform a dry-run to see what files would be changed
pnpm nx g @aws/nx-plugin:py#fast-api --dry-run
yarn nx g @aws/nx-plugin:py#fast-api --dry-run
npx nx g @aws/nx-plugin:py#fast-api --dry-run
bunx nx g @aws/nx-plugin:py#fast-api --dry-run
Options
Parameter | Type | Default | Description |
---|---|---|---|
name required | string | - | project name. |
directory | string | packages | The directory to store the application in. |
Generator Output
The generator will create the following project structure in the <directory>/<api-name>
directory:
- project.json Project configuration and build targets
- pyproject.toml Python project configuration and dependencies
Directory<module_name>
- __init__.py Module initialisation
- init.py Sets the up FastAPI app and configures powertools middleware
- main.py API implementation
The generator will also create CDK constructs which can be used to deploy your API, which reside in the packages/common/constructs
directory.
Implementing your FastAPI
The main API implementation is in main.py
. This is where you define your API routes and their implementations. Here’s an example:
from .init import app, tracerfrom pydantic import BaseModel
class Item(BaseModel): name: str
@app.get("/items/{item_id}")def get_item(item_id: int) -> Item: return Item(name=...)
@app.post("/items")def create_item(item: Item): return ...
The generator sets up several features automatically:
- AWS Lambda Powertools integration for observability
- Error handling middleware
- Request/response correlation
- Metrics collection
- AWS Lambda handler using Mangum
Observability with AWS Lambda Powertools
Logging
The generator configures structured logging using AWS Lambda Powertools. You can access the logger in your route handlers:
from .init import app, logger
@app.get("/items/{item_id}")def read_item(item_id: int): logger.info("Fetching item", extra={"item_id": item_id}) return {"item_id": item_id}
The logger automatically includes:
- Correlation IDs for request tracing
- Request path and method
- Lambda context information
- Cold start indicators
Tracing
AWS X-Ray tracing is configured automatically. You can add custom subsegments to your traces:
from .init import app, tracer
@app.get("/items/{item_id}")@tracer.capture_methoddef read_item(item_id: int): # Creates a new subsegment with tracer.provider.in_subsegment("fetch-item-details"): # Your logic here return {"item_id": item_id}
Metrics
CloudWatch metrics are collected automatically for each request. You can add custom metrics:
from .init import app, metricsfrom aws_lambda_powertools.metrics import MetricUnit
@app.get("/items/{item_id}")def read_item(item_id: int): metrics.add_metric(name="ItemViewed", unit=MetricUnit.Count, value=1) return {"item_id": item_id}
Default metrics include:
- Request counts
- Success/failure counts
- Cold start metrics
- Per-route metrics
Error Handling
The generator includes comprehensive error handling:
from fastapi import HTTPException
@app.get("/items/{item_id}")def read_item(item_id: int): if item_id < 0: raise HTTPException(status_code=400, detail="Item ID must be positive") return {"item_id": item_id}
Unhandled exceptions are caught by the middleware and:
- Log the full exception with stack trace
- Record a failure metric
- Return a safe 500 response to the client
- Preserve the correlation ID
Streaming
With FastAPI, you can stream a response to the caller with the StreamingResponse
response type.
Infrastructure Changes
Since AWS API Gateway does not support streaming responses, you will need to deploy your FastAPI to a platform which supports this. The simplest option is to use an AWS Lambda Function URL. To achieve this, you can adjust the generated HttpApi
construct to add an option for streaming, and conditionally instantiate the relevant constructs.
Example Changes
import { Construct } from 'constructs';import { CfnOutput, Duration } from 'aws-cdk-lib';import { CfnOutput, Duration, Stack } from 'aws-cdk-lib'; import { CorsHttpMethod, HttpApi as _HttpApi,@@ -7,7 +7,16 @@ import { IHttpRouteAuthorizer, } from 'aws-cdk-lib/aws-apigatewayv2';
}, });
this.api = new _HttpApi(this, id, { corsPreflight: { allowOrigins: props.allowedOrigins ?? ['*'], allowMethods: [CorsHttpMethod.ANY], allowHeaders: [ 'authorization', 'content-type', 'x-amz-content-sha256', 'x-amz-date', 'x-amz-security-token', ], }, defaultAuthorizer: props.defaultAuthorizer, }); let apiUrl; if (props.apiType === 'api-gateway') { this.api = new _HttpApi(this, id, { corsPreflight: { allowOrigins: props.allowedOrigins ?? ['*'], allowMethods: [CorsHttpMethod.ANY], allowHeaders: [ 'authorization', 'content-type', 'x-amz-content-sha256', 'x-amz-date', 'x-amz-security-token', ], }, defaultAuthorizer: props.defaultAuthorizer, });
this.api.addRoutes({ path: '/{proxy+}', methods: [ HttpMethod.GET, HttpMethod.DELETE, HttpMethod.POST, HttpMethod.PUT, HttpMethod.PATCH, HttpMethod.HEAD, ], integration: new HttpLambdaIntegration( 'RouterIntegration', this.routerFunction, ), }); this.api.addRoutes({ path: '/{proxy+}', methods: [ HttpMethod.GET, HttpMethod.DELETE, HttpMethod.POST, HttpMethod.PUT, HttpMethod.PATCH, HttpMethod.HEAD, ], integration: new HttpLambdaIntegration( 'RouterIntegration', this.routerFunction, ), }); apiUrl = this.api.url; } else { const stack = Stack.of(this); this.routerFunction.addLayers( LayerVersion.fromLayerVersionArn( this, 'LWALayer', `arn:aws:lambda:${stack.region}:753240598075:layer:LambdaAdapterLayerX86:24`, ), ); this.routerFunction.addEnvironment('PORT', '8000'); this.routerFunction.addEnvironment( 'AWS_LWA_INVOKE_MODE', 'response_stream', ); this.routerFunction.addEnvironment( 'AWS_LAMBDA_EXEC_WRAPPER', '/opt/bootstrap', ); this.routerFunctionUrl = this.routerFunction.addFunctionUrl({ authType: FunctionUrlAuthType.AWS_IAM, invokeMode: InvokeMode.RESPONSE_STREAM, cors: { allowedOrigins: props.allowedOrigins ?? ['*'], allowedHeaders: [ 'authorization', 'content-type', 'x-amz-content-sha256', 'x-amz-date', 'x-amz-security-token', ], }, }); apiUrl = this.routerFunctionUrl.url; }
new CfnOutput(this, `${props.apiName}Url`, { value: this.api.url! }); new CfnOutput(this, `${props.apiName}Url`, { value: apiUrl! });
RuntimeConfig.ensure(this).config.httpApis = { ...RuntimeConfig.ensure(this).config.httpApis!, [props.apiName]: this.api.url!, [props.apiName]: apiUrl, }; }
public grantInvokeAccess(role: IRole) { role.addToPrincipalPolicy( new PolicyStatement({ effect: Effect.ALLOW, actions: ['execute-api:Invoke'], resources: [this.api.arnForExecuteApi('*', '/*', '*')], }), ); if (this.api) { role.addToPrincipalPolicy( new PolicyStatement({ effect: Effect.ALLOW, actions: ['execute-api:Invoke'], resources: [this.api.arnForExecuteApi('*', '/*', '*')], }), ); } else if (this.routerFunction) { role.addToPrincipalPolicy( new PolicyStatement({ effect: Effect.ALLOW, actions: ['lambda:InvokeFunctionUrl'], resources: [this.routerFunction.functionArn], conditions: { StringEquals: { 'lambda:FunctionUrlAuthType': 'AWS_IAM', }, }, }), ); } } }
After making these changes, make sure you update packages/common/constructs/src/app/http-apis/<my-api>.ts
to use your new function url option.
Implementation
Once you’ve updated the infrastructure to support streaming, you can implement a streaming API in FastAPI. The API should:
- Return a
StreamingResponse
- Declare the return type of each response chunk
- Add the OpenAPI vendor extension
x-streaming: true
if you intend to use the API Connection.
For example, if you would like to stream a series of JSON objects from your API, you can implement this as follows:
from pydantic import BaseModelfrom fastapi.responses import StreamingResponse
class Chunk(BaseModel): message: str timestamp: datetime
async def stream_chunks(): for i in range(0, 100): yield Chunk(message=f"This is chunk {i}", timestamp=datetime.now())
@app.get("/stream", openapi_extra={'x-streaming': True})def my_stream() -> Chunk: return StreamingResponse(stream_chunks(), media_type="application/json")
Consumption
To consume a stream of responses, you can make use of the API Connection Generator which will provide a type-safe method for iterating over your streamed chunks.
Deploying your FastAPI
The FastAPI generator creates a CDK construct for deploying your API in the common/constructs
folder. You can use this in a CDK application:
import { MyApi } from ':my-scope/common-constructs';
export class ExampleStack extends Stack { constructor(scope: Construct, id: string) { // Add the api to your stack const api = new MyApi(this, 'MyApi'); }}
This sets up:
- AWS Lambda function running your FastAPI application
- API Gateway HTTP API as the function trigger
- IAM roles and permissions
- CloudWatch log group
- X-Ray tracing configuration
- CloudWatch metrics namespace
Granting Access
You can use the grantInvokeAccess
method to grant access to your API:
api.grantInvokeAccess(myIdentityPool.authenticatedRole);
Local Development
The generator configures a local development server that you can run with:
pnpm nx run my-api:serve
yarn nx run my-api:serve
npx nx run my-api:serve
bunx nx run my-api:serve
This starts a local FastAPI development server with:
- Auto-reload on code changes
- Interactive API documentation at
/docs
or/redoc
- OpenAPI schema at
/openapi.json
Invoking your FastAPI
To invoke your API from a React website, you can use the api-connection
generator.