Skip to content

Blog

Migrating from AWS PDK

This guide walks you through an example migration of an AWS PDK project to the Nx Plugin for AWS, as well as providing general guidance on this topic.

Migrating to the Nx Plugin for AWS provides the following benefits over PDK:

  • Faster builds
  • Easier to use (UI and CLI)
  • Vibe-coding friendly (try our MCP server!)
  • More modern technologies
  • Local API and website development
  • More control (modify vended files to fit your use-case)
  • And more!

Example Migration: Shopping List Application

Section titled “Example Migration: Shopping List Application”

In this guide, we will use the Shopping List Application from the PDK Tutorial as our target project to migrate. Follow the steps in that tutorial to create the target project if you wish to follow along yourself.

The shopping list application consists of the following PDK project types:

  • MonorepoTsProject
  • TypeSafeApiProject
  • CloudscapeReactTsWebsiteProject
  • InfrastructureTsProject

To start, we’ll create a new workspace for our new project. While more extreme than an in-place migration, this approach gives us the cleanest end result. Creating an Nx workspace is equivalent to using PDK’s MonorepoTsProject:

Terminal window
npx create-nx-workspace@21.4.1 shopping-list --pm=pnpm --preset=@aws/nx-plugin@0.50.0 --iacProvider=CDK --ci=skip

Open up the shopping-list directory this command creates in your favourite IDE.

The TypeSafeApiProject used in the shopping list application made use of:

  • Smithy as the modelling language
  • TypeScript for implementing operations
  • TypeScript hook generation for integrating with a react website

We can therefore use the ts#smithy-api generator to provide equivalent functionality.

Run the ts#smithy-api generator to set up your api project in packages/api:

  1. Install the Nx Console VSCode Plugin if you haven't already
  2. Open the Nx Console in VSCode
  3. Click Generate (UI) in the "Common Nx Commands" section
  4. Search for @aws/nx-plugin - ts#smithy-api
  5. Fill in the required parameters
    • name: api
    • namespace: com.aws
    • auth: IAM
  6. Click Generate

You will notice this generates a model project, as well as a backend project. The model project contains your Smithy model, and backend contains your server implementation.

The backend uses the Smithy Server Generator for TypeScript. We will explore this further below.

Now that we have the basic structure for our Smithy API project, we can migrate the model:

  1. Delete the generated example Smithy files in packages/api/model/src

  2. Copy your model from the PDK project’s packages/api/model/src/main/smithy directory into your new project’s packages/api/model/src directory.

  3. Update the service name and namespace in smithy-build.json to match the PDK application:

    smithy-build.json
    "plugins": {
    "openapi": {
    "service": "com.aws#MyApi",
    ...
  4. Update the service in main.smithy to add the ValidationException error, which is required when using the Smithy TypeScript Server SDK.

    main.smithy
    use smithy.framework#ValidationException
    /// My Shopping List API
    @restJson1
    service MyApi {
    version: "1.0"
    operations: [
    GetShoppingLists
    PutShoppingList
    DeleteShoppingList
    ]
    errors: [
    BadRequestError
    NotAuthorizedError
    InternalFailureError
    ValidationException
    ]
    }
  5. Add an extensions.smithy file to packages/api/model/src where we will define a trait that provides pagination information to the generated client:

    extensions.smithy
    $version: "2"
    namespace com.aws
    use smithy.openapi#specificationExtension
    @trait
    @specificationExtension(as: "x-cursor")
    structure cursor {
    inputToken: String
    enabled: Boolean
    }
  6. Add the new @cursor trait to the GetShoppingLists operation in get-shopping-lists.smithy:

    operations/get-shopping-lists.smithy
    @readonly
    @http(method: "GET", uri: "/shopping-list")
    @paginated(inputToken: "nextToken", outputToken: "nextToken", pageSize: "pageSize", items: "shoppingLists")
    @cursor(inputToken: "nextToken")
    @handler(language: "typescript")
    operation GetShoppingLists {
    input := with [PaginatedInputMixin] {
    @httpQuery("shoppingListId")
    shoppingListId: ShoppingListId
    }

    Any @paginated operations should also use @cursor if you’re using the client generator provided by the Nx Plugin for AWS (via the api-connection generator).

  7. Finally, remove the @handler trait from all operations as this isn’t supported by the Nx Plugin for AWS. Using ts#smithy-api, we don’t need the auto-generated lambda function CDK constructs and bundling targets generated by this trait, as we use a single bundle for all lambda functions.

At this point, let’s run a build to check our model changes and ensure we have some generated server code to work with. There will be some failures in the backend project (@shopping-list/api) but we’ll address those next.

Terminal window
pnpm nx run-many --target build

You can consider the api/backend project as somewhat equivalent to Type Safe API’s api/handlers/typescript project.

One of the main differences between Type Safe API and the ts#smithy-api generator is that handlers are implemented using the Smithy Server Generator for TypeScript, rather than Type Safe API’s own generated handler wrappers (found in the api/generated/typescript/runtime project).

The shopping list application’s lambda handlers rely on the @aws-sdk/client-dynamodb package, so let’s install that first:

Terminal window
pnpm add -w @aws-sdk/client-dynamodb

Then, let’s copy the handlers/src/dynamo-client.ts file from the PDK project to backend/src/operations so it’s available for our handlers.

To migrate the handlers, you can follow these general steps:

  1. Copy the handler from your PDK project’s packages/api/handlers/typescript/src directory to your new project’s packages/api/backend/src/operations directory.

  2. Remove my-api-typescript-runtime imports and instead import the operation type from the generated TypeScript Server SDK, as well as the ServiceContext for example:

    import {
    deleteShoppingListHandler,
    DeleteShoppingListChainedHandlerFunction,
    INTERCEPTORS,
    Response,
    LoggingInterceptor,
    } from 'myapi-typescript-runtime';
    import { DeleteShoppingList as DeleteShoppingListOperation } from '../generated/ssdk/index.js';
    import { ServiceContext } from '../context.js';
  3. Delete the handler wrapper export

    export const handler = deleteShoppingListHandler(
    ...INTERCEPTORS,
    deleteShoppingList,
    );
  4. Update the signature for your operation handler to use the SSDK:

    export const deleteShoppingList: DeleteShoppingListChainedHandlerFunction = async (request) => {
    export const DeleteShoppingList: DeleteShoppingListOperation<ServiceContext> = async (input, ctx) => {
  5. Replace usage of the LoggingInterceptor with ctx.logger. (Also applies to metrics and tracing interceptors):

    LoggingInterceptor.getLogger(request).info('...');
    ctx.logger.info('...');
  6. Update references to input parameters. Since the SSDK provides types that match your Smithy model exactly (rather than grouping path/query/header parameters separately to the body parameter), update any input references accordingly:

    const shoppingListId = request.input.requestParameters.shoppingListId;
    const shoppingListId = input.shoppingListId;
  7. Remove use of Response. We instead just return plain objects in the SSDK.

    return Response.success({ shoppingListId });
    return { shoppingListId };

    We also no longer throw or return Response, instead we throw the SSDK’s generated errors:

    throw Response.badRequest({ message: 'oh no' });
    return Response.badRequest({ message: 'oh no' });
    import { BadRequestError } from '../generated/ssdk/index.js';
    throw new BadRequestError({ message: 'oh no' });
  8. Update any imports to use ESM syntax, namely adding the .js extension to relative imports.

  9. Add the operation to service.ts

    service.ts
    import { ServiceContext } from './context.js';
    import { MyApiService } from './generated/ssdk/index.js';
    import { DeleteShoppingList } from './operations/delete-shopping-list.js';
    import { GetShoppingLists } from './operations/get-shopping-lists.js';
    import { PutShoppingList } from './operations/put-shopping-list.js';
    // Register operations to the service here
    export const Service: MyApiService<ServiceContext> = {
    PutShoppingList,
    GetShoppingLists,
    DeleteShoppingList,
    };
Click here for full before/after examples for the three shopping list operations from the tutorial

We generated the Smithy API project with the name api initially as we wanted it to be added to packages/api for consistency with the PDK project. Since our Smithy API now defines service MyApi instead of service Api, we need to update any instances of getApiServiceHandler with getMyApiServiceHandler.

Make this change to handler.ts:

packages/api/backend/src/handler.ts
import { getApiServiceHandler } from './generated/ssdk/index.js';
import { getMyApiServiceHandler } from './generated/ssdk/index.js';
process.env.POWERTOOLS_METRICS_NAMESPACE = 'Api';
process.env.POWERTOOLS_SERVICE_NAME = 'Api';
const tracer = new Tracer();
const logger = new Logger();
const metrics = new Metrics();
const serviceHandler = getApiServiceHandler(Service);
const serviceHandler = getMyApiServiceHandler(Service);

And to local-server.ts:

packages/api/backend/src/local-server.ts
import { getApiServiceHandler } from './generated/ssdk/index.js';
import { getMyApiServiceHandler } from './generated/ssdk/index.js';
const PORT = 3001;
const tracer = new Tracer();
const logger = new Logger();
const metrics = new Metrics();
const serviceHandler = getApiServiceHandler(Service);
const serviceHandler = getMyApiServiceHandler(Service);

Additionally, update packages/api/backend/project.json and update metadata.apiName to my-api:

packages/api/backend/project.json
"metadata": {
"generator": "ts#smithy-api",
"apiName": "api",
"apiName": "my-api",
"auth": "IAM",
"modelProject": "@shopping-list/api-model",
"ports": [3001]
},

We can now build the project to check that the migration has worked so far:

Terminal window
pnpm nx run-many --target build

The CloudscapeReactTsWebsiteProject used in the shopping list application configured a React website with CloudScape and Cognito authentication built in.

This project type leveraged create-react-app, which is now deprecated. For migrating the website in this guide, we will use the ts#react-website generator, which uses more modern and supported technologies, namely Vite.

As part of the migration, we will also move from PDK’s configured React Router to TanStack Router, which adds additional type-safety to website routing.

Run the ts#react-website generator to set up your website project in packages/website:

  1. Install the Nx Console VSCode Plugin if you haven't already
  2. Open the Nx Console in VSCode
  3. Click Generate (UI) in the "Common Nx Commands" section
  4. Search for @aws/nx-plugin - ts#react-website
  5. Fill in the required parameters
    • name: website
  6. Click Generate

The React website generator above doesn’t bundle cognito authentication by default like CloudscapeReactTsWebsiteProject, instead it’s added explicitly via the ts#react-website#auth generator.

  1. Install the Nx Console VSCode Plugin if you haven't already
  2. Open the Nx Console in VSCode
  3. Click Generate (UI) in the "Common Nx Commands" section
  4. Search for @aws/nx-plugin - ts#react-website#auth
  5. Fill in the required parameters
    • project: website
    • cognitoDomain: shopping-list
  6. Click Generate

This adds React components which manage the appropriate redirects to ensure users log in using the Cognito hosted UI. This also adds a CDK construct to deploy the Cognito resources in packages/common/constructs, called UserIdentity.

In PDK you could pass the vended Projen projects to one another to trigger integration code to be vended. This was used in the shopping list application to configure the website to be able to integrate with the API.

With the Nx Plugin for AWS, API integration is supported via the api-connection generator. Next, we use this generator so that our website can invoke our Smithy API:

  1. Install the Nx Console VSCode Plugin if you haven't already
  2. Open the Nx Console in VSCode
  3. Click Generate (UI) in the "Common Nx Commands" section
  4. Search for @aws/nx-plugin - api-connection
  5. Fill in the required parameters
    • sourceProject: website
    • targetProject: api
  6. Click Generate

This generates the necessary client providers and build targets for your website to call your API via a generated TypeScript client.

The CloudscapeReactTsWebsiteProject automatically included a dependency on @aws-northstar/ui which is used in our shopping list application, so we add it here:

Terminal window
pnpm add -w @aws-northstar/ui

The shopping list application has one component called CreateItem, and two pages, ShoppingList and ShoppingLists. We’ll migrate these to the new website, making a few adjustments since we’re using TanStack Router and the Nx Plugin for AWS TypeScript client code generator.

  1. Copy packages/website/src/components/CreateItem/index.tsx from the PDK project into the exact same location in the new project.

  2. Copy packages/website/src/pages/ShoppingLists/index.tsx to packages/website/src/routes/index.tsx, since ShoppingLists is our home page and we use file-based routing with TanStack router.

  3. Copy packages/website/src/pages/ShoppingList/index.tsx to packages/website/src/routes/$shoppingListId.tsx, since ShoppingList was the page we want to show on the /:shoppingListId route.

Note that you’ll now have some build errors visible in your IDE, we’ll need to make a few more changes to fit within the new framework, outlined below.

Migrate from React Router to TanStack Router

Section titled “Migrate from React Router to TanStack Router”

Since we’re using file-based routing, we can use the website local development server to manage automatically generating route configuration. Let’s start the local website server:

Terminal window
pnpm nx serve-local website

You’ll see some errors, but the local website server should start on port 4200, as well as the local Smithy API server on port 3001.

Follow the below steps in both routes/index.tsx and routes/$shoppingListId.tsx to migrate to TanStack Router:

  1. Add createFileRoute to register each route:

    import { createFileRoute } from "@tanstack/react-router";
    ...
    export default ShoppingLists;
    export const Route = createFileRoute('/')({
    component: ShoppingLists,
    });

    After you save the file, you’ll notice that type errors with call to createFileRoute have gone.

  2. Replace the useNavigate hook.

    Update the import:

    import { useNavigate } from 'react-router-dom';
    import { useNavigate } from '@tanstack/react-router';

    Update calls to the navigate method (returned by useNavigate) to pass in the type-safe routes:

    navigate(`/${cell.shoppingListId}`);
    navigate({
    to: '/$shoppingListId',
    params: { shoppingListId: cell.shoppingListId },
    });
  3. Replace the useParams hook.

    Remove the import:

    import { useParams } from 'react-router-dom';

    Update calls to useParams with the hook provided by the Route created above. These are now type-safe!

    const { shoppingListId } = useParams();
    const { shoppingListId } = Route.useParams();

Since our route files aren’t as deeply nested in the file tree as they were in our PDK project, we need to fix the import for CreateItem in both routes/index.tsx and routes/$shoppingListId.tsx:

import CreateItem from "../../components/CreateItem";
import CreateItem from "../components/CreateItem";

The AppLayoutContext is also vended in a slightly different location in our new project:

import { AppLayoutContext } from "../../layouts/App";
import { AppLayoutContext } from "../components/AppLayout";

Migrate to use the new Generated TypeScript Client

Section titled “Migrate to use the new Generated TypeScript Client”

We’re getting closer now! Next, we need to migrate to use the TypeScript client vended by the Nx Plugin for AWS, which has a few improvements compared to Type Safe API. To achieve this, follow the below steps

  1. Import the new generated client and types instead of the old, for example:

    import {
    ShoppingList,
    usePutShoppingList,
    useDeleteShoppingList,
    useGetShoppingLists,
    } from "myapi-typescript-react-query-hooks";
    import { ShoppingList } from "../generated/my-api/types.gen";
    import { useMyApi } from "../hooks/useMyApi";
    import { useInfiniteQuery, useMutation } from "@tanstack/react-query";

    Note that routes/$shoppingListId.tsx imports the ShoppingList type as _ShoppingList - in that file we should do the same, but again importing from types.gen.

    Note also we import the relevant hooks directly from @tanstack/react-query, since the generated client provides methods to generate options for TanStack query hooks, rather than hook wrappers.

  2. Instantiate the new TanStack Query hooks, for example:

    const getShoppingLists = useGetShoppingLists({ pageSize: PAGE_SIZE });
    const putShoppingList = usePutShoppingList();
    const deleteShoppingList = useDeleteShoppingList();
    const api = useMyApi();
    const getShoppingLists = useInfiniteQuery(
    api.getShoppingLists.infiniteQueryOptions(
    { pageSize: PAGE_SIZE },
    { getNextPageParam: (p) => p.nextToken },
    ),
    );
    const putShoppingList = useMutation(api.putShoppingList.mutationOptions());
    const deleteShoppingList = useMutation(
    api.deleteShoppingList.mutationOptions(),
    );
  3. Remove the wrapper <operation>RequestContent for calls to operations which accept parameters in the request body:

    await putShoppingList.mutateAsync({
    putShoppingListRequestContent: {
    name: item,
    },
    });

There are a few errors left to fix due to differences between TanStack Query v4 (used by PDK) and v5 which the api-connection generator added:

  1. Replace isLoading with isPending for mutations, for example:

    putShoppingList.isLoading
    putShoppingList.isPending
  2. The shopping list application made use of the InfiniteQueryTable from @aws-northstar/ui which expects a type from TanStack Query v4. This actually works with infinite queries from v5, so we can just suppress the type error:

    <InfiniteQueryTable
    query={getShoppingLists}
    query={getShoppingLists as any}

You can now visit the local website at http://localhost:4200/

The website should load up now that everything’s been migrated! Since the only infrastructure that the shopping list application relies on besides API, Website and Identity is the DynamoDB table - if you’ve got a DynamoDB table named shopping_list in-region, and local AWS credentials which can access it, the website will be fully functional!

If not, that’s ok, we’ll migrate the infrastructure next.

Click here for full before/after examples for the two shopping list pages from the tutorial

The last project we need to migrate for our shopping list application is the InfrastructureTsProject. This is a TypeScript CDK project, for which the Nx Plugin for AWS equivalent is the ts#infra generator.

As well as the Projen projects, PDK also vended CDK constructs which these projects depend on. We will migrate the shopping list application from these CDK constructs too, in favour of the ones generated by the Nx Plugin for AWS.

Generate a TypeScript CDK Infrastructure Project

Section titled “Generate a TypeScript CDK Infrastructure Project”

Run the ts#infra generator to set up your infrastructure project in packages/infra:

  1. Install the Nx Console VSCode Plugin if you haven't already
  2. Open the Nx Console in VSCode
  3. Click Generate (UI) in the "Common Nx Commands" section
  4. Search for @aws/nx-plugin - ts#infra
  5. Fill in the required parameters
    • name: infra
  6. Click Generate

The PDK shopping list application instantiated the following constructs within the CDK application stack:

  • DatabaseConstruct for the DynamoDB table storing shopping lists
  • UserIdentity for Cognito resources, imported directly from PDK
  • MyApi for deploying the Smithy API, which used the generated TypeScript CDK construct with type-safe integrations, depending on PDK’s TypeSafeRestApi CDK construct under the hood.
  • Website for deploying the Website, wrapping PDK’s StaticWebsite CDK construct.

Next, we will migrate each of these to the new project.

Copy packages/infra/src/stacks/application-stack.ts from the PDK shopping list application to the exact same location in your new project. You’ll see some TypeScript errors which we’ll address below.

The PDK shopping list application had a Database construct in packages/src/constructs/database.ts. Copy this to the exact same location in your new project.

Since the Nx Plugin for AWS uses Checkov for security tests which is a little stricter than PDK Nag, we also need to add some suppressions:

constructs/database.ts
import { suppressRules } from ':shopping-list/common-constructs';
...
suppressRules(
this.shoppingListTable,
['CKV_AWS_28', 'CKV_AWS_119'],
'Backup and KMS key not required for this project',
);

In application-stack.ts, update the import for the DatabaseConstruct to use ESM syntax:

stacks/application-stack.ts
import { DatabaseConstruct } from '../constructs/database';
import { DatabaseConstruct } from '../constructs/database.js';

The UserIdentity construct can generally be swapped out without changes by adjusting the imports.

import { UserIdentity } from "@aws/pdk/identity";
import { UserIdentity } from ':shopping-list/common-constructs';
...
const userIdentity = new UserIdentity(this, `${id}UserIdentity`);

Note that the underlying constructs used by the new UserIdentity construct are vended directly from aws-cdk-lib, where PDK used @aws-cdk/aws-cognito-identitypool-alpha.

The PDK shopping list application had a construct in constructs/apis/myapi.ts which instantiated a CDK construct which Type Safe API generated from your Smithy model.

As well as this construct, since the PDK project used the @handler trait, generated lambda function CDK constructs were also generated.

Like Type Safe API, the Nx Plugin for AWS provides type-safety for integrations based on your Smithy model, however it’s achieved in a much simpler and more flexible way. Instead of generating an entire CDK construct at build time, only minimal “metadata” is generated, which the packages/common/constructs/src/app/apis/api.ts uses in a generic fashion. You can learn more about how to use the construct in the ts#smithy-api generator guide.

Follow the below steps:

  1. Instantiate the Api construct in application-stack.ts

    stacks/application-stack.ts
    import { MyApi } from "../constructs/apis/myapi";
    import { Api } from ':shopping-list/common-constructs';
    ...
    const myapi = new MyApi(this, "MyApi", {
    databaseConstruct,
    userIdentity,
    });
    const api = new Api(this, 'MyApi', {
    integrations: Api.defaultIntegrations(this).build(),
    });

    Notice here we use Api.defaultIntegrations(this).build() - the default behaviour is to create a lambda function for each operation in our API, which is the same behaviour we had in myapi.ts.

  2. Grant permissions for the lambda functions to access the DynamoDB table.

    In the PDK shopping list application, the DatabaseConsruct was passed into MyApi, and it managed adding the relevant permissions to each generated function construct. We’ll do this directly in the application-stack.ts file by accessing the Api construct’s type-safe integrations property:

    stacks/application-stack.ts
    // Grant our lambda functions scoped access to call Dynamo
    databaseConstruct.shoppingListTable.grantReadData(
    api.integrations.getShoppingLists.handler,
    );
    [
    api.integrations.putShoppingList.handler,
    api.integrations.deleteShoppingList.handler,
    ].forEach((f) => databaseConstruct.shoppingListTable.grantWriteData(f));
  3. Grant permissions for authenticated users to invoke the API.

    Within the PDK application’s myapi.ts, authenticated users were also granted IAM permissions to invoke the API. We will do the equivalent in application-stack.ts:

    stacks/application-stack.ts
    api.grantInvokeAccess(userIdentity.identityPool.authenticatedRole);

Finally, we add the Website construct from packages/common/constructs/src/app/static-websites/website.ts to application-stack.ts, since this is the equivalent of the PDK shopping list application’s packages/infra/src/constructs/websites/website.ts.

import { Website } from "../constructs/websites/website";
import { Website } from ':shopping-list/common-constructs';
...
new Website(this, "Website", {
userIdentity,
myapi,
});
new Website(this, 'Website');

Notice that we don’t pass the identity or API to the website - runtime config is managed within each construct vended by the Nx Plugin for AWS, where UserIdentity and Api register the necessary values, and Website manages deploying it to /runtime-config.json on your static website.

Let’s build the project now that we’ve migrated all the relevant parts of the codebase to our new project.

Terminal window
pnpm nx run-many --target build

Now we’ve got our fully migrated codebase, we can look at deploying it. There are two paths we can take at this point.

The simplest approach is to treat this as a completely new application, meaning we’ll “start again” with a fresh DynamoDB table and Cognito User Pool - losing all users and their shopping lists. For this approach, simply:

  1. Delete the DynamoDB table named shopping_list

  2. Deploy the new application:

    Terminal window
    pnpm nx deploy infra shopping-list-infra-sandbox/*

🎉 And we’re done! 🎉

Migrate Existing Stateful Resources with no Outage (More Complex)

Section titled “Migrate Existing Stateful Resources with no Outage (More Complex)”

In reality, it’s more likely that you will want to migrate existing AWS resources so that they are managed by the new codebase, while avoiding any downtime for your customers.

For our shopping list application, the stateful resources we care about are the DynamoDB table which contains our users’ shopping lists, and the User Pool which contains the details of all of our registered users. Our high level plan will be to retain these two key resources and move them such that they’re managed by our new stack, then to update DNS to point to our new website (and API if exposed to customers).

  1. Update your new application to reference the existing resources you wish to retain.

    For the shopping list application, we do this for the DynamoDB table

    constructs/database.ts
    this.shoppingListTable = new Table(this, 'ShoppingList', {
    ...
    this.shoppingListTable = Table.fromTableName(
    this,
    'ShoppingList',
    'shopping_list',
    );

    And for the Cognito User Pool

    packages/common/constructs/src/core/user-identity.ts
    this.userPool = this.createUserPool();
    this.userPool = UserPool.fromUserPoolId(
    this,
    'UserPool',
    '<your-user-pool-id>',
    );
  2. Build and deploy the new application:

    Terminal window
    pnpm nx run-many --target build
    Terminal window
    pnpm nx deploy infra shopping-list-infra-sandbox/*

    Now we have our new application stood up referencing the existing resources, not yet taking any traffic.

  3. Perform full integration testing to ensure the new application works as expected. For the shopping list application, load the website and check you can sign in and create, view, edit and delete shopping lists.

  4. Revert the changes which reference the existing resources in your new application, but do not deploy them yet.

    constructs/database.ts
    this.shoppingListTable = new Table(this, 'ShoppingList', {
    ...
    this.shoppingListTable = Table.fromTableName(
    this,
    'ShoppingList',
    'shopping_list',
    );

    And for the Cognito User Pool

    packages/common/constructs/src/core/user-identity.ts
    this.userPool = this.createUserPool();
    this.userPool = UserPool.fromUserPoolId(
    this,
    'UserPool',
    '<your-user-pool-id>',
    );

    And then run a build

    Terminal window
    pnpm nx run-many --target build
  5. Use cdk import in your new application’s packages/infra folder to see which resources we’ll be prompted to import.

    New Application
    cd packages/infra
    pnpm exec cdk import shopping-list-infra-sandbox/Application --force

    Step through the prompts by hitting enter. The import will fail because the resources are managed by another stack - this is expected, we just did this step to confirm which resources we’ll need to retain. You’ll see output like this:

    Terminal window
    shopping-list-infra-sandbox/Application/ApplicationUserIdentity/UserPool/smsRole/Resource (AWS::IAM::Role): enter RoleName (empty to skip)
    shopping-list-infra-sandbox/Application/ApplicationUserIdentity/UserPool/Resource (AWS::Cognito::UserPool): enter UserPoolId (empty to skip)
    shopping-list-infra-sandbox/Application/Database/ShoppingList/Resource (AWS::DynamoDB::Table): import with TableName=shopping_list (y/n) y

    This tells us that there are actually 3 resources we’ll need to import into our new stack.

  6. Update your old PDK project to set RemovalPolicy to RETAIN for the resources discovered from the previous step. At the time of writing this is the default for both the User Pool and the DynamoDB table, but we need to update it for the SMS Role we discovered above:

    application-stack.ts
    const userIdentity = new UserIdentity(this, `${id}UserIdentity`, {
    userPool,
    });
    const smsRole = userIdentity.userPool.node.findAll().filter(
    c => CfnResource.isCfnResource(c) &&
    c.node.path.includes('/smsRole/'))[0] as CfnResource;
    smsRole.applyRemovalPolicy(RemovalPolicy.RETAIN);
  7. Deploy your PDK project so that the removal policies are applied

    PDK Application
    cd packages/infra
    npx projen deploy
  8. Take a look at the CloudFormation console and record the values you were prompted for in the above cdk import step

    1. The User Pool ID, eg us-west-2_XXXXX
    2. The SMS Role Name, eg infra-sandbox-UserIdentityUserPoolsmsRoleXXXXXX
  9. Update your PDK project to reference the existing resources instead of creating them

    constructs/database.ts
    this.shoppingListTable = new Table(this, 'ShoppingList', {
    ...
    this.shoppingListTable = Table.fromTableName(
    this,
    'ShoppingList',
    'shopping_list',
    );

    And for the Cognito User Pool

    application-stack.ts
    const userPool = UserPool.fromUserPoolId(
    this,
    'UserPool',
    '<your-user-pool-id>',
    );
    const userIdentity = new UserIdentity(this, `${id}UserIdentity`, {
    // PDK construct accepts UserPool not IUserPool, but this still works!
    userPool: userPool as any,
    });
  10. Deploy your PDK project again, this will mean the resources are no longer managed by our PDK project’s CloudFormation stack.

    PDK Application
    cd packages/infra
    npx projen deploy
  11. Now that the resources are unmanaged, we can run cdk import in our new application to actually perform the import:

    New Application
    cd packages/infra
    pnpm exec cdk import shopping-list-infra-sandbox/Application --force

    Enter the values when prompted, the import should complete successfully.

  12. Deploy the new application again to make sure that any changes to these existing resources (now managed by your new stack) are made:

    Terminal window
    pnpm nx deploy infra shopping-list-infra-sandbox/*
  13. Perform a full test of your new application again

  14. Update DNS records to point to your new Website (and API if required).

    We recommend a gradual approach using Route53 Weighted Routing, whereby a fraction of requests are directed to the new application to begin with. As you monitor your metrics you can increase the weight for the new application until no traffic is sent to your old PDK application.

    If you don’t have any DNS and used the auto-generated domains for the website and API, you can always look at proxying requests (eg via a CloudFront HTTP origin or API Gateway HTTP integration(s)).

  15. Monitor PDK application metrics to ensure there is no traffic, and finally destroy the old CloudFormation stack:

    Terminal window
    cd packages/infra
    npx projen destroy

That was quite a bit more involved, but we successfully migrated our users seamlessly to the new application! 🎉🎉🎉

We now have the new benefits of the Nx Plugin for AWS over PDK:

  • Faster builds
  • Local API development support
  • A vibe-coding friendly codebase (try our MCP server!)
  • More intuitive type-safe client/server code
  • And more!

This section provides guidance for features of PDK that aren’t covered by the example migration above.

As a general rule when moving from PDK, we recommend starting any project with an Nx Workspace, given its similarities to the PDK Monorepo. We also recommend using our generators as the primitives on which to build any new types.

Terminal window
npx create-nx-workspace@21.4.1 my-project --pm=pnpm --preset=@aws/nx-plugin --ci=skip

CDK Graph builds graphs of your connected CDK resources, and provided two plugins:

The CDK Graph Diagram Plugin generates AWS architecture diagrams from your CDK infrastructure.

For a similar deterministic approach a viable alternative is CDK-Dia.

With the advancements in Generative AI, many foundation models are capabile of creating high-quality diagrams from your CDK infrastructure. We recommend trying out the AWS Diagram MCP Server. Check out this blog post for a walkthrough.

The CDK Graph Threat Composer Plugin generates a starter Threat Composer threat model from your CDK code.

This plugin worked by simply filtering a base threat model containing example threats, and filtering them based on the resources your stack made use of.

If you’re interested in these specific example threats you can copy and filter the base threat model, or use it as context to help a foundation model generate a similar one.

AWS Arch provided mappings between CloudFormation resources and their associated architecture icons for CDK Graph above.

Refer to the AWS Architecture Icons page for icon related resources. Diagrams also provides a way to build diagrams as code.

If you were using this directly, consider forking the project and taking ownership!

PDK provided a PDKPipelineProject which set up a CDK infrastructure project and made use of a CDK construct which wrapped some CDK Pipelines resources.

To migrate from this, you can use the CDK Pipelines constructs directly. In practice however it is likely more straightforward to use something like GitHub actions or GitLab CI/CD, where you define CDK Stages and run the deploy command for the appropiate stage directly.

PDK Nag wraps CDK Nag, and provides a set of rules specific to building prototypes.

To migrate from PDK Nag, use CDK Nag directly. If you need the same set of rules you can create a “pack” of your own by following the documentation here.

The most commonly used components from Type Safe API are covered in the example migration above, however there are other features, for which migration details are below.

The Nx Plugin for AWS supports APIs modelled in Smithy, but not those modelled directly OpenAPI. The ts#smithy-api generator is a good starting point which you can then modify. You can define your OpenAPI specification in the model project’s src folder instead of Smithy, and modify the build.Dockerfile to use your desired code generation tool for clients/servers if they aren’t available on NPM. If your desired tools are on NPM, you can just install them as dev dependencies to your Nx workspace and call them directly as Nx build targets.

For type-safe backends modelled in OpenAPI, you can consider using one of the OpenAPI Generator Server Generators. These won’t generate directly for AWS Lambda, but you can use the AWS Lambda Web Adapter to bridge the gap for a lot of them.

For TypeScript clients, you can use the ts#react-website generator and api-connection generator with an example ts#smithy-api to see how clients are generated and integrated with a website. This configures build targets which generate clients by invoking our open-api#ts-client or open-api#ts-hooks generators. You can use these generators yourself by pointing them at your OpenAPI Specification.

For other languages, you can also see if any of the generators from OpenAPI Generator fit your needs.

You can also build a bespoke generator by using the ts#nx-generator generator. Refer that generator’s documentation for details about how to generate code from OpenAPI. You can use the templates from the Nx Plugin for AWS as a starting point. You can also even refer to the templates from the PDK codebase for more inspiration, noting that the data structure the templates operate on is a little different to the Nx Plugin for AWS.

For TypeSpec, the above section for OpenAPI applies too. You can start by generating a ts#smithy-api, install the TypeSpec compiler and OpenAPI packages to your Nx workspace, and update the model project’s compile target to run tsp compile instead, ensuring it outputs an OpenAPI specification to the dist directory.

The recommended approach would be to use the TypeSpec HTTP Server generator for JavaScript to generate your server code, since this works directly on your TypeSpec model.

You can use the AWS Lambda Web Adapter for running the generated server on AWS Lambda.

You can also use any of the above OpenAPI options.

TypeSpec has its own code generators for clients in all three of Type Safe API’s supported languages:

The above OpenAPI section also applies since TypeSpec can compile to OpenAPI.

The above example migration outlines migrating to use the ts#smithy-api generator. This section covers the options for Python and Java backends and clients.

The Smithy code generator for Java. This has a Java server generator as well as an adapter to run the generated Java server on AWS Lambda.

Smithy doesn’t have a server generator for Python, so you will need to go via OpenAPI. Refer to the above section regarding APIs Modelled with OpenAPI for potential options.

The Smithy code generator for Java. This has a Java client generator.

For Python clients, you can check out Smithy Python.

For TypeScript, check out Smithy TypeScript, or use the same approach we’ve taken in ts#smithy-api by going via OpenAPI (we opted for this as it gives us consistency between tRPC, FastAPI and Smithy APIs via TanStack Query hooks).

Type Safe API provided a Projen project type named SmithyShapeLibraryProject which configured a project which contained Smithy models which could be reused by multiple Smithy-based APIs.

The most straightforward way to achieve this is to do the following:

  1. Create your shape library using the smithy#project generator:

    1. Install the Nx Console VSCode Plugin if you haven't already
    2. Open the Nx Console in VSCode
    3. Click Generate (UI) in the "Common Nx Commands" section
    4. Search for @aws/nx-plugin - smithy#project
    5. Fill in the required parameters
      • Click Generate

      Specify any name for the serviceName option, as we will remove the service shape.

    6. Replace the default model in src with the shapes you wish to define

    7. Update smithy-build.json to remove the plugins and any unused maven dependencies

    8. Replace build.Dockerfile with minimal build steps:

      build.Dockerfile
      FROM public.ecr.aws/docker/library/node:24 AS builder
      # Output directory
      RUN mkdir /out
      # Install Smithy CLI
      # https://smithy.io/2.0/guides/smithy-cli/cli_installation.html
      WORKDIR /smithy
      ARG TARGETPLATFORM
      RUN if [ "$TARGETPLATFORM" = "linux/arm64" ]; then ARCH="aarch64"; else ARCH="x86_64"; fi && \
      mkdir -p smithy-install/smithy && \
      curl -L https://github.com/smithy-lang/smithy/releases/download/1.61.0/smithy-cli-linux-$ARCH.zip -o smithy-install/smithy-cli-linux-$ARCH.zip && \
      unzip -qo smithy-install/smithy-cli-linux-$ARCH.zip -d smithy-install && \
      mv smithy-install/smithy-cli-linux-$ARCH/* smithy-install/smithy
      RUN smithy-install/smithy/install
      # Copy project files
      COPY smithy-build.json .
      COPY src src
      # Smithy build with Maven cache mount
      RUN --mount=type=cache,target=/root/.m2/repository,id=maven-cache \
      smithy build
      RUN cp -r build/* /out/
      # Export the /out directory
      FROM scratch AS export
      COPY --from=builder /out /

    In your service model project(s), make the following changes to consume the shape library:

    1. Update the compile target in project.json to add the workspace as build context, and a dependency on the shape library’s build target

      project.json
      {
      "cache": true,
      "outputs": ["{workspaceRoot}/dist/{projectRoot}/build"],
      "executor": "nx:run-commands",
      "options": {
      "commands": [
      "rimraf dist/packages/api/model/build",
      "make-dir dist/packages/api/model/build",
      "docker build --build-context workspace=. -f packages/api/model/build.Dockerfile --target export --output type=local,dest=dist/packages/api/model/build packages/api/model"
      ],
      "parallel": false,
      "cwd": "{workspaceRoot}"
      },
      "dependsOn": ["@my-project/shapes:build"]
      }
    2. Update the build.Dockerfile to copy the src directory from your shape library. For example, assuming the shape library is located in packages/shapes:

      build.Dockerfile
      # Copy project files
      COPY smithy-build.json .
      COPY src src
      COPY --from=workspace packages/shapes/src shapes
    3. Update smithy-build.json to add the shapes directory to its sources:

      smithy-build.json
      {
      "version": "1.0",
      "sources": ["src/", "shapes/"],
      "plugins": {
      ...
      }

    Type Safe API provided the following default interceptors:

    • Logging, tracing and metrics interceptors using Powertools for AWS Lambda
    • Try-catch interceptor for handling uncaught exceptions
    • CORS interceptor for returning CORS headers

    The ts#smithy-api generator instruments logging, tracing and metrics with Powertools for AWS Lambda using Middy. The behaviour of the try-catch interceptor is built in to the Smithy TypeScript SSDK, and CORS headers are added in handler.ts.

    For logging, tracing and metrics interceptors in any language, use Powertools for AWS Lambda directly.

    For migrating custom interceptors, we recommend using the following libraries:

    Type Safe API provided documentation generation using Redocly CLI. This is very easy to add to an existing project once you’ve migrated it as above.

    1. Install the Redocly CLI

      Terminal window
      pnpm add -Dw @redocly/cli
    2. Add a documentation generation target to your model project using redocly build-docs, for example:

      model/project.json
      {
      ...
      "documentation": {
      "cache": true,
      "outputs": ["{workspaceRoot}/dist/{projectRoot}/documentation"],
      "executor": "nx:run-commands",
      "options": {
      "command": "redocly build-docs dist/packages/api/model/build/openapi/openapi.json --output=dist/packages/api/model/documentation/index.html",
      "cwd": "{workspaceRoot}"
      },
      "dependsOn": ["compile"]
      }
      }

    You can also consider the OpenAPI Generator documentation generators.

    Type Safe API generated mocks for you within its generated infrastructure package.

    You can move to JSON Schema Faker which can create the mock data based on JSON Schemas. This can work directly on an OpenAPI specification, and has a CLI which you could run as part of your model project build.

    You can update your CDK infrastructure to read the JSON file output by JSON Schema Faker, and return the appropriate API Gateway MockIntegration for an integration, based on the generated metadata.gen.ts (assuming you used the ts#smithy-api generator).

    Type Safe API supported implementing APIs with a mixture of different languages in the backend. This can also be achieved by providing “overrides” to integrations when instantiating your API construct in CDK:

    application-stack.ts
    const pythonLambdaHandler = new Function(this, 'PythonImplementation', {
    runtime: Runtime.PYTHON_3_12,
    ...
    });
    new MyApi(this, 'MyApi', {
    integrations: Api.defaultIntegrations(this)
    .withOverrides({
    echo: {
    integration: new LambdaIntegration(pythonLambdaHandler),
    handler: pythonLambdaHandler,
    },
    })
    .build(),
    });

    You will need to “stub” your service/router for your service to compile if using the ts#smithy-api and the TypeScript Server SDK, eg:

    service.ts
    export const Service: ApiService<ServiceContext> = {
    ...
    Echo: () => { throw new Error(`Not Implemented`); },
    };

    Type Safe API added native API Gateway validation for request bodies based on your OpenAPI specification since it used the SpecRestApi construct under the hood.

    With the ts#smithy-api generator, validation is performed by the Server SDK itself. This is the same for most server generators.

    If you would like to implement native API Gateway validation, you could do so by modifying packages/common/constructs/src/core/api/rest-api.ts to read the relevant JSON schema for each operation’s request body from your OpenAPI specification.

    Unfortunately there is no straightforward migration path for Type Safe API’s websocket API using API Gateway and Lambda with model-driven API development. However, this section of the guide aims to at least offer a few ideas.

    Consider using AsyncAPI to model your API instead of OpenAPI or TypeSpec since this is designed to handle asynchronous APIs. The AsyncAPI NodeJS Template can generate a Node websocket backend which you could host on ECS for example.

    You can also consider AppSync Events for infrastructure, and use Powertools. This blog post is worth a read!

    Another option is to use GraphQL APIs with websockets on AppSync, for which we have a GitHub issue you can +1! Refer to the AppSync developer guide for details and links to sample projects.

    You can also consider rolling your own code generators which interpret the same vendor extensions as Type Safe API. Refer to the APIs Modelled with OpenAPI section for details around building custom OpenAPI-based code generators. You can find the templates Type Safe API uses for API Gateway Websocket API Lambda handlers here, and the client here.

    You can also consider migrating to use the ts#trpc-api generator to use tRPC. At the time of writing we don’t yet have support for subscriptions/streaming but if this is something you need do add a +1 to our GitHub issue tracking this.

    Smithy is protocol agnostic, but does not yet have support for the Websocket protocol, refer to this GitHub issue tracking support.

    PDK supported CDK infrastructure written in Python and Java. We do not support this in the Nx Plugin for AWS at the time of writing.

    The recommended path forward would be to either migrate your CDK infrastructure to TypeScript, or to use our generators and migrate the common constructs package to your desired language. You can use Generative AI to accelerate these kinds of migrations, for example Amazon Q CLI. You can have an AI agent iterate on the migration until the synthesized CloudFormation templates are identical.

    The same applies for Type Safe API’s generated infrastructure in Python or Java - you can translate the generic rest-api.ts construct from the common constructs package, and implement your own simple metadata generator for your target language (refer to the APIs Modelled with OpenAPI section).

    You can use the py#project generator for a base Python project to add your CDK code to (and move over your cdk.json file, adding relevant targets). You can use Nx’s @nx/gradle plugin for Java projects, or @jnxplus/nx-maven for Maven.

    PDK was built on top of Projen. Projen and Nx Generators have fairly fundamental differences meaning that while it is technically possible to combine them it is likely an anti-pattern. Projen manages project files as code such that they cannot be modified directly, whereas Nx generators vend project files once and then code can be freely modified.

    If you would like to continue to use Projen, you can implement your desired Projen project types yourself. To follow patterns from the Nx Plugin for AWS, you can run our generators or examine their source code on GitHub to see how your desired project types are constructed, and implement the relevant parts using Projen’s primitives.

    Introducing the Nx Plugin for AWS MCP Server

    In a rapidly evolving landscape of software development, AI assistants have become valuable collaborators in our coding journey. Many developers have embraced what we affectionately call “vibe-coding” - the collaborative dance between human creativity and AI assistance. Like any emerging practice, it comes with both exciting benefits and notable challenges. This post introduces the Nx Plugin for AWS MCP Server, which enhances the AI-assisted development experience when working with AWS products and services.

    Vibe-coding, the practice of collaboratively building software with AI assistants, has transformed how many organizations approach software development. You describe what you want to build, and your AI assistant helps bring your vision to life, through writing code and tests, running build commands, and collaboratively iterating to complete tasks both large and small.

    This collaborative approach has accelerated development cycles significantly, as complex implementations that might previously take hours to write manually, can often be completed in minutes.

    Despite its benefits, vibe-coding comes with pitfalls that can disrupt your flow and lead to frustration. AI tools can produce inconsistent patterns across a project, which can create maintenance headaches down the line. Without specific guidance, AI may miss important AWS-specific best practices or security considerations that experienced developers would naturally incorporate.

    Without a clear project structure, AI-assisted code can become disorganized and difficult to maintain. AI may create custom implementations for problems that already have established solutions, unnecessarily reinventing the wheel.

    These challenges can lead to technical debt, security vulnerabilities, and frustration, especially when working with various interconnecting AWS services and not just within the bounds of a single framework.

    Th Nx Plugin for AWS provides a structured foundation for building AWS applications using the Nx monorepo tooling. Instead of starting with a blank canvas, the plugin offers a consistent framework for project organization.

    The plugin ensures consistent project scaffolding through generators for common project types, which maintains structural integrity across your codebase. It incorporates pre-configured templates that follow AWS best practices, helping developers avoid common pitfalls and security issues. The integrated tooling provides built-in commands for building, testing, and deploying AWS applications, and streamlining the development workflow through local development servers. Additionally, it leverages Nx’s powerful dependency management for complex projects, simplifying monorepo management.

    By providing this structure, the Nx Plugin for AWS gives AI assistants a clear structure to work within. Rather than inventing patterns from scratch, AI assistants can follow established conventions, leading to a more consistent and maintainable codebase.

    Model Context Protocol (MCP) is an open standard that allows AI assistants to interact with external tools and resources. The Nx Plugin for AWS MCP server extends your AI assistant’s capabilities with specialized knowledge about the Nx Plugin for AWS.

    The MCP server provides contextual information about best practices, available project structures, and implementation patterns specific to AWS development. It enables your AI tooling to create workspaces and run generators to scaffold common project types. This contextual awareness helps the AI make more informed suggestions that align with established patterns and avoid common pitfalls.

    Instead of producing code that might not align with best practices or might reference non-existent features, your AI assistant can leverage the MCP server to lay a foundation for your project. The result is a more deterministic, and reliable development experience, where you can start with a solid base for the core components of your project and use AI to fill in the business logic.

    If you’re interested in exploring AI-assisted AWS development with more structure and reliability, try the Nx Plugin for AWS MCP Server. You can set it up in your favourite AI Assistant (Amazon Q Developer, Cline, Claude Code, etc) with the following MCP Server configuration:

    {
    "mcpServers": {
    "aws-nx-mcp": {
    "command": "npx",
    "args": ["-y", "-p", "@aws/nx-plugin", "aws-nx-mcp"]
    }
    }
    }

    For detailed instructions, refer to our Building with AI guide.

    Welcome to the @aws/nx-plugin

    Aaaand we’re live! 🚀

    The Nx Plugin for AWS is an Nx plugin that provides a toolkit for simplifying the process of building and deploying full-stack applications on AWS. It provides developers with pre-configured templates for both application and IaC code, significantly reducing the time spent on setup and configuration. The plugin handles the complexity of AWS service integration while maintaining flexibility for customization.

    Users simplify pick and choose which components they want from the list of available generators, provide any configuration options and have the @aws/nx-plugin generate the required starter code. Several generators exist within this toolkit which can create APIs, websites, infrastructure and even do more sophistacted things like integrated a frontend to a backend (including updating existing files via AST transforms!) with type-safe clients.

    generator

    To learn more, get started with our Dungeon Adventure tutorial, which covers all the main components of the plugin and should give you a good flavour of how to use it.

    We’re keen to hear your feedback, please don’t hesitate to post a discussion or raise an issue to let us know what you think and what you’d like to see next!

    Try it out!