Skip to content

TypeScript MCP Server

Generate a TypeScript Model Context Protocol (MCP) server for providing context to Large Language Models (LLMs).

The Model Context Protocol (MCP) is an open standard that allows AI assistants to interact with external tools and resources. It provides a consistent way for LLMs to:

  • Execute tools (functions) that perform actions or retrieve information
  • Access resources that provide context or data

You can generate a TypeScript MCP server in two ways:

  1. Install the Nx Console VSCode Plugin if you haven't already
  2. Open the Nx Console in VSCode
  3. Click Generate (UI) in the "Common Nx Commands" section
  4. Search for @aws/nx-plugin - ts#mcp-server
  5. Fill in the required parameters
    • Click Generate
    Parameter Type Default Description
    project Required string - The project to add an MCP server to
    name string - The name of your MCP server (default: mcp-server)

    The generator will add the following files to your existing TypeScript project:

    • Directoryyour-project/
      • Directorysrc/
        • Directorymcp-server/ (or custom name if specified)
          • index.ts Entry point for the MCP server
          • server.ts Main server definition
          • Directorytools/
            • add.ts Sample tool
          • Directoryresources/
            • sample-guidance.ts Sample resource
      • package.json Updated with bin entry and MCP dependencies
      • project.json Updated with MCP server serve target

    Tools are functions that the AI assistant can call to perform actions. You can add new tools in the server.ts file:

    server.tool("toolName", "tool description",
    { param1: z.string(), param2: z.number() }, // Input schema using Zod
    async ({ param1, param2 }) => {
    // Tool implementation
    return {
    content: [{ type: "text", text: "Result" }]
    };
    }
    );

    Resources provide context to the AI assistant. You can add static resources from files or dynamic resources:

    const exampleContext = 'some context to return';
    server.resource('resource-name', 'example://resource', async (uri) => ({
    contents: [{ uri: uri.href, text: exampleContext }],
    }));
    // Dynamic resource
    server.resource('dynamic-resource', 'dynamic://resource', async (uri) => {
    const data = await fetchSomeData();
    return {
    contents: [{ uri: uri.href, text: data }],
    };
    });

    Most AI assistants that support MCP use a similar configuration approach. You’ll need to create or update a configuration file with your MCP server details:

    {
    "mcpServers": {
    "your-mcp-server": {
    "command": "npx",
    "args": ["tsx", "/path/to/your-mcp-server/index.ts"]
    }
    }
    }

    While developing your MCP server, you may wish to configure the --watch flag so that the AI assistant always sees the latest versions of tools/resources:

    {
    "mcpServers": {
    "your-mcp-server": {
    "command": "npx",
    "args": ["tsx", "--watch", "/path/to/your-mcp-server/index.ts"]
    }
    }
    }

    Please refer to the following documentation for configuring MCP with specific AI Assistants:

    The easiest way to test and use an MCP server is by configuring it with an AI assistant (as above). You can however run the server using the <your-server-name>-serve target, which can be useful if you switch from STDIO transport to Streamable HTTP transport.

    Terminal window
    pnpm nx run your-project:your-server-name-serve

    This command uses tsx --watch to automatically restart the server when files change.