Supercharging AWS database development with AWS MCP servers

June 30, 2025

The Model Context Protocol (MCP) is redefining the interaction between AI systems and various data sources, facilitating seamless integration with databases, APIs, file systems, and specialized business applications. As discussions around MCP servers gain momentum, many are curious about their implications for existing database systems. Customers are increasingly inquiring about how to incorporate AI models into their daily operations, leveraging AI-assisted tools that comprehend context and offer insightful enhancements.

At Amazon Web Services (AWS), our mission is to empower developers to build more efficiently and intuitively in the cloud. Services like Amazon Aurora, Amazon DynamoDB, and Amazon ElastiCache are widely adopted by developers managing critical workloads, from global commerce platforms to financial systems and real-time analytics applications. With the rise of AI, the way developers engage with these services is evolving. To enhance productivity, developers are increasingly integrating AI-assisted tools that not only understand context but also provide suggestions and assist in system configuration. MCP is at the forefront of this transformation, rapidly changing how developers can embed AI assistants into their development workflows. This article delves into the fundamental principles of MCP and illustrates how the new AWS MCP servers can expedite database development through natural language prompts.

Traditional Development Challenges

In conventional development settings, developers often find themselves dedicating significant time to crafting boilerplate queries and toggling between various development tools and database interfaces, such as psql or MySQL clients, to inspect schemas and data. This frequent context switching necessitates a constant re-familiarization with diverse schemas, syntaxes, paradigms, and best practices, ultimately hindering development speed and amplifying the risk of errors.

These challenges intensify when applications utilize a mix of relational, non-relational, and caching database technologies. Within the relational realm alone, developers must navigate nuanced yet critical differences among SQL dialects. A valid PostgreSQL query may fail in MySQL, and vice versa, compelling developers to maintain multiple mental models and continuously translate between dialects. The cognitive load escalates further when transitioning between relational and non-relational databases, each requiring distinct syntax and fundamentally different approaches to data modeling, query optimization, and application architecture. Traditional development tools were not designed to accommodate the reality of constant context-switching across diverse database ecosystems.

Introducing MCP Servers for AWS Databases

AWS’s approach emphasizes secure, protocol-based access to structured metadata, tailored for local development environments and collaborative settings. To facilitate this, we have released open-source MCP servers for several database services, including:

  • Amazon Aurora
  • Amazon DynamoDB
  • Amazon ElastiCache

What is MCP?

MCP is an open protocol that standardizes the connection between AI assistants and the external environment, encompassing content repositories, data sources, business tools, and development environments. At its essence, MCP employs a client-server architecture, allowing a host application to connect with multiple servers. The architecture comprises:

  • MCP hosts and clients – AI-powered applications such as Amazon Q CLI, Cursor, and Claude Desktop that require access to external data or tools.
  • MCP servers – Lightweight servers that provide specific functionalities through tools, connecting to local or remote data sources.
  • Data sources – Databases, files, or services containing the information necessary for AI assistants.

The following diagram illustrates how MCP enables large language model (LLM) agents to access and perform tasks on structured data stored in databases.

Each AWS database MCP server adheres to the same protocol while managing database-specific connections through appropriate mechanisms. This standardization eliminates the need for developers to create custom integrations for each database when connecting AI tools. By streamlining how AI assistant tools access metadata across database services, MCP fosters intelligent suggestions, context-aware query assistance, and real-time comprehension of database structures.

Accelerating Database Development with MCP Servers

Development often commences with fundamental inquiries: What am I building, and what tools should I employ? Database MCP servers address these queries by integrating database context directly into development environments. These servers unveil a curated set of tools that agents can discover and utilize to accomplish specific tasks. For instance, the Aurora DSQL MCP server offers three essential tools: get_schema, readonly_query, and transact. During development, the AI agent engages the LLM to select the appropriate tool. A typical workflow may initiate with get_schema to identify available tables, proceed with readonly_query to scrutinize table structures, and culminate with transact to insert rows into the relevant tables.

Equipping the AI assistant with such contextual awareness aids developers in addressing critical questions that traditionally impede development:

  • How are tables related?
  • Which keys drive specific access patterns?
  • What data is available for implementing a particular feature or constructing analytic dashboards?

By exposing structured metadata from AWS databases, database MCP servers bolster workflows that enhance development speed, safety, and visibility. They empower AI agents to reason about schemas, access patterns, and recent changes in real time. The following sections will explore common patterns that emerge when utilizing MCP in database development workflows.

Schema-Driven Feature Development

Once the appropriate application design and data model are established, developers can leverage integrated development environment (IDE) integrations with MCP servers to access real-time schema details and relationships. This allows for confident evolution of the data model through natural language interactions with the AI Assistant. For example, developers can add a table to Aurora and clearly understand its connections to existing foreign keys. They can modify a DynamoDB attribute structure based on new requirements while grasping its impact on query patterns. Additionally, they can investigate how cached data is stored and utilized in ElastiCache.

A video demonstration showcases how developers can utilize Amazon Q CLI with Cursor to locate tables, comprehend the database schema, and generate create, read, update, and delete (CRUD) APIs in an Amazon Aurora PostgreSQL-Compatible Edition database within minutes using natural language prompts. Although this demonstration focuses on Aurora, the same principles apply to other databases, including Amazon Aurora MySQL-Compatible Edition.

Exploring Data to Power Business Insight

Modern applications extend beyond mere data storage; they engage in reasoning about it. With MCP server capabilities, robust dashboards can be constructed in mere minutes. MCP automatically manages data contextualization, relationship mapping, and visualization recommendations. A demonstration employs the Aurora DSQL MCP server to create a dashboard for an e-commerce database on Aurora DSQL.

Automated Test Code Generation Aligned with Your Database Schemas

Test coverage is only as effective as its accuracy. By utilizing agents with MCP servers, developers can generate tests based on live metadata and query patterns, streamlining various testing activities. For instance, developers can inspect the current Aurora schema and create specific tests to validate constraints and relationships. For DynamoDB, they can generate tests based on a table’s access patterns and indexes. Similarly, with ElastiCache, they can construct tests to simulate specific Time To Live (TTL) configurations and fallback scenarios. With the context of each database accessible to the AI assistant, these tests are precise and purpose-built to validate the database’s live metadata and query patterns. This approach allows developers to spend less time writing and maintaining tests, accelerating delivery without compromising confidence in application behavior. A video demonstration illustrates how MCP servers facilitate AI-assisted test generation, ensuring that test logic remains aligned with the current system structure.

Monitor and Troubleshoot Issues

For operations engineers managing complex database systems in production, MCP serves as a powerful tool for systemic troubleshooting. When confronted with database performance issues or anomalies, an operations engineer can employ an AI-assisted workflow directly from their preferred tools to gain real-time insights. This capability aids in diagnosing issues such as high memory usage, slow queries, and other potential problems before they escalate into critical situations.

A companion demonstration illustrates how operations teams can utilize an AI-assisted workflow to interact with an Amazon ElastiCache Valkey cache. This workflow retrieves information such as used memory, peak memory, number of clients connected, status of replications, and more. Although this data can be extensive and challenging for humans to quickly interpret, an AI agent can efficiently summarize the information and highlight the most pertinent results that may be contributing to performance issues.

Getting Started

The MCP servers for Aurora, DynamoDB, and ElastiCache are available as an open-source project maintained by AWS Labs. These servers can be run in a Docker container locally on the client operating the AI assistant. Before utilizing MCP servers, it is essential to follow the prerequisite steps outlined below:

Prerequisites

  1. Ensure Docker is installed on your development environment. For installation instructions, visit Docker Desktop download.
  2. Execute the following commands in your shell based on the database service(s) you utilize:
    # Clone the repository
    git clone https://github.com/awslabs/mcp.git
    # Navigate to the folder of the database service (example shown for Aurora PostgreSQL)
    cd mcp/src/postgres-mcp-server/
    # Build the Docker image
    docker build -t awslabs/postgres-mcp-server:latest .
    Database Service Directory
    Aurora DSQL aurora-dsql-mcp-server
    Aurora MySQL mysql-mcp-server
    Aurora PostgreSQL postgres-mcp-server
    DynamoDB dynamodb-mcp-server
    ElastiCache (Valkey) valkey-mcp-server
    ElastiCache (memcached) memcached-mcp-server
    Amazon Neptune amazon-neptune-mcp-server
    Amazon Timestream timestream-for-influxdb-mcp-server
    Amazon DocumentDB aws-documentdb-mcp-server
    Amazon Keyspaces amazon-keyspaces-mcp-server
  3. Add the MCP server to your client application’s configuration file. The configuration will depend on the database service and specific client applications, as demonstrated in the following sections.

Once you have completed the prerequisites, you are ready to utilize your preferred IDE and generative AI tools. The next sections will explore common IDEs and tools.

Amazon Q CLI with Cursor

  1. Ensure Cursor is installed on your machine. For installation instructions, visit Installation.
  2. Ensure Amazon Q CLI and the Amazon Q Developer extension are installed for Cursor. For more information on installation, refer to Installing the Amazon Q Developer extension or plugin in your IDE and Installing Amazon Q for command line.
  3. Utilize the example workspace configuration to demonstrate MCP servers in containers for Amazon Q CLI based on the database service. Alternatively, you can use the global configuration as specified in the MCP configuration section in our documentation.
  4. Here are sample configurations for the mcp.json file based on the MCP server you choose:

    For Aurora MySQL and Aurora PostgreSQL:

    {
      "mcpServers": {
        "awslabs.postgres-mcp-server": {
          "command": "docker",
          "args": [
            "run",
            "-i",
            "--rm",
            "-e", "AWS_ACCESS_KEY_ID=YOUR_KEY_HERE",
            "-e", "AWS_SECRET_ACCESS_KEY=YOUR_SECRET_HERE",
            "-e", "AWS_REGION=YOUR_REGION_HERE",
            "awslabs/postgres-mcp-server:latest",
            "--resource_arn", "YOUR_CLUSTER_ARN",
            "--secret_arn", "YOUR_DB_SECRET_ARN",
            "--database", "YOUR_DB_NAME",
            "--region", "YOUR_REGION_NAME",
            "--readonly",
            "True"
          ]
        }
      },
      "awslabs.mysql-mcp-server": {
        "command": "docker",
        "args": [
          "run",
          "-i",
          "--rm",
          "-e", "AWS_ACCESS_KEY_ID=YOUR_KEY_HERE",
          "-e", "AWS_SECRET_ACCESS_KEY=YOUR_SECRET_HERE",
          "-e", "AWS_REGION=YOUR_REGION_HERE",
          "awslabs/mysql-mcp-server:latest",
          "--resource_arn", "YOUR_CLUSTER_ARN",
          "--secret_arn", "YOUR_DB_SECRET_ARN",
          "--database", "YOUR_DB_NAME",
          "--region", "YOUR_REGION_NAME",
          "--readonly",
          "True"
        ]
      }
    }

    For Aurora DSQL:

    {
      "mcpServers": {
        "awslabs.aurora-dsql-mcp-server": {
          "command": "docker",
          "args": [
            "run",
            "-i",
            "--rm",
            "-e", "AWS_ACCESS_KEY_ID=YOUR_KEY_HERE",
            "-e", "AWS_SECRET_ACCESS_KEY=YOUR_SECRET_HERE",
            "-e", "AWS_REGION=YOUR_REGION_HERE",
            "awslabs/aurora-dsql-mcp-server:latest",
            "--cluster_endpoint", "DSQL cluster endpoint",
            "--database_user", "admin",
            "--region", "us-east-1"
          ]
        }
      }
    }

    For DynamoDB:

    {
        "mcpServers": {
          "awslabs.dynamodb-mcp-server": {
            "command": "docker",
            "args": [
              "run",
              "--rm",
              "--interactive",
              "--env",
              "FASTMCP_LOG_LEVEL=ERROR",
              "awslabs/dynamodb-mcp-server:latest"
            ],
            "env": {},
            "disabled": false,
            "autoApprove": []
          }
        }
      }

    For ElastiCache:

    {
      "mcpServers": {
        "awslabs.valkey-mcp-server": {
          "command": "docker",
          "args": [
            "run",
            "--rm",
            "--interactive",
            "--env",
            "FASTMCP_LOG_LEVEL=ERROR",
            "--env",
            "VALKEY_HOST=hostname",
            "--env",
            "VALKEY_PORT=6379",
            "awslabs/valkey-mcp-server:latest"
          ],
          "env": {},
          "disabled": false,
          "autoApprove": []
        }
      }
    }

Amazon Q CLI with Visual Studio Code (VS Code)

  1. Ensure VS Code is installed on your machine. Refer to Setting up Visual Studio Code for guidance.
  2. Ensure Amazon Q CLI and the Amazon Q Developer extension are installed for VS Code. For more information, refer to Installing the Amazon Q Developer extension or plugin in your IDE and Installing Amazon Q for command line.
  3. Utilize the example workspace configuration for Amazon Q CLI based on the database service, as explained in the Amazon Q CLI with Cursor section, or use the global configuration as specified in the MCP configuration section in our documentation.
  4. The remaining configuration is identical to that of Amazon Q CLI with Cursor, as discussed previously.

Claude Desktop

  1. Ensure Claude Desktop is installed and running on your machine. Refer to Installing Claude for Desktop for more information.
  2. Open Claude Desktop and navigate to Settings, then select Edit Config.
  3. Select claude_desktop_config.json. The content of claude_desktop_config.json mirrors that of the mcp.json file as described in the Amazon Q CLI with Cursor section.

Where We’re Headed

The introduction of MCP servers signifies a broader transition in how we support builders. Our objective is to integrate AWS databases into the tools where development begins, encompassing local environments, collaborative editors, and AI-assisted workflows.

MCP servers are open source and readily available for use. You can find the code, documentation, and setup instructions in the AWS Labs GitHub repository. Begin integrating them into your workflows to fully leverage the context of AWS databases within your development environment.

Tech Optimizer
Supercharging AWS database development with AWS MCP servers