AWS costs estimation using Amazon Q CLI and AWS Cost Analysis MCP


Managing and optimizing AWS infrastructure prices is a essential problem for organizations of all sizes. Conventional charge evaluation approaches continuously contain the next:

  • Complicated spreadsheets – Growing and keeping up detailed charge fashions, which calls for vital effort
  • A couple of equipment – Switching between the AWS Pricing Calculator, AWS Price Explorer, and third-party equipment
  • Specialised wisdom – Figuring out the nuances of AWS pricing throughout products and services and AWS Areas
  • Time-consuming evaluation – Manually evaluating other deployment choices and eventualities
  • Behind schedule optimization – Price insights continuously come too overdue to tell architectural selections

Amazon Q Developer CLI with the Model Context Protocol (MCP) gives a innovative solution to AWS charge evaluation. Via the usage of generative AI via herbal language activates, groups can now generate detailed charge estimates, comparisons, and optimization suggestions in mins quite than hours, whilst offering accuracy via integration with reliable AWS pricing information.

On this publish, we discover the right way to use Amazon Q CLI with the AWS Cost Analysis MCP server to accomplish refined charge evaluation that follows AWS best possible practices. We speak about fundamental setup and complicated tactics, with detailed examples and step by step directions.

Answer evaluate

Amazon Q Developer CLI is a command line interface that brings the generative AI functions of Amazon Q immediately in your terminal. Builders can have interaction with Amazon Q via herbal language activates, making it a useful software for quite a lot of building duties.
Advanced through Anthropic as an open protocol, the Model Context Protocol (MCP) supplies a standardized option to attach AI fashions to other information assets or equipment. The use of a client-server structure (as illustrated within the following diagram), the MCP is helping builders reveal their information via light-weight MCP servers whilst construction AI packages as MCP shoppers that attach to those servers.

The MCP makes use of a client-server structure containing the next elements:

  • Host – A program or AI software that calls for get admission to to information during the MCP protocol, comparable to Anthropic’s Claude Desktop, an built-in building setting (IDE), or different AI packages
  • Consumer – Protocol shoppers that take care of one-to-one connections with servers
  • Server – Light-weight systems that reveal functions via standardized MCP or act as equipment
  • Information assets – Native information assets comparable to databases and document methods, or exterior methods to be had over the web via APIs (internet APIs) that MCP servers can connect to

mcp-architectinfo

As announced in April 2025, the MCP allows Amazon Q Developer to hook up with specialised servers that stretch its functions past what’s imaginable with the bottom type on my own. MCP servers act as plugins for Amazon Q, offering domain-specific wisdom and capability. The AWS Price Research MCP server particularly allows Amazon Q to generate detailed charge estimates, stories, and optimization suggestions the usage of real-time AWS pricing information.

Necessities

To enforce this resolution, you will have to have an AWS account with suitable permissions and practice the stairs under.

Arrange your setting

Sooner than you’ll be able to get started examining prices, you wish to have to arrange your setting with Amazon Q CLI and the AWS Cost Analysis MCP server. This phase supplies detailed directions for set up and configuration.

Set up Amazon Q Developer CLI

Amazon Q Developer CLI is to be had as a standalone set up. Whole the next steps to put in it:

  1. Obtain and set up Amazon Q Developer CLI. For directions, see The use of Amazon Q Developer at the command line.
  2. Test the set up through working the next command: q --version
    You must see output very similar to the next: Amazon Q Developer CLI edition 1.x.x
  3. Configure Amazon Q CLI together with your AWS credentials: q login
  4. Select the login manner appropriate for you:

Arrange MCP servers

Sooner than the usage of the AWS Price Research MCP server with Amazon Q CLI, you will have to set up a number of equipment and configure your setting. The next steps information you via putting in the essential equipment and putting in the MCP server configuration:

  1. Set up Panoc the usage of the next command (you’ll be able to install with brew as well), changing the output to PDF: pip set up pandoc
  2. Set up uv with the next command: pip set up uv
  3. Set up Python 3.10 or more recent: uv python set up 3.10
  4. Upload the servers in your ~/.aws/amazonq/mcp.json document:
    {
      "mcpServers": {
        "awslabs.cost-analysis-mcp-server": {
          "command": "uvx",
          "args": ["awslabs.cost-analysis-mcp-server"],
          "env": {
            "FASTMCP_LOG_LEVEL": "ERROR"
          },
          "autoApprove": [],
          "disabled": false
        }
      }
    }
    

    Now, Amazon Q CLI mechanically discovers MCP servers within the ~/.aws/amazonq/mcp.json document.

Figuring out MCP server equipment

The AWS Price Research MCP server supplies a number of tough equipment:

  • get_pricing_from_web – Retrieves pricing knowledge from AWS pricing webpages
  • get_pricing_from_api – Fetches pricing information from the AWS Worth Checklist API
  • generate_cost_report – Creates detailed charge evaluation stories with breakdowns and visualizations
  • analyze_cdk_project – Analyzes AWS Cloud Building Equipment (AWS CDK) tasks to spot products and services used and estimate prices
  • analyze_terraform_project – Analyzes Terraform tasks to spot products and services used and estimate prices
  • get_bedrock_patterns – Retrieves structure patterns for Amazon Bedrock with charge issues

Those equipment paintings in combination that can assist you create correct charge estimates that practice AWS best possible practices.

Take a look at your setup

Let’s check that the whole thing is operating appropriately through producing a easy charge evaluation:

  1. Get started the Amazon Q CLI chat interface and check the output presentations the MCP server being loaded and initialized: q chat
  2. Within the chat interface, input the next urged:Please create a value evaluation for a easy internet utility with an Utility Load Balancer, two t3.medium EC2 circumstances, and an RDS db.t3.medium MySQL database. Suppose 730 hours of utilization monthly and reasonable site visitors of about 100 GB information switch. Convert estimation to a PDF structure.
  3. Amazon Q CLI will ask for permission to agree with the software this is getting used; input t to agree with it. Amazon Q must generate and show an in depth charge evaluation. Your output must appear to be the next screenshot.

    In case you see the price evaluation document, your setting is about up appropriately. In case you stumble upon problems, check that Amazon Q CLI can get admission to the MCP servers through ensuring you put in set up the essential equipment and the servers are within the ~/.aws/amazonq/mcp.json document.

Configuration choices

The AWS Price Research MCP server helps a number of configuration choices to customise your charge evaluation revel in:

  • Output structure – Choose from markdown, CSV codecs, or PDF (which we put in the bundle for) for charge stories
  • Pricing model – Specify on-demand, reserved circumstances, or financial savings plans
  • Assumptions and exclusions – Customise the assumptions and exclusions to your charge evaluation
  • Detailed charge information – Supply particular utilization patterns for extra correct estimates

Now that the environment is about up, let’s create extra charge analyses.

Create AWS Price Research stories

On this phase, we stroll during the procedure of making AWS charge evaluation stories the usage of Amazon Q CLI with the AWS Price Research MCP server.

Whilst you supply a urged to Amazon Q CLI, the AWS Price Research MCP server completes the next steps:

  1. Interpret your necessities.
  2. Retrieve pricing information from AWS pricing assets.
  3. Generate an in depth charge evaluation document.
  4. Supply optimization suggestions.

This procedure occurs seamlessly, so you’ll be able to center of attention on describing what you wish to have quite than the right way to create it.

AWS Price Research stories normally come with the next knowledge:

  • Carrier prices – Breakdown of prices through AWS carrier
  • Unit pricing – Detailed unit pricing knowledge
  • Utilization amounts – Estimated utilization amounts for each and every carrier
  • Calculation main points – Step by step calculations appearing how prices have been derived
  • Assumptions – Obviously said assumptions used within the evaluation
  • Exclusions – Prices that weren’t integrated within the evaluation
  • Suggestions – Price optimization ideas

Instance 1: Analyze a serverless utility

Let’s create a value evaluation for a easy serverless utility. Use the next urged:

Create a value evaluation for a serverless utility the usage of API Gateway, Lambda, and DynamoDB. Suppose 1 million API calls monthly, moderate Lambda execution time of 200ms with 512MB reminiscence, and 10GB of DynamoDB garage with 5 million learn requests and 1 million write requests monthly. Convert estimation to a PDF structure.

Upon coming into your urged, Amazon Q CLI will retrieve pricing information the usage of the get_pricing_from_web or get_pricing_from_api equipment, and can use generate_cost_report with awslabscost_analysis_mcp_server.

You must obtain an output giving an in depth charge breakdown in accordance with the urged at the side of optimization suggestions.

The generated charge evaluation presentations the next knowledge:

  • Amazon API Gateway prices for 1 million requests
  • AWS Lambda prices for compute time and requests
  • Amazon DynamoDB prices for garage, learn, and write capability
  • General per month charge estimate
  • Price optimization suggestions

Instance 2: Analyze multi-tier architectures

Multi-tier architectures separate packages into practical layers (presentation, utility, and knowledge) to reinforce scalability and safety. This case analyzes prices for imposing such an structure on AWS with elements for each and every tier:

Create a value evaluation for a three-tier internet utility with a presentation tier (ALB and CloudFront), utility tier (ECS with Fargate), and knowledge tier (Aurora PostgreSQL). Come with prices for two Fargate duties with 1 vCPU and 2GB reminiscence each and every, an Aurora db.r5.massive example with 100GB garage, an Utility Load Balancer with 10

This time, we’re formatting it into each PDF and DOCX.

The fee evaluation presentations the next knowledge:

Instance 3: Evaluate deployment choices

When deploying bins on AWS, opting for between Amazon ECS with Amazon Elastic Compute Cloud (Amazon EC2) or Fargate comes to other charge buildings and control overhead. This case compares those choices to decide probably the most cost-effective resolution for a selected workload:

Evaluate the prices between working a containerized utility on ECS with EC2 release kind as opposed to Fargate release kind. Suppose 4 bins each and every wanting 1 vCPU and 2GB reminiscence, working 24/7 for a month. For EC2, use t3.medium circumstances. Supply a advice on which choice is cheaper for this workload. Convert estimation to a HTML webpage.

This time, we’re formatting it right into a HTML webpage.

The fee comparability comprises the next knowledge:

  • Amazon ECS with Amazon EC2 release kind prices
  • Amazon ECS with Fargate release kind prices
  • Detailed breakdown of each and every choice’s pricing elements
  • Aspect-by-side comparability of general prices
  • Suggestions for probably the most cost-effective choice
  • Issues for when each and every choice could be most well-liked

Actual-world examples

Let’s discover some real-world structure patterns and the right way to analyze their prices the usage of Amazon Q CLI with the AWS Price Research MCP server.

Ecommerce platform

Ecommerce platforms require scalable, resilient architectures with cautious charge control. Those methods normally use microservices to take care of quite a lot of purposes independently whilst keeping up prime availability. This case analyzes prices for a whole ecommerce resolution with a couple of elements serving reasonable site visitors ranges:

Create a value evaluation for an e-commerce platform with microservices structure. Come with elements for product catalog, buying groceries cart, checkout, fee processing, order control, and consumer authentication. Suppose reasonable site visitors of 500,000 per month energetic customers, 2 million web page perspectives in line with day, and 50,000 orders monthly. Make sure the evaluation follows AWS best possible practices for charge optimization. Convert estimation to a PDF structure.

The fee evaluation comprises the next key elements:

Information analytics platform

Trendy information analytics platforms wish to successfully ingest, retailer, procedure, and visualize massive volumes of knowledge whilst managing prices successfully. This case examines the AWS products and services and prices serious about construction an entire analytics pipeline dealing with vital day by day information volumes with a couple of consumer get admission to necessities:

Create a value evaluation for an information analytics platform processing 500GB of recent information day by day. Come with elements for information ingestion (Kinesis), garage (S3), processing (EMR), and visualization (QuickSight). Suppose 50 customers getting access to dashboards day by day and knowledge retention of 90 days. Make sure the evaluation follows AWS best possible practices for charge optimization and comprises suggestions for cost-effective scaling. Convert estimation to a HTML webpage.

The fee evaluation comprises the next key elements:

  • Information ingestion prices (Amazon Kinesis Data Streams and Amazon Data Firehose)
  • Garage prices (Amazon S3 with lifecycle insurance policies)
  • Processing prices (Amazon EMR cluster)
  • Visualization prices (Amazon QuickSight)
  • Information switch prices between products and services
  • General per month charge estimate
  • Price optimization suggestions for each and every element
  • Scaling issues and their charge implications

Blank up

In case you now not wish to use the AWS Price Research MCP server with Amazon Q CLI, you’ll be able to take away it out of your configuration:

  1. Open your ~/.aws/amazonq/mcp.json document.
  2. Take away or remark out the “awslabs.cost-analysis-mcp-server” access.
  3. Save the document.

This may save you the server from being loaded whilst you get started Amazon Q CLI one day.

Conclusion

On this publish, we explored the right way to use Amazon Q CLI with the AWS Price Research MCP server to create detailed charge analyses that use correct AWS pricing information. This means gives vital benefits over conventional charge estimation strategies:

  • Time financial savings – Generate complicated charge analyses in mins as a substitute of hours
  • Accuracy – Be sure that estimates use the newest AWS pricing knowledge
  • Complete – Come with related charge elements and issues
  • Actionable – Obtain particular optimization suggestions
  • Iterative – Briefly examine other eventualities via easy activates
  • Validation – Test estimates towards reliable AWS pricing

As you proceed exploring AWS charge evaluation, we inspire you to deepen your wisdom through studying extra concerning the Model Context Protocol (MCP) to know the way it complements the functions of Amazon Q. For hands-on charge estimation, the AWS Pricing Calculator gives an interactive revel in to type and examine other deployment eventualities. To ensure your architectures practice monetary best possible practices, the AWS Well-Architected Framework Cost Optimization Pillar supplies complete steerage on construction cost-efficient methods. And to stick on the leading edge of those equipment, control updates to the reliable AWS MCP servers—they’re continuously evolving with new options to make your charge evaluation revel in much more tough and correct.


Concerning the Authors

Joel Asante, an Austin-based Answers Architect at Amazon Internet Services and products (AWS), works with GovTech (Govt Era) shoppers. With a robust background in information science and alertness building, he brings deep technical experience to making protected and scalable cloud architectures for his shoppers. Joel is enthusiastic about information analytics, device studying, and robotics, leveraging his building revel in to design cutting edge answers that meet complicated govt necessities. He holds 13 AWS certifications and enjoys circle of relatives time, health, and cheering for the Kansas Town Chiefs and Los Angeles Lakers in his spare time.

Dunieski Otano is a Answers Architect at Amazon Internet Services and products founded out of Miami, Florida. He works with International Large Public Sector MNO (Multi-World Organizations) shoppers. His interest is Safety, Device Studying and Synthetic Intelligence, and Serverless. He works along with his shoppers to assist them construct and deploy prime to be had, scalable, and protected answers. Dunieski holds 14 AWS certifications and is an AWS Golden Jacket recipient. In his loose time, you’ll to find him spending time along with his circle of relatives and canine, staring at a perfect film, coding, or flying his drone.

Varun Jasti is a Answers Architect at Amazon Internet Services and products, running with AWS Companions to design and scale synthetic intelligence answers for public sector use instances to satisfy compliance requirements. With a background in Pc Science, his paintings covers wide vary of ML use instances essentially that specialize in LLM coaching/inferencing and laptop imaginative and prescient. In his spare time, he loves enjoying tennis and swimming.



Source link

Leave a Comment