Nuances, Limitations, and Shortcomings of the CUR

Author Alexander Yu

Last updated 26 Sep, 2023

5 mins read

Nuances, Limitations, and Shortcomings of the CUR

So far, we’ve established that if you’re an AWS customer, you should probably set up the CUR sooner rather than later. However, we’ve also hinted at some of the CUR’s flaws. After all, the CUR is far from a perfect tool!

That’s why before you start working with the AWS Cost and Usage Report, you’ll want to be aware of its various nuances, limitations, and shortcomings. Here are some of the major ones:

  • The CUR in its native form is unreadable
  • Requires an analytics tool to parse and interpret data
  • Learning curve for CUR-specific structure and terminology
  • CUR updates can be inconsistent
  • CUR features and documentation can be misaligned
  • You must opt into the CUR to get detailed data

We’ll take a closer look at each one in turn.

Unreadable in Native Form

While the AWS Cost and Usage Report (CUR) is essentially just a database table, it’s a really big database table. In the words of the AWS documentation: “The size of an individual report can grow to more than a gigabyte and might exceed the capacity of desktop spreadsheet applications to display every line.” Indeed, as we saw in an earlier screenshot of the CUR, it’s not something you’d want to view in Excel.

Why is this? A single AWS Cost and Usage report can have over a hundred columns, and that’s with the default settings. If you use extensive tagging on your resources, the number of columns becomes effectively unlimited.

Depending on how you set up your CUR, you could also have many rows referring to a single resource. For example, if you choose the hourly time granularity, AWS generates one row per hour for each resource. Given this, it’s easy to see why CUR files, even for small and mid-sized organizations, can easily surpass the 1 million lines per file limit. When this happens, your CUR will be split into multiple files. 

And finally, even if CUR files were manageable in size, their column names and data values can be very difficult to understand. Unless you’re well-versed in CUR-speak, it’s hard to decipher what the column “lineItem/UsageType” refers to, especially when possible values include the ever-cryptic “USE1-EUC1-AWS-Out-Bytes”, “APN2-FreeEventsRecorded”, and “Requests-Tier1”.

Still don't understand the cost and usage report

Requires an Analytics Tool

To handle a beast like the CUR, a data analytics tool is a must. The AWS CUR offers native integrations with Athena, Redshift, and QuickSight. This is great for convenience… though it’s also great thinking on AWS’s front to pull you deeper into their ecosystem.

While these tools are extremely powerful, some technical expertise is required to be able to use them effectively. For example, a DevOps team might not know what to do with a CUR file on their own. You’ll need to make sure someone is fluent in SQL so that you can capitalize on a CUR integration with Athena or Redshift.

As an additional hidden nuance, switching between native integrations, if you choose to do so, can be quirky. There are slight differences in the way Athena and Redshift format CUR columns. Changing the analytics tool you use might mean editing your SQL queries to ensure they still work.

Learning Curve

The first two points highlight an overarching theme when adopting the CUR: there can be a significant learning curve. This applies to both understanding the complex columns and format of the CUR, as well as becoming fluent in the analytics tool you choose.

Just consider some of the CUR billing terminology: one topic that newer AWS users often get tangled up is the difference between “unblended” versus “blended” costs:

  • Unblended costs represent actual, real-time costs: if you spend $1/day on EC2 in an account, your unblended cost is $1/day. 
  • Blended costs represent average costs across accounts in an organization: if your organization spends $1/day on EC2 in account A, and $5/day on EC2 in account B, the blended cost for both accounts is $3/day (the unblended cost for account A is still $1/day, and $5/day for account B).
  • Apart from these two, you also have amortized costs, which distribute recurring charges evenly across a more reasonable timeframe. For example, if you’re charged monthly for AWS EC2 Reserved Instances, it often helps to view amortized costs to distribute that charge evenly across each day so you don’t see a spike in billing at the start of each month.

But apart from terminology, there’s also the file structure that you have to get used to, which varies based on the integration you choose. For example, with a Redshift integration, you’d get a CSV file, a Manifest JSON file, a SQL command to create a Redshift table, and a Redshift Manifest JSON file. The Manifest JSON file contains metadata about the CUR, such as a description of the columns, the report ID, and billing period.

try learning curve

All this to say, the CUR isn’t the most intuitive tool in the world. Expect to devote some time to learning how to use it.

Inconsistent CUR Updates

Every day, AWS updates your CUR file for costs incurred the day before (in UTC). At most, updates happen 3 to 4 times a day at regular intervals every 6 to 8 hours. In reality however, AWS can be very inconsistent in delivering timely updates. For example, CUR updates are most commonly delayed near the beginning and end of months, without much explanation from AWS as to why this is.

Reporting at the end of months can also be tricky to navigate, since AWS often makes retroactive updates to last month’s reports in addition to updating the current month’s reports. For example, if you have an enterprise discount, AWS may apply it to your report at the end of a monthly billing cycle. This means that AWS will update last month’s report to include discount details, which can be easy to miss.

For these reasons, we’d recommend that you avoid relying on AWS to deliver CUR updates on a timely basis. Instead, you might want to set up a system to consume CUR updates whenever they’re delivered, as suggested by AWS Well-Architectured Labs.

Misalignment Between CUR Features and Documentation

The data dictionary does a great job of explaining the columns you’ll find in your CUR file. However, it’s also prone to becoming outdated easily, especially since AWS periodically releases new fields and columns. In addition, just about any new AWS feature release could introduce new possible values or completely change the way something is reported. The data dictionary might be left to play catch-up.

In fact, the data dictionary acknowledges this in the intro: it describes only “a subset of columns that you see in your report”, so it may be possible that new columns aren’t immediately documented. 

Must Opt in to Get Detailed Data

Perhaps the biggest shortcoming of the CUR is that despite all these other technical nuances and limitations, you’re still pretty much forced to opt into it. After all, it’s AWS’s only offering for receiving detailed spending data. 

And not only are you forced to opt in, but you should opt in ASAP. There’s no backfilling of past spending data with the CUR, so you’re missing out on a gold mine of data if you just sit idle. For instance, if you enable the CUR 3 years after opening your AWS account, that’s over 2 years of spending data that’s impossible to recover. While AWS Cost Explorer does give you basic spending information over the past 12 months, this is nothing compared to the detailed data you get with the CUR. 

AWS should be more forthright about enabling the CUR the moment you set up your account, but unfortunately this isn’t the case. We absolutely recommend enabling the CUR as early as possible to start accumulating spending data in your buckets. 

Author Alexander Yu
Alexander Yu is a technical writer at AWS by day and a freelance writer by night. After completing his BS in electrical engineering and computer science from UC Berkeley, he became a software developer at AWS for almost three years before transitioning into technical writing. He lives in Seattle with his dog Yuna.

Manage, track, and report your AWS spending in seconds — not hours

CloudForecast’s focused daily AWS cost monitoring reports to help busy engineering teams understand their AWS costs, rapidly respond to any overspends, and promote opportunities to save costs.

Monitor & Manage AWS Cost in Seconds — Not Hours

CloudForecast makes the tedious work of AWS cost monitoring less tedious.

AWS cost management is easy with CloudForecast

We would love to learn more about the problems you are facing around AWS cost. Connect with us directly and we’ll schedule a time to chat!

AWS daily cost reports