In an effort to try to understand Amazon Web Services (AWS) further and get a better grasp of how it is configured and priced, I asked Data Canopy’s CTO, Andrew Iwamoto, to walk me through some of the console. I’m not an IT manager by trade, but Andrew has spent years (and years and years) working in and around data centers. In walking through the interface of AWS with him, it became immediately clear that 1. Customizations will cost you and 2. It’s not at all simple to understand just how much.
On-Demand vs. Reserved Instances
Amazon has laid out several different packages that are aggressively priced and would work wonderfully for developers who need that EXACT package. Significant discounts are given compared to On-Demand instances for the Reserved Instances and Amazon recommends these packages for steady-state usage. One note: Reserved Instances attributes can be changed for equal or greater value only. If you may wind up needing less than you purchase, there isn’t an option to scale down.
Public Storage Costs
The basic storage option has the disconcerting quality of not allowing movement of data across virtual machines and not explaining clearly what an upgrade from this pooled storage type would cost. To understand the pricing of each potential upgrade – and there are a lot – you must open a pricing page and calculate the price by hour for each specification change. We’re all comfortable with math here, but that is a lot to sort through when you consider the sheer volume of specifications one can make to a single virtual machine.
Storage and snapshots require an expert with excellent attention to detail to ensure that the snapshots are regularly executed and the data is kept for only a specified amount of time. This all must be accomplished by writing scripts. Anecdotally, a company we have worked with had not executed this task precisely and were surprised to find their bill contained thousands of dollars worth of unexpected storage costs. Little oversights can result in big charges, all the more reason to engage with an expert to help you navigate the waters of migrating to the public cloud.
Staying in Control in AWS
Your two biggest costs are compute resources (CPU and RAM) and storage. Long-term storage or development environments are well suited to the AWS cloud. Constantly accessing large amounts of data or moving large amounts of data between environments can be cost prohibitive. Production environments in AWS need application provisioning and autoscaling logic to ensure your usage is running optimally. Planning and vigilance are key to keeping costs under control.
In addition to cost concerns, there are landmines in the selection process that an uninformed purchaser might hit, such as choosing to terminate upon shutdown which could result in complete loss of all data – not great. The upshot? You’ll need expertise (either in-house or from a consultant) to navigate the AWS portal to get exactly what you want – at a price you can afford.