As Amazon AWS strives to cover even more use cases, high memory applications have been a traditional hardware niche that EC2 has serviced. AWS EC2 has offered memory focused instance types including the M2, CR1, R3, and R4. Now comes the next generation AWS R5 EC2 instances. These instances are targeted at applications such as in-memory analytics where having RAM capacity is extremely important to minimize access from slower storage.
AWS R5 EC2 Instances
Underpinning the AWS R5 EC2 instances is the Intel Xeon Platinum 8000 series parts running at up to 3.1GHz all core turbo. Amazon claims that the Skylake-SP Xeon chips used are custom. There is a good chance AWS is using “M” series SKUs. You can read about the M series in Intel Xeon Scalable Processor Family SKUs and Value Analysis. These SKUs allow for up to 1.5TB per socket in 6x 128GB. The instances support NVMe and ENA as well.
Here is the table of AWS R5 EC2 instances that the company announced:
Instance Name | vCPUs | Memory | EBS-Optimized Bandwidth | Network Bandwidth |
r5.large | 2 | 16 GiB | Up to 3.5 Gbps | Up to 10 Gbps |
r5.xlarge | 4 | 32 GiB | Up to 3.5 Gbps | Up to 10 Gbps |
r5.2xlarge | 8 | 64 GiB | Up to 3.5 Gbps | Up to 10 Gbps |
r5.4xlarge | 16 | 128 GiB | 3.5 Gbps | Up to 10 Gbps |
r5.12xlarge | 48 | 384 GiB | 7.0 Gbps | 10 Gbps |
r5.24xlarge | 96 | 768 GiB | 14.0 Gbps | 25 Gbps |
AWS did not announce pricing and we expect to see pricing as these machines come online and become generally available. The instances also support up to 25Gbps of bandwidth so for those that need large clusters with high inter-node traffic, this may still be below what one would see in an on-prem data center. 100GbE is reasonably priced within a few racks these days.
There is also a r5d instance type that will add NVMe storage to the standard r5 instance. The r5d.24xlarge we expect to have 3.68TB of local NVMe storage with decrements on that local storage for smaller instances.
Final Words
These are the types of machines that we see a major opportunity for Intel Optane Persistent Memory. Building instance types for large redis nodes as an example. We published details on the Next-Gen Intel Cascade Lake 28C Supporting 3.84TB Memory using the new technology. We expect a new generation later this year or next year taking advantage of Optane and offering significantly more capacity as a result.