inf1.6xlarge
The inf1.6xlarge instance is in the machine learning asic instances family with 24 vCPUs, 48.0 GiB of memory and 25 Gibps of bandwidth starting at $1.18 per hour.
Pricing
On Demand
Spot
1 Yr Reserved
3 Yr Reserved
$413.91 per month (-52%) with Autopilot
Family Sizes
Size | vCPUs | Memory (GiB) |
---|---|---|
inf1.xlarge | 4 | 8 |
inf1.2xlarge | 8 | 16 |
inf1.6xlarge | 24 | 48 |
inf1.24xlarge | 96 | 192 |
Instance Details
Compute | Value |
---|---|
vCPUs | 24 |
Memory (GiB) | 48.0 |
Memory per vCPU (GiB) | 2.0 |
Physical Processor | Intel Xeon Platinum 8275CL (Cascade Lake) |
Clock Speed (GHz) | None |
CPU Architecture | x86_64 |
GPU | 4 |
GPU Architecture | AWS Inferentia |
Video Memory (GiB) | 0 |
GPU Compute Capability (?) | 0 |
FPGA | 0 |
Networking | Value |
---|---|
Network Performance (Gibps) | 25 |
Enhanced Networking | True |
IPV6 | True |
Placement Group (?) | True |
Storage | Value |
---|---|
EBS Optimized | True |
Max Bandwidth (Mbps) on (EBS) | 4750 |
Max Throughput (MB/s) on EBS | 593.75 |
Max I/O Operations/second (IOPS) | 20000 |
Baseline Bandwidth (Mbps) on (EBS) | 4750 |
Baseline Throughput (MB/s) on EBS | 593.75 |
Baseline I/O Operations/second (IOPS) | 20000 |
Amazon | Value |
---|---|
Generation | current |
Instance Type | inf1.6xlarge |
Family | Machine Learning ASIC Instances |
Name | INF1 6xlarge |
Elastic Map Reduce (EMR) | False |
EC2 Rightsizing Recommendations
Connect your AWS account with Vantage to see a savings estimate.
Sign Up