inf1.24xlarge

The inf1.24xlarge instance is in the machine learning asic instances family with 96 vCPUs, 192.0 GiB of memory and 100 Gibps of bandwidth starting at $4.721 per hour.

paid

Pricing

On Demand

Spot

1 Yr Reserved

3 Yr Reserved

$1656.37 per month (-52%) with Autopilot

dns

Family Sizes

Size vCPUs Memory (GiB)
inf1.xlarge 4 8
inf1.2xlarge 8 16
inf1.6xlarge 24 48
dns

Instance Variants

Variant
inf1
inf2
info

Instance Details

Compute Value
vCPUs 96
Memory (GiB) 192.0
Memory per vCPU (GiB) 2.0
Physical Processor Intel Xeon Platinum 8275CL (Cascade Lake)
Clock Speed (GHz) None
CPU Architecture x86_64
GPU 16
GPU Architecture AWS Inferentia
Video Memory (GiB) 0
GPU Compute Capability (?) 0
FPGA 0
Networking Value
Network Performance (Gibps) 100
Enhanced Networking True
IPV6 True
Placement Group (?) True
Storage Value
EBS Optimized True
Max Bandwidth (Mbps) on (EBS) 19000
Max Throughput (MB/s) on EBS 2375.0
Max I/O Operations/second (IOPS) 80000
Baseline Bandwidth (Mbps) on (EBS) 19000
Baseline Throughput (MB/s) on EBS 2375.0
Baseline I/O Operations/second (IOPS) 80000
Amazon Value
Generation current
Instance Type inf1.24xlarge
Family Machine Learning ASIC Instances
Name INF1 24xlarge
Elastic Map Reduce (EMR) False

close
Vantage Logo
EC2 Rightsizing Recommendations

Connect your AWS account with Vantage to see a savings estimate.

Sign Up