inf1.24xlarge

The inf1.24xlarge instance is in the Machine Learning ASIC Instances family with 96 vCPUs, 192 GiB of memory and 100 Gibps of bandwidth starting at $4.721 per hour.

Pricing

$4.721

On Demand

$1.6051

Spot

$2.974

1-Year Reserved

$2.269

3-Year Reserved

Request a demo

Family Sizes

SizevCPUsMemory (GiB)
inf1.xlarge48
inf1.2xlarge816
inf1.6xlarge2448
inf1.24xlarge96192

Instance Variants

Variant
inf1
inf2

Having trouble making sense of your EC2 costs? Check out cur.vantage.sh for an AWS billing code lookup tool.

Instance Details

ComputeValue
vCPUs96
Memory (GiB)192
Memory per vCPU (GiB)2
Physical ProcessorIntel Xeon Platinum 8275CL (Cascade Lake)
Clock Speed (GHz)N/A
CPU Architecturex86_64
GPU
16
GPU Architecture
AWS Inferentia
Video Memory (GiB)0
GPU Compute Capability (?)0
FPGA
0
NetworkingValue
Network Performance (Gibps)100
Enhanced Networking
true
IPv6
true
Placement Group (?)
StorageValue
EBS Optimized
true
Max Bandwidth (Mbps) on (EBS)19000
Max Throughput (MB/s) on (EBS)2375
Max I/O operations/second (IOPS)80000
Baseline Bandwidth (Mbps) on (EBS)19000
Baseline Throughput (MB/s) on (EBS)2375
Baseline I/O operations/second (IOPS)80000
Devices0
Swap Partition
false
NVME Drive (?)
false
Disk Space (GiB)0
SSD
false
Initialize Storage
false
AmazonValue
Generation
current
Instance Typeinf1.24xlarge
FamilyMachine Learning ASIC Instances
NameINF1 24xlarge
Elastic Map Reduce (EMR)
false
DocsBy Vantage