Vantage is the FinOps platform your engineering team will actually use. Get a demo.

inf2.24xlarge

The inf2.24xlarge instance is in the Machine Learning ASIC Instances family with 96 vCPUs, 384 GiB of memory and 50 Gibps of bandwidth starting at $6.49063 per hour.

Pricing

$6.4906

On Demand

$1.363

Spot

$4.0891

1-Year Reserved

$2.804

3-Year Reserved

Request a demo

Family Sizes

SizevCPUsMemory (GiB)
inf2.xlarge416
inf2.8xlarge32128
inf2.24xlarge96384
inf2.48xlarge192768

Instance Variants

Variant
inf1
inf2

Having trouble making sense of your EC2 costs? Check out cur.vantage.sh for an AWS billing code lookup tool.

Instance Details

ComputeValue
vCPUs96
Memory (GiB)384
Memory per vCPU (GiB)4
Physical ProcessorAMD EPYC 7R13 Processor
Clock Speed (GHz)2.95 GHz
CPU Architecturex86_64
GPU
6
GPU Architecture
AWS Inferentia2
Video Memory (GiB)192
GPU Compute Capability (?)0
FPGA
0
NetworkingValue
Network Performance (Gibps)50
Enhanced Networking
true
IPv6
true
Placement Group (?)
StorageValue
EBS Optimized
true
Max Bandwidth (Mbps) on (EBS)30000
Max Throughput (MB/s) on (EBS)3750
Max I/O operations/second (IOPS)120000
Baseline Bandwidth (Mbps) on (EBS)30000
Baseline Throughput (MB/s) on (EBS)3750
Baseline I/O operations/second (IOPS)120000
Devices0
Swap Partition
false
NVME Drive (?)
false
Disk Space (GiB)0
SSD
false
Initialize Storage
false
AmazonValue
Generation
current
Instance Typeinf2.24xlarge
FamilyMachine Learning ASIC Instances
NameINF2 24xlarge
Elastic Map Reduce (EMR)
false
DocsBy Vantage