inf2.48xlarge

The inf2.48xlarge instance is in the machine learning asic instances family with 192 vCPUs, 768.0 GiB of memory and 100 Gibps of bandwidth starting at $12.98127 per hour.

paid

Pricing

On Demand

Spot

1 Yr Reserved

3 Yr Reserved

dns

Family Sizes

Size vCPUs Memory (GiB)
inf2.xlarge 4 16
inf2.8xlarge 32 128
inf2.24xlarge 96 384
dns

Instance Variants

Variant
inf1
inf2
info

Instance Details

Compute Value
vCPUs 192
Memory (GiB) 768.0
Memory per vCPU (GiB) 4.0
Physical Processor AMD EPYC 7R13 Processor
Clock Speed (GHz) 2.95
CPU Architecture x86_64
GPU 12
GPU Architecture AWS Inferentia2
Video Memory (GiB) 384
GPU Compute Capability (?) 0
FPGA 0
Networking Value
Network Performance (Gibps) 100
Enhanced Networking True
IPV6 True
Placement Group (?) True
Storage Value
EBS Optimized True
Max Bandwidth (Mbps) on (EBS) 60000
Max Throughput (MB/s) on EBS 7500.0
Max I/O Operations/second (IOPS) 240000
Baseline Bandwidth (Mbps) on (EBS) 60000
Baseline Throughput (MB/s) on EBS 7500.0
Baseline I/O Operations/second (IOPS) 240000
Amazon Value
Generation current
Instance Type inf2.48xlarge
Family Machine Learning ASIC Instances
Name INF2 48xlarge
Elastic Map Reduce (EMR) False