inf2.8xlarge

The inf2.8xlarge instance is in the machine learning asic instances family with 32 vCPUs, 128.0 GiB of memory and up to 25 Gibps of bandwidth starting at $1.96786 per hour.

paid

Pricing

On Demand

Spot

1 Yr Reserved

3 Yr Reserved

dns

Family Sizes

Size vCPUs Memory (GiB)
inf2.xlarge 4 16
inf2.24xlarge 96 384
inf2.48xlarge 192 768
dns

Instance Variants

Variant
inf1
inf2
info

Instance Details

Compute Value
vCPUs 32
Memory (GiB) 128.0
Memory per vCPU (GiB) 4.0
Physical Processor AMD EPYC 7R13 Processor
Clock Speed (GHz) 2.95
CPU Architecture x86_64
GPU 1
GPU Architecture AWS Inferentia2
Video Memory (GiB) 32
GPU Compute Capability (?) 0
FPGA 0
Networking Value
Network Performance (Gibps) Up to 25
Enhanced Networking True
IPV6 True
Placement Group (?) True
Storage Value
EBS Optimized True
Max Bandwidth (Mbps) on (EBS) 10000
Max Throughput (MB/s) on EBS 1250.0
Max I/O Operations/second (IOPS) 40000
Baseline Bandwidth (Mbps) on (EBS) 10000
Baseline Throughput (MB/s) on EBS 1250.0
Baseline I/O Operations/second (IOPS) 40000
Amazon Value
Generation current
Instance Type inf2.8xlarge
Family Machine Learning ASIC Instances
Name INF2 Eight Extra Large
Elastic Map Reduce (EMR) False