inf1.2xlarge

The inf1.2xlarge instance is in the machine learning asic instances family with 8 vCPUs, 16.0 GiB of memory and up to 25 Gibps of bandwidth starting at $0.362 per hour.

paid

Pricing

On Demand

Spot

1 Yr Reserved

3 Yr Reserved

dns

Family Sizes

Size vCPUs Memory (GiB)
inf1.xlarge 4 8
inf1.6xlarge 24 48
inf1.24xlarge 96 192
info

Instance Details

Compute Value
vCPUs 8
Memory (GiB) 16.0
Memory per vCPU (GiB) 2.0
Physical Processor Intel Xeon Platinum 8275CL (Cascade Lake)
Clock Speed (GHz) None
CPU Architecture x86_64
GPU 0
GPU Architecture None
Video Memory (GiB) 0
GPU Compute Capability (?) 0
FPGA 0
close
Concerned about cloud costs?

Connect your AWS account in under
5 minutes to see savings.

Connect AWS Account
Networking Value
Network Performance (Gibps) Up to 25
Enhanced Networking None
IPV6 True
Placement Group (?) True
Storage Value
EBS Optimized True
Max Bandwidth (Mbps) on (EBS) 4750
Max Throughput (MB/s) on EBS 593.75
Max I/O Operations/second (IOPS) 20000
Baseline Bandwidth (Mbps) on (EBS) 1190
Baseline Throughput (MB/s) on EBS 148.75
Baseline I/O Operations/second (IOPS) 6000
Amazon Value
Generation current
Instance Type inf1.2xlarge
Family Machine Learning ASIC Instances
Name INF1 Double Extra Large
Elastic Map Reduce (EMR) False