p2.16xlarge

The p2.16xlarge instance is in the gpu instance family with 64 vCPUs, 732.0 GiB of memory and 25 Gibps of bandwidth starting at $14.4 per hour.

paid

Pricing

On Demand

Spot

1 Yr Reserved

3 Yr Reserved

$5361.12 per month (-49%) with Autopilot

dns

Family Sizes

Size vCPUs Memory (GiB)
p2.xlarge 4 61
p2.8xlarge 32 488
info

Instance Details

Compute Value
vCPUs 64
Memory (GiB) 732.0
Memory per vCPU (GiB) 11.44
Physical Processor Intel Xeon E5-2686 v4 (Broadwell)
Clock Speed (GHz) 2.3
CPU Architecture x86_64
GPU 8
GPU Architecture NVIDIA Tesla K80
Video Memory (GiB) 192
GPU Compute Capability (?) 3.7
FPGA 0
Networking Value
Network Performance (Gibps) 25
Enhanced Networking True
IPV6 True
Placement Group (?) True
Storage Value
EBS Optimized True
Max Bandwidth (Mbps) on (EBS) 10000
Max Throughput (MB/s) on EBS 1250.0
Max I/O Operations/second (IOPS) 65000
Baseline Bandwidth (Mbps) on (EBS) 10000
Baseline Throughput (MB/s) on EBS 1250.0
Baseline I/O Operations/second (IOPS) 65000
Amazon Value
Generation current
Instance Type p2.16xlarge
Family GPU instance
Name P2 General Purpose GPU 16xlarge
Elastic Map Reduce (EMR) True

close