g3.16xlarge
The g3.16xlarge instance is in the gpu instance family with 64 vCPUs, 488.0 GiB of memory and 25 Gibps of bandwidth starting at $4.56 per hour.
Pricing
On Demand
Spot
1 Yr Reserved
3 Yr Reserved
$1696.52 per month (-49%) with Autopilot
Family Sizes
Size | vCPUs | Memory (GiB) |
---|---|---|
g3.4xlarge | 16 | 122 |
g3.8xlarge | 32 | 244 |
g3.16xlarge | 64 | 488 |
Instance Details
Compute | Value |
---|---|
vCPUs | 64 |
Memory (GiB) | 488.0 |
Memory per vCPU (GiB) | 7.62 |
Physical Processor | Intel Xeon E5-2686 v4 (Broadwell) |
Clock Speed (GHz) | 2.3 |
CPU Architecture | x86_64 |
GPU | 4 |
GPU Architecture | NVIDIA Tesla M60 |
Video Memory (GiB) | 32 |
GPU Compute Capability (?) | 5.2 |
FPGA | 0 |
Networking | Value |
---|---|
Network Performance (Gibps) | 25 |
Enhanced Networking | True |
IPV6 | True |
Placement Group (?) | True |
Storage | Value |
---|---|
EBS Optimized | True |
Max Bandwidth (Mbps) on (EBS) | 14000 |
Max Throughput (MB/s) on EBS | 1750.0 |
Max I/O Operations/second (IOPS) | 80000 |
Baseline Bandwidth (Mbps) on (EBS) | 14000 |
Baseline Throughput (MB/s) on EBS | 1750.0 |
Baseline I/O Operations/second (IOPS) | 80000 |
Amazon | Value |
---|---|
Generation | current |
Instance Type | g3.16xlarge |
Family | GPU instance |
Name | G3 Graphics GPU 16xlarge |
Elastic Map Reduce (EMR) | True |