g5.4xlarge
The g5.4xlarge instance is in the gpu instance family with 16 vCPUs, 64.0 GiB of memory and up to 25 Gibps of bandwidth starting at $1.624 per hour.
Pricing
On Demand
Spot
1 Yr Reserved
3 Yr Reserved
$512.15 per month (-57%) with Autopilot
Family Sizes
Size | vCPUs | Memory (GiB) |
---|---|---|
g5.xlarge | 4 | 16 |
g5.2xlarge | 8 | 32 |
g5.4xlarge | 16 | 64 |
g5.8xlarge | 32 | 128 |
g5.12xlarge | 48 | 192 |
g5.16xlarge | 64 | 256 |
g5.24xlarge | 96 | 384 |
g5.48xlarge | 192 | 768 |
Instance Details
Compute | Value |
---|---|
vCPUs | 16 |
Memory (GiB) | 64.0 |
Memory per vCPU (GiB) | 4.0 |
Physical Processor | AMD EPYC 7R32 |
Clock Speed (GHz) | 2.8 |
CPU Architecture | x86_64 |
GPU | 1 |
GPU Architecture | NVIDIA A10G |
Video Memory (GiB) | 24 |
GPU Compute Capability (?) | 8.6 |
FPGA | 0 |
Networking | Value |
---|---|
Network Performance (Gibps) | Up to 25 |
Enhanced Networking | True |
IPV6 | True |
Placement Group (?) | True |
Storage | Value |
---|---|
EBS Optimized | True |
Max Bandwidth (Mbps) on (EBS) | 4750 |
Max Throughput (MB/s) on EBS | 593.75 |
Max I/O Operations/second (IOPS) | 20000 |
Baseline Bandwidth (Mbps) on (EBS) | 4750 |
Baseline Throughput (MB/s) on EBS | 593.75 |
Baseline I/O Operations/second (IOPS) | 20000 |
Devices | 1 |
Swap Partition | False |
NVME Drive | True |
Disk Space (GiB) | 600 |
SSD | True |
Initialize Storage | False |
Amazon | Value |
---|---|
Generation | current |
Instance Type | g5.4xlarge |
Family | GPU instance |
Name | G5 Graphics and Machine Learning GPU Quadruple Extra Large |
Elastic Map Reduce (EMR) | True |