Form Factor
4U rack-mount server
CPU
Dual AMD EPYC 7763 CPUs (64 cores each, Milan)
Memory
Up to 8TB DDR4-3200 (32x DIMM slots)
Expansion PCIe
8x PCIe 4.0 x16 slots (for NVIDIA A100 PCIe 80GB GPUs), 2x PCIe 4.0 x8 slots
HD
Up to 24x 2.5”/3.5” SATA/SAS/NVMe hot-swap bays
Network Interfaces
- Supports NVIDIA ConnectX-7 NICs (200GbE/400GbE) or Quantum InfiniBand
- 2x 1GbE LAN ports (management)
Performance Features
- Secured three leading positions in MLPerf Training 2.0 benchmarks (2022) for AI models (e.g., BERT, UNET3D)
- Supports NVIDIA HGX A100 platform with 8x GPUs for large-scale AI training
- High-density GPU and storage support for deep learning and HPC
- Scalable for AI data pipelines
Management
- ASUS Control Center (ACC) for server management
- Integrates with NVIDIA AI Enterprise for AI workflows
Power Supply
- Redundant, hot-swappable 2200W power supplies
- Optimized for AI data center efficiency
Target Market
Hyperscalers, enterprises, and research institutions for large-scale AI