← All Hardware
Datacenter NVIDIA March 2024

NVIDIA DGX B200

The DGX B200 is NVIDIA's turnkey AI server, containing 8 B200 GPUs connected via NVLink 5.0 with a total of 1,440GB of HBM3e memory. It is designed as a self-contained unit for organisations that want to operate frontier AI workloads on-premises rather than in cloud infrastructure. A single DGX B200 is priced at approximately $300,000. The DGX line has historically been the entry point for research institutions building on-premises AI infrastructure — universities, government labs, and enterprises that cannot or will not send proprietary data to cloud providers. The B200 generation's memory capacity means a single DGX B200 can serve models with up to ~700 billion parameters.

Specifications

GPUs: 8× B200 SXM | Total GPU memory: 1,440GB HBM3e | GPU interconnect: NVLink 5.0 (1.8 TB/s) | Network: 8× 400GbE | Storage: 60TB NVMe | CPU: 2× Intel Xeon | System memory: 2.3TB
enes