← All Hardware
Chip NVIDIA November 2023
NVIDIA H200 SXM
The H200 is NVIDIA's incremental upgrade to the H100, released in late 2023. The compute specifications are identical — it uses the same Hopper GPU die — but the memory system is substantially upgraded: 141GB of HBM3e versus the H100's 80GB, with memory bandwidth increasing from 3.35 TB/s to 4.8 TB/s. For large model inference, where memory capacity and bandwidth are often the bottleneck, this is a meaningful improvement. The H200 can serve larger models or larger batch sizes than the H100 without model parallelism. It became the primary GPU for hyperscale inference workloads in 2024.
Specifications
Architecture: Hopper | Process: TSMC 4N | FP16 Tensor: 3,958 TFLOPS | Memory: 141GB HBM3e | Bandwidth: 4.8 TB/s | TDP: 700W | Interconnect: NVLink 4.0 (900 GB/s)