← All Hardware
Chip NVIDIA September 2022

NVIDIA H100 SXM5

The H100 is the chip that defined the AI infrastructure buildout of 2023–2024. Based on the Hopper architecture (80 billion transistors, TSMC 4N process), the H100 SXM5 delivers 3,958 TFLOPS of FP16 tensor compute and 3.35 TB/s of HBM3 memory bandwidth. Its Transformer Engine — hardware specifically designed to accelerate attention mechanisms — made it the GPU of choice for training and serving large language models. A single H100 SXM5 costs approximately $30,000–$40,000. The H100 became a geopolitically significant object: the US government restricted its export to China in October 2022 and tightened restrictions in 2023, making NVIDIA GPU access a proxy for national AI capability. Data centres acquiring H100s in 2023 and 2024 spent billions of dollars — Microsoft, Google, Meta, and Amazon each deployed tens of thousands of units.

Specifications

Architecture: Hopper | Process: TSMC 4N | Transistors: 80B | FP16 Tensor: 3,958 TFLOPS | Memory: 80GB HBM3 | Bandwidth: 3.35 TB/s | TDP: 700W | Interconnect: NVLink 4.0 (900 GB/s)
enes