← All Hardware
Chip Google August 2023

Google TPU v5e

Google's Tensor Processing Units are application-specific integrated circuits (ASICs) designed from the ground up for matrix multiplication workloads in neural networks. The TPU v5e, announced in August 2023, is Google's efficiency-focused variant — designed for large-scale training and inference at lower cost per operation than its high-performance sibling, the v5p. TPUs do not compete on raw peak FLOPS with NVIDIA GPUs; they compete on total cost of ownership for specific workloads. Google trains Gemini on TPUs. The existence of a competitive internal accelerator is why Google is less dependent on NVIDIA than Microsoft or Meta — a structural advantage in the AI infrastructure arms race. TPUs are available via Google Cloud but not sold as hardware.

Specifications

Architecture: Custom ASIC | Process: Custom | BF16: 197 TFLOPS per chip | Memory: 16GB HBM | Interconnect: Inter-Chip Interconnect (ICI) | Form factor: Cloud-only (not purchasable)
enes