Google · cloud-model
Verified 2026-05-01
Google Cloud TPU v5p
Google's largest training TPU pod — 3D-torus interconnect, BF16-strong chips, accessible as Cloud TPU.
TPU v5p is Google's flagship training chip, accessible only as a Cloud TPU. Each chip pairs 459 BF16 TFLOPS with 95 GiB of HBM at 2,575 GiB/s. Up to 8,960 chips connect via a 3D-torus inter-chip interconnect at 1,200 GB/s bidirectional per chip.
Specs
- compute
- Google TPU v5p
- topology
- 3D torus