Skip to content

Google · cloud-model

Verified 2026-05-01

Google Cloud TPU v5p

Google's largest training TPU pod — 3D-torus interconnect, BF16-strong chips, accessible as Cloud TPU.

TPU v5p is Google's flagship training chip, accessible only as a Cloud TPU. Each chip pairs 459 BF16 TFLOPS with 95 GiB of HBM at 2,575 GiB/s. Up to 8,960 chips connect via a 3D-torus inter-chip interconnect at 1,200 GB/s bidirectional per chip.

Specs

compute
Google TPU v5p
topology
3D torus
const{jsx:e}=arguments[0];function _createMdxContent(n){const t={p:"p",...n.components};return e(t.p,{children:"Google Cloud TPU flagship. Available exclusively through Google Cloud — no on-prem deployment path."})}return{default:function(n={}){const{wrapper:t}=n.components||{};return t?e(t,{...n,children:e(_createMdxContent,{...n})}):_createMdxContent(n)}};