AWS · cloud-model
Verified 2026-05-01
AWS Trainium2
Amazon's training/inference silicon: HBM3e per chip; UltraServers via NeuronLink scale to dozens of chips.
Trainium2 is AWS's second-generation in-house AI accelerator. Each chip pairs eight third-gen NeuronCores with 96 GB of HBM3e at 2.9 TB/s. Trn2 instances host 16 chips for 20.8 FP8 PFLOPS; Trn2 UltraServers connect 64 chips across four instances for 83.2 FP8 PFLOPS.
Specs
- compute
- AWS Trainium2