Why Google Built Two AI Chips Instead of One — Inside the TPU v8 Split
Google's eighth-generation TPU isn't a chip — it's two. The TPU 8t targets training, the TPU 8i targets agentic inference, and the reason for the split tells you where AI hardware is heading. This post builds the story from the silicon up: how CPUs, GPUs, and TPUs actually differ, what a systolic array does, and why one chip can no longer be optimal for both halves of the AI workload.
AI HardwareTPUGPU