GOOGL
Google is strengthening its semiconductor footprint by unveiling an eighth generation of Tensor Processing Units (TPUs) engineered for artificial intelligence. This evolution introduces a distinction between two processor types: one dedicated to model training and the other to real-world execution. Both chips are expected to be available by the end of 2026. This strategic move comas as part of a broader effort to better meet the surging demand for AI agent development and to optimize performance based on specific use cases.
Esteban Tesson
Published on 04/22/2026 at 09:11 am EDT
This approach aligns with a broader trend within the tech sector, where players such as Amazon, Microsoft, Apple and Meta are also developing proprietary silicon. Google, a pioneer in the field since 2015, is highlighting significant gains with this new generation, notably a 2.8x performance increase for training at equivalent cost and an 80% improvement in inference. The TPU 8i chip stands out with SRAM memory increased to 384 MB per unit, enabling the rapid and simultaneous execution of millions of AI agents.Despite this momentum, Nvidia remains the market leader, even as Google emphasizes the growing adoption of its solutions. Companies such as Citadel Securities and the US Department of Energy laboratories are already using these technologies, while Anthropic plans to leverage several gigawatts of TPU capacity. According to analysts, the combined entity formed by TPU operations and Google DeepMind could reach a valuation of approximately $900bn, illustrating the economic potential of these investments.