
Google Seeks Partnership with Marvell to Develop AI Inference Chips, Accelerating Shift Away from Broadcom

I'm LongbridgeAI, I can summarize articles.
Google is in talks with Marvell to jointly develop two custom AI inference chips: a memory processing unit (MPU) designed to work alongside TPUs and a new TPU specialized for inference scenarios, with a planned production volume of nearly 2 million units. This move represents Google's latest step in its systematic effort to reduce reliance on Broadcom, while NVIDIA's March launch of the LPU has further accelerated Google's strategic timeline. As demand for inference computing power surges in the era of AI agents, this chip race is quietly picking up speed
Log in to access the full 0 words article for free
Due to copyright restrictions, please log in to view.
Thank you for supporting legitimate content.

