Google is reportedly in discussions with semiconductor company Marvell Technology to develop two new artificial intelligence (AI) chips aimed at strengthening its infrastructure for AI inference workloads, according to reports.
The talks are said to be in an early stage, and no formal agreement has been finalised yet. However, sources indicate that the potential collaboration is focused on meeting Google’s growing demand for AI inference capacity, which refers to the computing power required to generate responses from trained AI models. This is distinct from AI training, which involves building and refining the models themselves.
As per the report, one of the proposed chips is expected to function as a memory processing unit designed to work alongside Google’s existing Tensor Processing Units (TPUs). The second chip is being considered as a new TPU specifically optimised for AI inference tasks, which are increasingly important as AI-driven services expand globally.
Also Read: US-Iran Conflict Disrupts Global Plastic and Glass Supply as Oil and Gas Prices Surge
The reported initiative would not replace Google’s existing hardware roadmap. The company recently introduced “Ironwood,” its seventh-generation TPU built for inference workloads. The system is capable of scaling up to thousands of liquid-cooled chips connected through a high-speed interconnect network, designed to handle large-scale AI computing demands.
Despite the proposed Marvell partnership, Google is expected to continue its existing collaborations with major chip partners, including Broadcom, MediaTek, and TSMC. The potential deal is viewed as an additional step to diversify its semiconductor supply chain rather than a replacement of current arrangements.
Industry observers suggest that the move reflects the broader competition among global technology firms to secure advanced AI computing infrastructure. As demand for generative AI and large-scale machine learning applications continues to rise, companies are increasingly investing in specialised chips to improve performance, efficiency, and scalability across their AI ecosystems.
Also Read: JLR Halts Range Rover Production at UK Plant: Supply Chain Crisis Hits Tata's Crown Jewel