Meta and Broadcom: Pioneering the Future of AI Chips
Meta Platforms and Broadcom announced on Tuesday an expanded multi-year strategic partnership to co-develop successive generations of Meta's custom AI accelerator chips, with a roadmap extending through 2029 and an initial deployment commitment exceeding one gigawatt of computing capacity — enough to power roughly 750,000 U.S. homes.
The deal, which Broadcom described as "the first phase of a sustained, multi-generation roadmap," will leverage Broadcom's XPU custom accelerator platform for chip design, advanced packaging, and networking across multiple silicon generations. Broadcom confirmed that the next MTIA chips will be the first custom AI silicon in the industry to use a 2-nanometer manufacturing process.
A Deepening Silicon Alliance
Meta CEO Mark Zuckerberg framed the partnership as essential to the company's ambitions. "Meta is collaborating with Broadcom on chip design, packaging, and networking to build out the massive computing foundation we need to deliver personal superintelligence to billions of people," Zuckerberg said in a statement.
As part of the arrangement, Broadcom CEO Hock Tan will step down from Meta's board of directors when his term expires at the company's annual meeting, transitioning into an advisory role. Tan had served on Meta's board since 2024. In a securities filing disclosed this week, Meta revealed it paid Broadcom $2.3 billion in 2025 — a rare glimpse into the financial scale of such chip design relationships.
Four Chip Generations in Two Years
The partnership builds on Meta's aggressive custom silicon roadmap, unveiled in March, which calls for four generations of MTIA chips — the 300, 400, 450, and 500 — to be deployed within roughly two years. The MTIA 300, optimized for ranking and recommendation systems, is already in production with hundreds of thousands of units deployed across Facebook and Instagram. The MTIA 400, 450, and 500 will progressively shift toward generative AI inference workloads, with mass deployments planned through 2027.
Meta builds MTIA chips on open-source RISC-V architecture, with TSMC handling fabrication. Each successive generation brings substantial gains: HBM bandwidth is set to increase by four and a half times, and compute performance by 25 times from the MTIA 300 to the MTIA 500.
The Broader AI Hardware Shift
The Broadcom deal is part of a broader push by Meta to diversify its AI infrastructure beyond Nvidia GPUs. Meta has also announced commitments for six gigawatts of AMD GPUs, millions of Nvidia chips, and custom processors designed with Arm Holdings, alongside capacity rented from cloud providers CoreWeave and Nebius. Meta's capital expenditure could reach between $115 billion and $135 billion in 2026.
Broadcom has separately expanded its collaboration with Google through a long-term agreement running until 2031 covering future generations of TPUs, underscoring its growing role as the chip design partner of choice for hyperscalers building their own AI hardware destiny.
