The headline writes itself: Arm makes its first chip. After three decades of licensing designs to the world's biggest technology companies, the Cambridge-born firm held up its own silicon on a stage in San Francisco and dared its partners to keep smiling.
They did, mostly. Meta is the lead partner and co-developer. OpenAI, Cerebras, Cloudflare, SAP, and SK Telecom have all signed on as customers. Nvidia's Jensen Huang, Amazon's James Hamilton, and Google's Amin Vahdat appeared in pre-taped video testimonials. The chip is called the AGI CPU, it runs on 136 Neoverse V3 cores fabbed by TSMC at 3nm, and Arm says it will reach full production in the second half of this year.
But the real story is not that Arm built a chip. The real story is why the world's most successful design licensor decided that licensing alone could no longer capture enough value from the AI buildout, and what that tells us about where the semiconductor industry is actually headed.
The licensing trap
Arm's business model has been elegant for decades. Design once, license everywhere. Apple, Qualcomm, Samsung, Amazon, Nvidia, Microsoft, Tesla: all of them pay Arm for the privilege of building on its architecture. Arm collects royalties on every chip shipped. By the 2010s, Arm-based processors ran in virtually every smartphone on the planet.
That model worked because the phone market was enormous and fragmented. Hundreds of manufacturers needed Arm designs. Nobody wanted to build from scratch.
AI data centers are different. The customers are fewer and richer. They are already designing their own chips: Google has TPUs, Amazon has Graviton and Trainium, Meta has MTIA. When your biggest licensees start doing in-house silicon, the royalty math changes. You are still collecting a fee, but you are watching other companies capture the margins on the hardware that runs the most valuable workloads in computing.
CEO Rene Haas put it plainly at the launch event: "Let me be clear: We are now in a new business for Arm, and we are supplying CPUs." He framed it as customer demand. That is true, but incomplete. Arm is also chasing a market that its own licensing model was never built to fully monetize.
Who benefits
The immediate winners are the companies too small or too focused to design their own data center CPUs. Arm's cloud AI head Mohamed Awad told CNBC the goal is to serve companies that cannot afford in-house processors. That is a real gap. Not every AI startup or enterprise can do what Google and Amazon do.
Meta benefits too, but differently. The company is both a customer and a co-developer, committed to "multiple generations" of the AGI CPU roadmap according to Arm's official announcement. Meta's infrastructure chief Santosh Janardhan appeared on stage and talked about needing more silicon for "personal superintelligence," the deeply personalized AI the company is building into its apps. Meta has with its own chip program. Partnering with Arm on CPU design while continuing to develop MTIA for inference and training gives Meta a two-track approach to silicon.



