Getting your Trinity Audio player ready...
|
Meta Platforms continues to develop its own in-house hardware for its push into the artificial intelligence (AI) sector.
Reuters, citing two sources, reported that Meta is testing its first in-house chip for training artificial intelligence systems, which is a key milestone as it moves to design more of its own custom silicon and reduce reliance on external suppliers such as Nvidia, which recently reported a profit increase of 130 percent.
It comes after Meta in April 2024, had unveiled the second generation of its custom AI processor, namely MTIA v2 (codenamed Artemis) – an AI inference accelerator. MTIA v2 was designed to work with its tech infrastructure, but also support and future AI advancements.

Image credit Meta Platforms
AI training
Now Reuters has reported that Meta has begun a small deployment of its AI training chip, and plans to ramp up production for wide-scale use if the test goes well, the sources said.
The push to develop in-house chips is part of a long-term plan at Meta to bring down its huge infrastructure costs, as it banks on AI tools to drive growth.
And it should be noted that AI chips are not at all cheap.
Nvidia’s GPUs control roughly 80 percent of the market. Nvidia does not disclose for example the prices of its H100, but each chip can reportedly sell from $16,000 to $100,000 depending on the volume purchased and other factors.
Meta in 2024 had stated it planned to bring its total stock to 350,000 H100s chips last year, demonstrating the hefty financial investment required to compete in this sector, and the value of organisation supplying these semiconductors.
Looking forward, Meta has forecast total 2025 expenses of $114 billion to $119 billion, including up to $65 billion in capital expenditure largely driven by spending on AI infrastructure, Reuters noted.
One of the sources told Reuters that Meta’s new training chip is a dedicated accelerator, meaning it is designed to handle only AI-specific tasks.
This can make it more power-efficient than the integrated graphics processing units (GPUs) generally used for AI workloads.
In-house AI chips
Meta is not the only firm developing its own AI processors.
Last month it was reported that OpenAI is finalising the design for its first in-house AI accelerator chip and plans to send it to Taiwan’s TSMC for an initial fabrication run, called tape-out, in the next few months.
Also last month it was reported that British chip designer ARM Holdings is developing its own chip, and Meta Platforms has been secured as one of its first customers.
This was a major departure from ARM’s normal business model of licensing its chip blueprints to other companies.