TrustFinance is trustworthy and accurate information you can rely on. If you are looking for financial business information, this is the place for you. All-in-One source for financial business information. Our priority is our reliability.

TrustFinance Global Insights
3月 11, 2026
2 min read
10

Meta has announced the development of four new generations of its custom in-house chips, part of the Meta Training and Inference Accelerator (MTIA) family. This move is a core component of the company's aggressive data center expansion plans to support its growing artificial intelligence workloads.
The social media giant plans to deploy these four new chips within the next two years, a pace significantly faster than typical industry chip cycles. Manufactured by Taiwan Semiconductor, these chips are designed to enhance Meta's AI capabilities. The first new chip, MTIA 300, was deployed recently to train smaller AI models for core ranking and recommendation tasks on platforms like Facebook and Instagram. Future versions will handle more advanced generative AI inference tasks.
By designing its own silicon, Meta aims to optimize price-per-performance across its data centers, reducing its reliance on external vendors. Yee Jiun Song, Meta's Vice President of Engineering, stated this strategy provides greater supply chain diversity and insulates the company from price fluctuations. This vertical integration reflects a broader trend among tech giants to control their AI hardware stack amid massive capital expenditures on data center infrastructure and a highly competitive AI market.
Meta's accelerated chip development signals a strategic push for greater efficiency and control over its AI infrastructure. The successful deployment of these MTIA chips could significantly impact its competitive position and long-term operational costs. Market observers will closely watch the performance benchmarks of these custom chips and their effect on Meta's AI service capabilities.
Q: What are the new Meta chips used for?
A: The chips are designed for AI ranking, recommendations, and generative AI inference workloads, but not for training giant large language models.
Q: Why is Meta building its own AI chips?
A: To improve cost-efficiency, gain more control over its hardware performance, and diversify its silicon supply chain.
Q: What is the timeline for these new chips?
A: The first chip is already deployed, the second is ready for deployment, and two more are expected to be operational in 2027.
Source: Investing.com

TrustFinance Global Insights
AI-assisted editorial team by TrustFinance curating reliable financial and economic news from verified global sources.
Related Articles