Bhushan is a FinTech enthusiast and holds a good flair in understanding financial markets. His interest in economics and finance draw his attention towards the new emerging Blockchain Technology and Cryptocurrency markets. He is continuously in a learning process and keeps himself motivated by sharing his acquired knowledge. In free time he reads thriller fictions novels and sometimes explore his culinary skills.
The new Meta Training and Inference Accelerator (MTIA) chip shall specifically handle “inference”, an already-trained AI model that makes a prediction or takes action.
For the very first time, Facebook parent Meta Platforms Inc (NASDAQ: META) publicly announced that it is working on custom computer chips which will help the company with its artificial intelligence and video processing tasks.
Ahead of the virtual event on Thursday, May 18, Meta disclosed its internal silicon chip projects for the very first time while discussing the company’s AI technical infrastructure investments. Investors have been keen on knowing about Meta’s investments in AI along with the related data center hardware.
Over the last year, Meta has announced multiple layoffs as the company works on streamlining its operations while embarking on “a year of efficiency”. However, designing and building indigenous chips is always an expensive path for companies to undertake.
But Meta’s vice president of infrastructure Alexis Bjorlin believes that the improved performance will justify the investment. Recently, Meta has been overhauling its data center designs to focus on other energy-efficient techniques for reducing excess heat and liquid cooling.
For e.g. the company’s new computer chips, the Meta Scalable Video Processor, or MSVP, will process and transmit video to users while cutting down on energy requirements. Bjorlin said that there were no options available commercially to handle the task of processing and delivering 4 billion videos a day, as effectively as Meta wants.
Also, another processor Meta Training and Inference Accelerator, or MTIA, is the first in Meta’s family of chips that will help with various AI-specific tasks. The new MTIA chip shall specifically handle “inference”, an already-trained AI model that makes a prediction or takes action.
This AI-powered “inference” chip will power Meta’s recommendation algorithm used to show content and ads in people’s news feeds. However, Bjorlin didn’t speak about who the manufacturer is, but the blog post shows a “fabricated in TSMC 7nm process”.
Meta’s Multi-Generational Roadmap
Bjorlin said that Meta has a “multi-generational roadmap” for its family of AI chips which will include processors used for training AI models. As per the Reuters report earlier, Meta canceled its AI inference chip project and started another one scheduled to roll out by 2025.
Bjorlin said that since Meta isn’t in the business of cloud computing, the company didn’t feel compelled to talk about the internal data center chip projects. She added:
“If you look at what we’re sharing – our first two chips that we developed – it’s definitely giving a little bit of a view into what are we doing internally. We haven’t had to advertise this, and we don’t need to advertise this, but you know, the world is interested.”
Meta’s new hardware works effectively with the company’s PyTorch software. This has become one of the most popular tools used by third-party developers in creating AI apps. The new hardware will eventually power several metaverse-related tasks, including augmented reality and virtual reality.
Additionally, Meta has also created a generative AI-powered coding assistant which will help the company’s developers to easily create and develop software. Meta said that it will continue to contribute to open-source technologies and AI research driving technological innovation ahead.