free web page hit counter
31 C
Lahore
Monday, May 20, 2024
Advertisment
HomeGeneralMeta showcases next-generation AI chipsets for building large-scale AI infrastructure.

Meta showcases next-generation AI chipsets for building large-scale AI infrastructure.

- Advertisement -
- Advertisement -
Read Counter 0

Meta showcases next-generation AI chipsets for building large-scale AI infrastructure.

MetaOn Wednesday, it unveiled its next-generation Meta Training and Inference Accelerator (MTIA), its family of custom chipsets. Artificial intelligence (AI) workload. The upgrade to its AI chipset comes almost a year after the company introduced its first AI chips. These inference accelerators will power the tech giant’s current and future products, services and AI within its social media platforms. Specifically, Meta highlighted that the chipset’s capabilities will be used to service its classification and recommendation models.

Declaring it through Blog postMeta said, “The next generation of Meta’s massive infrastructure is being built with AI in mind, including new generative AI (GenAI) products and services, recommendation systems, and cutting-edge AI research. This is an investment that we expect to increase in the coming years as the sophistication of the models increases as well as the compute requirements to support AI models.”

According to Meta, the new AI chip offers significant improvements in both power output and performance due to improvements in its architecture. The next generation of MTIA doubles the compute and memory bandwidth of its predecessor. It may also offer meta-recommendation models that it uses to personalize content for its users on its social media platforms.

Regarding the chipset’s hardware, Meta said the system has a rack-based design with 72 accelerators where there are 12 boards in three chassis with two accelerators each. The processor is clocked at 1.35GHz which is much faster than its predecessor at 800MHz. It can also run at a high output of 90W. The fabric between the accelerator and the host has also been upgraded to PCIe Gen5.

The software stack is where the company has made major improvements. The chipset is designed to fully integrate with PyTorch 2.0 and related features. “The low-level compiler for MTIA takes the output from the front end and generates highly efficient and device-specific code,” the company explained.

Results so far show that this MTIA chip can handle both low-complexity (LC) and high-complexity (HC) classification and recommendation models that are components of Meta’s products. In these models, the model size can vary ~10x-100x and the amount of compute per input sample. Because we control the entire stack, we can achieve higher performance than commercially available GPUs. Achieving these benefits is an ongoing effort and we continue to improve performance per watt as we build and deploy MTIA chips in our systems.

With the rise of AI, many tech companies are now focusing on creating custom AI chipsets that can meet their specific needs. These processors offer massive computing power to servers that enable them to bring products like generalist AI chatbots and AI tools to specific tasks.


Affiliate links may be generated automatically – see our Statement of Ethics For details

For More Detail www.gadgets360.com

meta mtia ai chipset products services infrastructure meta,Meta AI,Artificial intelligence,ai

- Advertisement -
RELATED ARTICLES
- Advertisment -
- Advertisment -

Latest News & Update

- Advertisment -

Recent Comments