Meta Platforms is building custom silicon chips to power its artificial intelligence systems as part of an ambitious infrastructure upgrade plan. The first chip is called the Meta Training and Inference Accelerator (MTIA), designed to provide greater performance and efficiency than CPUs for AI workloads.
The social network company is also creating a new AI-optimized data center, liquid-cooled to support thousands of AI chips for large model training. This complements Meta’s fast-growing video infrastructure chip MSVP.
More Details About Meta Platforms’ New AI Chips
A supercomputer called Research SuperCluster is being built by Meta Platforms (META). The device consists of 16,000 GPUs and it is considered one of the fastest AI systems in the world. It’s designed to train next-generation AI models for augmented reality, content understanding, and real-time translation.
By custom-designing much of its infrastructure from data centers to servers to chips, Meta says it can optimize performance by enabling different hardware components and tailoring designs to its specific AI workloads. This customized infrastructure will enable Meta to build larger AI models and deploy them efficiently at scale, powering features like better personalization and generative AI content across its family of apps including Facebook, Instagram, and WhatsApp.
Meta’s plans were announced at the company’s AI INFRA @ SCALE conference. The firm’s Head of Infrastructure, Santosh Janardhan, delivered opening and closing remarks at the virtual event. He guided participants through six technical presentations on Meta’s latest AI infrastructure investments and opportunities ahead.
A panel of Meta AI infrastructure leaders discussed “The Future of AI Infra: The Opportunities and Challenges That Await Us On Our Journey” at the May 18, 2023 event.
Meta’s chips will come to compete with those of companies like Nvidia, IBM, Intel Corporation, and Alphabet (GOOG). Meta may be attempting to secure its own supply of chips to be able to scale its AI systems without constraints.
The chip shortage crisis that took place during the pandemic may have revealed significant flaws in the industry’s capabilities to rapidly supply these much-needed components at times when the demand from cloud services and data centers is booming.
Meta is Already Using Generative AI to Help its Engineers
In a recent update to its so-called “Year of Efficiency” pledge, the Chief Executive Officer and founder of Meta Platforms, Mark Zuckerberg, highlighted that the firm is actively building new AI tools to help its programmers work faster.
A solution called Buck2 is among the initiatives that are already being put to work at the tech company. According to Zuckerberg, it is an open-source tool that can compile builds 50% faster so engineers waste less time waiting to run iterations.
In addition, Meta said today that it has deployed a tool called CodeCompose that is also powered by artificial intelligence and that helps engineers be more productive in the software development cycle.
The billionaire founder of the social media company has steered a bit from his relentless focus on developing the so-called metaverse – the reason why the company changed names in 2021 – and has deemed that ambitious project a long-term endeavor.
Meanwhile, in the short term, Meta is reportedly focusing on developing and advancing AI and incorporating that technology into its user experience both to help advertisers create better campaigns and to offer a more tailored experience to regular users.
Meta has also been chopping its headcount in the past 6 months after letting go of 21,000 people – a one-quarter reduction of the firm’s global workforce. The company is seemingly right-sizing its organization after going on a pandemic-prompted hiring spree.
In addition, the tech firm may also be responding to changes in the macroeconomic landscape including an economic slowdown that will likely impact its top-line results as businesses will likely reduce their advertising spending.
Other Related Articles: