Nvidia Vs Qualcomm

Qualcomm Inc’s artificial intelligence chips outperformed those made by Nvidia in power efficiency during the MLPerf test results released on Wednesday. As per the results from MLCommons, Qualcomm’s chips surpassed Nvidia’s in two of the three power efficiency tests, while a Taiwanese startup outperformed both companies in the third test.

When it comes to training AI models with large amounts of data, Nvidia leads the market. However, after these models are trained, they are used more widely for inference, performing tasks like determining if an image has a cat and generating text answers to questions.

As companies across the globe continue to adopt AI technology into their products, the market for data center inference chips is expected to expand significantly. However, with the integration of AI into products comes extra costs such as computing power. As such, companies such as Google are already looking into ways to limit the additional expenses that this would entail.

Electricity is one of those significant costs, and Qualcomm has utilized its experience creating CPUs for battery-powered devices like smartphones to produce a chip called the Cloud AI 100 that strives for economical power consumption.

MLCommons is an engineering consortium that maintains testing benchmarks widely used in the AI chip industry. The task of operating neural networks in production is known as AI inference processing, and every six months, MLCommons releases a new set of benchmarks for this activity

Qualcomm Tops The Power Efficiency Tests

In the results for the most recent activity, Qualcomm’s AI 100 chip outperformed Nvidia’s flagship H100 chip at efficient use of power when classifying images. Qualcomm’s processors achieved 197.6 server queries per watt whereas Nvidia attained 108.4 queries per watt. However, with 227 queries per watt, Neuchips, a startup founded by renowned Taiwanese chip academic Youn-Long Lin, won the benchmark.

In the object detection test, Qualcomm outperformed Nvidia with a score of 3.2 queries per watt as opposed to Nvidia’s 2.4 queries per watt.

Nvidia, however, won a test of natural language processing, in both absolute performance terms and power efficiency terms. Natural language processing is mainly used in chatbot systems such as ChatGPT, Nvidia achieved a sampling rate of 10.8 samples per watt, followed by Neuchips at 8.9 samples per watt and Qualcomm at 7.5 samples per watt.

The other submitters included Alibaba, ASUSTeK, Azure, cTuning, Deci, Dell, Gigabyte, H3C, HPE, Inspur, Intel, Krai, Lenovo, Moffett, Nettrix, Neural Magic, Quanta Cloud Technology, Rebellions, SiMa, Supermicro, VMware, and xFusion.

The companies made submissions for various other tasks including medical imaging, speech-to-text, language processing, and recommendation in addition to image classification and object detection.

Nvidia On A Winning Streak In Performance

While Nvidia was greatly challenged in terms of power efficacy, the NVIDIA H100 Tensor Core GPUs delivered the highest performance in every test of AI inference. In addition, the GPUs delivered up to 54% performance gains from their debut in September, thanks to software optimizations.

Furthermore, the company debuted the new NVIDIA L4 Tensor Core GPUs whose performance rate was more than three times that of T4 GPUs from earlier generations.

Notably, the number of software engineers employed by NVIDIA is significantly more than that of many of its rivals, and these engineers keep improving the performance of new chip generations.

This gives Nvidia a competitive advantage and any company trying to beat them will have a very tough time doing so, especially in the data center and edge applications that require the flexibility of running many models.

While Qualcomm did not top any of the performance tests, the company demonstrated a 75% performance improvement since they began this journey 3 years ago.

Love Hate Inu - Next Big Meme Coin

Our Rating

Love Hate Inu
  • First Web3 Vote to Earn Platform
  • Latest Meme Coin to List on OKX
  • Staking Rewards
  • Vote on Current Topics and Earn $LHINU Tokens
Love Hate Inu