Breaking Away from NVIDIA Dependency, Microsoft Reportedly Developing In-House AI Server High-Speed Network Card

Microsoft is reportedly developing a customized network card for AI servers, as per sources cited by global media The Information. This card is expected to enhance the performance of its in-house AI chip Azure Maia 100 while reducing dependency on NVIDIA as the primary supplier of high-performance network cards.

Leading this product initiative at Microsoft is Pradeep Sindhu, co-founder of Juniper Networks. Microsoft acquired Sindhu’s data center technology startup, Fungible, last year. Sindhu has since joined Microsoft and is leading the team in developing this network card.

According to the Information, this network card is similar to NVIDIA’s ConnectX-7 interface card, which supports a maximum bandwidth of 400 Gb Ethernet and is sold alongside NVIDIA GPUs.

Developing high-speed networking equipment tailored specifically for AI workloads may take over a year. If successful, it could reduce the time required for OpenAI to train models on Microsoft AI servers and lower the costs associated with the training process.

In November last year, Microsoft unveiled the Azure Maia 100 for data centers, manufactured using TSMC’s 5-nanometer process. The Azure Maia 100, introduced at the conference, is an AI accelerator chip designed for tasks such as running OpenAI models, ChatGPT, Bing, GitHub Copilot, and other AI workloads.

Microsoft is also in the process of designing the next generation of the chip. Not only is Microsoft striving to reduce its reliance on NVIDIA, but other companies including OpenAI, Tesla, Google, Amazon, and Meta are also investing in developing their own AI accelerator chips. These companies are expected to compete with NVIDIA’s flagship H100 AI accelerator chips.

width="640"

Not only is Microsoft striving to reduce its reliance on NVIDIA, but other companies including OpenAI, Tesla, Google, Amazon, and Meta, are all investing in developing their own AI accelerator chips. These companies are expected to compete with NVIDIA’s flagship H100 AI accelerator chips as well.

Indium EMSNow Durafuse x

About The Author