Microsoft
is going all in on AI. Last year, the company unveiled a
custom AI chip
to boost the performance of its infrastructure to train large language models (LLMs) cost-effectively. In what can be called another step towards bringing costs down and avoiding costly reliance on third-party chipmakers, Microsoft is said to be working on
AI
server tech to run its programmes.
According to a report by The Information, Microsoft is developing server network cards to replace the ones offered by Nvidia, which will help the company save some cash. Server cards move data quickly between servers, the report said, highlighting that these could also enhance the performance of its Nvidia chip servers as well as of its Maia AI chips.
“The software giant is developing a networking card to ensure that data moves quickly between its servers, as an alternative to one supplied by Nvidia,” the report claimed, citing a person with direct knowledge.
“In addition to potentially saving Microsoft money, the company hopes the new networking gear will improve the performance of its Nvidia chip servers,” it added.
The move comes as Nvidia is riding the increased demand wave for its AI chips. Not only Microsoft, but several companies have been reported to be working on in-house chips and other infrastructure to boost the performance of their systems and lessen their dependence on Nvidia.
Microsoft’s AI infrastructure plans
At Microsoft Ignite 2023, the company unveiled the Microsoft Azure Maia 100 AI Accelerator, which is optimised for undertaking AI tasks.
At that time, Scott Guthrie, executive vice president of Microsoft’s Cloud + AI Group said that the company is building the infrastructure to support AI innovation, and Microsoft is “reimagining every aspect of our data centres to meet the needs of our customers.” Microsoft announced that the chips will start to roll out this year.