Built with TSMC's 3nm process, Microsoft's new Maia 200 AI accelerator will reportedly 'dramatically improve the economics of ...
By Toby Sterling and Nathan Vifflin AMSTERDAM, Jan 27 (Reuters) - As artificial intelligence firms jostle for the Nvidia ...
Maia 200 is most efficient inference system Microsoft has ever deployed, with 30% better performance per dollar than latest ...
Hyperscaler leverages a two-tier Ethernet-based topology, custom AI Transport Layer & software tools to deliver a tightly integrated, low-latency platform ...
Microsoft has unveiled its Maia 200 AI accelerator, claiming triple the inference performance of Amazon's Trainium 3 and superiority over Google's TPU v7.
Microsoft Corp. is rolling out its second-generation artificial intelligence chip, the centerpiece of the company's push to ...
Microsoft introduced a second-generation AI chip to run its services more efficiently while providing an alternative to ...
Data centers have gotten a bad rap lately, but here's why we think they are actually good for Arizona. It starts with jobs.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results