Microsoft launches Maia 200 AI chip on TSMC 3nm for Azure, delivering 10+ petaFLOPS to reduce costs and Nvidia reliance in ...
Microsoft has unveiled its second-generation Maia 200 AI chip to boost AI inference across Azure, cut costs, and support ...
Microsoft is not just the world’s biggest consumer of OpenAI models, but also still the largest partner providing compute, networking, and storage to ...
Microsoft says the new chip is competitive against in-house solutions from Google and Amazon, but stops short of comparing to ...
Cryptopolitan on MSN
Microsoft introduces Maia 200 to reduce AI cloud costs and power use
Microsoft has unveiled its second-generation artificial intelligence chip, Maia 200, as it pushes to strengthen its cloud business and ease reliance on Nvidia processors. Demand for artificial ...
The software giant is vying for a bigger piece of the AI pie.
See how three organizations strengthen security and accelerate AI innovation with Microsoft’s family of security products.
Microsoft says its new Maia 200 chip outperforms Amazon's and Google's latest AI silicon on key benchmarks.
The company said Maia 200 offers three times the compute performance of Amazon Web Services Inc.’s most advanced Trainium processor on certain popular AI benchmarks, while exceeding Google LLC’s ...
Calling it the highest performance chip of any custom cloud accelerator, the company says Maia is optimized for AI inference on multiple models.
The Maia 200 AI chip is described as an inference powerhouse — meaning it could lead AI models to apply their knowledge to ...
Microsoft’s new Maia 200 inference accelerator chip enters this overheated market with a new chip that aims to cut the price ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results