Microsoft “Athena” AI chips take aim at Nvidia for AI supremacy
Microsoft / PexelsMicrosoft has reportedly been developing its own AI chip, codenamed “Athena” in order to avoid an overreliance on the Nvidia A100 GPU in AI training, according to a report from The Verge.
It’s no secret that Microsoft themselves have to capacity to conjure up its own bespoke chip designs. With the AI war currently raging, there seems to have emerged one clear victor: Nvidia. The GPU manufacturer has been outspoken about its successes in its latest GTC conference, held just last month. CEO Jensen Huang called it the “iPhone moment” for generative AI.
Nvidia boasted about the deployment of its A100 GPUs in multiple AI facilities, including being used to train OpenAI’s ChatGPT. With an order of 30,000 from OpenAI themselves, and Elon Musk picking up over 10,000, there’s a clear need for more competitors on the market.
A report from The Verge States that Microsoft has been working on a dedicated AI chip since 2019, codenamed “Athena”. It is reportedly currently being tested for performance on GPT-4 by OpenAI themselves.
The AI hardware boom is just beginning
Nvidia’s graphics cards are industry-leading in just about every way. With a huge segment of its profits stemming from business-to-business market share, AI is just another avenue that the gaming GPU manufacturer can earn more cash. With little to no opposition, Nvidia has been able to corner the industry into purchasing its own products.
However, Microsoft has seemingly been looking in-house to see if it can save a buck on its own AI push. There are currently no details on if Microsoft’s “Athena” chips will be available for Azure Cloud customers, nor has it been formally announced to the public.
However, it is also alleged that the chips may be available in 2024, with a full roadmap for more advanced chips to come in the years after. It’s also claimed that Microsoft’s own chips are not a replacement for the Nvidia A100, but might be a more cost-effective solution for businesses to use AI in smaller, potentially less intensive AI applications.
Google is no stranger to making its own silicon, meaning that Apple, Google, and many more are beginning to break from relying on external hardware. However, for most businesses and AI startups, Nvidia still remains the top choice for powering bigger AI models.