BSV
$46.42
Vol 9.33m
1.96%
BTC
$62771
Vol 13940.46m
1.5%
BCH
$323.79
Vol 113.23m
1.25%
LTC
$67.31
Vol 221.06m
2.77%
DOGE
$0.11
Vol 484.38m
2.78%
Getting your Trinity Audio player ready...

Microsoft (NASDAQ: MSFT) has announced plans to introduce an artificial intelligence (AI) chip to support the development of large language models (LLMs) and another for mundane computing processes.

The big tech company confirmed plans to enter the chipmaking business at its Ignite conference after months of planning. Called the Maia 100, Microsoft’s latest AI chip is expected to be a close competitor to Nvidia’s chips (NASDAQ: NVDA) while being a cheaper alternative.

Priced at around $40,000, Nvidia’s H100 chips have seized a large chunk of the market, with the company scrambling to meet the market demand. Microsoft is also keen to meet the demand of AI developers with its offering for running cloud AI processes.

Microsoft’s testing of Maia with OpenAI’s systems has since yielded impressive results, given its range of functionalities for faster model training and inference.

“We were excited when Microsoft first shared their designs for the Maia chip, and we’ve worked together to refine and test it with our models,” said OpenAI CEO Sam Altman. “Azure’s end-to-end AI architecture, now optimized down to the silicon with Maia, paves the way for training more capable models and making those models cheaper for our customers.”

Experts opine that Microsoft can catch up with industry leaders like Nvidia and Advanced Micro Devices (AMD) (NASDAQ: AMD) with its debut Maia 100. Microsoft’s new chips are laced with 105 billion transistors, offering support for 8-bit data types and liquid-cooled server processors.

The company says it will share the designs for its unique rack house with its partners but will keep the chip designs private.

“The goal here was to enable higher density of servers at higher efficiencies,” said Microsoft. “Because we’re reimagining the entire stack we purposely think through every layer, so these systems are actually going to fit in our current data center footprint.”

Alongside Maia 100, Microsoft announced a second chip, Cobalt 100, designed for general computing tasks and expected to be a direct competitor with Intel processors.

Microsoft keeping pace with rivals

Given the pace of industry innovation, Microsoft faces a rough patch ahead in its attempt to keep pace with Nvidia and AMD. Barely months after rolling out H100 chips, Nvidia has hinted toward developing a successor—the H200—expected to be launched in 2024.

With sweltering pockets from its AI chips business, Nvidia has splurged a fortune in the research and development for advanced chips, confirming an annual innovation timeline rather than every two years. Big Tech firms are no longer throwing their weight behind the development of chips for block reward mining but are scrambling to satisfy the demand stemming from generative AI.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Artificial intelligence needs blockchain

Recommended for you

BTC miner Hut 8 pivots to AI as BTC profits dry up
Hut 8 now allows AI clients to use a cluster of 1,000 Nvidia GPUs powering HP supercomputers after a $72...
October 4, 2024
BTC mining profitability in free fall, noise complaints rise
BTC miners continue to face challenges, with their mining profitability on a downward trend, exacerbated by natural disasters, a possible...
October 2, 2024
Advertisement
Advertisement
Advertisement