BSV
$56.1
Vol 54.15m
13.73%
BTC
$98138
Vol 77117.38m
5.28%
BCH
$468.33
Vol 497.6m
17.79%
LTC
$103.69
Vol 1215.87m
16.9%
DOGE
$0.33
Vol 10091.94m
24.48%
Getting your Trinity Audio player ready...

Microsoft (NASDAQ: MSFT) has announced plans to introduce an artificial intelligence (AI) chip to support the development of large language models (LLMs) and another for mundane computing processes.

The big tech company confirmed plans to enter the chipmaking business at its Ignite conference after months of planning. Called the Maia 100, Microsoft’s latest AI chip is expected to be a close competitor to Nvidia’s chips (NASDAQ: NVDA) while being a cheaper alternative.

Priced at around $40,000, Nvidia’s H100 chips have seized a large chunk of the market, with the company scrambling to meet the market demand. Microsoft is also keen to meet the demand of AI developers with its offering for running cloud AI processes.

Microsoft’s testing of Maia with OpenAI’s systems has since yielded impressive results, given its range of functionalities for faster model training and inference.

“We were excited when Microsoft first shared their designs for the Maia chip, and we’ve worked together to refine and test it with our models,” said OpenAI CEO Sam Altman. “Azure’s end-to-end AI architecture, now optimized down to the silicon with Maia, paves the way for training more capable models and making those models cheaper for our customers.”

Experts opine that Microsoft can catch up with industry leaders like Nvidia and Advanced Micro Devices (AMD) (NASDAQ: AMD) with its debut Maia 100. Microsoft’s new chips are laced with 105 billion transistors, offering support for 8-bit data types and liquid-cooled server processors.

The company says it will share the designs for its unique rack house with its partners but will keep the chip designs private.

“The goal here was to enable higher density of servers at higher efficiencies,” said Microsoft. “Because we’re reimagining the entire stack we purposely think through every layer, so these systems are actually going to fit in our current data center footprint.”

Alongside Maia 100, Microsoft announced a second chip, Cobalt 100, designed for general computing tasks and expected to be a direct competitor with Intel processors.

Microsoft keeping pace with rivals

Given the pace of industry innovation, Microsoft faces a rough patch ahead in its attempt to keep pace with Nvidia and AMD. Barely months after rolling out H100 chips, Nvidia has hinted toward developing a successor—the H200—expected to be launched in 2024.

With sweltering pockets from its AI chips business, Nvidia has splurged a fortune in the research and development for advanced chips, confirming an annual innovation timeline rather than every two years. Big Tech firms are no longer throwing their weight behind the development of chips for block reward mining but are scrambling to satisfy the demand stemming from generative AI.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Artificial intelligence needs blockchain

Recommended for you

Google unveils ‘Willow’; Bernstein downplays quantum threat to Bitcoin
Google claims that Willow can eliminate common errors associated with quantum computing, while Bernstein analysts noted that Willow’s 105 qubits...
December 18, 2024
WhatsOnChain adds support for 1Sat Ordinals with new API set
WhatsOnChain now supports the 1Sat Ordinals with a set of APIs in beta testing; with this new development, developers can...
December 13, 2024
Advertisement
Advertisement
Advertisement