Microsoft and Apple logos

Apple’s new open-source lightweight AI model developed to run offline

Apple (NASDAQ: AAPL) has thrown its hat in the ring for small language models (SML) after falling behind Google (NASDAQ: GOOGL) and OpenAI.

In an official announcement on the open-source platform Hugging Face, Apple confirmed the release of eight small artificial intelligence (AI) models split evenly between pre-trained and instruction-tuned models. The parameters range from 270 million parameters, with the largest of the lot having 3 billion parameters.

Dubbed OpenELM, Apple says the lightweight models are specifically designed to run locally on devices without the need for internet connectivity. Although the report failed to specify exact use cases, pundits predict several utilities, including forming the base layer for mobile assistants with specific functionalities.

To close the gap between it and other industry first-movers, Apple is opting for an open-source route, stealing a page from Meta’s Llama 2 (NASDAQ: META). OpenELM was trained using public datasets with Apple, allowing users to inspect the “recipe” for the models by sharing the CoreNet code.

According to the official report, OpenELM holds its own logic and reasoning capabilities when placed side-by-side with larger models. Apple says the models were able to achieve the feat through the efficient allocation of parameters leaning on a layer-wise scaling method.

“We introduce OpenELM, a family of Open-source Efficient Language Models,” read a company statement. “OpenELM uses a layer-wise scaling strategy to efficiently allocate parameters within each layer of the transformer model, leading to enhanced accuracy.”

The new release brings Apple in direct competition with Microsoft’s (NASDAQ: MSFT) suite of LLMs, with the company releasing the third iteration of its Phi models. Following the successes of the previous two versions, Microsoft says the Phi-3 mini 3 will not require extensive computing power but will broaden the horizon for enterprises to choose their AI strategy.

“What we’re going to start to see is not a shift from large to small, but a shift from a singular category of models to a portfolio of models where customers get the ability to make a decision on what is the best model for their scenario,” said Microsoft’s generative AI product manager, Sonali Yadav.

A comparative advantage

Despite playing second, Apple has several things working in its favor regarding the race for SMLs. Experts say Apple has a significant advantage over Windows in terms of local AI use stemming from the combination of device RAM with VRAM.

For Windows, users must buy 32GB GPUs to support RAMs running AI models. However, since AI hardware is predominantly Nvidia-based (NASDAQ: NVDA), Apple has a mountain to climb with its in-house chips with the firm adopting a slow-and-steady approach to emerging technologies in recent years.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: How blockchain will keep AI honest

New to blockchain? Check out CoinGeek’s Blockchain for Beginners section, the ultimate resource guide to learn more about blockchain technology.