BSV
$53.1
Vol 20.6m
-2.1%
BTC
$95109
Vol 42285.09m
-2.51%
BCH
$443.64
Vol 337.12m
-2.35%
LTC
$100.33
Vol 807.53m
-0.43%
DOGE
$0.31
Vol 4743.74m
-4.54%
Getting your Trinity Audio player ready...

A recent paper from a group of Huawei researchers is calling for “embodied” artificial intelligence (AI) as the next big thing for the emerging ecosystem, poking holes in the argument for scaling.

Achieving artificial general intelligence (AGI) will not be a walk in the park for frontier companies, given the absence of a consensus on the definition of the concept, according to the paper. Generally, AGI refers to the intelligence that exceeds or is at par with human abilities, possessing the ability to learn and solve challenges.

Despite the leaps in large language models (LLMs), AI is far from achieving
superintelligence, with models described as “static and unable to evolve with time and experience.” The researchers argue that the best path to attaining superintelligent AI systems is through embodying AI rather than the widespread belief in scaling LLMs.

“It is a prevalent belief that simply scaling up such models, in terms of data volume and computational power, could lead to AGI,” the paper read. “We contest this view. We propose that true understanding, not only propositional truth but also the value of propositions that guide us how to act, is achievable only through E-AI agents that live in the world and learn of it by interacting with it.”

For the researchers, giving AI a body capable of interacting with its surroundings will offer several benefits, including achieving general intelligence. Four components are necessary for embodiment to take place, with perception at the top of the pile.

The second trait required the ability to take action based on perceived data, with the researchers subdividing the functionality into reactive and goal-directed actions. The capability of reactive actions is primarily for “self-preservation,” while goal-directed actions are necessary to achieve “complex, high-level objectives.”

Memory is another essential component for embodied agents to possess to achieve general intelligence. Finally, the ability to learn from memory is a distinguishing factor for intelligent AI systems, with its reliance on simulators and other emerging technologies offering new learning pathways for systems.

Challenges to embodying AI

The paper highlights several downsides associated with embodying AI, including “noise and uncertainty,” which can affect the system’s decision-making abilities.

Hardware limitations pose another challenge for embodying AI, given the cost and energy consumption rates of GPU clusters.

There is also the concern that embodied AI systems will proceed with an “egocentric” perspective, which may open a whole new issue. The researchers are predicting ethical dilemmas in communication with humans, data collection methods, and outputs, given the novelty of the technique.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: What does blockchain and AI have in common? It’s data

Recommended for you

Google unveils ‘Willow’; Bernstein downplays quantum threat to Bitcoin
Google claims that Willow can eliminate common errors associated with quantum computing, while Bernstein analysts noted that Willow’s 105 qubits...
December 18, 2024
WhatsOnChain adds support for 1Sat Ordinals with new API set
WhatsOnChain now supports the 1Sat Ordinals with a set of APIs in beta testing; with this new development, developers can...
December 13, 2024
Advertisement
Advertisement
Advertisement