BSV
$54.98
Vol 60.54m
-12.63%
BTC
$86933
Vol 138080.09m
-2.86%
BCH
$409.94
Vol 1060.86m
-10.92%
LTC
$72.9
Vol 1221.43m
-7.39%
DOGE
$0.36
Vol 41751.81m
-9.25%
Getting your Trinity Audio player ready...

Meta (NASDAQ: META) recently announced the latest enhancement to their Ray-Ban smart glasses—a suite of multimodal artificial intelligence features that gives users new ways to interact with and understand the environment around them.

The core of this upgrade lies in the glasses’ ability to process environmental data through its camera and microphones, providing users with contextual information based on their surroundings.

Interactive AI features in Meta’s Ray-Ban smart glasses

To activate the AI, users must use the voice command “Hey Meta, take a look at this,” followed by their specific question. For instance, if a user asks, “Hey Meta, take a look at this plate of food and tell me what ingredients were used,” the glasses capture an image and utilize generative AI to analyze and break down the various subjects and elements within the frame.

 
 
 
 
 
View this post on Instagram
 
 
 
 
 
 
 
 
 
 
 

A post shared by Mark Zuckerberg (@zuck)

When a user asks the Meta AI about their visual surroundings, the glasses capture a photo and send it to Meta’s cloud for processing. After this, the AI delivers an audio response directly through the glasses. Additionally, users can review their request, the corresponding image, and the AI’s response in the Meta View phone app, which pairs with the glasses.

At the moment, Meta is rolling out these new multimodal AI features to a limited number of users through an early access program, but it plans to make the features available to all users in the new year.

Challenges in the AI wearable sector

Meta’s Ray-Ban smartglasses seem to be a step in the right direction regarding AI wearables. However, the AI wearable sector still has to overcome significant user experience challenges. Although these devices can provide increased utility in certain aspects of life, they tend to complicate life in many other areas.

For instance, the Humane AI pin, which was launched earlier this year by former Apple (NASDAQ: AAPL) employees, is a prime example. This screenless AI device is futuristic and innovative in design, but its complexity and the numerous steps required to accomplish simple tasks raise questions about its practicality. There are so many steps involved in performing tasks as simple as sending a text message that it makes you wonder if the existing alternatives to a device like the Humane AI pin provide a superior experience.

These issues are not unique to the Humane AI Pin. They represent a broader problem facing many AI wearables: finding practical use cases. Even the Meta Ray-Ban smart glasses are on a path where they are bound to face this obstacle. Although the smart glasses offer more utility than other AI wearables currently on the market thanks to their multimodal AI capabilities—that should be able to reduce the time it takes to retrieve information and generate content, especially in private settings–using the glasses in public settings could be an awkward and uncomfortable experience.

Imagine standing in line at a farmers market and asking your glasses what exotic fruit or vegetable you are looking at instead of simply asking the vendor your questions. Talking to a device in public remains a largely unfamiliar and uncomfortable experience for many people. It highlights the inherent awkwardness and social discomfort that voice-command AI wearables bring, particularly in public spaces. Addressing these problems will be crucial for the future success and acceptance of voice-command AI wearables.

The future of AI wearables

AI wearables are an emerging market. Innovations in the artificial intelligence space
continue to enhance the capabilities of these devices, but there are many obstacles on their road to mass adoption. A significant challenge for the industry lies in identifying and establishing practical use cases that convince consumers and businesses to move away from existing alternatives toward these AI wearable products.

The key to widespread acceptance and success lies in developing a device that not only integrates advanced AI capabilities but also addresses real-world needs in a user-friendly manner. The industry must focus on creating wearables that seamlessly blend into daily life, offering tangible benefits without adding complexity or discomfort, especially in social settings, if they wish to be successful.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Artificial intelligence needs blockchain

Recommended for you

Nvidia reigns as most valuable firm after overtaking Apple anew
Nvidia and Apple have been locked in a race to become the world’s most valuable company, and in the latest...
November 13, 2024
FTX sues Binance, CZ to reclaim $1.76B in fraudulent transfers
The FTX suit seeks to claw back $1.76B in fraudulent transfers to be determined at trial; former Binance execs Xiao...
November 12, 2024
Advertisement
Advertisement
Advertisement