11-21-2024
BSV
$67.5
Vol 205.24m
0.75%
BTC
$98400
Vol 106316.25m
4.85%
BCH
$484.87
Vol 2165.39m
11.56%
LTC
$89.02
Vol 1350.83m
6.63%
DOGE
$0.38
Vol 9324.68m
3.51%
Getting your Trinity Audio player ready...

Meta (NASDAQ: META) recently announced the latest enhancement to their Ray-Ban smart glasses—a suite of multimodal artificial intelligence features that gives users new ways to interact with and understand the environment around them.

The core of this upgrade lies in the glasses’ ability to process environmental data through its camera and microphones, providing users with contextual information based on their surroundings.

Interactive AI features in Meta’s Ray-Ban smart glasses

To activate the AI, users must use the voice command “Hey Meta, take a look at this,” followed by their specific question. For instance, if a user asks, “Hey Meta, take a look at this plate of food and tell me what ingredients were used,” the glasses capture an image and utilize generative AI to analyze and break down the various subjects and elements within the frame.

 
 
 
 
 
View this post on Instagram
 
 
 
 
 
 
 
 
 
 
 

A post shared by Mark Zuckerberg (@zuck)

When a user asks the Meta AI about their visual surroundings, the glasses capture a photo and send it to Meta’s cloud for processing. After this, the AI delivers an audio response directly through the glasses. Additionally, users can review their request, the corresponding image, and the AI’s response in the Meta View phone app, which pairs with the glasses.

At the moment, Meta is rolling out these new multimodal AI features to a limited number of users through an early access program, but it plans to make the features available to all users in the new year.

Challenges in the AI wearable sector

Meta’s Ray-Ban smartglasses seem to be a step in the right direction regarding AI wearables. However, the AI wearable sector still has to overcome significant user experience challenges. Although these devices can provide increased utility in certain aspects of life, they tend to complicate life in many other areas.

For instance, the Humane AI pin, which was launched earlier this year by former Apple (NASDAQ: AAPL) employees, is a prime example. This screenless AI device is futuristic and innovative in design, but its complexity and the numerous steps required to accomplish simple tasks raise questions about its practicality. There are so many steps involved in performing tasks as simple as sending a text message that it makes you wonder if the existing alternatives to a device like the Humane AI pin provide a superior experience.

These issues are not unique to the Humane AI Pin. They represent a broader problem facing many AI wearables: finding practical use cases. Even the Meta Ray-Ban smart glasses are on a path where they are bound to face this obstacle. Although the smart glasses offer more utility than other AI wearables currently on the market thanks to their multimodal AI capabilities—that should be able to reduce the time it takes to retrieve information and generate content, especially in private settings–using the glasses in public settings could be an awkward and uncomfortable experience.

Imagine standing in line at a farmers market and asking your glasses what exotic fruit or vegetable you are looking at instead of simply asking the vendor your questions. Talking to a device in public remains a largely unfamiliar and uncomfortable experience for many people. It highlights the inherent awkwardness and social discomfort that voice-command AI wearables bring, particularly in public spaces. Addressing these problems will be crucial for the future success and acceptance of voice-command AI wearables.

The future of AI wearables

AI wearables are an emerging market. Innovations in the artificial intelligence space
continue to enhance the capabilities of these devices, but there are many obstacles on their road to mass adoption. A significant challenge for the industry lies in identifying and establishing practical use cases that convince consumers and businesses to move away from existing alternatives toward these AI wearable products.

The key to widespread acceptance and success lies in developing a device that not only integrates advanced AI capabilities but also addresses real-world needs in a user-friendly manner. The industry must focus on creating wearables that seamlessly blend into daily life, offering tangible benefits without adding complexity or discomfort, especially in social settings, if they wish to be successful.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Artificial intelligence needs blockchain

Recommended for you

BIT Mining hit with $10M fine over bribery charges
In its previous existence as a casino and sports lottery firm, BIT Mining reportedly paid $2 million in bogus consultation...
November 21, 2024
Donald Trump’s role in the ‘crypto’ boom
Donald Trump pledged to make the United States the "crypto capital of the world." For the first time in nearly...
November 21, 2024
Advertisement
Advertisement
Advertisement