Ray-Ban logo

Meta multimodal AI added in Ray-Ban smart glasses

Meta (NASDAQ: META) recently announced the latest enhancement to their Ray-Ban smart glasses—a suite of multimodal artificial intelligence features that gives users new ways to interact with and understand the environment around them.

The core of this upgrade lies in the glasses’ ability to process environmental data through its camera and microphones, providing users with contextual information based on their surroundings.

Interactive AI features in Meta’s Ray-Ban smart glasses

To activate the AI, users must use the voice command “Hey Meta, take a look at this,” followed by their specific question. For instance, if a user asks, “Hey Meta, take a look at this plate of food and tell me what ingredients were used,” the glasses capture an image and utilize generative AI to analyze and break down the various subjects and elements within the frame.

 
 
 
 
 
View this post on Instagram
 
 
 
 
 
 
 
 
 
 
 

A post shared by Mark Zuckerberg (@zuck)

When a user asks the Meta AI about their visual surroundings, the glasses capture a photo and send it to Meta’s cloud for processing. After this, the AI delivers an audio response directly through the glasses. Additionally, users can review their request, the corresponding image, and the AI’s response in the Meta View phone app, which pairs with the glasses.

At the moment, Meta is rolling out these new multimodal AI features to a limited number of users through an early access program, but it plans to make the features available to all users in the new year.

Challenges in the AI wearable sector

Meta’s Ray-Ban smartglasses seem to be a step in the right direction regarding AI wearables. However, the AI wearable sector still has to overcome significant user experience challenges. Although these devices can provide increased utility in certain aspects of life, they tend to complicate life in many other areas.

For instance, the Humane AI pin, which was launched earlier this year by former Apple (NASDAQ: AAPL) employees, is a prime example. This screenless AI device is futuristic and innovative in design, but its complexity and the numerous steps required to accomplish simple tasks raise questions about its practicality. There are so many steps involved in performing tasks as simple as sending a text message that it makes you wonder if the existing alternatives to a device like the Humane AI pin provide a superior experience.

These issues are not unique to the Humane AI Pin. They represent a broader problem facing many AI wearables: finding practical use cases. Even the Meta Ray-Ban smart glasses are on a path where they are bound to face this obstacle. Although the smart glasses offer more utility than other AI wearables currently on the market thanks to their multimodal AI capabilities—that should be able to reduce the time it takes to retrieve information and generate content, especially in private settings–using the glasses in public settings could be an awkward and uncomfortable experience.

Imagine standing in line at a farmers market and asking your glasses what exotic fruit or vegetable you are looking at instead of simply asking the vendor your questions. Talking to a device in public remains a largely unfamiliar and uncomfortable experience for many people. It highlights the inherent awkwardness and social discomfort that voice-command AI wearables bring, particularly in public spaces. Addressing these problems will be crucial for the future success and acceptance of voice-command AI wearables.

The future of AI wearables

AI wearables are an emerging market. Innovations in the artificial intelligence space
continue to enhance the capabilities of these devices, but there are many obstacles on their road to mass adoption. A significant challenge for the industry lies in identifying and establishing practical use cases that convince consumers and businesses to move away from existing alternatives toward these AI wearable products.

The key to widespread acceptance and success lies in developing a device that not only integrates advanced AI capabilities but also addresses real-world needs in a user-friendly manner. The industry must focus on creating wearables that seamlessly blend into daily life, offering tangible benefits without adding complexity or discomfort, especially in social settings, if they wish to be successful.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Artificial intelligence needs blockchain

YouTube video

New to blockchain? Check out CoinGeek’s Blockchain for Beginners section, the ultimate resource guide to learn more about blockchain technology.