June Wan/ZDNET
When Meta first launched its Ray-Ban smart glasses, there was one feature that I was excited to try but couldn’t. The promise of a multimodal AI device capable of answering questions based on what the user was staring at sounded like a dream wearable, but Meta wouldn’t be rolling out that functionality to its $299 smart glasses until “next year.” That idolized future may be closer than I anticipated.
Also: Meta’s $299 Ray-Ban smart glasses may be the most useful gadget I’ve tested all year
Today, the company is launching an early access program that will allow Ray-Ban Meta smart glasses users to test the new multimodal AI features, all of which leverage the onboard camera and microphones to process environmental data and provide contextual information such as what a user is staring at.
How it all works is rather straightforward. You start a Meta AI prompt by saying, “Hey Meta, take a look at this,” followed by the specifics. For example, “Hey Meta, take a look at this plate of food and tell me what ingredients were used.” To answer the question, the glasses capture an image of what’s in front of you and then break down the various subjects and elements with generative AI.
The functionality goes beyond the usual “What is this building?” or “What’s the weather like today?” prompts, of course, as Meta CEO, Mark Zuckerberg, demoed in an Instagram Reel. In the video, Zuckerberg asks Meta AI, “Look and tell me what pants to wear with this shirt.” as he holds up a rainbow-striped button-down. Not only does the voice assistant identify the apparel, but it suggests pairing it with dark-washed jeans or solid-colored trousers. (The real question is do tech CEOs actually wear outfits beyond the monochromatic t-shirts and dark-colored pants.)
(Side note: Up until today, Meta AI on the Ray-Ban glasses had a knowledge cutoff of December 2022. According to Meta CTO Andrew Bosworth, they now have access to real-time info thanks to Bing.)
Also: Meta rolls out its AI-powered image generator as a dedicated website
Only a small batch of users will receive the new update at first, as Meta plans to collect feedback and refine its upcoming AI features before the official releases. To participate, update the Meta View app to the latest version, tap on the gear icon in the bottom right of the menu bar, swipe down to “Early Access,” and tap “Join Early Access.”
I’m not seeing anything resembling an early access program on my Android and iOS apps, but you can bet that when the update comes along, I’ll be quick to download and start testing it — because what was already one of the most useful tech gadgets I tested in 2023 is about to become even more useful.
Editorial standards
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : ZDNet – https://www.zdnet.com/article/meta-just-gave-its-299-smart-glasses-their-biggest-ai-upgrade-yet-and-im-beyond-excited/#ftag=RSSbaffb68