Ask your glasses: Ray-Ban Meta Smart Glasses to double as your virtual tour guide

admin12 March 2024Last Update :
Ask your glasses: Ray-Ban Meta Smart Glasses to double as your virtual tour guide

Ask your glasses: Ray-Ban Meta Smart Glasses to double as your virtual tour guide،

In fall 2023, the Ray-Ban Meta smart glasses, formerly known as Ray-Ban Stories, received a major upgrade. They now have a faster processor, better cameras, improved audio, and you can even stream live to Facebook and Instagram. Oh, and Meta AI is now on board. The recent version 2 update improved the game with better image quality, global volume control, and enhanced security features. And guess what? Further updates are in progress. Meta CTO Andrew Bosworth shared in a Discussion post (via Engadget) that a new feature is on the way with the latest beta. This is a handy tool that identifies landmarks in different locations and provides additional information about them, essentially serving as a virtual tour guide for travelers.

Bosworth showed off some sample photos, explaining why the Golden Gate Bridge has an orange tint (apparently it's easier to spot in fog), told some stories about the iconic “painted ladies” homes, and shed some light Coit Tower. in San Francisco. Below these snapshots, descriptions appeared, adding additional context to the visuals. Meanwhile, Mark Zuckerberg took to Instagram to polish off the new glasses feature with a bunch of videos filmed in Montana. This round, the glasses shifted gears, using audio to present a verbal overview of Big Sky Mountain and the history behind Roosevelt's Arch. Oh, and Zuckerberg made a quirky request, asking Meta AI to break down how snow forms in a primitive, caveman-like way.
Meta previewed this feature at its Connect event last year, rolling out some “multimodal” tricks that allow the glasses to answer questions based on your environment. And guess what? All thanks to Meta's smart glasses that leverage real-time information, with a helping hand from Bing Search in the mix.

This feature is similar to Google's own version of Meta's Lens, allowing users to “show” items they see through the glasses and ask the AI ​​questions. Right now it's only available to users in Meta's early access program, but it will roll out to more users over time.