Ray-Ban Meta Smart Glasses have been a big success and the company is continuing to expand the capabilities of these stylish tech shades that include a camera and speakers. You’ll soon get live translation, reminders, and more, along with a new clear style. Since these Ray-Bans can see and hear, Meta is leveraging the advanced AI capabilities of its new Llama 3.
2 model to enable live translation. In a live demo, Meta founder Mark Zuckerberg spoke with Brandon Moreno, one speaking English and the other Spanish, while their Meta glasses translated for each person. Despite a small delay — about one to three seconds before the AI spoke the translation — it’s a great addition for owners of these smart glasses.
Zuckerberg mentioned live translations of Spanish, French, and Italian. More languages will follow. Another Meta AI improvement gives it memory.
You’ll be able to ask your smart glasses to remind you where you parked or to pick up apples at the store. With Ray-Ban Meta Smart Glasses becoming more helpful in everyday life, you might find yourself reaching for them more often. The Ray-Ban smart glasses can already describe and answer questions about what you see, snapping a photo with the integrated camera.
Now that multimodal AI is getting more practical capabilities. With the latest update, you can scan QR codes with your Ray-Ban Meta Smart Glasses and call phone numbers you see in print or on a billboard just by asking. Your stylish Ray-Bans will also give fashion.