🕶️ What’s New:
Meta has rolled out a major update to its Ray-Ban Meta Smart Glasses, making them true multimodal AI assistants. The glasses now use computer vision + voice to understand and respond to the world around you.
🔍 Multimodal Features:
- Look + Ask: You can now look at something and ask, “What is this?” and the glasses respond with GPT-style intelligence.
- Scene Summarization: Get descriptions of your surroundings, including landmarks, objects, or even people (if permitted).
- Translation: Real-time visual + audio translation (e.g., read a sign in French and hear it in English).
- Shopping Help: Look at a product and get reviews, pricing info, or style tips.
🎧 Other Smart Capabilities:
- Hands-free photo/video with “Hey Meta, take a picture.”
- Built-in speakers for music, calls, and responses
- Integration with Instagram and Facebook Stories
🧠 Powered by Meta AI:
- Uses Meta’s latest LLaMA 3 multimodal model
- AI processing is done both on-device and in the cloud
- Continual learning through user interaction (privacy-respecting, says Meta)
⚖️ Privacy & Criticism:
- Concerns remain over facial recognition and passive video capture.
- Meta has added visible LED indicators and opt-in only features to reduce creepiness.
📅 Availability:
- Free update rolling out in phases (June–July 2025)
- Available for all Ray-Ban Meta 2nd gen glasses
