Ray-Ban Meta smart glasses are a standard consumer product that would be worth considering for their basic features alone. In partnership with your smartphone they offer hands free calling and messaging as well as a handy AI chatbot. But for blind and low vision people they do even more - the camera in the glasses frame can take photos and the AI chatbot can answer your questions about the photos, whether you just want a description of what's in front of you, to have printed or handwritten text spoken or you want to know what number door you are standing in front of in your hotel. This all comes in the standard consumer package at no extra cost. And they can be bought from around £300 online from Ray-Ban or at stores including Argos in the UK. There are other wearable products designed specifically for blind people that you may wish to consider, including Envision smart glasses and ARx vision but for many of us the Ray-Ban Meta glasses will be an excellent value for money and very effective tool. As I write this in May 2025, they are also still acquiring new features fast. Note that a promised device for connecting ARx vision to iPhones was not yet available when I last checked but this product does work with Android phones.
So what's so good about Ray-Ban Meta?
comfortable to wear and very like standard sunglasses
no training wires
good quality speakers in the arms by your ears will play all phone sound including music and VoiceOver speech
a built-in microphone
the glasses work in conjunction with the Meta AI app on your phone and the app has excellent training material to be read using the phone
hands-free operation with "Hey Meta" commands
hands-free calling and messsaging including with WhatsApp or Messenger if you wish
commands prefixed with "Hey Meta look and..." will take a photo and answer questions about what is in front of you, for example "Hey Meta look and tell me if there are baked potatoes on this menu"
access to assistance from a human volunteer with Be My Eyes or you can start a video WhatsApp call with a friend to request their assistance
as an alternative to human assistance over a video call, Meta started rolling out "Live AI" in the US and Canada in May 2025. This allows users to ask the glasses questions about the live video feed from their glasses; this will certainly drain the glasses battery relatively quickly; we'll probably have to wait a while for this in the UK; Meta roll outs seem to be very slow and gradual and certainly not a whole country at a time.
Be aware that:
the smartphone or tablet you pair the glasses with must be nearby
the glasses have a limited battery life, maybe 4 hours for what Meta describes as typical use, but they recharge in their case and the battery in the case holds up to 9 full charges of the glasses. I haven't yet managed to run my glasses battery down below 50% in a full morning or afternoon session but I don't use them much for streaming music. There's an on/off switch in the left arm which can be used it you just want them to be sunglasses for a while but they seem to consume little battery while idle
functions that use Meta AI, including all the visual features, require an internet connection to your smartphone and this will render the glasses less useful in areas of poor mobile service reception where there is no good wi-fi for you to use
All AI systems can make mistakes; if you suspect the response is wrong or it fails in some other way try asking again
people nearby may be able to hear sound from the glasses speakers; you are in control of volume
when buying, you can't mix and match frame style, colour, fit and lens type and colour to get precisely what you want. You must select from the combinations which are currently available. Lenses available include clear, various colours and transitions which change from clear to a dark shade in strong UV light; prescription lenses are also available
Ray-Ban Meta smart glasses are a very successful product and many competitors can be expected, including Apple, but Apple is unlikely to enter the field any time soon and maybe not even by the end of 2026; many of the products under development are Augmented Reality glasses with the additional capability of providing visual information overlaid over your view of the world around you; although this may be accessible to some low-vision people it certainly won't help those of us with severe sight impairment; if Apple finally delivers the more useful Siri which it promised in summer 2024, its glasses may allow users to do a great deal with their phones hands free but don't hold your breath; Google has claimed that it is close to introducing smart glasses to be used with Android phones; Google's smart glasses will perform all sorts of tasks hands free using Google's Gemini AI when they hit the market; Google is probably way ahead of Apple in the race to bring a product to market; it's going to be interesting but don't let all of this stop you buying the current Ray-Ban offering
Meta is currently developing augmented reality glasses which can project an image that appears near the bottom of the image viewed by the right eye; it is unlikely that this will be of much use to many visually impaired users; it is predicted that Meta will not discontinue the Ray-Ban product and will hope that some Ray-Ban users will choose to upgrade to considerably more expensive AR glasses
And finally a few tips:
when using "Hey Meta look and read" you'll often get a summary; if you need the full text ask for it with "Hey Meta look and read the full text"
if you don't use VoiceOver and sometimes use a magnifier to read yur phone screen why not ask Meta to read it
the glasses can read dials and meters but repeat the request to be more confident of the answer
speak at normal speed but try to speak clearly; I used to find that the glasses sometimes told me about the previous photo I'd taken in the supermarket when I asked them to look at something new. When I checked in the History tab of the Meta AI app I discovered that the glasses hadn't heard all or part of "look and". I now try to emphasise the word "look" and pause briefly before continuing with the word "and" when I'm in a noisy environment; that improves performance substantially.
mobile phone network connections are often poor in large buildings like supermarkets; if you want to use your glasses to help you shop you'll probaby find a free wifi network to sign up to; signing up can be fiddly but you only need do it once
if you can't get a good internet signal and need to read things you could take out your phone and use SeeingAI, which doesn't need the internet for instant reading but of course you aren't hands-free any more
if you don't like saying "Hey Meta" you can set a tap and hold on the touchpad in the right arm as an alternative but, of course, it isn't hands free; the setting is in the glasses settings in the Meta AI app under Gestures; there's another setting in the glasses settings you might like; go to Meta AI in the glasses settings then Hey Meta Preferences and turn on the Customise switch; now set Respond Without Hey Meta; this keeps Meta AI listening for a few seconds after each of its responses so that you can have a conversation with Meta AI without repeatedly saying Hey Meta or touching and holding the touchpad
more generally, take a tour of the glasses settings in the Meta AI app; there may be other things to help; for example, there are several alternative languages and voices you can set for Meta AI. My glasses were originally set to US English but UK English is available along with a limited choice of UK English voices including Judi Dench, Atlas and Clover. Unfortunately, in June 2025 Judi seems to speak American English with a British accent. She told me that my bathroom sink had two faucets.