‘When does this milk expire?’: Chat GPT-4 is now helping blind people navigate everyday tasks

 

By Talib Visram

Lucy Edwards., 27, is legally blind. She lives an independent life in Birmingham, England, with her two guide dogs and her sighted fiancé. But when he’s not around, there are things she needs help with—often “rudimentary questions” to assist her with everyday tasks. It could be something as mundane as distinguishing between two cans of beans, so that she doesn’t end up putting baked beans in her chili for dinner.

Since 2012, Be My Eyes has been helping blind and low-vision people like Edwards navigate these routines by connecting them to volunteers on video calls who can describe something in front of them or walk them through a process. But some users are uncomfortable relying on other people, and try to carry on without asking for help. Now, a small number of users are trialing a new version of the app that integrates Chat GPT-4. Users are finding the level of detail and speed impressive, and it could be a route to gaining more independence, and prove to be one of the more promising uses of the AI chatbot.

In its original edition, the voice-assisted Be My Eyes app allows users to call one of the more than 6 million sighted volunteers in its database—in 150 countries and 180 languages—at any time, for free. They could help a person determine what’s in their fridge and when items are expiring; distinguish colors; or provide navigation. “It’s this lovely merger of technology and human kindness,” says CEO Mike Buckley.

Volunteers sign up because they’re willing to help. They receive calls any time during the day in their time zones, and can pick up if they’re available. But the company, founded by a visually impaired Danish furniture craftsman, Hans Jørgen Wiburg, found that blind users aren’t using the service to its full potential for various reasons. A company survey found that some people believe it would take away from those who need the volunteers more (even though there are plenty); others feel discomfort calling a stranger to ask about something personal; and many don’t like the constant reliance on other people.

Then came Chat GPT-4, the newest update of an artificial intelligence tool that can respond rapidly to a wide range of human prompts, whether answering general knowledge questions or writing creative copy, from song lyrics to your next cover letter.

Crucially, for Be My Eyes, GPT-4 can also read and understand photos. The app integrated the AI chatbot, creating a new version, the Virtual Volunteer. Since February, a number of users have been testing the version in its beta stage (Buckley won’t say exactly how many, but it’s less than a hundred).

Be My Eyes says initial testers have been impressed at the level of vision recognition and descriptive capabilities, the speed, and the layers of analytical detail. For example, if you show a photo of the inside of your fridge, not only can it tell you what’s in there, but it can also give you ideas of what to cook with those ingredients, and then provide full recipes.

Edwards, who is a content creator and disability advocate (and a paid ambassador in Be My Eyes), is one of the beta testers. She lost her sight at age 17 in 2013 due to a rare genetic condition, and started using Be My Eyes two years later.

@lucyedwards

AD I’m blind, can AI tell me where to go on the London tube? @bemyeyesapp #ChatGPT #OpenAI #AI #ArtificialIntelligence #HugeIfTrue

? original sound – British blind girl ?????????????????

Edwards says she is astounded by the speed of the Chat GPT version. She recently took a photo of the Heathrow Underground station, and within five seconds, the app gave her detailed instructions on how to get to Central London. “That is quicker than any sighted person on the planet,” she says.

Another advantage of Chat GPT compared to a human is that it eliminates opinion. If she shows it a photo of her eyeshadow palette, it tells her the precise, true colors, whereas humans—like her partner and friends—may debate over what a particular shade is called.

 

The GPT version will remain free when it rolls out publicly, though the company doesn’t yet have a date for that. The main concern is safety, and ensuring that users exercise caution. It should be an addition to other digital and physical tools that blind people employ. “This doesn’t replace a white cane, it doesn’t replace a guide dog,” Buckley says. “Let’s not use this to cross the street.”

And, in its early stages, the Virtual Volunteer is still incorrect sometimes. It can generate the wrong outputs for an image——known in the industry as “hallucinations”—misidentifying a toaster as a slow cooker, or a Sonos speaker as an air purifier. But it’s constantly improving and fine-tuning, Buckley says. And, importantly, it always suggests a human call as an alternative; Buckley is adamant they are not doing away with the volunteer community.

Still, Edwards doesn’t miss the original version. While human connection is nice, it’s something she already has in her life. The GPT version feels like a tool she’ll take more advantage of, and she’s excited to see how the tech continues to evolve. “The more AI learns, the more autonomous I’m going to be as a blind person,” she says. “I thought I gave that up 10 years ago. I thought I would never get that back.”

Correction: An earlier version of this story misstated Edwards’ relationship to Be My Eyes. She is a paid ambassador, not an investor.

Fast Company

(34)