The ubiquitous generative AI chatbot is already answering daily queries for visually impaired people with more detail and efficiency than human volunteers.
While the recreational uses of AI continue to make headlines every week, the technology’s ability to make day-to-day living more accessible is going under the radar.
Largely regarded as the benchmark for quality within tech’s latest race, OpenAI’s ChatGPT-4 is now able to accurately break down visual prompts as well as text.
This upgrade instantly drew commercial interest from companies focused on assisted living, such as Be My Eyes. Since 2012, this business has helped those with visual impairments navigate their daily routines by connecting callers with volunteers through video chat.
In the majority of cases, the caller will ask a staff member to describe something rudimentary in front of them or explain a process that isn’t accessible through other means.
It’s a simple yet creative way of making life more convenient, described by its CEO Mike Buckley as a ‘lovely merger of technology and human kindness.’ Could AI help to refine the process even further in the near future?
A training version of Be My Eyes trialled by a small pool of users is testing an integration with ChatGPT-4, and the early results are extremely positive.
Several participants have praised the level of detail and speed, and expressed a newfound/ rediscovered sense of independence provided by problem solving without the assistance of another person.