The technologically-advanced world that we live in seems to be constantly expanding – and three individuals based in northern California are working to directly contribute to that growth. Alberto Rizzolo, Marita Cheng, and Simon Edwardsson have created the smartphone application called Aipoly Vision. Aipoly Vision is intended for use by people with visual impairments and people who are color blind.
Using a neural network through an application downloaded on your phone, it interprets the input of the phone’s camera to describe out loud to the user what it is viewing. Basically, when a person points the phone at a nearby object, Aipoly Vision will verbally identify the object.
Aipoly Vision is also capable of identifying over 900 colors through the same method. The application does not require internet connection for use, and can recognize objects within three seconds. As the company has pointed out, the app could be particularly helpful in “situations where using your hands to feel objects is not ideal,” such as in the bathroom.
In the future, the app will be able to describe the settings and surroundings of objects as well – for example, a “a dog near a lamp post.” According to the company’s website, Aipoly helps “the blind and visually impaired quickly identify objects using affordable, cutting-edge technology,” and is endorsed by multiple prominent individuals in the community of people who are blind, including Rob Turner, the President of Silicon Valley Council of the Blind, and Lisa Maria Carvalho, the Vice President of the Lighthouse Center for the Blind.
Aipoly Vision was released to the public on January 3, 2016, and is free on iTunes.
Check out the video below to hear what several people with visual impairments think of Aipoly Vision!
This video may begin with a commercial which was not chosen by or for the benefit of Rooted in Rights.