Colorful conversations are the game plan for an app for the visually impaired developed by a group of research assistants.
The group will launch their app EyeCYou, which is designed to enhance conversations for people with low vision, late in the fall semester. Azmat Qureshi, computer science and engineering research assistant, said EyeCYou will help detect features someone with vision impairments may not pick up on in conversations with others.
Low vision is a reduced level of vision that cannot be fully corrected with conventional glasses and is not the same as blindness, according to the Kellogg Eye Center at the University of Michigan.
“When we go and meet people, subconsciously we store a lot of information,” he said. “When I meet you, I will not concentrate on your shirt color or skin color but my eyes will grasp that information.”
Tosin Oluwadare, alumnus and former computer science and engineering research assistant, said knowing what someone looks like, what they’re wearing, can help people get more involved in discussions. He said our eyes naturally perceive this information, which can also help inform people of the personality of individuals they’re talking to.
Oluwadare said the app intends to provide the same context for people with vision impairments in a matter of seconds.
“It just makes it feel like I can actually see you,” he said.
The current prototype of the app is like a makeshift, Qureshi said. He said it’s used by launching the application into a tablet that processes the image from a camera that’s taped to a pair of glasses. He said it will tell the gender, shirt color, skin complexion and age group of individuals captured by the camera. He said although the makeshift prototype doesn’t look pretty, they just want to show people it can be used.
Back in April, the group went to the Dallas Lighthouse for the Blind to have people test the app for them, Oluwadare said. He said the feedback from respondents suggested they also wanted the ability to detect people’s facial expressions while conversing with them.
In scenarios where the sighted help the unsighted, assumptions can cause people to overlook what’s necessary, said Terry McManus, vice president for Rehabilitation & Community Initiatives at the Dallas Lighthouse for the Blind.
McManus said people with vision impairments can sometimes get offended by color, because they’re imprisoned by something that doesn’t affect them. “It can be a frustrating, trivial thing,” he said.
That is why knowing body language — when someone laughs, frowns or fidgets — would benefit conversations for the visually impaired, McManus said. He said this way someone would know if the person they’re talking to is uncomfortable or enjoying the conversation.
Lamar Upshaw, assistive technology specialist at the Dallas Lighthouse for the Blind, said he’s been blind since he was five and got to test out the app when the researchers visited. Although the app wouldn’t have significant meaning for him, it could help serve people with low vision, Upshaw said.
Oluwadare said his group, research assistant Cameron Moreau, Qureshi and himself, is working to incorporate the testers’ feedback into their app. He said they’re currently working to have it compatible with all mobile devices, one of the reasons it may not be ready until the end of the fall semester.
When the app launches it’s going to be free, but additional features like hair color and length may come at a charge, Oluwadare said.