When I met up with a visually impaired friend, I was struck that while I have been teaching robots to see, there are many people who cannot see and need help.
Researchers at the University of Georgia have developed a prototype wearable AI system in the form of a voice-activated backpack that allows visually impaired people to navigate the world around them without a guide dog, according to Forbes.
The ground-breaking system is embedded within a backpack and connected to several sensors in a vest. Audio notifications are relayed to the visually impaired wearer about the environment around them via a Bluetooth earpiece.
Jagadish K. Mahendran, a Computer Vision/Artificial Intelligence Engineer from the University of Georgia’s Institute for Artificial Intelligence, is the revolutionary device’s brainchild.
Shown below is the vision system backpack using a computer algorithm to identify objects in a real-life environment. The device tells the wearer a stop sign and a crosswalk are ahead.
In another example, the device tells the wearer of a curb that is feet away.
Mahendran said his team was motivated by earlier work on robotic vision. “Last year, when I met up with a visually impaired friend, I was struck by the irony that while I have been teaching robots to see, there are many people who cannot see and need help,” he said.
The technology behind the vision system backpack is a high-tech Intel processing chip superior to other high-tech visual-assistance programs. Mahendran said the rapid advancement in technology has slimmed down prototype weight and makes it simple and wearable for anyone with vision issues.
As Mahendran explains, “Without these neural compute sticks from Intel, the wearer would be looking at carrying something like five graphics processing units in the backpack. Each one weighing around a quarter of a pound and that’s not to mention all the fans and power sources that would be needed.”
“It would be unaffordable and impractical for users,” he continues.
“However, thanks to these neural compute sticks and the Intel Movidius processor, this huge GPU capacity is being compressed into a USB stick-sized hardware, so you can just plug it anywhere and you can run these complex, deep learning models.
“This is why the solution that we have developed is so simple because we can just put everything in a small backpack, and it’s portable, cheap and has a very simple form factor.”
The added ingenuity is that the way the system is configured, it doesn’t even look like a piece of assistive technology at all. – Forbes
As the Fourth Industrial Revolution commences, guide dogs, or commonly referred to as service dogs, may finally be relieved of their duties sometime in this decade, which would please the vegan population who view service dogs as a form of animal exploitation.