An Italian technology startup called Eyra Ltd will soon release a wearable device known as the Horus that allows the visually impaired to explore public space beyond the limitations of the walking cane. "The white cane is not enough because it only solves the issue of detecting obstacles touching the ground," said Saverio Murgia, CEO and co-founder of Eyra. "It can never detect obstacles like tree branches or cars parked near crosswalks."
This new headset and pocket computer, which hits the market in January, is meant to act like an audio personal assistant—it reads out words from books and public signs, detects obstacles and has a facial recognition feature. Worn like a pair of headphones and powered by a smartphone in your pocket, it has two cameras placed in the headset to capture 3D images of the environment, sends them to the Tegra pocket computer (the size of a smartphone), and translates it into sound.
It is powered by the NVIDIA Tegra K1, a popular mobile processor (a variant will be used in Nintendo's upcoming Switch console). It is also used for display units inside of Audi cars and electric Sedans. "It has GPU-accelerated computer vision, deep learning and sensors that process, analyze and describe images from two cameras," said Murgia.
The Horus works by scanning the surrounding area of the visually impaired person and translates visual information into verbal messages. The headset's battery lasts roughly 12 hours and it is currently offered in three languages: Italian, English and Japanese. "It can detect text on curved surfaces, virtually unwarp it and then translate that text into sound that the user can listen to," said Murgia. "There is no need for an internet connection as every step of the computation is performed locally."
The development of Horus started when Murgia and his colleague Luca Nardelli were leaving the University of Genoa, where they were working as computer vision researchers in 2014. "One day, we met a blind person on the street asking for help to reach the bus stop from the main train station in Genoa," said Murgia. "He was having issues because street construction disrupted his usual route to the bus stop. Talking with him, we discovered a world of issues caused by the fact that the world is not designed for blind people and we realized that we could bring algorithms and technology from robotics to a wearable device that can improve the life of millions of people."
The wearable helps the visually impaired avoid obstacles with an audible, 3D soundscape where each object in front of the user corresponds to a sound coming from the place where the object is. "If there is a pole or a wall on the right, the user will hear a sound coming from the right," said Murgia. "It is similar to parking sensors: there is a beeping sound coming from either the front, left or right side and the closer the obstacle is the higher the frequency becomes."
The facial recognition feature is taught individually by each different user. "The first time it sees a person, the user has to teach that person to the device, taking pictures of them and then saying their name out loud," said Murgia. And from that moment on, Horus is able to recognize that person in real time, even at a distance. "Horus continuously detects all the faces in front of the person and checks them against all the people who have been taught to the device."
When the English language version of Horus goes for sale in January, it will cost roughly $2,000. But it still needs fine tuning before its launch. "We are mostly improving the user interface, making it easy for everyone, even people who don't regularly use technology," said Murgia, who has tested it out on 100 visually impaired people since 2014. "Blind people always tell us that it will make a huge difference in their everyday life."