Wearables – Navigation in Footwear
Japanese start-up Ashirase is developing a navigation system integrated into shoes to help people with visual impairments walk. It consists of a smartphone app and a three-dimensional vibration device with a motion sensor placed inside the shoe. Based on the route preset with the app, the system vibrates to provide navigation instructions. If the user must go straight, a vibration is triggered on the front foot; if the user must turn right or left, a vibration is triggered on the left or right side. Lower Austrian company Tec-Innovation, in collaboration with the Institute for Machine Vision and Display at Graz University of Technology, focuses on artificial intelligence in footwear. Their current prototype is a shoe with ultrasonic sensors on the toe and an integrated camera. The system works with algorithms trained via machine learning to recognize an obstacle-free and thus hazard-free walkable area based on camera images from the foot’s perspective.
Routago – AI for Visually Impaired Pedestrians
Together with research partner Karlsruhe Institute of Technology (KIT), start-up “Routago” has developed a technology platform working with artificial intelligence to support blind and visually impaired people with tools and aids in traffic. The navigation takes street sides, sidewalks, parks, and pedestrian zones into account. It guides users to safe street crossings at traffic lights, crosswalks, pedestrian bridges, as well as underpasses. The system combines navigation, object recognition, environmental information, and individual route management. To do so, it supplements the existing geodata with an algorithmic addition specifically for pedestrians. Users hear the directions via voice-over.
NaviLens – A Digital Map for the Blind
Based in Murcia, Spain, NaviLens has developed a smartphone-based solution that makes bus travel, sightseeing, and shopping possible for people with impaired vision. At the heart of the application are special QR codes measuring around 20 square centimeters that can be recognized by smartphone camera from up to twelve meters away, including from different angles. The app recognizes the codes even without focusing. It downloads the linked information at lightning speed and outputs it to users as voice instructions. The technology is currently being used in transit systems in the cities of Barcelona, Madrid, and Murcia. New York and Los Angeles are currently testing the system in a subway station and a train station.
Indoor Navigation – everGuide from the Fraunhofer Institute FOKUS
Indoor navigation app “everGuide” from the Berlin Fraunhofer Institute FOKUS is designed to enable users to navigate safely in complex and sprawling buildings. To ensure that the app’s building data is up to date, the Fraunhofer scientists send out a robot with cameras and laser scanners to create an exact digital map. Smartphones play a key role in navigation. Acceleration and angular rate sensors, for example, are used to determine the location, recording the decrease and increase in speed, and changes in the device’s position. Signs installed in the building provide further input with special QR codes, which the smartphone’s camera automatically recognizes and are also used to determine position. The navigation software on the smartphone merges the various data and adjusts navigation. Field tests with the Fraunhofer app are currently taking place in the House of Health and Family in Berlin, in the city hall building of the North Rhine-Westphalian city of Solingen, and in the Foreigners’ Registration Office in Cologne.
Peak Vision – Measuring Visual Acuity with Smartphones
Five years ago, British company Peek Vision launched Peek Acuity, an app that is available in more than 190 countries as a certified medical aid for checking vision. Trained laypeople can use the Peek Acuity diagnostic procedure to measure patients’ visual acuity with a simple smartphone and forward the data to doctors for remote diagnosis. To do this, the subject is shown the letter E in various orientations and sizes on the smartphone display – just like on a classic eye chart at the ophthalmologist’s office. The test person sits on a chair three meters away from the screen, covers one eye, and points their finger to indicate the direction in which the openings of the letter E point. Once the test is complete, the app displays the calculated value for visual acuity.