The Feel the View prototype uses haptic feedback and artificial intelligence to give blind and visually impaired passengers the opportunity to experience the view out the car window.

First, a camera takes a picture of the passing scenery and converts it into grayscale. Each and every shade of gray is then represented by one of 255 unique vibration intensities on the window glass. By placing their hands on the window, blind and visually impaired passengers can “read” these vibrations much like they would Braille text. An AI within the vehicle’s audio system provides auditory cues to further describe what is in the image and help the user put it into context.