Event Dates: June 11, 2014 - 4:00pm
The seminar will cover my Ph.D. work on developing an interactive system that strives to give visually impaired people direct perceptual access to images via an acoustic signal. The user explores the image actively on a touch screen or touch pad and receives auditory feedback about the image content at the current position. Computer Vision and Machine Learning approaches are harnessed to pre-evaluate the image and convey low-level information, such as color, edges, and roughness, as well as high-level information, such as recognized objects per user position. Experiments show that congenital blind participants could use such a system to successfully to interpret and understand whole scenes.