Source: International Business Times UK
By: Jason Murdock
In a heartfelt demo that rounded off the Microsoft Build conference keynote this year, software engineer Saqib Shaikh outlined an ongoing research project that uses machine learning and artificial intelligence to help visually impaired or blind people to better ‘see’ the world around them.
London-based Shaikh, who has been blind since the age of seven, said that talking computer technology inspired him to develop the application – titled SeeingAI – that is built upon Microsoft Intelligence APIs to translate real-world events into audio messages.
The application is intended to work on both smartphones as well as PivotHead smartglasses. The video demonstration, above, depicts Shaikh taking a picture with his glasses which then describe to him exactly what they ‘see’ – from business meetings to teenagers skateboarding on the streets of London to a woman throwing a Frisbee in a park. While another scene demonstrates how the smartphone app uses a device’s camera to take a picture of a menu then translate the text into audio.
Continue reading at IBTimes.Co.UK