VizWiz is an iPhone application aimed at enabling blind people to recruit remote sighted workers to help them with visual problems in nearly real-time.

Users take a picture with their phone, speak a question, and then receive multiple spoken answers. We are using VizWiz as a tool to explore human-backed access technology– the idea that access technology would be more reliable and useful if humans could back up fragile (but fast and cheap) automatic approaches. With services like Mechanical Turk, social networks like Facebook and Twitter, and everyone connected at all times on their mobile devices — the human cloud is ready and waiting; we just need to figure out how to harness them to do useful work!

The captioned video below shows an early version of VizWiz being used by a blind person in our lab. We’ve since improved the interface and released it in a field deployment — check out our UIST 2010 paper on VizWiz to learn more about that study, how we get answers back from real people in less than 30 seconds, and learn how you can use humans to prototype new interactions:

VizWiz: Nearly Real-time Answers to Visual Questions (2010).
Jeffrey P. Bigham, Chandrika Jayant, Hanjie Ji, Greg Little, Andrew Miller, Robert C. Miller, Robin Miller, Aubrey Tatrowicz, Brandyn White, Samuel White and Tom Yeh. In Proceedings of the ACM Symposium on User Interface Software and Technology (UIST 2010). New York, New York. To Appear.