MIT develops FingerReader, wearable device to assist in reading text

MIT's Fluid Interfaces Group from the MIT Media Lab has developed the FingerReader, a wearable device that assists people with limited vision and can also aid in language detection.

FingerReader uses a camera housed inside the ring's body to line up the words and read them out loud to the user. The video shows a woman selecting a book by running her finger along the book spine and listening as the title is read.


http://fluid.media.mit.edu/projects/fingerreader

As the user moves a finger underneath the line of text the words are read and a small vibration from the device's haptic actuators alerts the user to move down to the next line. The user can control the speed of the reading and read lines multiple times for deeper understanding.

A FAQ on the Fluid Interfaces Group website says that almost 3% of the population is visually impaired. This is the potential market for the device, along with children and people trying to learn a new language.

Different text reading devices are explored and discussed and the FingerReader touts its ability to almost immediately read text as a major benefit. MIT's algorithm was built to read multiple words instead of one word at a time and this is another huge bonus of the FingerReader.

 The website states in a few places that this model was built solely as a proof of concept and is not yet in development as a consumer product. The model in the video consists of a 3d printed housing with its own camera, text-to-audio converter and power supply.

Roy Shilkrot, Jochen Huber, Connie Liu, Suranga Nanayakkara and Pattie Maes will be presenting their paper FingerReader: A Wearable Device to Support Text Reading on the Go at the 2014 Human Factors in Computing Systems convention.


http://fluid.media.mit.edu/projects/fingerreader