Glasses translate languages into subtitles on the fly

Learning a new language can be tough, but what if you could just look at someone speaking in a foreign language, and with a pair of glasses, translate their speech into subtitles in a language you do comprehend? That would change everything, wouldn’t it?
 
In another DIY spectacular, William Powell returns with his augmented reality glasses, but this time rather than beat Google’s Project Glass at its own game, he’s upped the ante with a headset that can display subtitles of conversations spoken in foreign languages.
 
Powell’s entire translation setup doesn’t use any fancy state of the art equipment. It’s made from parts you can go out and buy yourself: two Raspberry Pi running Debian Squeeze, Vuzix 1200 Star, Jawbone Mic, headset mic, TV, iPhone, iPad and Asus Transformer. See, nothing special.
 
The individual using the glasses be wears the Vuzix 1200 Star glasses which are connected to the s-video connector on the first raspberry pi and the Jawbone bluetooth microphone that connects to a device such as smartphone or tablet to provide a clean noise cancelled audio feed. The bluetooth microphone streams across the network of what I say and what it picks up around me. This is then recognised and passed through Microsoft’s translation API with a caching layer to improve performance of regularly used statements. Passing through this API service is the biggest delay in the subtitles. Once translated the server passes back the text and translations that are picked up by the raspberry pi driving the TV and glasses displays. Elizabeth uses a headset mic but could user her own raspberry pi, glasses and jawbone microphone to have the same experience as I do.