2013, The Dawn Of Wearable Computing?

Sooner rather than later you’ll have a computer attached to your face, and for some, it’ll happen as early as next year. Why? Because the era of wearable computing is dawning as startups and established tech companies focus their efforts on designing eyewear that converges the digital and real worlds right before your eyes.
 
Whether you wear these headgear in the comfort of your living room or as you walk around in daily life, during your leisure time or at all times, or for work or play, the inescapable truth is that computers are taking the next logical step in their evolution from big chunky boxes to smaller wearable forms, which will open up new ways to be productive, social, and entertained. This window into connected life will take either of two forms: augmented reality, in which a digital interface is blended with the physical world, or virtual reality, where complete immersion in a synthetic world is achieved.
 
The key development in 2013 will be computers molded to human anatomy (finally!).
 
During the time that IBM introduced personal computers into the mainstream three decades ago, the concept of wearable computers emerged, primarily due to the 1983 film Brainstorm that featured a massive helmet device capable of capturing video and recording human sensations. Then in the 1990s, the functionality of PCs was expanded as they became even more connected through the Internet and the promise of greater mobility loomed with high-end notebooks (laptops) hitting the market.
 
But three things kept most computers firmly planted on desktops everywhere: the dependency on the electrical grid for power, the need for Ethernet cables to network, and a clunky form factor that has changed little since the first PCs.
 
Today, battery technology allows computers increasingly longer periods of time away from the grid. WiFi and other wireless technologies have effectively cut the Ethernet umbilicus allowing mobile computing to become widespread. But advances in electronics and miniaturization have yet to free computers from their recognizable rectangular forms. Even smartphones mimic the black brick forms of their monolithic-like predecessors.
 
It’s time for computers to integrate with biology, and there’s no better place to start than with the eyes.