Google announced a while back that they’re planning to get their new smart glasses on the market by the end of this year. A month or so ago, they even came out with a demo video, depicting a day in the life of the average person of the very near future. Almost magically, it seemed, little pictures would pop up around your face and show you which way to turn, or where the music section is in the Strand, since it’s really tough to navigate your way through a bookstore on your own.
Recently though, it was revealed that the Google Glasses prototypes don’t actually augment reality as much as they appear to do in the video. Instead of a full view interface, the glasses merely show the information slightly above the wearer’s sight line “where the edge of an umbrella might be.”
Information showing up slightly above a person’s normal viewpoint actually seems a lot more sensible. Most are probably unprepared for such a major shift in increased interaction between technology and the environment. Used to picking up a phone or some other device at will, it would be nice to still have a choice as to whether you want to see the information or not, though with the glasses, it would be as easy as looking upwards. And they make picture taking a whole lot easier, as demonstrated by Google X Lab founder Sebastian Thrun‘s awesome photo of himself and his son:
Despite the toning back of the interface, the cool factor of these things has not gone down, as these very artsy looking snapshots of Google employees show. If only they didn’t look like giant metal unibrows…