Sixth Sense – Turning the Physical World into a Computer Interface
So you are at the supermarket. You are standing in the aisle that features a gazillion different types of toilet paper wondering which product you should buy.
If you were home, you could Google-search each brand to see which one might be the most ecologically sound to purchase. Of course, if you were so inclined, you could also whip out your cell phone, get online and start the process right at the store.
But as amazing as Google is as a search engine and as amazing as the world wide web is as a data storage system, the idea of researching which toilet paper is the most ecologically-sound purchase is currently far too cumbersome to enact while standing in the aisle.
However, not too surprisingly, some folks have begun asking, is there a way we could make such information available to the purchaser, a process that might allow us to access the available data via that cell phone without ever taking the device out of our pocket? And could there be a way to transfer the data stored on the web via the cell phone so as to appear on the very package you are considering purchasing?
Maes is well known for her contributions related to media work having been a key architect behind the concept called “collaborative filtering” (the principles used to generate music at Pandora.com). An associate professor in MIT’s Program in Media Arts and Sciences, Maes founded and currently directs the Media Lab’s Fluid Interfaces group. Mistry, a PhD student within the Fluid Interfaces Group, is deemed the genius behind this new concept called Sixth Sense.
Combining two electronic devices already readily available to most people, a camera and a cell phone, with two not currently as ‘en vogue,’ a portable projector and a mirror, these researchers have built a prototype device that plugs into the net in an entirely new way. The ingenious, not-so-chic, wearable device truly allows one to rethink the ways in which humans and computers interact.
With Sixth Sense, “the computer is no longer a distinct object, but a source of intelligence that’s embedded in our environment.” Then “by outfitting ourselves with digital accessories,” we are able to “continually learn from (and teach) our surroundings.”
The concept essentially turns your entire world into a computer, a step that allows us to build upon the use of our five natural senses to evaluate our surroundings. In doing so, it adds the most powerful element possible as the data stored on the world-wide web becomes available to provide one more evaluator.
Digital World Meets the Physical World
Most noticeably, the device is the first attempt to link connected digital devices directly to the physical world. Instead of information being confined to paper or to a screen connected to a computer, every physical object has the potential to become a computer interface.
The projector and the camera are both connected to whatever mobile computing device might be in the user’s pocket. The projector projects the visual information onto whatever surface is available: a wall, a table, even your hand.
But the real genius lies in the ability to use natural hand gestures and arm movements to interact with the information being made available. The camera is able to recognize a user’s hand gestures through computer-vision based techniques.
Imagine using your finger to sketch the @ symbol in the air and immediately having your email projected upon the wall in front of you? How about placing your fingers and hands into a framing gesture and having the camera snap a picture? Or drawing a circle on your wrist to have the device project an analog watch onto your arm?
Perhaps, with the right gesture, even helping you decide which toilet paper to purchase.