Natural User Interfaces. The Microsoft Kinect is a device categorized as such. It was introduced in 2010 alongside the XBox 360 game console. It opened up a new level of interactive game playing. The Kinect became the fastest selling consumer electronics device. It was also hacked inside out by developers. Why is this device so significant and what is in store for the future?
3 Worlds now exist;
The real world as we know it, the world of our imagination and the world that we have
created within our computers.
While each world offers near endless possibilities their limitation resides in the ability to efficiently and effectively
interact. [B.J. Rao]
How does your business, solution, product fit into things? More specifically, how does it enhance interaction between worlds?
When developing a new solution/product it may be a good idea to see how it fits into this. In other words, how does it enhance the way to interact between worlds. Anything which allows you to enhance the interaction between these worlds can be of great significance.
So what does this interaction relate to exactly? Why is it so significant?
Well, if I have an idea I can write it down on paper. The process of converting your idea from the world within your mind into the real world as we know it. You could also use a keyboard and mouse to do the same on your computer. Once it’s within the world of your computer you can review it back into the real world from your screen or print it out.
A more profound example is a microscope or a telescope which allows you to enhance your observation of something from the real world into the world of your mind. A 3D scanner allows you to take something from your real world (in this case the surface shape of an object or scene) and convert it into data in the world of your computer. This data can then be converted back to the real world via a 3D printer.
All seemingly trivial. Yes, but what if you discovered a new, better and more efficient means to interact with, for instance, your computer?
Technologies such as these have proven to be extremely significant. You can imagine the profound effect that the microscope had in science, technology and society as well as what 3D scanners and 3D printers are having in today’s market.
“Our senses enable us to perceive only a minute portion of the outside world.” [Nikola Tesla 1905]
An recent example of an emerging technology which greatly enhances the interaction process with our world and that within our computers are Natural User Interfaces (NUI).
Most of us, by now, have experienced how intuitive it is to interact with our machines using a touch screen such as the iPhone and iPad. The introduction of the mouse and GUI’s just a few years ago had a very similar effect. NUI’s take that a leap forward.
Natural User Interfaces allow you to interact with your computer using gestures. You don’t touch anything. But your machine is able to interpret your gestures such as fingers, arm, hand and body movement as commands and data input. This is achieved via an NUI device. And, it won’t take long before these devices will also be able to respond to commands based on facial gestures and eye movement. They will even be able to detect your mood based on facial expressions.
The impact of NUI technologies on the interaction between worlds? This will probably be greater than the impact that the invention of the microscope had in science. An evolutionary step and a revolution in interaction between humans and machines. It will carry a market, industry and society as we know it into a new phase.
Today technology is very prolific, its in your face and its there almost all the time. Our phones, tablets, desktops, screens, webcams are everywhere and in all styles. In my view this will all change. Technology will steadily become submerged. It will be assimilated into everything around us until we don’t directly see it anymore. But it’s still there and in an even bigger way than it is now. NUI will form an important means to allow you to interact with the technology around you.
So how do these Natural User Interfaces devices work?
Today’s devices emit encoded light and reserve a 3d volume working space. Within this space the users gestures can be tracked via a camera. Distortion of the projected light serves as information about the differences in-depth in the 3D volume scene. Otherwise explained, these devices are equipped with a real time 3D scanner. The 3D data allows a person within the working space to be easily isolated from the surrounding environment and tracked with high precision.
Many NUI technologies have been developed. Some have been produced for several years already. 2 of the most popular and most significant of today are the Microsoft Kinect and the Asus Xtion. The core sensing technology of both devices is the same. The Xtion is however much less restricted, uses a OpenSource API, enhanced hardware and costs less. Unfortunately its much less commonly known.
Another recent vision technology to go into commercial production is the Lytro camera. Referred to as a “light field camera” the Lytro is a plenoptic camera. Conventional cameras have a single imaging array and view a scene with a single lens system. Plenoptic camera’s use a multitude of imaging arrays and/or a multitude of lens systems to view the same scene but from a multitude of slightly different directions. They can produce a picture that is always in focus. This information can also be used to build a 3D picture. It can also be used to serve as an NUI device. Their strength here is that as a 3D scanner or NUI no active patterns need to be projected such as with the Kinect and Xtion. A passive 3D NUI camera is the result.
But it should be made clear that it is not the technology, it’s the people who get things done. More specifically, it’s the people who carry a vision regarding the placement and use of technology. Those that are able to merge the perception of others towards this vision and how the technology is used.
A high-tech living room of the future. Maximum minimalism? No, I won’t give up my iPhone just yet.