“Natural User Interface (NUI) relates to interfaces that are hidden or that eventually turn invisible as the user keeps using them, or that become an integral part with the environment one operates in.” – Ofer Spiegel
I would like to discuss a controversial theme regarding the use of NUIs: the approach game console developers are taking in order to introduce more NUI tech and expand the video games market. Don’t get me wrong; I am an absolute fan of the concept of NUIs, and if ever there was an era ready to adopt them and integrate them in the household, it is now. There are pros and cons to consoles such as the Wii or the Xbox Kinect, and this short article will not be able to do justice to the lengthy discussion that can be had regarding the UX of gaming NUIs. However, I do believe that the current gaming consoles are used at a limited potential due to bad UX design, and that the path of Virtual Reality is the way to go. I am not arguing that all NUI games have bad UX. Games on tablets and smartphones are prime examples of how successful a simple NUI can be, for example Angry Birds, Temple Run etc.
Here is a quick summary of some things that are wrong with the Kinect, PS Move or Wii: you can translate a very limited amount of your seemingly unlimited natural body movements into in-game actions. On one hand, these consoles have allowed us play tennis in the middle of our living room, which is impressive. But as soon as the novelty effect fades away, players realise that they can sit on their couch and move their wrist at a 62 degrees downward angle for the same result: so much for natural movements. In my opinion, a tennis game that captures your entire body’s movement in order to calculate the speed at which your racket hit the ball gets the NUI concept right. A game that simply calculate the relative direction and velocity of a controller’s rotation does it wrongly:
Another bad paradigm of NUIs in gaming is the fact that the “controls” have changed, but the role of the screen as a feedback tool remains the same. If your natural movement takes the controller further than the rim of the screen, the signal gets lost or the input is simply ignored. In fact, in the way these consoles are designed and implemented, they do not read the players’ natural movements to turn them into game actions. Instead, players need to figure out which “natural” movements are indeed accepted by the game, and that for every game and every console. This is not a natural interface, it merely means you need to learn to lift your left hand when you want to jump rather than pressing up on your joystick. Full-body games such as Dancing games require your entire body to move by following the avatar’s instructions on the screen, which is somewhat closer to the way NUIs should be used in my opinion.
The systems that take NUI gaming in the right direction, in my opinion, are the Kickstarter-funded devices Oculus Rift and Virtuix Omni treadmill. The first device is a head-mounted virtual reality display which allows the player to move their head and see as if they were physically present in the game, thanks to the Rift’s accelerometers and motion sensors. The Omni Treadmill reads player movements in any direction, at any speed and even actions such as jumping or kneeling and translates them into their respective in-game counterparts. Combine the two, and you are instantly teleported, body and mind, into the game.
The main point I am trying to make is that video game developers should stop trying to use NUIs as if it meant “No joystick needed”. The basic concept is to match the product’s reactions to the player’s natural actions; not to force the player to use specific actions using their body (whether it be voice, touch or expressions) in order to trigger the product’s reactions. Smartphones and tablets are so popular because the way we touch, swipe and tap the screens follows everyday movements and behaviours of objects we have learned ever since we were babies. The way some NUI games do it, it is like having to wave your arm about in front of your camera every time you want to scroll down in your phone’s browser.
In the case of the Omni treadmill and the Oculus Rift, they simply enhance your natural actions of looking and moving about. You don’t need to be told which actions do what: You can move your hands, you can look around you, and you can feel the ground move beneath you. If you don’t move your feet, your body won’t budge. The UX of such virtual reality tools is instantly intuitive, as you only need to be yourself. Now that, in my opinion, sounds pretty natural.
Cover image © nice-cool-pics.com