We interact with computers and technology in almost every sphere of our lives. Scientists at the University of Sussex in the UK aim to bring this idea even closer to home by using our hands as the display extension for the next generation of smartwatches and other smart devices.
According to lead researcher Professor Sriram Subramanian, SkinHaptics is a major step towards what designers are calling the “eye-free” age of technology.
"Wearables are already big business and will only get bigger. But as we wear technology more, it gets smaller and we look at it less, and therefore multisensory capabilities become much more important,” he says.
"If you imagine you are on your bike and want to change the volume control on your smartwatch, the interaction space on the watch is very small. So companies are looking at how to extend this space to the hand of the user.
"What we offer people is the ability to feel their actions when they are interacting with the hand."
The system relies on the rapidly growing field of Haptics, the science of applying touch (tactile) sensation and control to interaction with computer applications. It sets itself apart from other research in that, so far, skin-touch displays have relied on vibrations or pins which need contact with the palm, interrupting the display.
The new, SkinHaptics system sends sensations to the back of the hand, leaving the palm free to hold a display.
The findings were presented at the IEEE Haptics Symposium in Philadelphia, by the study’s co-author Dr. Daniel Spelmezan.