The future of touch screens is in good hands
I stumbled upon this video about the possible future of touch screens by Chris Harrison from CMU and found it amazing. You can see him demonstrating their TouchTools and TapSense apps for tablets.
The device they put together is able to achieve something really cool: anticipating what a user wants to do next. By positioning their hand as if they were actually holding a physical object over the glass users get access to that same object except on screen. In fact, the smart screen recognises the grip unique to a product and its intended use.
While some of the examples shown may provide little real-world use, such as a camera or a mouse, this little experiment makes you think about all the possibilities associated with this type of technology.
What I find even more impressive though, is the precise recognition of different types of touch input such as fingertip or knuckle by the device. This opens up a lot of possibilities in terms of interface interactions, as a single button could output various results depending the nature of the touch and essentially triple the amount of possible actions on a given screen.
I can’t think of a current app that would directly benefit from this, except for something like Knock maybe. And even then it’s a long shot.
Nevertheless, it’s going to be really interesting to explore new ways to interact with screens in the future.