Creating interfaces in the world, where and when we need them, has been a persistent goal of research areas such as ubiquitous computing, augmented reality, and mobile computing. The WorldKit system makes use of a paired depth camera and projector to make ordinary surfaces instantly interactive. Using this system, touch-based interactivity can, without prior calibration, be placed on nearly any unmodified surface literally with a wave of the hand, as can other new forms of sensed interaction. From a user perspective, such interfaces are easy enough to instantiate that they could, if desired, be recreated or modified “each time we sat down” by “painting” them next to us. From the programmer’s perspective, our system encapsulates these capabilities in a simple set of abstractions that make the creation of interfaces quick and easy. Further, it is extensible to new, custom interactors in a way that closely mimics conventional 2D graphical user interfaces, hiding much of the complexity of working in this new domain.
Author Archives: charriso@cs.cmu.edu
Using Shear as a Supplemental Input Channel for Rich Touchscreen Interaction
Touch input is constrained, typically only providing finger X/Y coordinates. To access and switch between different functions, valuable screen real estate must be allocated to buttons and menus, or users must perform special actions, such as touch-and-hold, double tap, or multi-finger chords. Even still, this only adds a few bits of additional information, leaving touch interaction unwieldy for many tasks. In this work, we suggest using a largely unutilized touch input dimension: shear (force tangential to a screen’s surface). Similar to pressure, shear can be used in concert with conventional finger positional input. However, unlike pressure, shear provides a rich, analog 2D input space, which has many powerful uses.
OmniTouch: Wearable Multitouch Interaction Everywhere
OmniTouch is a body-worn projection/sensing system that enables graphical, interactive, multitouch input on everyday surfaces. Our shoulder-worn implementation allows users to manipulate interfaces projected onto the environment (e.g., walls, tables), held objects (e.g., notepads, books), and even their own bodies (e.g., hands, lap). This approach allows users to capitalize on the tremendous surface area the real world provides. For example, the surface area of one hand alone exceeds that of typical smartphone; tables are often an order of magnitude larger than a tablet computer. If these ad hoc surfaces can be appropriated in an on-demand way, users could retain all of the benefits of mobility while simultaneously expanding the interactive capability.
Zoomboard: A Diminutive QWERTY Keyboard for Ultra-Small Devices
The proliferation of touchscreen devices has made soft keyboards a routine part of life. However, ultra-small computing platforms like the Sony SmartWatch and Apple iPod Nano lack a means of text entry. This limits their potential, despite the fact they are capable computers. We created a soft keyboard interaction technique called ZoomBoard that enables text entry on ultra-small devices. Our approach uses iterative zooming to enlarge otherwise impossibly tiny keys to comfortable size. We based our design on a QWERTY layout, so that it is immediately familiar to users and leverages existing skill. As the ultimate test, we ran a text entry experiment on a keyboard measuring just 16 x 6mm – smaller than a US penny. Users achieved roughly 10 words per minute, allowing users to enter phone numbers and searches such as “closest pizza” and “directions home” both quickly and quietly.
FingerSense: Enhancing Finger Interaction on Touch Surfaces
Six years ago, multitouch devices went mainstream, and changed the industry and our lives. However, our fingers can do so much more than just poke and pinch at screens. FingerSense is an enhancement to touch interaction that allows conventional screens to know how the finger is being used for input: fingertip, knuckle or nail. This opens several new and powerful interaction opportunities for touch input, especially in mobile devices, where input bandwidth is limited due to small screens and fat fingers. For example, a knuckle tap could serve as a “right click” for mobile device touch interaction.
Acoustic Barcodes
Acoustic Barcodes are structured patterns of physical notches that, when swiped with a fingernail, produce a complex sound that can be resolved to a unique ID number. A single, inexpensive contact microphone attached to a surface or object is used to capture the waveform. Acoustic Barcodes could be used for information retrieval or to triggering interactive functions. They are passive, durable and inexpensive to produce. Further, they can be applied to a wide range of materials and objects, including plastic, wood, glass and stone.