Category Archives: Human-Computer Interaction

OmniTouch: Wearable Multitouch Interaction Everywhere

OmniTouch is a body-worn projection/sensing system that enables graphical, interactive, multitouch input on everyday surfaces. Our shoulder-worn implementation allows users to manipulate interfaces projected onto the environment (e.g., walls, tables), held objects (e.g., notepads, books), and even their own bodies (e.g., hands, lap). This approach allows users to capitalize on the tremendous surface area the real world provides. For example, the surface area of one hand alone exceeds that of typical smartphone; tables are often an order of magnitude larger than a tablet computer. If these ad hoc surfaces can be appropriated in an on-demand way, users could retain all of the benefits of mobility while simultaneously expanding the interactive capability. 

Zoomboard: A Diminutive QWERTY Keyboard for Ultra-Small Devices

The proliferation of touchscreen devices has made soft keyboards a routine part of life. However, ultra-small computing platforms like the Sony SmartWatch and Apple iPod Nano lack a means of text entry. This limits their potential, despite the fact they are capable computers. We created a soft keyboard interaction technique called ZoomBoard that enables text entry on ultra-small devices. Our approach uses iterative zooming to enlarge otherwise impossibly tiny keys to comfortable size. We based our design on a QWERTY layout, so that it is immediately familiar to users and leverages existing skill. As the ultimate test, we ran a text entry experiment on a keyboard measuring just 16 x 6mm – smaller than a US penny. Users achieved roughly 10 words per minute, allowing users to enter phone numbers and searches such as “closest pizza” and “directions home” both quickly and quietly.

FingerSense: Enhancing Finger Interaction on Touch Surfaces

Six years ago, multitouch devices went mainstream, and changed the industry and our lives. However, our fingers can do so much more than just poke and pinch at screens. FingerSense is an enhancement to touch interaction that allows conventional screens to know how the finger is being used for input: fingertip, knuckle or nail. This opens several new and powerful interaction opportunities for touch input, especially in mobile devices, where input bandwidth is limited due to small screens and fat fingers. For example, a knuckle tap could serve as a “right click” for mobile device touch interaction.

John Anderson Earns Highest Honor From Association for Psychological Science

CMU’s John R. Anderson—whose human thought and cognition research has revolutionized how we learn—has been selected to receive the Association for Psychological Science’s (APS) William James Lifetime Achievement Award for Basic Research. The award, APS’s highest honor, recognizes Anderson’s profound impact on the field of psychological science and his significant intellectual contributions to the basic science of psychology.

“John Anderson is being recognized both for the importance of his theoretical contributions and for his success in transitioning his theories into widely used applications having great societal impact,” said John Lehoczky, dean of the Dietrich College of Humanities and Social Sciences. “It is entirely fitting that John would be selected for the William James Lifetime Achievement Award, as he is among the very best scholars of psychological science.”

Acoustic Barcodes

Acoustic Barcodes are structured patterns of physical notches that, when swiped with a fingernail, produce a complex sound that can be resolved to a unique ID number. A single, inexpensive contact microphone attached to a surface or object is used to capture the waveform. Acoustic Barcodes could be used for information retrieval or to triggering interactive functions. They are passive, durable and inexpensive to produce. Further, they can be applied to a wide range of materials and objects, including plastic, wood, glass and stone.

Siewiorek Named Director of Quality of Life Technology Center

Daniel P. Siewiorek has been named director of the Quality of Life Technology (QoLT) Center, a National Science Foundation Engineering Research Center. Carnegie Mellon University and the University of Pittsburgh are partners in the center, which focuses on creating intelligent systems that improve the quality of life for everyone while enabling older adults and people with disabilities.

Siewiorek, a longtime CMU faculty member who had been acting director since the fall of 2011, was selected for the position following a nine-month national search process. He succeeds Takeo Kanade, director emeritus and professor of computer science and robotics.

Designing Robowall

Robowall was designed to serve as an interactive public exhibition in Newell-Simon Hall of the amazing projects of the HCII and Robotics Institute. It was designed and developed throughout the 2012-2013 school year by students in Professor Haakon Faste’s Deliberate Design Studio. It was built by Jason Block (MHCI), Susan Buenafe (MHCI), Kristine Mendoza (MHCI), Chris Mueller (MHCI), Kevin Schaefer (IS + HCI), and Heidi Yang (Tepper).

 

Understanding Data Collection in Autism Education

Autism education programs for children collect and use large amounts of behavioral data on each student. PhD Student Gabi Marcu has been investigating this for several years. Staff use paper almost exclusively to collect these data, despite significant problems they face in tracking student data in situ, filling out data sheets and graphs on a daily basis, and using the sheets in collaborative decision making.

Gabi and her team conducted fieldwork to understand data collection and use in the domain of autism education to explain why current technology had not met staff needs. They found that data needs are complex and unstandardized, immediate demands of the job interfere with staff ability to collect in situ data, and existing technology for data collection is inadequate. They also identified opportunities for technology to improve sharing and use of data.

CMU Student Startup Places Payments at Users’ Fingertips

It may take two to tango, but payments now are as easy as one touch.

Four Carnegie Mellon University seniors tired of digging through backpacks, pockets and purses for their student identification and debit cards have developed PayTango, a fingerprint-based identification and payment system.

With majors ranging from information systems and human-computer interaction to industrial design, Brian Groudan, Kelly Lau-Kee, Umang Patel and Christian Reyes combined their expertise to launch their startup.

“We believe you should be able to walk into any establishment and prove who you are without carrying anything — no apps, no cards,” Groudan said. “PayTango can be used for everyday activities like paying for a morning coffee or critical scenarios like identifying patients’ medical information in a hospital.”

The startup is attracting attention from media and potential investors. Inc. magazine recently named it among “America’s Coolest College Startups” for 2013.