Monthly Archives: January 2014

Jodi Forlizzi Named to CHI Academy

Jodi Forlizzi, associate professor of human-computer interaction and design, has been named to the CHI Academy — an honorary group of individuals who have made substantial contributions to the HCI field. Each year, the Association for Computing Machinery’s Special Interest Group in Computer-Human Interaction elects five to seven new academy members whose efforts have shaped the discipline and the industry. Members are selected based on cumulative contributions to the HCI field, impact on the field through development of new research directions and/or innovations, influence on the work of others, and participation in the ACM SIGCHI community.

Forlizzi’s work in the field of interaction design ranges from understanding the limits of human attention to understanding how products and services evoke social behavior. She designs and researches systems ranging from peripheral displays to social and assistive robots and the interfaces that control them. Forlizzi has also applied her design research thinking to new research topics including big data, healthcare, and service design.

Forlizzi joins four current HCII faculty members who belong to the CHI Academy: Professor Scott Hudson, Hillman Professor of Computer Science and Human-Computer Interaction Sara Kiesler, Herbert A. Simon Professor of Human-Computer Interaction Robert E. Kraut, and Professor Brad Myers. Former HCII Professor Bonnie John the late Randy Pausch are also CHI Academy members.

Cassell Assumes Associate Vice Provost Responsibilities

Justine Cassell, the Charles M. Geschke Director of the Human-Computer Interaction Institute and co-director of The Simon Initiative, has taken on additional responsibilities as the university’s Associate Vice Provost of Technology Strategy and Impact.

In an email to faculty, Provost Mark Kamlet said Cassell’s duties as associate vice provost will include strategy and outreach efforts related to the Global Learning Council as well as university-wide efforts that fall broadly within the area of human-computer interaction.

The Global Learning Council, chaired by President Subra Suresh, is a component of The Simon Initiative. The GLC is a distinguished group of thought leaders from across the globe who are committed to the use of science and technology to enhance learning.

Crowdsourced RNA Designs Outperform Computer Algorithms

An enthusiastic group of non-experts, working through an online interface and receiving feedback from lab experiments, has produced designs for RNA molecules that are consistently more successful than those generated by the best computerized design algorithms, researchers at Carnegie Mellon University and Stanford University report.

Moreover, the researchers gathered some of the best design rules and practices generated by players of the online EteRNA design challenge and, using machine learning principles, generated their own automated design algorithm, EteRNABot, which also bested prior design algorithms. Though this improved computer design tool is faster than humans, the designs it generates still don’t match the quality of those of the online community, which now has more than 130,000 members.

The research will be published this week in the Proceedings of the National Academy of Sciences Online Early Edition.

“The quality of the designs produced by the online EteRNA community is just amazing and far beyond what any of us anticipated when we began this project three years ago,” said Adrien Treiulle, an assistant professor of computer science and robotics at Carnegie Mellon, who leads the project with Rhiju Das, an assistant professor of biochemistry at Stanford, and Jeehyung Lee, a Ph.D. student in computer science at Carnegie Mellon.

VizWiz: Nearly Real-Time Answers to Visual Questions

VizWiz, led by Professor Jeff Bigham of the Human-Computer Interaction Institute, answers questions that people who are blind or visually impaired have about the things around them. Users take a photo, ask a question, and receive answers back quickly from people on the web (the crowd). VizWiz leverages color identification, text recognition, and recognition of objects seen before, but automatic methods are limited to a small subset of questions in practice. The bulk of remaining questions are answered by humans, e.g. “is there a rash on my baby’s head,” “what number is on this credit card,” serving as challenges for computer vision. The VizWiz living laboratory illustrates the utility of deploying working crowd-powered systems to understand target domains via deployable Wizard-of-Oz. On-going work is using VizWiz to explore how people may volunteer their friends for microtasks they care about on social media.

Luis Von Ahn’s Duolingo Named iPhone App of the Year

In an early morning phone call on Tuesday, an Apple executive congratulated Duolingo founder Luis Von Ahn on the Oakland company’s designation as 2013 App of the Year — and warned him to secure company servers for a new onslaught of business.

Not long after, Apple announced publicly that the language learning software app was the editor’s choice for the iTunes App Store 2013 App of the Year.

The free iPhone app, described in the computer giant’s app store as “fantastically well-designed and easy to use,” beat out San Francisco-based photo editing app VSCO Cam and San Francisco-based educational game Endless Alphabet.

Mr. Von Ahn, a professor of computer science at Carnegie Mellon University, reported a definite increase in activity following the announcement.

But despite the push from Apple, an influx of new users isn’t exactly new to the company this year.

Duolingo has seen its user base soar from 3 million in May to 16 million in December.

He said the designation from Apple was particularly notable because it made Duolingo the first education app to take home App of the Year and also the first non-Silicon Valley based company to take home the prize.

(excerpt of Deborah M. Todd’s article for the Pittsburgh Post Gazette)

National Science Foundation Features CoBots on “Science Nation”

Collaborative robots, or CoBots, developed by Manuela Veloso and her Carnegie Mellon research team, have been running errands for occupants of the Gates and Hillman centers for more than two years. Now, they are the subject of a “Science Nation” video and special report by the National Science Foundation.

The robots operate autonomously, navigating their own way through the buildings as they deliver mail and messages, or guide visitors. But they also employ what Veloso, professor of computer science, calls “symbiotic autonomy,” in that they recognize their own limitations. Without arms, they must ask people for help pressing elevator buttons, opening doors and placing items in its delivery basket. They also can search the Internet for information that they lack.

The CoBots move on an omnidirectional base, ask questions using a synthesized voice and accept input from people through a touchscreen interface. Gates and Hillman center occupants can schedule tasks for CoBot through a special web site.

Bio-Inspired Robotic Device Could Aid Ankle-Foot Rehabilitation

A soft, wearable device that mimics the muscles, tendons and ligaments of the lower leg could aid in the rehabilitation of patients with ankle-foot disorders such as drop foot, said Yong-Lae Park, an assistant professor of robotics at Carnegie Mellon University.

Park, working with collaborators at Harvard University, the University of Southern California, MIT and BioSensics, developed an active orthotic device using soft plastics and composite materials, instead of a rigid exoskeleton. The soft materials, combined with pneumatic artificial muscles (PAMs), lightweight sensors and advanced control software, made it possible for the robotic device to achieve natural motions in the ankle.

The researchers reported on the development in the journal Bioinspiration & Biomimetics.

Park, who did the work while a post-doctoral researcher at Harvard’s Wyss Institute for Biologically Inspired Engineering, said the same approach could be used to create rehabilitative devices for other joints of the body or even to create soft exoskeletons that increase the strength of the wearer.

DARPA Selects Tartan Rescue Team For Robotics Challenge Funding

The Tartan Rescue Team from Carnegie Mellon University’s National Robotics Engineering Center ranked third among teams competing in the DARPA Robotics Challenge Trials this weekend in Homestead, Fla., and was selected by the agency as one of eight teams eligible for DARPA funding to prepare for next December’s Finals.

The team’s four-limbed CMU Highly Intelligent Mobile Platform, or CHIMP, robot scored 18 out of a possible 32 points during the two-day Trials. It demonstrated its ability to perform such tasks as removing debris, cutting a
hole through a wall and closing a series of valves.

The Defense Advanced Research Projects Agency is sponsoring the DARPA Robotics Challenge (DRC) to spur development of robotic technologies that could be used to respond to natural or man-made disasters in environments engineered for humans, such as the Fukushima nuclear power plant crisis of 2011.

Sixteen teams competed at the Trials. DARPA on Saturday announced it would enter into funding negotiations with Tartan Rescue and seven other teams, who tallied the highest scores during the Trials. Gill Pratt, DARPA’s program manager for the DRC, said the agency has $8 million budgeted for the teams and intends to spread the money evenly between them.