How researchers are turning ‘Star Wars’ droids into reality

Author: Notre Dame News

R2-D2, left, and C-3PO droids from "Star Wars" R2-D2, left, and C-3PO

After nearly 40 years of pop culture relevancy, the “Star Wars” saga is continuing this month with the Dec. 18 release of “Star Wars: The Force Awakens.” Fans are lining up to see beloved characters return to the screen, including Han Solo and General Leia, and to welcome several new ones, including a variety of droids.

The enduring popularity of and interest in C-3PO and R2-D2 speaks to the fascination many people have with robotics and artificial intelligence. Although no one will have their own C-3PO soon, a number of University of Notre Dame researchers are working to make droids more science fact than science fiction.

Attention and emotion detection

Sidney D’Mello, assistant professor of psychology and computer science and engineering, and his colleagues are researching the phenomena of “mind wandering” and are developing a software system that can both detect when a person’s focus shifts from the task at hand and get that person to refocus. The researchers’ vision is to make computer interfaces intelligent enough to spot a user’s waning attention and take action. The system’s software tracks a person’s eye movements with a commercial eye tracker, a person’s facial features with a webcam and the person’s interaction patterns. If the system determines that the person’s mind is wandering, it can pause the interaction, notify the person, plan a different type of interaction, or tag the interaction for future restudy. Their work also tracks emotions such as confusion, frustration, delight and boredom, in order to increase the bandwidth of interaction to encompass what people think and feel in addition to what they say or do.

Contact: Sidney D’Mello, 574-631-8322,

Personal robots

Laurel Riek patient simulators

Laurel Riek, Clare Booth Luce Assistant Professor of Computer Science and Engineering, builds robots able to sense, respond and adapt to people. Until fairly recently, robots were separated from people by cages, but now they need to work with people in human environments, which are dynamic and constantly changing. Riek’s research tackles the fundamental and applied problems that make this complex, real-world interaction so difficult.

One project explores team coordination, and builds computer vision and machine learning algorithms able to sense how people coordinate their behaviors in real time. This knowledge is then used by Riek’s robots to automatically adapt their behavior to cooperate with people. In addition to informing robotics, the results of this project will also be used in health care settings to enable robots to be more adaptive when helping people with disabilities, as well as to design new sensing systems to help surgical teams avoid making fatal mistakes in the operating room before they happen.

On another project, Riek is improving the state of the art for the most commonly used humanoid robots in the world today: robotic human-patient simulators. These are life-sized robots that train clinicians to safely treat patients. Unfortunately, current systems are missing a key feature: Despite the critical importance of facial cues in diagnosis and effective communication, none of the commercially available simulators have expressive faces, which ruins simulation realism and student immersion. Riek’s research, supported by an NSF CAREER award, involves designing intelligent, interactive, high-fidelity robotic patient stimulator systems that can express patient signals of pain, stroke and neurological impairment.

Contact: Laurel Riek, 574-631-8380,

Walking robots

KURMET is a Locomotion and Biomechanics Lab robot that hops KURMET is a Locomotion and Biomechanics Lab robot that hops

Although C-3PO‘s walking gait is a bit clunky, it is still decidedly more human-like than that of most existing humanoid robots. Using what is called ZMP-based walking, these robots rely on carefully choreographed walking motions, perfectly flat ground and large feet to ensure stability. This approach is relatively simple but inefficient because it fails to consider the natural dynamics of walking that humans unconsciously exploit to minimize energy. Much of the biped robotics research at Notre Dame’s Locomotion and Biomechanics Laboratory is developing techniques to better enable these robots to walk more dynamically like humans do. One approach employs a mathematical technique known as Hybrid Zero Dynamics (HZD) that generates stable gaits even when the robot lacks ankles and has a foot so small that it can be considered a single point, like a ballerina walking on her toes.

An alternative strategy applies direct nonlinear optimization to generate dynamically feasible gait trajectories. While lacking the stability guarantees of HZD, it is applicable to a broader range of robot designs. Both techniques have proven effective in experiments with a custom-built robot at Notre Dame.

Recently, a flywheel has been mounted on the body of that robot to provide stabilization regardless of foot-ground contact. As an added benefit of this approach, current work has shown that this inertial actuator not only improves stability, but can also improve walking efficiency in some circumstances.

Contact: James Schmiedeler, associate professor of engineering, 574-631-6403,