March 29, 2024

Sapiensdigital

Sapiens Digital

Scientists Teach Robot To Mimic Human Facial Expressions With Deep Learning

Researchers at the Creative Machines Lab have created EVA, a robot face that can mimic human facial expressions in real-time. Teaching EVA to do so took over five years. The research, conducted at Columbia Engineering in New York, was presented at the ICRA Conference on May 30, 2021.

The team used deep learning artificial intelligence to allow EVA to watch videos of herself making random faces and teach herself how to make them. They then used this information to mirror human expressions caught on a video camera onto her face. 

A smiling robot is not just about programming

From nursing homes to grocery stores, the use of robots is on the rise. Humans and robots are working closely now more than ever. In many workspaces, humans give their robot colleagues a name or draw eyes on them to humanize them. In return, robots can only offer the standard stoic faces that they are manufactured with.

Buoyant Chen, one of the researchers who worked on the EVA project says, “Robots are intertwined in our lives in a growing number of ways. So building trust between humans and machines is important.”

A few years ago, Hod Lipson, Chen’s advisor, noticed with his students how robots in their labs were staring back at them with their plastic eyes. Recalling that incident, Lipson says, “There is a limit to how much we can engage emotionally with chatbots or smart-home speakers. Our brains respond well to robots that have a recognizable physical presence.” As director of the Creative Machines Lab, Lipson directed Zanwar Faraj,  an undergraduate student, to make a responsive and expressive robot face.

Faraj had quite a challenge on hand. For many decades, robots have been made with metal or hard plastic. Robot visages house crude and bulky components like sensors, circuits, and motors. Faraj’s team used 3D printing to overcome the challenge and manufactured parts that would fit inside a Blue Man Group-inspired face, which they called EVA.

The team spent several weeks figuring out which ‘muscles’ (motors and cables) had to move for a smile, for a frown, and for the many other nuanced emotions that the human face is capable of. Once the team had the mechanics sorted out, they moved to the programming part of the project.

Facial expressions are complex 

Chen, responsible for the software part of the project, realized that facial expressions in social settings were too complex a problem to be governed by some pre-set rules. He worked with another team and created a brain for EVA using Deep Learning neural networks. Inspired by the human brain, a neural network is a series of algorithms that try to understand the underlying relationship in a data set. In a deep learning setup, the neural network works with raw data as its input. When making sense of data, the neural network also ‘learns’ what data is important and is needed to perform its function.

Chen’s team recorded EVA making random faces and then showed her the footage. EVA’s ‘brain’ paired these facial expressions with her own ‘muscle’ movement and learned how her own face worked. The team then used a second neural network that allowed EVA to mirror onto her own face the images of a human face captured on a video camera.

The team at Creative Machines Lab knows that mimicking expressions is just a small drop in the ocean of complex ways how humans communicate. While EVA is a laboratory experiment, they are confident that their technology will have real-world applications. In the future, robots capable of “reading” and responding to human body language can be used at work, in hospitals, and even at schools.

More information about the EVA Project can be found on the project website. To dive into a world of Curious and Creative Machines, you may visit the Creative Machines Lab 

Source Article