September 23, 2023


Sapiens Digital

New Game Exposes the Risks of AI Emotion Recognition

Your chance to pull a funny face in front of a webcam or smartphone camera has arrived, and we’re not talking about the regular family Zoom call or social media post.

Scientists from the University of Cambridge and University College London in the U.K. have created a website aimed at boosting the discussion around emotion recognition technology, as well as pointing out its risks.

Their site, called Emojify, was built to help people understand how computers can be used to scan people’s faces to guess their emotions — something that’s called artificial intelligence emotion recognition. 

The researchers pointed out that this type of technology is already in use in parts of the world, for example, during an interview to detect a job applicant’s enthusiasm for the position, or during a court, trial to determine someone’s innocence or guilt.

Some might say it’s being used to help make some pretty important life decisions, and the question begs to be asked: is that ethically right? This is precisely what the research team is trying to get people to talk about.

How the Emojify website works

On its website, the team has set up a game where the user is asked to pull a number of faces at the camera in order for it to then guess six human emotions: happiness, sadness, fear, surprise, disgust, and anger. 

There are also a few questions the user can answer, like if they’ve used this type of technology before, if they think it’s useful, worrying, or if they’ve even heard of it.

None of these images or data are saved, and the hope is to boost the conversation around the controversial tech. 

As project lead and researcher at the University of Cambridge’s Leverhulme Centre for the Future of Intelligence, Dr. Alexa Hagerty told Indy500, “Many people are surprised to learn that emotion recognition technology exists and is already in use.”

“Our project gives people a chance to experience these systems for themselves and get a better idea of how powerful they are, but also how flawed,” she continued.

As Dr. Hagerty also pointed out to The Guardian, some of the main worries surrounding this tech is that it has huge potential for discrimination and surveillance. Furthermore, the team questions how technology can accurately guess humans’ emotions through facial movements. For instance, we sometimes smile to be polite, not because we’re happy. 

So the team has set out to gather the public’s opinion, and to start the debate amongst those of us whose faces will be scanned without our knowledge.

Source Article