The photo preview feature of Twitter has faced increasing scrutiny as global users continue to examine the neural network algorithm. This time, however, things might not be up to snuff.
In other words, the app’s preview algorithm could have a racial bias.
Racial bias allegations on Zoom, Twitter
The discussion started with a white Twitter user initially talking about Zoom’s racial bias, mentioning that Zoom was showing his face but not his Black friend’s when they were using virtual backgrounds.
A faculty member has been asking how to stop Zoom from removing his head when he uses a virtual background. We suggested the usual plain background, good lighting etc, but it didn’t work. I was in a meeting with him today when I realized why it was happening.
— Colin Madland (@colinmadland) September 19, 2020
After this thread took off, other users began testing Twitter, and the results were shocking.
SEE ALSO: TWITTER ADMITS HACKERS OBTAINED DOZENS OF HIGH-PROFILE ACCOUNTS’ DMS
Testing with Obama, McConnell
One user experimented with a thread of tweets using photos of Mitch McConnell and Barack Obama. The algorithm apparently chose the whiter face in most tweets, adding more to the claim of racial bias in Twitter’s neural network algorithm.
Trying a horrible experiment…
Which will the Twitter algorithm pick: Mitch McConnell or Barack Obama? pic.twitter.com/bR1GRyCkia
— Tony “Abolish (Pol)ICE” Arcieri 🦀 (@bascule) September 19, 2020
Some tried using Carl and Lenny from Simpsons to see if it works on cartoon characters as well. Guess what? You don’t see Carl on the previews.
I wonder if Twitter does this to fictional characters too.
Lenny Carl pic.twitter.com/fmJMWkkYEf
— Jordan Simonovski (@_jsimonovski) September 20, 2020
While the Twitter folks were raging, Twitter’s Chief Design Officer Dantley Davis jumped in with some tests of his own.
Here’s another example of what I’ve experimented with. It’s not a scientific test as it’s an isolated example, but it points to some variables that we need to look into. Both men now have the same suits and I covered their hands. We’re still investigating the NN. pic.twitter.com/06BhFgDkyA
— Dantley 🔥✊🏾💙 (@dantley) September 20, 2020
While Twitter is working to solve this issue, Liz Kelley from Twitter Communications thanked everyone for finding this bug and that they hadn’t come across any evidence of racial and/or gender bias when they tested the algorithm. She also added, “but it’s clear that we’ve got more analysis to do.”
The Chief Technology Officer of the company, Parag Agrawal replied to one of the users and said that their neural network model needs “continuous improvement” and that he is “eager to learn” from this experience.
Even though this topic has sparked a large discussion, when we take Twitter’s 2018 blog post into account, it looks like it is just an algorithm thing.
The 2018 blog post mentions that when the app is cropping photos for previews, the neural network tries to focus on the face meaning it’s focusing on the higher contrast levels of an image. This might as well be the reason behind the recent issue with Black and white faces.