Anna Diop, who plays the alien superhero Starfire on the live-action Titans television show set to premiere on DC's new streaming service, has disabled commenting on her Instagram posts. The…
racism
Computers That Consume Our Media Develop Sexist Views of Women
Women don’t play sports and belong in the kitchen, according to sophisticated learning machines.
In a study that has a number of disturbing implications, researchers found that computers become sexist when exposed to too much of our media. University of Virginia computer science professor Vicente Ordóñez noticed a pattern in how the image-recognition software he had developed interpreted photos. “It would see a picture of a kitchen and more often than not associate it with women, not men,” he told Wired.
That made Ordóñez wonder if researchers were injecting their own biases into the computer’s thought processes. So he found some collaborators and decided to test industry standard the photosets provided by Microsoft and Facebook to “train” image-recognition software.
They found that both data sets reinforced gender stereotypes in their depiction of activities such as cooking and sports. Pictures of shopping and washing were correlated to women, for example, while coaching and shooting were linked to men.
More worrying, image-recognition software trained with these datasets did not just reflect those biases — they magnified them. If a photo set associated women with cleaning, software trained with that photo set created an even stronger link between women and cleaning.
For instance, the research paper shows a photo of a man at a stove that image-recognition software consistently labelled as “woman.”
As these types of intelligent machine-learning programs are getting ready to explode in number and in importance. If we can’t get a handle on how to combat this problem, they could magnify the worst stereotypes society has about race and gender.
This is already happening. In 2015, Google’s automated photo service embarrassingly tagged black people as “gorillas.”
As learning computers become more sophisticated, this problem could have dramatic real-world consequences. Mark Yatskar, a a researcher at the Allen Institute for Artificial Intelligence, imagines a personal assistant robot in the future that is trying to guess what a human is doing in the kitchen. It might give a man a beer while offering to help a woman clean.
Last year, researchers from Boston University and Microsoft demonstrated that software that learned from text provided by Google News had acquired sexist biases as well. When asked complete the statement “Man is to computer programmer as woman is to X,” it replied, “homemaker.”
Eric Horvitz, director of Microsoft Research, notes that the materials we give to children often reflect an idealized world — where men and women are equally likely to be firefighters or homemakers. He suggests that a similar approach might be necessary for learning machines. “It’s a really important question–when should we change reality to make our systems perform in an aspirational way?” he asked..
Other experts worry that providing a distorted version of reality to computers will hamper their effectiveness because the data no longer reflects the real world. Aylin Caliskan, a researcher at Princeton, says it’s important for the computer to know that there are more male construction workers in the world so that it can analyze data more effectively. She recommends identifying and correcting bias afterwards rather than providing “bad data” to the machines at the outset.. “We risk losing essential information,” she says. “The datasets need to reflect the real statistics in the world.”
There may not be a clear-cut answer, but it is clear that the issue needs to be addressed before these types of learning systems become even more prevalent.
One final thought. If these stereotypes are present in media that is suppose to be curated against racial and gender biases and computers pick up on them and amplify them — what do you suppose is happening to our most important “learning machines” — the brains of our children — when they are constantly exposed to the unfiltered text and imagery that makes up much of modern society?