Skip to main content

You’ll likely see headlines that Siri is racist; the reality is more nuanced

It’s likely we’re going to see headlines suggesting that Siri is racist, based on the findings of a study by Stanford University. The reality, however, is a little different …

The study published in the journal Proceedings of the National Academy of Sciences concluded that speech recognition systems created by tech giants have a racial bias, and that Apple’s system was worst. Apple’s recognition engine misidentified 23% of words spoken by white people but 45% of those spoken by African Americans.

There are, however, two very big caveats to the findings.

First, this wasn’t a test of the actual live systems: Siri, Alexa, Google Assistant and so on. Instead, the researchers used speech recognition tools provided by the companies for use by others, explains the New York Times.

The study tested five publicly available tools from Apple, Amazon, Google, IBM and Microsoft that anyone can use to build speech recognition services. These tools are not necessarily what Apple uses to build Siri or Amazon uses to build Alexa. But they may share underlying technology and practices with services like Siri and Alexa.

Each tool was tested last year, in late May and early June, and they may operate differently now. The study also points out that when the tools were tested, Apple’s tool was set up differently from the others and required some additional engineering before it could be tested.

Second, the African American speech transcribed by the systems was what researchers describe as African-American Vernacular English, which is to say that it contains a lot of slang.

Based in a largely African-American rural community in eastern North Carolina, a midsize city in western New York and Washington, D.C., the black testers spoke in what linguists call African-American Vernacular English — a variety of English sometimes spoken by African-Americans in urban areas and other parts of the United States. The white people were in California, some in the state capital, Sacramento, and others from a rural and largely white area about 300 miles away.

No differences were found when the same slang words were spoken by white or African American people, so the bias described is in recognition of the terminology, rather than the accents or voices.

That said, it is reasonable to look at how well speech-recognition systems do when faced with commonly-used slang. I’m just a bit puzzled why they didn’t conduct the study with the actual services in use by each company. So, no, this doesn’t tell us whether or not Siri is racist.

Siri was recently updated with a CDC questionnaire to help determine whether you are likely to have COVID-19.

FTC: We use income earning auto affiliate links. More.

OWC Mac Pro memory
You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear