Skip to main content

Pinterest said it’s so hard to remove anti-vax disinformation that it had to find a plan B

Image-sharing app Pinterest said that it has proven so difficult to identify and remove all the anti-vax disinformation that has been posted to the service that it’s had to find an unusual solution to the problem …

The WSJ reports that, instead of relying on removing the disinformation itself, it has instead blocked searches for it.

Pinterest has stopped returning results for searches related to vaccinations, a drastic step the social-media company said is aimed at curbing the spread of misinformation but one that demonstrates the power of tech companies to censor discussion of hot-button issues.

Most shared images on Pinterest relating to vaccination cautioned against it, contradicting established medical guidelines and research showing that vaccines are safe, Pinterest said. The image-searching platform tried to remove the antivaccination content, a Pinterest spokeswoman said, but has been unable to remove it completely.

The company earlier took the same action to block searches for claimed cancer therapies.

In both cases, the company took the view that the risk of harm from false claims was sufficient that it justified the step.

Most anti-vax posts stem from a fraudulent paper back in 1998 which claimed to have found a link between the MMR vaccine – for measles, mumps, and rubella – and autism. The paper was written by Andrew Wakefield, who was later revealed to have written the paper in the hope of generating demand for an alternative vaccination product in which he had a financial interest. Ill-informed people continue to point to the study in their anti-vax disinformation despite the fact that Wakefield was struck off the medical register after being found guilty of both dishonesty and abuse of children.

The WSJ piece points to the difficult tightrope social media companies have to walk at times.

The aggressive move by Pinterest marks another change in the way large tech companies are trying to handle the responsibility of monitoring the flow of information.

“Until recently, social-media companies have drawn a line in the sand saying they’re not arbiters of truth; that they are passive purveyors of information,” said Samuel Woolley, a researcher who studies social-media disinformation at the Institute for the Future think tank.

“There’s been pressure on them for a long time to respond to this because the reality of this is the spread of misinformation—especially around vaccines—leads to extremely bad consequences, including death,” he said.

Google, Facebook and YouTube are among the other companies which have had to tackle the issue, adjusting their algorithms to make health-related disinformation less prominent.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Check out 9to5Mac on YouTube for more Apple news:

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear