This piece is part of an ongoing Motherboard series on Facebook's content moderation strategies. You can read the rest of the coverage here.
It only took around ten seconds.
On Wednesday, I created a fresh Instagram account, and followed ‘Beware the Needle’, a user with 34,000 followers which posts a steady stream of anti-vaccination content. I also followed the user’s "backup" account mentioned in its bio, the creator clearly aware that Instagram may soon ban them. Instagram's “Suggested for You” feature then recommended I follow other accounts, including "Vaccines are Genocide" and "Vaccine Truth." I followed the latter, and checked which accounts Instagram now thought would be a good fit for me: another 24 accounts that were either explicitly against vaccinations in their profile description, or that posted anti-vaccine content.
They included pseudo-scientists claiming that vaccines cause autism; accounts with tens of thousands of followers promising the "truth" around vaccinations through memes and images of misleading statistics, as well as individual mothers spouting the perceived, but false, dangers of vaccinating children against measles, polio, and other diseases.
“The sheep continue to line up for the massacre. No questions asked,” the caption on a post from "Vaccine Truth" reads.
Over the last few months politicians, researchers, and media outlets have highlighted issues with how social media platforms promote, amplify, and disseminate anti-vaccine content; content that may reinforce believes that people should not vaccinate their children, putting them at risk of disease. This month, Facebook announced it would push changes to Instagram making this sort of content less discoverable on the site, removing anti-vaxx material from the platform’s "Explore" option, which shows new content the user may like, as well as removing anti-vaccine posts from hashtags.
But our experiment shows how Instagram’s recommendation engine suggests users follow more anti-vaxx accounts.
Got a tip? You can contact Joseph Cox securely on Signal on +44 20 8133 5190, OTR chat on email@example.com, or email firstname.lastname@example.org.
The users Instagram recommended to Motherboard’s test account had between around 800 and 150,000 followers each, and dozens and thousands of posts. The majority of the accounts were focused solely on anti-vaccine content and ideology, while a handful also criss-crossed with more general conspiracy theories. Instagram also recommended a series of alternative and holistic health accounts, such as promoting plant based diets. Some of these accounts do include anti-vaxx posts but aren’t specifically focused on the belief (Motherboard did not count these in its total of anti-vaccine accounts.)
Some of the accounts made use of Instagram’s Stories feature too, resharing memes from other accounts or running polls asking their audience if they would prefer anti-vaccine tote bags or buttons on their online store (the store itself is not operated on Instagram.)
“Exposing the vaccine agenda and spreading awareness on related health topics,” the bio of another anti-vaccine account Motherboard was recommended reads.
“Real Stories. Vaccine Injuries. Vaccine Deaths. What the mainstream media won’t show you,” the bio of another reads.
After pressure, a slew of social media sites, including Pinterest, Facebook, and YouTube, have acted against anti-vaccine content. In February, The Guardian reported how Facebook and YouTube help spread anti-vaxx propaganda, with YouTube’s recommendations leading viewers from fact-based content over to anti-vaccine material, and Facebook’s search results being dominated with anti-vaxx pages. Representative Adam Schiff then sent a letter to Facebook CEO Mark Zuckerburg, encouraging him to take additional steps to curb the proliferation of this sort of content. This month, Facebook announced several new changes, such as not including anti-vaxx Facebook groups or Pages in recommendations or predictions when using the site’s search function; rejecting ads that include misinformation about vaccinations, and exploring ways to share educational information about vaccines.
Erin Elizabeth, who has blogged on anti-vaccine positions, told Motherboard in an email that Facebook has acted against her content.
“It’s also been disheartening to see that Facebook has chosen to demote us and could possibly delete us, a fact that is even more worrisome since they own Instagram,” she said.
Facebook’s announcement said the company would not show or recommend related content on Instagram’s "Explore" feature or hashtag pages, which show all posts tagged with a certain keyword.
Those measures haven’t come into force yet. Hashtags such as "informedconsent." "vaccineskill,", and "vaccinescauseautism" have thousands or tens of thousands of posts at the time of writing, and include anit-vaxx content, Motherboard found. Instagram told Motherboard the new mitigations would be rolled out in the coming weeks.
Motherboard’s test dealt more specifically with Instagram’s “Suggested for You” and “Because You Follow” algorithmic recommendations, however; to see just how quick it is for those features to provide a stream of other related anti-vaccine accounts to follow.
Although it wasn’t explicitly mentioned in Facebook’s earlier announcement, Instagram told Motherboard it will be looking at different ways to minimize these sorts of recommendations too, including in the “Suggested for You” section. Instagram, unlike the other measures, did not give a more specific timeframe for this change, but said the company will update the community on its efforts.
For the moment, however, Instagram remains a hot bed of easy to discover misinformation on vaccinations.
Subscribe to our new cybersecurity podcast, CYBER.