A young woman wears a grey t-shirt that reads “Back the Blue” and shows the Blue Lives Matter version of the American flag. Using her hands heavily in demonstrative motions that seem to mimic a slam poetry performance, she lip-syncs to the following voiceover, read by a man:
“I’m pro-gun and pro-2a,
how does that make me a bigot or anti-gay?
And I’m pretty sure I was conceived during a threeway.
You look at my guns and say they’re a disgrace,
Tell me how many crimes you stop with your safe space.”
This is not an unusual or unexpected result. But Motherboard didn’t conduct this experiment in order to prove that TikTok is dominated by far right users, because it’s not. It also doesn't prove that TikTok’s algorithm disproportionately promotes conservative content, because it does not.Rather, this experiment proves that the TikTok recommendation algorithm was working exactly as we understand it to work: it drives users to “engaging” content that they are inclined to like and share. This structure implicitly encourages users to spend as much time as possible on the app by showing them only content that they already like.
"You can just keep getting fed content without thinking of why that content is being placed in front of your eyeballs specifically.”