While sometimes there's a big piece of misinformation that a lot of people latch onto—like The Rapture or the existence of "MedBeds"—the fractured nature of the information sphere has all but killed the overarching conspiracy theory. No longer do big ideas like "we never went to the moon" unite the dumbest minds; instead, the algorithm creates bespoke conspiracy theories. So instead of joining the Flat Earth Society, you might think the actual year is 1728, or that AI secretly imagined a British comedian from the the 1980s and seeded the web with evidence of his existence.
But how does it start? And how quickly can social media platforms transform someone from a person from a seeker-of-knowledge to a believer-in-bullshit? YouTuber Benaminute recently posted a video where he dug in to find out. His question: If you start with a benign, broad, randomly chosen subject, and you only watch videos having to do with that subject, how long will it take until TikTok, YouTube Shorts, and Instagram Reels feed you a conspiracy theory video? The answer: not long at all.
YouTube Shorts: The initial videos were ads for Jurassic Park, AI slop featuring dinosaurs, and the occasional educational video, but the 541st video was a clip from the Joe Rogan Experience about how the pyramids were not tombs, but "DNA restoration devices."
Instagram Reels: Insta took 661 videos to get from dinosaurs to a "forbidden phone from the 2000s that lets you see into a parallel dimension."
The Vietnam War
YouTube Shorts got to a conspiracy theory video about Noah's Ark in only seven videos.
Reels took 139 videos to get to "Bush did 9/11."
The 2000 election
YouTube Shorts took 136 videos to get to the same Noah's Ark conspiracy as it did for dinosaur fans.
Reels took only 26 videos to land on "The World Trade Center was bombed" (by either Clinton or Bush).
Which social media app leads to conspiracy theories fastest?
What does it all mean?
It would be easy to conclude that the massive tech companies that built YouTube, Instagram, and TikTok companies weight their recommendation engines so viewers are led to fake stories. Maybe they have specific political aims and are trying to sway votes, or maybe (as Benaminute posits in a semi-tongue-in-cheek way) these apps are built to "keep us angry, divided, and distracted" from realizing the conflict isn't between Left and Right, but between "up and down."
I have no doubt that a social media platform featuring an algorithm that weighs the truth heavily would fail pretty quickly; the Truth is boring compared to conspiracy theories. Conspiracy theories, broadly, make believers feel special, like they have inside knowledge the rest of us lack. People scroll TikTok to have fun; the truth usually isn't fun. Conspiracy theorist can say things like "UFOs are here!" or "They're turning the frogs gay!" Meanwhile, if you're devoted to the truth, you mostly have to go with "the best evidence suggests..." or "it seems logical that..." and who wants to hear that?
Hence then, the article about what people are getting wrong this week the short video to conspiracy theory pipeline was published today ( ) and is available on Live Hacker ( Middle East ) The editorial team at PressBee has edited and verified it, and it may have been modified, fully republished, or quoted. You can read and follow the updates of this news or article from its original source.
Read More Details
Finally We wish PressBee provided you with enough information of ( What People Are Getting Wrong This Week: The Short Video to Conspiracy Theory Pipeline )
Also on site :