What People Are Getting Wrong This Week: The Short Video to Conspiracy Theory Pipeline ...Middle East

Live Hacker - News
What People Are Getting Wrong This Week: The Short Video to Conspiracy Theory Pipeline

Did you know you can customize Google to filter out garbage? Take these steps for better search results, including adding my work at Lifehacker as a preferred source.

While sometimes there's a big piece of misinformation that a lot of people latch onto—like The Rapture or the existence of "MedBeds"—the fractured nature of the information sphere has all but killed the overarching conspiracy theory. No longer do big ideas like "we never went to the moon" unite the dumbest minds; instead, the algorithm creates bespoke conspiracy theories. So instead of joining the Flat Earth Society, you might think the actual year is 1728, or that AI secretly imagined a British comedian from the the 1980s and seeded the web with evidence of his existence.

    But how does it start? And how quickly can social media platforms transform someone from a person from a seeker-of-knowledge to a believer-in-bullshit? YouTuber Benaminute recently posted a video where he dug in to find out. His question: If you start with a benign, broad, randomly chosen subject, and you only watch videos having to do with that subject, how long will it take until TikTok, YouTube Shorts, and Instagram Reels feed you a conspiracy theory video? The answer: not long at all.

    For the experiment, Benaminute created "blank" social media profiles and behaved like someone who was innocently curious about one of three topics—dinosaurs, The Vietnam War, and the 2000 presidential election. He put the keyword in each platform's search bar and only watched and liked videos about the initial subject.

    YouTube Shorts: The initial videos were ads for Jurassic Park, AI slop featuring dinosaurs, and the occasional educational video, but the 541st video was a clip from the Joe Rogan Experience about how the pyramids were not tombs, but "DNA restoration devices."

    TikTok: If you thought TikTok would get to conspiracy theories quickly, you'd be right. The 144th video was this fake UFO video that has 24 million views.

    Instagram Reels: Insta took 661 videos to get from dinosaurs to a "forbidden phone from the 2000s that lets you see into a parallel dimension."

    The Vietnam War

    Things get worse for people interested in historical or political events. On all short-form platforms, an interest in Vietnam will lead you pretty quickly to right-leaning content, which leads you to conspiracy theories.

    YouTube Shorts got to a conspiracy theory video about Noah's Ark in only seven videos.

    TikTok took a little longer; video 161 was about how financial services company Blackrock had something to do with the attempted assassination of Donald Trump.

    Reels took 139 videos to get to "Bush did 9/11."

    The 2000 election

    The election of 2000 is still a charged topic, but it's been awhile, so maybe cooler heads and verified information will win the day? Spoiler: nope.

    YouTube Shorts took 136 videos to get to the same Noah's Ark conspiracy as it did for dinosaur fans.

    TikTok only took 38 videos to get to "The Rapture is happening on September 24."

    Reels took only 26 videos to land on "The World Trade Center was bombed" (by either Clinton or Bush).

    Which social media app leads to conspiracy theories fastest?

    The champion of "normal search to conspiracy theory" speed runs is TikTok, with an average of 114 videos or 57 minutes of watching. YouTube Shorts comes in second with 230 videos or 1 hour 57 minutes of time, and Reels takes 275 video or 138 minutes. It's a distinction without a difference; however, all three platforms lead to conspiracies in the time it takes to watch a Marvel movie.

    What does it all mean?

    It would be easy to conclude that the massive tech companies that built YouTube, Instagram, and TikTok companies weight their recommendation engines so viewers are led to fake stories. Maybe they have specific political aims and are trying to sway votes, or maybe (as Benaminute posits in a semi-tongue-in-cheek way) these apps are built to "keep us angry, divided, and distracted" from realizing the conflict isn't between Left and Right, but between "up and down."

    This is also a conspiracy theory, however. I'm not saying he's wrong, but we don't have enough information to know why algorithms recommend conspiracy content. It could be because bad actors at the top demand specific results for some purpose, but it seems more likely to me that TikTok et al. don't have an agenda beyond making money.

    I have no doubt that a social media platform featuring an algorithm that weighs the truth heavily would fail pretty quickly; the Truth is boring compared to conspiracy theories. Conspiracy theories, broadly, make believers feel special, like they have inside knowledge the rest of us lack. People scroll TikTok to have fun; the truth usually isn't fun. Conspiracy theorist can say things like "UFOs are here!" or "They're turning the frogs gay!" Meanwhile, if you're devoted to the truth, you mostly have to go with "the best evidence suggests..." or "it seems logical that..." and who wants to hear that?

    Hence then, the article about what people are getting wrong this week the short video to conspiracy theory pipeline was published today ( ) and is available on Live Hacker ( Middle East ) The editorial team at PressBee has edited and verified it, and it may have been modified, fully republished, or quoted. You can read and follow the updates of this news or article from its original source.

    Read More Details
    Finally We wish PressBee provided you with enough information of ( What People Are Getting Wrong This Week: The Short Video to Conspiracy Theory Pipeline )

    Apple Storegoogle play

    Last updated :

    Also on site :