How a debunked COVID-19 video kept spreading after Facebook and YouTube took it down

Although social media giants like Facebook and YouTube removed an unmasked documentary about the COVID-19 pandemic from their platforms, copies and variations of the video are still available on alternative social media sites, where hundreds of thousands of people watch them .

And links to that content continue to appear on conventional platforms.

Advertisement

Plandemic is a 26-minute video, originally presented as a vignette intended to be part of a longer documentary, full of false and misleading claims about the coronavirus, including how people can protect themselves.

The video flooded social media platforms in the first week of May. According New York Times, was seen more than eight million times on major platforms

The original version was removed from Facebook, YouTube and Vimeo in an attempt to crack down on false or misleading information related to the COVID-19 pandemic. Facebook said that some of the documentary's allegations could cause "imminent harm" and YouTube cited "medical-based diagnostic advice".

Advertisement

However, clips and modified versions of the documentary have resurfaced on the websites. Facebook has labeled them as false and linked to fact checks by news organizations that detail a number of issues with the content, including the promotion of conspiracy theories and incorrect medical information.

Complicating this effort is the fact that Plandemic it is easy to find on sites known as alt-tech platforms, many of which position themselves as alternatives to popular popular social media platforms and as defenders of free speech without censorship.

Advertisement

These social media platforms generally act as a reservoir for content that has been flagged and removed from major sites like Facebook and YouTube, as was the case with Pandemic, and links to content on alt-tech platforms often return to major social media platforms.

So despite efforts by major sites to crack down on what they considered to be potentially harmful content, alt-tech platforms help to keep it circulating.

Mikovits walks with Mikki Willis, the filmmaker who created Plandemic, still in this video. (Screenshot / Elevate)

One of these alternative technology sites is BitChute, a video sharing platform registered in the UK that is similar to YouTube, allowing people to comment and vote on posted videos. It shows over 1,770 search results for the term "Plandemic". The main result appears to be a re-posting of the original Plandemic video and had over 1.6 million views on Wednesday.

Advertisement

There are about a dozen alt-tech platforms in operation, some based on blockchain technology that allows them to publicize their decentralized nature.

Advertisement

It is not possible to check how many users and visitors each has, but some claim to have considerable communities.

Twitter's alternative, Gab, for example, favored by the far right, says it had at least one million registered users last year.

BitChute recently tweeted that it had 20 million unique visitors in April.

& # 39; Shot by censorship & # 39;

Zarine Kharazian, assistant editor of the Atlantic Council's Digital Forensic Research Laboratory in Washington, DC, who studies disinformation and how it affects democratic norms, says in the case of Plandemic, the filmmaker seemed to anticipate that the video would be downgraded or removed from major platforms – initial posts with the previewed video.

At the same time, she said, copies began to appear on alternative video platforms.

Then, when the video was flagged and removed from major social media sites, links to the video on alt-tech platforms appeared on Twitter and Facebook.

"This is how the video stays alive, even when the major platforms, with the control they have, tried to remove it and downgrade it," she said.

"I think paradoxically what is happening with the Plandemic the video is a kind of backfire censorship. It's called … the Streisand Effect. "

In 2003, legendary artist Barbra Streisand sued for removing a photo of her home from the internet, which resulted in a much larger number of people viewing the photo because of the publicity surrounding the process.

In case of Plandemic, Kharazian says people who are curious about the video may be attracted by the now taboo status and will look for it.

"At the same time, … unconditional believers are outraged that the video has been removed. Therefore, they increase their type of effort to get the video in front of as many people as possible," she said.

"These two things together make any effort to moderate content on any platform not stop the conspiracy from spreading after the conspiracy goes viral."

Platforms defend freedom of expression

When asked about Plandemic remaining on the BitChute platform, the company wrote in an email: "There is already legislation that imposes reasonable restrictions on speech, such as not allowing incitement to violence, and we follow it completely.

"The censorship of ideas from Google and Facebook is wrong and counterproductive. Hiding the public from ideas, even bad ones, only makes society more susceptible to dangerous errors and violates the universal human right to people's freedom of expression."

WATCH As plandemic and other demystified content continues to spread:

A new video called & # 39; Plandemic & # 39; it's full of false claims about COVID-19, but that hasn't stopped it from being released online. An expert explains how & # 39; Plandemia & # 39; and other conspiracy theories become popular. 7:47

The administrators of DTube, another alt-tech video sharing platform, are anonymous, but the one who identifies himself as "heimindanger" answered CBC's questions on Discord, another platform on which the DTube community hosts a forum.

"I have nothing to say about these videos, I didn't watch them, but I saw them circulating. Any type of content is allowed on DTube. If something is deleted on [YouTube], that same content is often sent back on alternative video solutions like ours ".

The administrator went on to say that users have the ability to cast a positive or negative vote. The content that is voted on is considered popular and shown to more users.

"So that anyone can fulfill their public duty to limit damage. Nobody controls what freedom of speech is or what is hurting, everything is based on votes," said the administrator.

Matthew Johnson, director of education at Media Smarts, a Canadian non-profit organization that promotes digital and media literacy, says that in some cases, high-tech platforms are created specifically to host content that is picked up elsewhere.

"Sometimes they are created by conspiracy groups, hate groups, health disinformation groups as a refuge. And sometimes it is just that one of these platforms can have a more laissez-faire attitude towards moderating content" said Johnson.

CLOCK Demystifying the immunity strikes of the COVID-19:

The misinformation about COVID-19's so-called miracle cures is spreading online. Can you really buy your way to a better immune system? We asked a specialist: UBC professor Bernie Garrett, who studies health fraud, including alternative medicine. 5:27

Because of the pandemic, major platforms rely more on automated content moderation than human moderation, says Johnson, which may have some negative effects.

"In some cases, there is an increase in the rate of false positives, which really … damages the public reputation for content moderation, because you see things that are being withdrawn and that should not be withdrawn," he said.

In March, for example, after a large number of complaints that were posted on legitimate news sites – and even Canadian authorities – Facebook was blocked. blamed his anti-spam system and ended up fixing the problem.

Despite the limitations of moderation, it was important to try to remove Plandemic of large platforms, because people are less likely to trip over it, Johnson said.

"This is particularly important in places like YouTube, where, for example, 70% of views don't happen because someone is looking for someone, they happen because people are watching what was recommended by the YouTube algorithm," he said.

"So, already removing him from these platforms is a victory. He is moving away from the public eye."

How platforms are removing content

The big platforms have stepped up their efforts to flag deceptive or potentially dangerous COVID-19 content with varying degrees of success, according to Philip N. Howard, director of the Oxford Internet Institute at the University of Oxford.

The Institute did a study how much misinformation remained on social media platforms after the initial posts were unmasked by fact-checkers.

"Two-thirds of the stories we found on Twitter were still there a week later," said Howard, while on Facebook about 24% of posts remain on the platform without warning.

On a separate study, the group analyzed what kind of information the average person gets when searching YouTube for health information on COVID-19.

"We found that the average person looking for health information on YouTube does not find garbage. They often find things prepared by professional news organizations," he said.

"You need to do a lot of searching on YouTube to find the most conspiratorial content about COVID."

But, Howard said, those looking for that content will inevitably find it elsewhere – or build their own platforms.

"It is the starting point when a social media company tries to manage extreme views and expel them from the platform. These extremists will go elsewhere to try to create their own technologies and maintain their conversations."

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *