• Post author:
  • Post category:Other
  • Reading time:4 mins read

More than 80 fact-checking organizations today published an open letter to YouTube’s CEO, Susan Wojcicki, calling YouTube “one of the major conduits of online disinformation and misinformation worldwide” and urging her to implement “a roadmap of policy and product interventions to improve the information ecosystem.”

The letter, published through Florida-based journalism nonprofit Poynter Institute, says YouTube’s current policies are “proving insufficient,” particularly when it comes to handling the spread of misinformation in languages other than English.

“YouTube is allowing its platform to be weaponized by unscrupulous actors to manipulate and exploit others, and to organize and fundraise themselves,” the letter says.

The spread of misinformation on YouTube is a well-documented, years-long problem. The company has addressed it broadly in occasional updates, the latest of which was an August 2021 blog post from its chief product officer, Neal Mohan. In that post, Mohan said misinformation is “[n]o longer contained to the sealed-off worlds of Holocaust deniers or 9-11 truthers, [and] it now stretches into every facet of society, sometimes tearing through communities with blistering speed.”

Mohan said YouTube tackles misinformation in part by focusing on how quickly it can remove Community Guidelines-violating videos. To that end, the platform removes nearly 10 million videos per quarter, “the majority of which don’t even reach 10 views,” he said.

He also made it clear, however, that YouTube is hesitant to take “an overly aggressive approach towards removals,” arguing that doing so might “send a message that controversial ideas are unacceptable,” and saying that he “personally believe[s] we’re better off as a society when we can have an open debate.”

That wishy-washy approach is evident in one of YouTube’s newest misinformation policies. Rolled out in October 2021, the policy says YouTube will demonetize, but not remove, content containing misinformation about climate change.

But the open letter to Wojcicki alleges that even topics YouTube has specifically cracked down on or banned, like COVID-19 and election misinformation, are still spreading to millions of people in non-English-language content.

The letter chronicles numerous alleged instances of misinformation: “millions” of views on videos in Greek and Arabic spreading misinformation about COVID-19 vaccines and supposed cures; “tens of thousands of users” in Brazil watching hate-speech-filled videos; and election misinformation videos with 2+ million views “denying human rights abuses and corruption during the Martial law years” in the Philippines.

The letter says these issues are present in English-language videos too. YouTube talked a big game about how it would handle misinformation around the 2020 election, only to hesitate when it came to removing false claims that Donald Trump had won or that Democrats had rigged the election using ballot fraud. The letter alleges that from the evening of Nov. 2, 2020, to Nov. 4, 2020, “YouTube videos supporting the ‘fraud’ narrative were watched more than 33 million times.”

So what does the letter want Wojcicki and YouTube to do about all this?

The 80+ signing organizations first want YouTube to commit to “meaningful transparency about disinformation on the platform”–meaning, they want it to allow independent organizations to research the origins of misinformation spread on its site, and about “the most effective ways” to debunk that misinformation. The organizations also call for YouTube to publish its “full moderation policy regarding disinformation and misinformation, including the use of artificial intelligence and which data powers it.”

Additionally it wants YouTube to focus more on including factual, debunking information on videos, take action against accounts that repeatedly post misinformation, and “[e]xtend current and future effort against disinformation and misinformation in languages different from English.”

A YouTube spokesperson told CNET that “fact checking is a crucial tool to help viewers make their own informed decisions, but it’s one piece of a much larger puzzle to address the spread of misinformation.”

“Over the years, we’ve invested heavily in policies and products in all countries we operate to connect people to authoritative content, reduce the spread of borderline misinformation, and remove violative videos,” she said. “We’re always looking for meaningful ways to improve and will continue to strengthen our work with the fact checking community.”

Visit Tubefilter for more great stories.

Source: TubeFilter.com

Leave a Reply