Video platforms tested as election misinformation runs rampant
Amid an strong effort by social media platforms to curb misinformation around the U.S. election, political operatives were finding loopholes in YouTube and other video platforms.
Google-owned YouTube has come under scrutiny for leaving online one video from a far-right media group claiming Donald Trump had won Tuesday's election, along with content from others near the president challenging the integrity of the vote-counting process.
Right after the vote, challenger Joe Biden was closing in on victory as Trump launched unsubstantiated claims of fraud and managed to get clear he was not ready to concede.
Social media watchdogs say other videos containing falsehoods have circulated on TikTok and Facebook livestreams, but the biggest concerns have already been raised around YouTube, the behemoth of online video.
"It appears like YouTube has done appreciably worse at policing disinformation around the election including from the president weighed against Twitter or Facebook, both of which have already been very aggressive in responding to the moment," said Daniel Kreiss, a researcher with the University of North Carolina's Center for Information, Technology, and Public Life.
The watchdog group Media Matters for America listed a number of questionable videos that YouTube left online, saying the snippets had received several million views this week.
"YouTube videos pushing misinformation about the results of the 2020 presidential election have obtained high combined view counts, regardless of the platform's community guidelines prohibiting 'content that aims to mislead persons about voting,'" said Media Matters analyst Alex Kaplan in a blog post.
Analysts say policing video content could be challenging for platforms, which use artificial intelligence to scan for keywords and unverified allegations.
"The problem with video, especially live video, is that it is hard for artificial intelligence to discover a problem," said Adam Chiara, a professor of communication at the University of Hartford.
This can be critical because for most young voters "this is one way they are literally watching the election unfold... the country's youngest voters are scrolling through their social media feeds," Chiara said.
YouTube this week took down a video in which former Trump strategist Steve Bannon called for the beheading of the FBI director and top pandemic advisor, but stopped short of banning the account, which Twitter did.
"We've removed this video for violating our policy against inciting violence. We will still be vigilant as we enforce our policies in the post-election period," YouTube spokesman Alex Joseph said.
The toughened the stand by position Twitter and Facebook on unverified election claims has prompted operatives to turn to video on YouTube, TikTok and even Facebook Live videos in order to circumvent restrictions.
Media reports said Facebook was moving to limit the distribution of livestream video content about the election.
Video-sharing iphone app TikTok meanwhile, which had also pledged tough action on election misinformation, allowed several videos with false content to circulate with at least 200,000 combined views, according to Media Matters.
"Misinformation videos alleging mass voter fraud is going viral on TikTok," a Media Matters statement said earlier this week.
The watchdog group said TikTok allowed unfounded "magic ballot" narratives claiming that mail-in votes for Democratic nominee Biden are fraudulent, in addition to allegations about Arizona poll personnel intentionally offering markers to Trump voters in order that their ballots wouldn't normally be processed.
The group said TikTok removed a lot of those videos after being notified.
Chiara said it is increasingly difficult to maintain with the velocity of social media content like video.
"Just with the sheer quantity of video that is misinformation and disinformation, even if the platforms are able to stop them all before they go viral, all of them are still reaching eyeballs before they are scrubbed," he said.
But some authorities say platforms with a clear strategy can curb the flow of political misinformation.
Kreiss said that for YouTube and others, "any serious enforcement action could have started with the institutional political accounts."
He added that Facebook and Twitter appeared to be effective by monitoring videos shared by the president and the ones in his circle.
"You start from the most notable," he said. "I'd search for enforcement on the president and other elites who would like to undermine the credibility of the ballot box."
Carl Tobias, a University of Richmond law professor, said the battle over misinformation continues to be raging on social media.
"In the coming days, and until the presidential and Senate elections are concluded, I expect Twitter and Facebook to be equally vigilant because they have been so far, and I am cautiously optimistic that they can learn from some of their own missteps or new 'tricks' from politicians," he said.
Source: japantoday.com