How Facebook, Twitter, and Others Are Handling Election Misinformation

/
/
/
995 Views

As Election Day drags into Election Week, here’s what Facebook, Instagram, Twitter, TikTok, and YouTube are doing to crack down on misinformation and keep users informed, from demoting content to outright bans.

Social media companies have come under fire this election season from both sides of the aisle, and as Election Day drags into Election Week, Twitter and Facebook have been playing whack-a-mole with conspiracy theorists, liars, and misinformation efforts. Here’s where we are.

Facebook

Amid of flurry of misinformation on the social network, Facebook has announced plans to slow the spread of questionable content. “As vote counting continues, we are seeing more reports of inaccurate claims about the election. While many of these claims have low engagement on our platform, we are taking additional temporary steps, which we’ve previously discussed, to keep this content from reaching more people,” Facebook said in a statement.

Demoting Content and Limiting Live Video

“These include demotions for content on Facebook and Instagram that our systems predict may be misinformation, including debunked claims about voting.”

Facebook also pledged to limit the distribution of election-related live videos. And if someone moves to share a post with an informational election label, they’ll see a message that encourages them to visit Facebook’s Voting Information Center. Much like Twitter is now calling on people to read articles before they tweet them, the idea is to stop people from mindlessly sharing things that might be incorrect.

Removing Potentially Dangerous Facebook Pages

The announcement comes after Facebook deleted a “Stop the Steal” page that was organizing protests at vote-counting locations around the country, the Wall Street Journal reports.

That page, which alleged widespread ballot fraud, surged to 361,000 members in one day, the Journal says, and resulted in people showing up at election offices in Arizona and Philadelphia—sometimes armed. After Facebook says it “saw worrying calls for violence from some members of the group,” the page was removed.

As the Journal notes, the Cybersecurity and Infrastructure Security Agency (CISA)—an agency created by a bill that Trump signed into law in 2018—has tweeted multiple times to defend the security of election systems.

Announcing a Winner

Once “Reuters and a majority of independent decision desks at major media outlets, including ABC, CBS, Fox, NBC, CNN and AP project a winner,” Facebook plans to add notifications at the top of Facebook and Instagram feeds.

Limiting Controversial Hashtags

In the interim, Facebook is limiting the ability to find content with controversial hashtags, like #rigged or #sharpiegate. If you search for those terms, you’ll get a prompt that asks you to visit Facebook’s Voting Information Center. Generic election-related hashtags, like #biden or #trump, will still direct you to posts featuring them.

Instagram

Similarly, Instagram is limiting access to recent hashtags.

Usually when you search a hashtag on the app, there’s the option to see the top posts with that hashtag, as well as the most recent posts that use them. But in a bid to prevent people from hijacking certain hashtags and flooding them with misinformation or unhelpful posts, Instagram has temporarily blocked the “recent” option.

If you search any hashtag on Instagram right now—election-related or not—you’ll see an alert at the top of the page explaining why the “recent” section is unavailable. Clicking the “Learn More” button takes you to an Instagram help page explaining the decision.

TikTok

As Media Matters reports, TikTok struggled to keep election misinformation in check this week. It removed some hashtags, according to TechCrunch, but with so many people on the platform, it’s been difficult to stay on top of it.

Twitter

President Trump’s Twitter feed is now awash in warnings that say content in his posts “is disputed and might be misleading about an election or other civic process.”

It started just after midnight on Nov. 4, when Trump alleged that he was “up BIG, but they are trying to STEAL the Election.” Twitter promptly slapped a label on that tweet, given there was no evidence to back up Trump’s assertion. “This action is in line with our Civic Integrity Policy,” Twitter said at the time.

It has since done the same on 11 other tweets from the president. It also added a label on that bizarre Trump tweet in which he “claimed” wins for “Electoral Vote purposes” in various states despite the fact that votes were still being counted.

Warnings About Vote-Count Delays

Twitter has since added notices atop people’s social feeds to remind them that vote counts will take some time this year, given the increase in mail-in ballots and state rules governing when they can be counted.

“Twitter has been reminding people in the US for several weeks that election results might be delayed. Today, you’ll see an updated reminder when you open Twitter with the latest #Election2020 news and information about when and how votes are counted,” it said on Wednesday.

Banning Trump From Twitter?

There has been a lot of discussion about why Twitter doesn’t suspend Trump; it basically comes down to the fact that he’s the president, so what he says is in the public interest. That could change if he loses re-election, though. As Bloomberg reports, “high-profile politicians no longer in office…are treated like regular users if they violate Twitter’s rules.” Trump could have his tweets removed rather than labeled, and he could be permanently banned from the platform should he repeatedly violate its terms.

YouTube

YouTube has taken heat this week for not acting quickly enough to label or remove misleading videos about the election. A video from conservative news outlet One American News Network, for example, features a woman stating definitively that Trump won four more years in office and baselessly alleges “rampant…voter fraud,” CNBC reports. YouTube has since de-monetized the video, but it remains live with more than 427K views.

YouTube did, however, remove an episode of former Trump advisor Steve Bannon’s show, which featured him calling for violence against FBI Director Christopher Wray and Dr. Anthony Fauci. Twitter also permanently banned Bannon from the platform for “glorifying violence.”

A number of election-related YouTube videos currently have a note underneath that says “Results may not be final. See the latest on Google,” which brings people to a Google search results page for “election results.”

Leave a Comment

Your email address will not be published. Required fields are marked *

This div height required for enabling the sticky sidebar
Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views :