Congressman calls on sites to remove reports on ‘celebrity pedophilia’ during hearing
During a Congressional hearing on Tuesday, Democrat Rep. Ted Deutch demanded that Facebook, Youtube, and Twitter should ban “conspiracy theories” about “celebrity pedophilia” on their platforms.
Representatives from each of the internet giants were called to answer questions before Congress regarding their “content filtering policies.”
Facebook, Twitter, and YouTube all emphasized that they are taking steps to demote independent media sources whilst promoting “authoritative content” from the mainstream news outlets.
All three platforms say they have also introduced a strike system, whereby users or pages that violate too many of the platform’s rules, are permanently banned.
However, how many strikes a user gets, and whether or not a strike will be handed out, is still unclear to many users, including members of Congress.
Testifying in front of the House Judiciary Committee, Juniper Downs, Google’s Director of Public Policy and Government Relations, was asked by Deutch what Youtube is doing to remove “conspiracy theories” from the video streaming service.
Deutch elaborated on the “conspiracy theories” he wants to be blocked from the sites, which include:
“Shock reality, social experiments, celebrity pedophilia, false flag rants, and terror-related conspiracy theories dating back to the Oklahoma City attacks in 1995.”
SCROLL DOWN FOR VIDEO
VB reports: The hearing touched on a slew of the most-talked-about controversies involving social media companies in the last year — Russian election interference, Cambridge Analytica, and perceived suppression of conservative moments.
But one revealing exchange came when Rep. Ted Deutch (D-FL), asked Facebook and YouTube’s representatives specifically about what they’re doing to prevent conspiracy theories from spreading on the platform.
Deutch’s district includes Parkland, Florida, which was the site of a mass shooting at Marjory Stoneman Douglas High School several months ago.
He specifically brought up the case of the far-right blog Infowars, which published a YouTube video accusing survivors of the shooting of being crisis actors.
YouTube later took the video down — as YouTube’s global head of public policy and government relations Juniper Downs explained, if an individual or group is claiming that a “specific, well-documented violent attack didn’t happen and you use the name or image of survivors of the attack, that is a malicious attack and it violates our policy.”
YouTube’s community guidelines state that if a channel garners three strikes or three violations of community guidelines in three months, it will be “terminated.”
When asked at the hearing what YouTube was doing to stop the spread of conspiracy theories, Downs said that YouTube’s goal was to primarily “demote low-quality content and providing more authoritative content.”
Facebook vice president of global policy management Monika Bickert was also asked by Deutch how many strikes Facebook Groups, Pages, or Profiles have before they’re kicked off the platform.
Bickert responded that Facebook did have a strikes policy, but the “threshold varies depending on the severity of different types of violations,” but didn’t offer any further specifics.
Combined, the two answers are a useful distillation of the problems social media platforms today, including Facebook and YouTube, face at keeping harmful content at bay.
Yes, social media companies are fond of talking about their community guidelines.
But just how consistently these platforms are enforcing guidelines, and what it takes for someone to completely get kicked off the platform, remain the bigger questions.
When asked by Rep. Karen Bass (D-CA) about what options Facebook offered for activists who feel like their content has been unfairly taken down, Bickert said that Facebook had an appeals process.
But Rep. Bass questioned how many users know about Facebook’s appeals process.
Facebook was already on damage control duty earlier today as a documentary on its content moderation practices that questions just how consistently Facebook is applying its community guidelines that aired in the U.K. on Tuesday.
Details about the documentary have already leaked, which involved a reporter for U.K. broadcaster Channel 4 working undercover for an Ireland-based contractor that works with Facebook’s content moderation team.
One moderator reportedly told the reporter not to take down a far-right activist’s page who had violated Facebook’s policies, because “they have a lot of followers, so they’re generating a lot of revenue for Facebook.”
Facebook later published a blog post, attributed to Bickert, pushing back on the idea that the company believed in turning a blind eye to bad content was necessary to generate more revenue.
Much of the hearing fell along partisan lines, as many Republicans used their time to ask the Facebook, YouTube, and Twitter representatives questions about why content favorable to conservatives seemed to be censored or why pages or accounts encouraging violence against conservatives were not.
Rep. Matt Gaetz (R-FL) questioned why a Facebook page that appeared to encourage violence against conservatives was still up on the platform, while Rep. Steve King (R-IA) asked why the Facebook traffic of far-right blog Gateway Pundit appeared to have dropped over the past year.
Democrats such as the vice ranking member of the committee, Jamin Raskin (D-MD), accused the committee of arranging the hearing to “resume consideration of the totally imaginative narrative that social media companies are biased against conservatives.”
The Judiciary Committee held a meeting three months prior on similar topics, but invited only a pair of conservative video bloggers known as Diamond and Silk, who complained after receiving a message from Facebook characterizing their page as “dangerous.”
Facebook later apologized, saying that the characterization was incorrect.