Amidst vague and uninformed questions during today’s House Judiciary hearing with Facebook, Google, and Twitter on social media filtering practices, Representative Steve King (R-Iowa) dropped a bombshell. “What about converting the large behemoth organizations that we’re talking about here into public utilities?”
King’s suggestion followed his inquiries about right-wing outlet Gateway Pundit losing reach on social media and how Facebook’s algorithm worked. The insinuation was that these companies cannot properly maintain fair platforms for discourse.
The Representative also suggested that there may be need for “review” of Section 230 of the Communications Decency Act that protects interactive computer services from being treated as the publisher of content users post on their platforms. If that rule was changed, social media companies could be held responsible for illegal content from copyright infringement or child pornography appearing on their platform. That would potentially cripple the social media industry, requiring extensive pre-vetting of any content they display.
The share prices of the tech giants did not see significant declines upon the Representative’s comments, indicating the markets don’t necessarily fear that overbearing regulation of this nature is likely.
Here’s the exchange between King and Google’s Global Head of Public Policy and Government Relations for YouTube Juniper Downs:
King: “Ms Downs, I think you have a sense of my concern about where this is going. I’m all for freedom of speech, and free enterprise, and for competition and finding a way that competition itself does its own regulation so government doesn’t have to. But if this gets further out of hand, it appears to me that Section 230 needs to be reviewed.
And one of the discussions that I’m hearing is ‘what about converting the large behemoth organizations that we’re talking about here into public utilities?’ How do you respond to that inquiry?”
Downs: “As I said previously, we operate in a highly competitive environment , the tech industry is incredibly dynamic, we see new entratnts all the time. We see competitorsacross all of our products at google, and we believe that the framework that governs our services is an appropriate way to continue to support innovation.”
Unfortunately, many of the Representatives frittered away their five minutes each asking questions that companies had already answered in previous congressional hearings or public announcements, allowing them to burn the time without providing much new information. Republican reps focused many questions on whether social media platforms are biased against conservatives. Democrats cited studies saying metrics do not show this bias, and concentrated their questions on how the platforms could protect elections from disinformation.
Protestors during the hearing held up signs behind Facebook’s Head of Global Policy Management Monica Bickert showing Facebook CEO Mark Zuckerberg and COO Sheryl Sandberg as heads of an octopous sitting upon a globe, but the protestors were later removed.
One surprise was when Representative Jerrold Nadler (D-New York) motioned to cut the hearing for an executive session to discuss President Trump’s comments at the Helsinki press conference yesterday that he said were submissive to Russian president Vladimir Putin. However, the motion was defeated 12-10.
Facebook says “The Threshold Varies” For Deleting Fake News
Later in the hearing, Rep Ted Deutch (D-Florida) questioned Facebook and Google’s YouTube about how they deal with conspiracy theorists. The issue has been a huge pain point for Facebook this weak after giving vague answers for why it hasn’t deleted known faker Alex Jones’ Infowars Page, and tweeting that “We see Pages on both the left and the right pumping out what they consider opinion or analysis – but others call fake news.” Bickert today reiterated that “sharing information that is false does not violate our policies.”
As I detailed in this opinion piece, I think the right solution is to quarantine the Pages of Infowars and similar fake newers, preventing their posts or shares of links to their web domain from getting any visibility in the News Feed. But that deleting the Page without instances of it directly inciting violence would make Jones a martyr and strengthen his counterfactual movement.
When Deutch asked about how Infowars’ claims in YouTube videos that Parkland shooting’s survivors were crisis actors squared with the company’s policy, Downs explained that “We have a specific policy that says that if you say a well documented violent attack didn’t happen and you use the name or image of the survivors or victims of that attack, that is a malicious attack and it violates our policy.” She noted that YouTube has a ‘three strikes’ policy, it is “demoting low quality content and promoting more authoritative content”, and it’s now showing boxes atop result pages for problematic searches like is the earth flat?’ with facts to dispel conspiracies.
Facebook’s answer was much less clear. Bickert told Deutch that “We do use a strikes model. What that means is that if a Page, or profile, or group is posting content and some of that violates our polices, we always remove the violating posts at a certain point” (emphasis mine). That’s where Facebook became suddenly less transparent.
“It depends on the nature of the content that is violating our policies. At a certain point we would also remove the Page, or the profile, or the group at issue” Bickert continued. Deutch then asked how many strikes conspiracy theorists get. Bickert noted that ‘crisis actors’ claims violate its policy and its removes that content. “And we would continue to remove any violations from the Infowars Page.” But regarding Page-level removals, she got wishy-washy, saying “If they posted sufficient content that it would violated our threshold, then the page would come down. The threshold varies depending on the different types of violations.”
Facebook will need to come up with a much clearer rubrick for exactly how that threshold varies, and make that publicly available, or it will continue to be seen as indecisive, and lacking in proper response.