Twitter, Facebook CEOs Face Senate Grilling

ManagementMarketingSocial lawsSocialMedia

Mark Zuckerberg and Jack Dorsey face four hours of questioning on Tuesday by US Senate members over content moderation and who makes content decisions

The bosses of both Facebook and Twitter faced four hours of questioning on Tuesday over the way their platforms handled an article from the New York Post.

The New York Post article last month was about US presidential candidate Joe Biden and his son Hunter.

Twitter last month changed its policy that had blocked users from sharing a link to the article, because of those rules that sought to stop the spread of content that had been acquired as a result of a hack.

Senate hearing

Twitter had changed its rules so that such posts would now be flagged as containing hacked material, rather than blocked.

The Post article had contained screenshots of emails allegedly sent and received by Hunter Biden, as well as personal photos of Hunter Biden, allegedly removed from a laptop computer while it was in a shop for repairs.

Both Mark Zuckerberg and Jack Dorsey answered questions about their policies, and they also outlined future regulation of their industry, CNN reported.

It was the second time the CEOs had been summoned to testify in as many months.

Both CEOs were subjected to the usual share of allegations from Republican lawmakers convinced that social media platforms have a bias against conservative viewpoints.

Last month Senator Ted Cruz hit out at Twitter’s CEO Jack Dorsey, that saw the grandstanding Republican senator demanding “who the hell elected you?” of Dorsey.

Jack_Dorsey
Twitter chief executive Jack Dorsey

Less grandstanding

There was less grandstanding this time around, but the theme of the hearing according to CNN was to establish what responsibilities tech companies should have for moderating content, and what role the US government should play.

Leading members of the Senate Judiciary Committee reportedly said they did not think it’s appropriate for the US government to get directly involved in online content moderation.

“I am not, nor should we be on this committee, interested in being a member of the speech police,” said Sen. Richard Blumenthal, the panel’s top Democrat.

But Blumenthal indicated that he wants private citizens to be able to sue tech platforms for harms they’ve suffered as a result of the companies’ handling of content, something they can’t do now under Section 230 of the Communications Act, the signature US law that grants tech platforms legal immunity for many of their content decisions.

Blumenthal and Sen. Lindsey Graham, the committee’s Republican chairman, said changes are likely coming to Section 230, which has been targeted by both US President Donald Trump and President-elect Joe Biden.

Way forward

Zuckerberg and Dorsey reportedly spent hours debating with US lawmakers on, among other things, whether social media platforms are analogous to news publishers or telecommunications companies.

Zuckerberg pushed back, arguing that social media represents an entirely new sector of the economy that the federal government should hold accountable under a unique model.

“We do have responsibilities, and it may make sense for there to be liability for some of the content that is on the platform,” Zuckerberg said. “But I don’t think the analogies to these other industries … will ever be fully the right way to think about this.”

Zuckerberg reiterated his preference for clear rules for the internet.

Dorsey, by contrast, said federal policy should not depend too heavily on any single set of algorithms to moderate content.

Instead, he argued, consumers should be able to choose among many algorithms – or even to opt out of having content decisions made algorithmically altogether.

“As we look forward,” Dorsey reportedly said, “we have more and more of our decisions, of our operations, moving to algorithms which have a difficult time explaining why they make decisions, bringing transparency around those decisions. And that is why we believe that we should have more choice in how these algorithms are applied to our content, whether we use them at all, so we can turn them on and off – and have clarity around the outcomes that they are projecting and how they affect our experience.”

Read also :
Author: Tom Jowitt
Click to read the authors bio  Click to hide the authors bio