The chief executives of Facebook and Twitter defended on Tuesday the steps they took to limit election misinformation before lawmakers on both sides of the aisle who have grown increasingly critical of Big Tech.
Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey told the Senate Judiciary Committee they had been more aggressive this year than ever before in labeling misleading posts, clamping down on conspiracy theories and amplifying credible information about voting. Those measures started well before Election Day, as a record number of Americans cast ballots early. The social media companies have continued their monitoring into the weeks since November 3, as results were counted and later questioned by President Trump and his supporters.
Testifying remotely, Dorsey told the committee Twitter flagged 300,000 tweets between October 27 and November 11 for “disputed and potentially misleading” information about the election.
“More than a year ago, the public asked us to offer additional context to help make potentially misleading information more apparent. We did exactly that,” he said.
While Facebook also labeled many election-related posts, Zuckerberg — also testifying remotely — instead emphasized the company’s efforts to highlight authoritative sources of information in users’ feeds and its voter registration campaign.
“All taken together, I think that we really went quite far in terms of helping to distribute reliable and accurate information about what was going on during this election,” he said.
Democrats and Republicans are skeptical of social media, for different reasons
But some Democrats on the committee said the executives should be doing more to stamp out the baseless voter-fraud conspiracies and false claims of victory that the president and his allies continue to promote two weeks after Election Day.
“I recognize the steps — they’re really baby steps — that you’ve taken so far. The destructive, incendiary information is still a scourge on both your platforms,” said Sen. Richard Blumenthal, D-Conn. He added that executives from Google’s YouTube should also be facing lawmaker’s questions about the spread of misinformation through video.
When Zuckerberg told Sen. Mike Lee, R-Utah, that the company occasionally made “mistakes” in enforcing its rules, Lee responded: “They may be mistakes but they’re mistakes that rhyme … and the consistent theme happens to be Republicans, conservatives and pro-life activists.”
Dorsey also acknowledged that Twitter had made its own errors. He said the company had been wrong to block users from sharing a New York Post article about President-elect Joe Biden’s son, Hunter, in late October. After the decision sparked outrage — it was the initial impetus for Tuesday’s hearing — Twitter quickly backtracked, changing its policy within 24 hours.
“This demonstrates our ability to take feedback amid the stakes and make changes transparently to the public,” Dorsey said.
Senators drill down on content moderation decisions
Tuesday’s hearing produced fewer fireworks than Dorsey and Zuckerberg’s past appearances before Congress, including one just six days before Election Day.
Lawmakers from both parties spent the bulk of the hearing focused on how Facebook and Twitter moderate what users post on their platforms, in many cases pressing them over specific decisions to label individual posts.
Several Democrats grilled Zuckerberg on Facebook’s decision not to ban former White House advisor Steve Bannon after he called in a live-streamed video for the beheadings of Dr. Anthony Fauci and FBI director Christopher Wray.
“How many times is Steve Bannon allowed to call for the murder of government officials before Facebook suspends his account?” Blumenthal asked, pointing out that Twitter had permanently suspended Bannon. Zuckerberg said that if Bannon repeatedly violated Facebook’s rules, it would take further action.
Bipartisan agreement that more regulation is needed
The one thing everyone at the hearing seemed to agree on? Tech companies need more regulation. Appetite is growing on Capitol Hill to consider new laws to address concerns ranging from antitrust and privacy to encryption and content moderation.
“There’s Republican and Democrat concern about the power that’s being used by social media outlets to tell us what we can see and what we can’t, what’s true and what’s not,” said Sen. Lindsey Graham, R-S.C., the committee chairman.
Even Zuckerberg and Dorsey said on Tuesday they were open to reforming long-standing legal protections that shield online platforms from liability over what users post.
Zuckerberg said tech companies should have to produce transparency reports about what posts they decide to remove. Dorsey called for users to be given more control over the algorithms that determine what content shows up in their timelines.
But what such regulation would look like is still unclear — especially since Republicans and Democrats do not agree on whether social media companies are doing too much, or not enough, to police their platforms.
Platforms’ next test looms in Georgia
Just how effective the companies’ efforts to curb election misinformation have been is still unclear. Facebook and Twitter both say they are studying the data and will share their findings publicly.
In the meantime, the companies are already facing pressure over another political race: the special elections to fill two Georgia Senate seats in January, which will determine which party controls the chamber.
“I’m concerned that both of your companies are, in fact, backsliding or retrenching — that you are failing to take action against dangerous disinformation, exactly the same kind of voter suppression tactics that existed in the last election,” Blumenthal said.
Zuckerberg replied that Facebook is taking a “similar approach” to Georgia as it did with the presidential election. Dorsey said Twitter was doing the same, adding, “we intend to learn from all of our experience with this election.”
Dorsey did confirm that one thing will change in the wake of November’s vote: President Trump will no longer qualify for Twitter’s policy exempting world leaders from some of its rules.
Under that policy, tweets from the president that would have been removed, had they been posted by any other user, were left up on the platform.
“If an account suddenly is not a world leader anymore, that particular policy goes away,” Dorsey said.
Editor’s note: Facebook is among NPR’s financial supporters.