On March 25th Mark Zuckerberg, Jack Dorsey and Sundar Pichai returned (virtually) to Washington D.C. to testify before Congress’ Energy and Commerce Committee about the handling of fake news on their platforms. After more than 6 hours and 40 minutes of relentless questioning, lawmakers seemed to agree on the Chairman’s premise that “The time for self-regulation is over. It is time we legislate to hold you accountable”.
Chairman Franck Pallone set the tone of the hearing with especially unforgiving opening remarks, saying that in the last five years Facebook, Google, and Twitter “were warned about – but simply ignored – their platforms’ role in spreading disinformation”, save for a few “inadequate attempts to appease critics” by “tweaking some underlying policies”. The members of the Committee proceeded with a methodical grilling of the CEOs, making the most of the five minutes that each was allocated to ask questions, some that went unanswered such as:
– Do you recognize that your site plays a role in spreading falsehoods around Covid-19 and racial biases? – How much are you making in advertising revenues from kids under the age of 13? – Is your algorithm designed to encourage the public to stay on the site?
– Answering to Congresswoman Schakowsky, Zuckerberg seemed to walk back Facebook COO Sheryl Sandberg’s downplaying of the role that the platform played in enabling the organisation of the Capitol riot. – Jack Dorsey answered “yes” to the question “does your platform bear responsibility with regard to #stopthesteal” narrative that fueled the Capitol riot?”; his two counterparts answered a definite “no”. – When pressed, all denied that their platforms’ algorithms were created to intentionally create addiction and increase revenues as a result.
What now? Probably Section 230 reform
After the more than six hours of hearing, there was bipartisan agreement that the platforms play an active role in the rise of disinformation and extremism, and regulation is in order. Rep. Angie Craig (D., Minn.): “Your industry cannot be trusted to regulate itself.” Specifically, section 230 and its reform were looming in the background throughout the hearing.
Facebook has clear ideason how 230 should be reformed:
1) transparency obligations for platforms 2) obligate big tech to have effective measures to handle clearly illegal content (ie opioids/child exploitation/ terrorist content); 3) the reformed rule should not apply to startups and smaller platforms.
Twitter and Google stopped short of endorsing Zuckerberg’s proposal. Pichai said there are merits in the efforts to increase transparency and accountability. Dorsey answered saying that he supports more transparency around content moderation practices but sees challenges in defining small platforms.