Whistleblower Frances Haugen on the alliance to hold social media accountable
We need to act now
The former Facebook manager joins the Council for Responsible Social Media, a new coalition created to press big tech to change.
The Facebook whistleblower Frances Haugen is part of a bipartisan coalition to hold social media companies accountable
Frances Haugen left her role as a product manager at Facebook in 2021, bringing with her a cache of internal documents illustrating allegations of wrongdoing at the company.
But a year later, despite congressional hearings and investigations, Meta has made few meaningful changes to its policies, Haugen says, and as the US midterm elections approach, the stakes are high.
“I’m extremely concerned about the upcoming election, and I’m even more concerned about future elections,” Haugen told the Guardian. “Without transparency and without oversight, we should expect [Facebook] will not spend enough on safety – they won’t produce a level of safety that we deserve.”
Frustrated by the inaction, Haugen is one of dozens of former government officials, independent researchers and public health advocates who are joining a new bipartisan coalition that hopes to force fundamental change to the world’s major tech platforms.
Launching on Thursday, the Council for Responsible Social Media (CRSM) aims to advocate for “bipartisan solutions” and serve “a critical mechanism” in holding these companies accountable.
“The council is trying to bring together a bipartisan, diverse set of people to emphasize that these are not partisan issues,” Haugen told the Guardian. “These are common sense solutions that can make a really big difference, and we need to act now.”
Launched in partnership with political reform group Issue One, the CRSM will advocate for change in three main areas: kids, communities and national security.
Other members of the CRSM include former defense secretaries Chuck Hagel and Leon Panetta, former Congress members Claire McCaskill and Dick Gephardt, and former National Security Agency director Michael Rogers.
“Social media defines nearly every aspect of our social fabric and has changed the world as we know it. We can now see clearly that the companies operating these platforms have too often failed to be responsible stewards of our political, social and communications spaces,” Gephardt said.
Some 68% of Americans believe big tech firms have too much power and influence on the economy, and 56% say they should be more regulated than they are currently. But despite broad bipartisan support for action, Congress has for years failed to pass effective legislation.
The new council aims to advocate for reforms to bring more transparency and oversight to these companies, Haugen said, adding that there are several “low hanging fruits” of regulation that could be passed imminently.
That includes the Platform Accountability and Transparency Act, a bill introduced in 2021 that would require social media firms to comply with researcher data requests for external audits. Under the proposed law, failure to do so could result in loss of legal protections for content hosted on their platform.
Haugen also highlighted the Kids Online Safety Act, a bill introduced in 2022 that would install more safeguards and transparency for minors using social media.
“There are a number of large opportunities today that were not on the table a year ago in terms of moving forward in a bipartisan way,” she said. “They just need a push over the finish line.”
Haugen said these protections were only growing more important as tech companies continue to expand their reach. Nearly one year ago, Facebook’s parent company officially changed its name to Meta, with its chief executive, Mark Zuckerberg, announcing a new focus on building out a digital world called the metaverse.
“The idea that that we’re going to let Mark Zuckerberg for a second time define a critical piece of social public infrastructure without any accountability or transparency is amazing,” she said. “He has not earned the privilege of being able to act with this level of impunity.”
Council looks to bridge divides on tech
Public officials in Washington for years have sparred along partisan lines over whether social media platforms take down too much or too little hate speech and misinformation.
A council launching this week aims to sidestep those disputes by proposing reforms that tackle issues of bipartisan concern, including children’s safety and national security.
The newly minted Council for Responsible Social Media, set up by the nonpartisan nonprofit Issue One, features a wide-ranging and influential lineup of former U.S. lawmakers and federal officials, advocates, scholars, industry leaders and whistleblowers.
They include former Democratic House majority leader Dick Gephardt, former defense secretary and White House chief of staff Leon Panetta, former Jan. 6 House select committee adviser and Republican congressman Denver Riggleman, former Google ethicist and Center for Humane Technology co-founder Tristan Harris and Facebook whistleblower Frances Haugen.
“This is not a think tank. This is an action tank,” Gephardt told The Technology 202. “We want to see results.”
The group is staking out three initial areas of focus: how social media platforms may adversely affect children’s mental health, how they can be weaponized by foreign adversaries to the detriment of U.S. national security and how they may exacerbate societal divisions.
“The core goal of the commission is to really show that there are bipartisan paths forward … that involve having companies have to actually talk about what is their role in society,” Haugen said in an interview.
Haugen said the council can move the debate around social media accountability forward by focusing on areas of “common ground,” like concerns around algorithmic amplification, transparency and platform design choices.
Haugen said proposals the council might explore include giving users, particularly children, the option to “reset algorithms” so they do not keep wandering down the same risky “rabbit holes.”
By focusing on systemic issues, she said, the group might be able to help build support for ideas that sidestep thorny speech debates.
The council has received roughly $250,000 in funding collectively from the Omidyar Network, the Newton and Rochelle Becker Charitable Trust and the Wend Collective, according to Issue One spokesperson Cory Combs. Combs said the council has not received direct funding from any tech companies. Some of the members, including Gephardt and Panetta, have ties to firms that have represented industry giants.
The Omidyar Network, a philanthropic venture launched by eBay founder Pierre Omidyar, has pushed for greater regulation of the tech giants. The Newton and Rochelle Becker Charitable Trust is a foundation that describes itself as a “support organization” for Jewish communities. The Wend Collective is a philanthropic group launched by Walmart magnate James Walton.
The council may also rally around legislation that already has bipartisan support, such as recent Senate bills on kids’ online safety and platform transparency, Haugen said.
One of those proposals, the Kids Online Safety Act, would give parents greater control over their children’s online activity and require that platforms give kids the option to opt out of algorithmic recommendations and other potentially harmful features. The other, the Platform Accountability and Transparency Act, would force companies to turn over data about their services to researchers, or face fresh liability threats if they don’t.
Both proposals have bipartisan support, and the children’s safety bill advanced out of committee in the Senate with broad backing. But neither has a counterpart in the House, and the latter hasn’t been formally introduced, signaling they may both still face a long path to passage.
The council is also poised to shine a brighter spotlight on how U.S. companies may be playing into the hands of foreign adversaries — scrutiny that has largely focused on TikTok, owned by Beijing-based tech giant ByteDance.
The council includes a slew of national security heavyweights, including former defense secretary Chuck Hagel, former CIA director Porter Goss and former Cybersecurity and Infrastructure Security Agency director Chris Krebs.
Haugen said one concept the group may explore is requiring “consistent reporting” by companies about how much they are investing to counter foreign influence operations.
The group is launching ahead of a summit in Washington on Thursday, where it plans to lay out a “plan of action for the months and years ahead,” according to a release.
Frances Haugen on Future of Facebook, Twitter
Former Facebook Product Manager and Beyond the Screen Founder Frances Haugen joins Emily Chang to discuss what changed or not since a year ago when she testified before US Congress about Facebook’s harms to society. She also shares her thoughts on the state of big tech antitrust legislation, and Elon Musk’s purchase of Twitter.