Internet

Frances Haugen: Please go to Facebook

Facebook whistleblower Frances Haugen speaks at Brown

Haugen discusses misinformation, internal reform, Twitter takeover in interview with The Herald

Haugen said that Facebook is especially damaging in developing countries, where Facebook offers free internet for “a billion people in the world, maybe more,” to use its platform

Frances Haugen, a former Facebook employee who released documents revealing that Facebook was aware that the platform helped cause ethnic violence and mental health issues, spoke at the University Wednesday in a talk entitled “Reforming Social Media From the Inside.” The event was one of the first in a series of planned talks at university campuses by Haugen.

President Christina Paxson P’19 opened the talk, and was followed by Mark Blyth, director of the William R. Rhodes Center for International Economics and Finance, who introduced Haugen.

Haugen spoke about the harm of Facebook, possible reforms and the ways students and Facebook employees alike can make a difference.

Haugen said that Facebook is especially damaging in developing countries, where Facebook offers free internet for “a billion people in the world, maybe more,” to use its platform.

As a result, for many, stopping their use of Facebook is not an option, as “for a majority of languages in the world, 80% to 90% of all the content available on the internet in that language is only on Facebook,” Haugen said in an interview with The Herald.

And according to Haugen, 87 % of Facebook’s fall 2020 budget to curb misinformation was focused on American content — even though Americans only comprise about 10 % of Facebook users — because Facebook is more concerned about regulatory bodies in the United States and the European Union than in developing countries. As a result, the U.S. has “the cleanest version of Facebook in the world,” she said.

In contrast, the documents that Haugen leaked show that Facebook was aware that its platform was promoting misinformation and hate speech, which led to ethnic violence in Myanmar and Ethiopia. This disparity in Facebook’s policing of content is why Haugen said that censorship or removing content is not the solution.

“When we focus on censorship or moderation, we have to do it language by language,” she said. “We leave behind people who speak smaller languages, and often the places that have the most (ethnic) conflict … are also linguistically diverse.”

Instead, she said she advocates for product change-based solutions that would work in any language, such as modifying algorithms to reduce the amount of content that users see but did not ask for.

Haugen sorted content on Facebook into two buckets: content you asked for — such as posts from family and friends or groups you chose to join — and content you did not consent to — posts that friends commented on, or suggestions based on activity.

“If we just show people more content that they consented to … you get less hate speech, less violence, less nudity, because the problem is not your friends. It is not your family. Even when your uncle is crazy, that’s not the problem,” she said. “The problem is systems that give the most reach to the most extreme ideas.”

Other suggestions Haugen had include requiring users to click on links before resharing them, which Haugen hypothesized would decrease misinformation, or limiting the number of times content can be reshared off of other reshares.

But Haugen said that profit incentives are currently preventing these types of changes. As a result, she said that she is “cautiously optimistic” about Elon Musk’s recent acquisition of Twitter because the social media giant’s transition to being a privately held company with less emphasis on maximizing profits for investors could allow for policy improvements.

“We got to where we are with Facebook because of incentives. Some of those incentives are because it’s a publicly held company. They have to publicly report how many users they have. They have to publicly report their revenue,” she said.

Because investors care about factors like how many users a platform has, Facebook has a disincentive to crack down on bots, which artificially increase the number of users and thus profits, Haugen said.

“I’m very against the idea that we should view the idea of a billionaire saving us as a positive thing,” she said. “But at the same time, Elon announced he wanted to crack down on bots, and cracking down on bots is an example of something that’s very easy to do when you’re a privately held company because you don’t have to report your income anymore.”

Haugen also talked about the importance of Facebook employees working toward reform within the company. Specifically, she said that colleges have a role in preparing students to create change working at companies like Facebook but are currently unable to adequately do so.

“If we were talking about an oil company, every year we graduate 50,000 environmental science majors, maybe more globally, who learn how to test the water, test the soil, who learn what good regulation and bad regulation looks like, that learn all the different potential consequences and questions to ask,” she said. “When it comes to social media, we graduate zero people globally who can talk about things like, ‘Should we cut reshare chains at two?’ ‘Should you have to click on a link before you reshare?’ (and) a whole bunch of much more complicated things that are about network dynamics.”

She said one part of creating education for more socially conscious work at tech companies is new simulations of algorithm outcomes, a project Haugen hopes to develop.

“The reason we can educate those people for environmental science is because we have a chemistry lab bench. You can go blow stuff up, breathe stuff in you shouldn’t breathe in. You get a sense of what it is to be a chemist,” she said. “Imagine if we had simulated social networks that had the level of complexity or an approximation of what you see in your job right now.”

“You need to start developing curricula where you come in there and you have an argument about ‘how does this experiment perform for a child versus an adult, for a low-literacy person versus a high-literacy person.’ Until we have a lab bench, we’re not going to be able to teach those classes,” she added.

Still, Haugen emphasized her belief that employees and recent graduates can make change now.

“Go to Facebook. Please, please, please go to Facebook,” Haugen said. But she added that once there, recent graduates should focus on addressing pressing issues such as hate speech, arguing that there’s time to meet expectations of the job while still addressing social issues.

Haugen closed her remarks by emphasizing that “every single tech platform is disruptive”: The invention of the printing press led to religious wars and witch hunts, cheap printing allowed yellow journalism and wars for illegitimate reasons while radio and cinema gave a platform to fascists, she said. But every time, Haugen added, people learn to contain such harms.

“Every single time we’ve invented a new form of media, we’ve realized our limitations and figured out ways to course correct,” she said. “The reason this feels overwhelming right now is that it is our burden to figure out what’s next. But we’ve done it every single time before. We’re very, very resilient, and we make a difference.”

Tags

Related Articles

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button Partners
Close