Colorado Academy Welcomes ‘Facebook Whistleblower’ Frances Haugen

Colorado Academy Head of School Dr. Mike Davis welcomed families on January 16 to a Zoom discussion with Frances Haugen, the “Facebook Whistleblower” known for courageously revealing tens of thousands of pages of internal documents demonstrating that the social media giant prioritizes profits over public safety, putting people’s lives at risk around the world.

“Ever since I saw Haugen’s appearance on 60 Minutes,” Dr. Davis said—where she accused Facebook’s parent company, Meta, of misleading the public and investors on how it handles issues such as climate change, misinformation, and hate speech—“I’ve been hoping to have her speak to our community.”

CA, Davis went on, is deeply engaged in thinking about how social media use and technologies such as AI are impacting students, and he is concerned that surveys about their technology use reveal that most Upper School students have witnessed disturbing online content or behavior on their phones. The school recently rolled out a voluntary pledge to “wait until eighth,” enlisting families in a broader movement to delay putting smartphones into children’s hands because of the increasing body of research showing that social media, in particular, has had a devastating impact on the mental health of young people.

“These are issues everyone in schools should be concerned about,” argued Davis, “and I’m honored to welcome an advocate who is leading the way toward change.”

With a degree in Electrical and Computer Engineering from Olin College and an MBA from Harvard University, Haugen is a specialist in algorithmic product management, having worked on ranking algorithms at Google, Pinterest, Yelp, and Facebook. In 2019, she was recruited to Facebook to be the lead Product Manager on the Civic Misinformation team, which dealt with issues related to democracy and misinformation, and she later also worked on counter-espionage.

During her time at Facebook, Haugen became increasingly alarmed by the choices the company was making despite the dangers to users, and she decided to go public. She has since published The Power of One—a memoir about a career that led her to blowing the whistle on Facebook—and testified in front of Congress, UK and EU Parliaments, and the French Senate and National Assembly. Through her nonprofit, Beyond the Screen, she is engaged internationally with efforts to address the negative externalities of social media platforms.

Yet, Haugen told an audience of nearly 150 at the beginning of her talk, the issues she first helped to reveal in 2021 have still not received the attention they deserve, given the severity and prevalence of their impacts, especially on children. “Everyone in our country knows young people who have been harmed by social media,” she said, “and the reality is that in the next few years we will make decisions that will determine the new ‘digital normal.’”

Haugen cited data showing that rising rates of depression, anxiety, and self-harm among children and teens paralleled the growth of social media, and she described the role of platforms such as Facebook in stoking ethnic violence in countries including Myanmar and Ethiopia.

“We can have social media we enjoy, that connects us,” Haugen continued, “without tearing apart our democracy, putting our children in danger, and sowing violence across the world. We can do better.”

The key to making a difference is understanding how social media platforms such as Facebook really work, how they create harm and division, and how they can be made safer, she argued. “Meta has spent hundreds of millions of dollars on lobbying and other efforts to keep the discussion about safety focused on so-called ‘bad content’ and censorship,” she explained, “and to hide from the public any real knowledge of how Facebook works or how its own research proves it harms users.”

But the federal lawsuit filed in fall 2023 against Meta by 41 states—in which they allege the company lied about the safety of their products and what it did or did not do to keep kids safe—shined a spotlight on the business practices that, Haugen insists, drove corporate policies intended to maximize profits without regard for harmful consequences that were well known within the company. The revelations, she made clear, apply just as accurately to Instagram, which is also owned by Meta, along with TikTok, Snapchat, and other products.

Quoting from the opening paragraphs of the lawsuit, Haugen said, “Over the past decade, Meta has profoundly altered the psychological and social realities of a generation of young Americans. Meta has harnessed powerful and unprecedented technologies to entice, engage, and ultimately ensnare youth and teens. Its motive is profit, and in seeking to maximize its financial gains, Meta has repeatedly misled the public about the substantial dangers of its social media platforms. It has concealed the ways in which these platforms exploit and manipulate its most vulnerable consumers: teenagers and children. And it has ignored the sweeping damage these platforms have caused to the mental and physical health of our nation’s youth.”

“The crux of the issue,” underscored Haugen, “is that in doing so, Meta engaged in, and continues to engage in, deceptive and unlawful conduct in violation of state and federal law.”

Haugen testifying before Congress in October 2021

Haugen went on to detail some of the most concerning information contained in the states’ lawsuit, much of which was based on the documents she released:

  • First, Meta’s business model is based on maximizing the time young users spend on the platform. Because young users are so valuable to Meta’s advertisers, the company intentionally targets them and incentivizes employees to find ways to increase the time they spend online—all contrary to Meta executives who have testified to Congress repeatedly that this is not the case.
  • Second, Meta carefully developed an extensive set of psychologically manipulative product features specifically designed to influence children and teens and maximize their use of Facebook and Instagram—despite claiming that technologies such as dopamine-stimulating algorithms, intrusive notifications, and visual filters were not used for this purpose.
  • Third, Facebook’s content algorithms, which determine the unique mix of stories and advertisements that each user sees in the app, privilege disturbing, divisive, and potentially damaging material, which the company’s research shows drives deeper user engagement and revenue.
  • And finally, Meta routinely published profoundly misleading reports purporting to show extremely low rates of negative and harmful experiences on its platforms, while maintaining a parallel body of internal research convincingly pinpointing Facebook’s role in promoting self-harm, eating disorders, and other harms and leading to increased rates of depression, anxiety, insomnia, and disrupted learning among young people.

Meta and companies like it, Haugen concluded, cannot be trusted to keep kids safe. “Instead, I am a proponent of acting locally and quickly, even as we wait for governments to pass new laws that might help to regulate this industry.”

What we can do today, she went on, is work together to develop shared norms about the ways that we care for children in the face of all the efforts by Meta and its peers.

“Let’s all agree to ‘wait until eighth,’” Haugen suggested. “Let’s work with kids to help them develop their own ‘rituals of governance’ to demand social media that works for them. Let’s build muscle around accountability in these systems.”

Young people around the world are realizing they’re not powerless against companies like Meta; they’re beginning to demand something different. “We are starting to acknowledge that kids have genuinely lost something over the past decade or so—the ability to develop real-world social skills and build community offline.”

“Social media is only the first of many highly disruptive, intangible technologies we’re going to have to learn how to govern,” asserted Haugen. “Though we all may see the world individually, filtered by algorithms on our phones, these are not individual problems—these are ‘we’ problems. How are we going to stand up together to demand technology that we feel good about?”