As former Facebook employee Frances Haugen testified before Congress about the way the company’s algorithm and corporate culture spawns everything from anxiety and polarization to the hollowing out of the self-image of younger users, one question seemed to rise to the surface: Is it time to regulate (and perhaps break up) Facebook?
From a Catholic standpoint, the answer is absolutely and emphatically yes—and not just because Monday’s six-hour worldwide blackout of Facebook, Instagram, Messenger and WhatsApp meant that none of us had any idea whose birthday was on Oct. 4.
Here are three reasons the Catholic Church should be fighting to see Facebook regulated.
1) Facebook is essential to the world and the church.
First, let’s consider the case of whether Facebook should even continue. It was a question no small number of people were asking yesterday.
Yesterday was nice #facebookdownpic.twitter.com/t7RZOx5iaf
— THE Jason 𝔻𝕒𝕧𝕚𝕕 Hyatt 🤘🏼🤙🏼 (@wolfast) October 5, 2021
#facebookdown This video of me walking into Twitter from my 4 year Instagram addiction yesterday when it went down pic.twitter.com/291dtimvcY
— Summer (@1nottheseason1) October 5, 2021
LETS HAVE AN ELECTION WHILE FACEBOOK’S DOWN
— 🔪 andrew born kimler 🔪 (@AndyKimy) October 4, 2021
But in a lot of places, there was nothing funny about Facebook losing service. As reported by both The Washington Post and The New York Times, WhatsApp has become a main form of communication and also business in many developing countries, where standard forms of telecommunications are expensive. It’s so essential to life in Tanzania that the government’s chief spokesperson posted a video on Twitter reassuring citizens yesterday that “all other government services provided online still remain available.”
According to Facebook itself, 2.76 billion people used at least one of its products every day in the month of June. And as much as some in the United States and elsewhere might scoff at the idea of spending time on a site that sells every bit of information you post on it (or even draft), the Facebook social media platforms are also the easiest and in many cases the only way for people to find and interact with old friends, colleagues across the world and also that guy I met at the bar. At this point, it really is the crossroads of humanity.
And during the pandemic we discovered exactly what that means for the church, as our 1.3 billion Catholics worldwide (and billions of non-Catholics, too) looked for ways of staying connected and spiritually fed at a time when none of us could be together in person. Parishes livestreamed Masses and meditations, prayer services and Scripture study on Facebook. Our own James Martin, S.J., launched a daily chat that averaged a thousand participants.
Everyone hears about how Facebook’s engagement-based algorithm generates hostility, doubt and anxiety. But it is also capable of tremendously boosting people’s spirits and sense of connection.
In the process we all discovered not only that it was possible to do these kinds of things—which for some of us oldsters was no small revelation—we learned that doing so could expand our reach. During the pandemic I was running the Facebook feed for the Loyola Marymount Jesuit Community. I was shocked to find how many people sometimes saw our own little stream of hopefulness, spirituality and Jeff Goldblum memes. Everyone hears about how Facebook’s engagement-based algorithm generates hostility, doubt and anxiety (see below). But it is also capable of tremendously boosting people’s spirits and sense of connection.
Facebook’s products are an essential part of the life and livelihood of many people worldwide, especially for many on the margins. And through Facebook the church and many others are able to reach people and do real good. The idea of somehow permanently “turning Facebook off” at this point is nonsensical anyway. But even if it were, it would be a mistake.
2) Facebook’s current business model is doing harm to people and society.
Having said that, the way Facebook does business is currently broken in a hundred different ways. It has been a constant source of misinformation for years now, and the company has repeatedly dragged its feet on stopping the flow. One in three American and British girls who use Instagram end up feeling worse about their own bodies. And its business is based around the sale of people’s information, something the company can argue is a small cost for free use of its services and the vast communities they allow you to access. But given its the pervasiveness of its products, many people really don’t have the option not to join. Which makes what Facebook is doing akin to the electric company selling businesses information on what you own, which rooms you’re using, even where you are at every moment of the day.
Of more immediate concern are the ongoing revelations from Frances Haugen about the ways in which Facebook consciously capitalizes on its users’ resentments and anxieties to increase engagement and profits.
“No one at Facebook is malevolent,” Ms. Haugen explained on “60 Minutes” on Sunday. “But the incentives are misaligned. Facebook makes more money when you consume more content.... The more anger they get exposed to, the more they interact, the more they consume.”
Whether the underlying intentions of the Facebook management team are in any way malevolent seems a very open question. But the documents Ms. Haugen obtained from Facebook before leaving paint a picture of the consequences of the system currently in place. The Facebook algorithm tends to boost things that will make us angry or afraid because we are more likely to react to them. Media outlets, businesses and political parties consequently find themselves having to become more sensationalistic, divisive or extreme not only in their posts but in their positions, so as to attract online attention. “We know if we don’t take those positions,” Ms. Haugen said, summarizing a 2019 complaint to Facebook from European political parties, “we won’t win in the marketplace of social media.”
The Facebook algorithm’s promotion of toxic or false content has been linked to ethnic cleansing in Myanmar,violence and instability in Libya and murders in India. While Facebook insists its platform cannot be blamed for the January 6th assault on the Capitol, it was a main hub through which people that participated met and planned. And, Ms. Haugen notes in her “60 Minutes” interview, the group that Facebook put into place to prevent misinformation during the 2020 election cycle was closed down—Facebook said redistributed—as soon as it was over. “Fast forward a couple months and we got the insurrection.”
In the case of Instagram, Ms. Haugen notes that “Facebook’s own research says that it’s not just that Instagram is dangerous for teenagers, that it harms teenagers. It’s that it is distinctly worse than other forms of social media.” Again, she explains, that is the Facebook engagement algorithm at work—anxiety and depression actually draw people to stay on their platforms in order to feel better. “And so they end up in this feedback cycle where they hate their bodies more and more.”
Facebook platforms are harming children, instilling anxiety and division, even destabilizing whole societies. Truly, what more evidence do we need that some kind of government intervention is required?
Facebook platforms are harming children, instilling anxiety and division, even destabilizing whole societies. Truly, what more evidence do we need that some kind of government intervention is required?
3) Facebook’s leadership refuses to change.
None of the problems that Ms. Haugen and others are speaking about constitute truly new information. The cycle of depression that social media can create in teenagers, the algorithm’s accentuation of outrage, the civil unrest Facebook has helped create—they’re all things we’ve been talking about for years.
What is new is the insight Ms. Haugen provides into how Facebook has handled these questions internally. Many of the documents that Ms. Haugen has revealed include internal research that demonstrates the company’s clear knowledge of the problems generated by their platforms and algorithm. “We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world,” said one internal report read on “60 Minutes.”
The Wall Street Journal’s series on the leaked documents likewise reveals that Facebook management has exempted millions of its higher profile users from having to follow its rules so strictly about misinformation or inappropriate content, a decision that has meant content that violates Facebook’s rules being viewed over 16 billion times before it is taken down. The Journal also reports that staffers warned chief executive officer Mark Zuckerberg that the platform’s algorithm was making users angrier, but he resisted change because he feared it would diminish overall engagement; that content moderators have at times warned their superiors about people using their platforms to incite violence, sell organs or commit human trafficking, and that the company’s response has at times been “nothing at all.”
All of which suggests a culture that is at best incapable of real change and at worst just unwilling. “Facebook has demonstrated they cannot act independently,” Ms. Haugen told “60 Minutes.” “Facebook over and over again has shown that it chooses profit over safety.”
We can argue that Facebook is a business, not a public service. We can point out that the task the company has set for itself, offering media platforms and communication services for literally billions of people in over a hundred countries, is just plain hard.
Despite its own research highlighting the issues it needs to address, the company only seems to do the right thing after it’s been publicly shamed, and even then only partially or for a little while.
But at the end of the day, the evidence to support Ms. Haugen’s conclusion seems overwhelming. (It should: she has tens of thousands of pages of internal documents.) Despite its own research highlighting the issues it needs to address, the company only seems to do the right thing after it’s been publicly shamed, and even then only partially or for a little while.
Maybe that is a function of its leadership or the massiveness of its business. Or maybe it is an unavoidable consequence of the conflict between its drive to make money and its mission to connect people. Can a company that absolutely needs you to keep interacting on its platforms ever really avoid the kinds of manipulations we’re seeing?
Whatever your answer, that Facebook needs some kind of government intervention seems clear. Not only because it is doing too much damage—but because it is too important to the world.