Loading...
Loading...
Click here if you don’t see subscription options
Justine LimpitlawMay 18, 2021
Composite by America based on illustrations by Carlos TX/Unsplash

I bet you don’t think of yourself as capable of being radicalized—as someone swayed by extremist views online. Think again.

You are—and the most popular social media platform in the world is engineering your radicalization, intentionally.

The recent ruling of the Facebook Oversight Board on the suspension of former President Donald J. Trump’s account has been widely publicized, but its strong criticism that Facebook’s own architecture contributed to the attack on the Capitol on Jan. 6 has received far less attention. This is a much-overlooked aspect of the ruling by the Oversight Board, the independent body set up and funded by Facebook to make determinations about the moderation of content on the company’s social media platforms: Facebook and Instagram.

Facebook is engineering your radicalization, intentionally.

On Jan. 7, the day after the attack, former President Trump was suspended indefinitely from Facebook. In his post explaining the suspension of the then-president of the United States, Mark Zuckerberg, the founder and C.E.O. of Facebook, said that Mr. Trump had “used our platform to incite violent insurrection against a democratically elected government.” It appears that this was the post that broke the camel’s back:

These are the things and events that happen when a sacred landslide election victory is so unceremoniously viciously stripped away from great patriots who have been badly, unfairly treated for so long. Go home with love in peace. Remember this day forever!

On Jan. 21, Facebook referred its indefinite suspension of Mr. Trump’s account to the Oversight Board. On May 5, the board ruled it would uphold the suspension of the former president’s account, arguing that Trump had “severely violated Facebook’s Community Standards.”

But the Oversight Board also criticized Facebook for imposing an “indefinite” suspension on the former president’s account. “It is not permissible for Facebook to keep a user off the platform for an undefined period,” the board ruled, “with no criteria for when and whether the account will be restored.” Facebook was given six months to examine its “arbitrary” penalty and decide on the appropriate punishment for the former president of the United States.

Buried on page 21 of the Oversight Board’s ruling is this:

In this case, the Board asked Facebook 46 questions, and Facebook declined to answer seven entirely, and two partially. The questions that Facebook did not answer included questions about how Facebook’s news feed and other features impacted the visibility of Mr. Trump’s content; whether Facebook has researched, or plans to research, those design decisions in relation to the events of January 6, 2021.

The Board sought clarification from Facebook about the extent to which the platform’s design decisions, including algorithms, policies, procedures and technical features, amplified Mr. Trump’s posts after the election

Still further down, on page 29 of the ruling:

The Board sought clarification from Facebook about the extent to which the platform’s design decisions, including algorithms, policies, procedures and technical features, amplified Mr. Trump’s posts after the election and whether Facebook had conducted any internal analysis of whether such design decisions may have contributed to the events of January 6. Facebook declined to answer these questions. This makes it difficult for the Board to assess whether less severe measures, taken earlier, may have been sufficient to protect the rights of others.

This is oversight dynamite.

Facebook is quite happy for its Oversight Board to opine on the application of its Community Standards to content and even on the need to bolster those Community Standards in particular areas. What Facebook is clearly very unhappy about is the Oversight Board poking its nose into the very thing that drives the profit-making machine that is Facebook—its design, its algorithms—the things it does to keep you doomscrolling through those posts in the dead of night.

I was one of over 9,000 people who made public submissions on the Trump case to the Oversight Board. You can find my contribution on page 8,550 of the Public Comments Appendix to the Oversight Board’s Ruling. (That page number is not a typo!)

“There is a significant and relevant issue that Facebook failed to refer to the Oversight Board,” I write in my public submission. “The Oversight Board should be asking questions, searching ones, about Facebook’s algorithms and what they do to global dialogue and debate.”

“The Oversight Board should be asking questions, searching ones, about Facebook’s algorithms and what they do to global dialogue and debate.”

Facebook’s algorithms are designed to ensure you are on Facebook as long as possible so as to expose you to as many advertisements as possible. The content algorithms are set up to achieve this in a couple of ways:

  • The attention grab algorithm: Outrage is the surest way to get your attention. By design, topics you are likely to be outraged by find their way to your Facebook timeline.
  • The echo chamber algorithm: Your own views (and prejudices and quirks) are fed back to you and not challenged. You are soothed into believing your outrage is shared and that your ideas are mainstream.

The impacts of these are that Facebook is designed to hold your attention by polarizing its community. The social media platform is designed to push you to the extremes of your inclinations by feeding you with a barrage of content that you find outrageous and feel the ardent need to respond to, and by the echo chamber that removes the doubt that you could be wrong—that there could be other quite reasonable ways of seeing a situation.

The echo chamber algorithm soothes you into believing your outrage is shared and that your ideas are mainstream.

The quietening of the middle—of the many reasonable voices that can disagree but respectfully so; of the many ordinary people contributing to the marketplace of ideas, not burning it down—is a terrible side-effect of Facebook’s business model.

If you are a regular Facebook user, there is a good chance you are being radicalized. You are being pushed to the extremes of positions you care about and encouraged (and deliberately so) to disparage and despise (not just disagree with) views that are different.

One response option is to delete your account. But the reach and convenience of Facebook makes that an option for a vanishingly small number of users. The other, much harder, option is to train yourself to bring morality, skepticism and a desire (not just a willingness) to hear voices outside your echo chamber.

The quietening of the middle——of the many reasonable voices that can disagree but respectfully so—is a terrible side-effect of Facebook’s business model.

The key is to develop your own manifesto about how to be on Facebook with integrity:

  • Morality: I will not troll, threaten, insult or disparage someone whose posts I don’t agree with.
  • Skepticism: I will check posts sparking outrage with other sources to guard against spreading disinformation.
  • Other voices: I will deliberately set out to find people with different views and, here’s the trick, not polar opposite views, because these are too easy to discount because the chasm of disagreement is wide. Find people whose views are more to the center of yours. Encourage yourself to change your mind, to be persuaded by argument and not by tribal political or social affiliation.

Now, Facebook is advocating for more governmental regulation. Why? Because that is easier to deal with than considered internal criticism and oversight. And no government regulator would ever ask it to change its business model to prevent the radicalization of you and me.

And so, preventing that radicalization is up to me, and you.

More from America: 

The latest from america

Pope Francis, on his first visit to Corsica, praised the island people's deep faith and tradition of popular piety. On December 5, thousands of Corsicans gathered in Ajaccio to welcome the first pope ever to visit the island.
Gerard O’ConnellDecember 15, 2024
While “What Child Is This?” is an explicitly Christmas song, the tune speaks to the mystery of identity and purpose—questions that transcend religious boundaries and appeal to the human experience.
Grace LenahanDecember 15, 2024
For the first time, women will serve on the Ordinary Council of the General Secretariat of the Synod.
Catholic LGBTQ+ group “La Tenda di Gionata” (”Jonathan’s Tent”) is among the many groups to register a pilgrimage to St. Peter's Basilica for the Holy Year. They are among the hundreds of groups not part of the Vatican’s 35 official, special jubilees.