Articles

Our Asocial Media Domestication

12/11/19

Roberto Rivera

If “ideas have consequences and bad ideas have victims,” then few ideas are worse than the belief that technology is morally neutral.

Every human invention is rooted in ideas about what constitutes human flourishing and how that flourishing is best accomplished. These inventions only seem morally neutral because we take their underlying beliefs about human flourishing for granted.

This “taking for granted” is no big deal when it comes to, say, the plow, but it’s been disastrous when it comes to the internet and social media. The worldviews of the engineer-entrepreneurs who created social media are reflected in the algorithms that govern social media. These algorithms, in turn, changed the way we talk and thus, changed who we are, and not for the better.

The engineers and techno-utopians who created social media have an almost messianic belief in the power of information to transform the world. At the same time, they were agnostic about, if not hostile towards, the idea of evaluating information to determine whether it’s true or false, important or trivial, or even beneficial or harmful.


An “activating emotion” is one that prompts a response, “engagement” in Web 2.0-speak. This “engagement” can be a “like,” sharing the meme with others, or commenting on it. This “activation” is a function of arousal, and this arousal is why social media is so toxic.


Chuck Colson was fond of quoting, albeit not approvingly, William James’ summary of what truth means from “Pragmatism”:  “According to the pragmatic criterion of truth, true ideas have practical value, false ideas do not. Truth has ‘cash-value’ in experiential terms. If we know the truth, we can cash it in, or make use of it.” Or, put more pithily, “Truth is the cash-value of an idea.”

The architects of Web 2.0 operated from a similar assumption. In his new book, “Antisocial: Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation,” Andrew Marantz of the New Yorker writes that after the fall of traditional gatekeepers, those institutions that had traditionally controlled the flow of information, “many decisions about the spread of information were now made algorithmically.”

These algorithms have “no way of prioritizing what really matters.” Even worse, they “were not designed to gauge whether an idea was true or false, prosocial or antisocial.” All they are designed to do is “measure whether a meme [is] causing a spike of activating emotion in a large number of people.” (Link added.)

That’s because, for the engineer/entrepreneurs who created the algorithms, competitive success was entirely a matter of provoking “activating emotions,” which prompts an obvious question: What is an “activating emotion?”

An “activating emotion” is one that prompts a response, “engagement” in Web 2.0-speak. This “engagement” can be a “like,” sharing the meme with others, or commenting on it. This “activation” is a function of arousal, and this arousal is why social media is so toxic.

There is such a thing as positive emotional arousal. It’s why uplifting stories can go viral and prompt people to be generous. Less uplifting but still positively arousing are videos like “I put some Bee Gees music over North Korean marching,” which has been viewed more than 10 million times onYouTube and never fails to make me smile.

Negative emotional arousal, which activates emotions such as disgust, fear, outrage, and bigotry, is easier to manufacture. If it wasn’t, then there would have been no need for the Incarnation, Good Friday, Easter, and the continuing sanctifying work of the Holy Spirit.

And since the algorithms not only reward the engagement produced by this arousal just as much as that produced by positive emotional arousal but are also indifferent to their veracity or mendacity, why appeal to the better angels of human nature?


And since the algorithms not only reward the engagement produced by this arousal just as much as that produced by positive emotional arousal but are also indifferent to their veracity or mendacity, why appeal to the better angels of human nature?


Incredibly, this possibility never occurred to the engineer-entrepreneur architects of Web 2.0. They believed that the “marketplace of ideas” they were constructing would arrive at the truth via some magical version of “the wisdom of crowds.”

People like Mike Cernovich, who features prominently in Marantz’s book, knew better. They understood that the system, i.e., the algorithms, could be “gamed” and that, to paraphrase what Winston Churchill never actually said, a lie could circle the earth many times before the truth even thought to put on its pants.

No one who has spent any time on social media, whether it’s Twitter, Reddit, or Facebook, can deny that Cernovich and company understood Web 2.0 better than its creators.

As for the rest of us, as I said in Part I, we’ve come to believe that social media is where our politics, and much of the rest of life as well, must be conducted, and we conduct these according to its rules: an emphasis on negative emotional arousal, i.e, fear, outrage, and disgust, and, if not outright mendacity, a less-than-scrupulous concern for the truth and simple fairness.

And Christians are not an exception. A few weeks ago, a report about the lawsuit against David Daleiden and others by Planned Parenthood said that the judge had directed a “guilty verdict” against the defendants.

There’s no such thing as a directed guilty verdict in American law. A jury can direct an acquittal, but not a guilty verdict, which was irrelevant in this instance because the case was a civil, not criminal, suit. The story demonstrated an almost complete lack of knowledge of the difference between criminal and civil justice.

No one caught this, and apparently no one cared since the story was linked to by many reputable Christian sites, including some that should have known better. Why let the basic facts of American law get in the way of outrage and clicks?

How did this happen? Algorithm-driven negative emotional arousal. Outrage may not be a strategy, as we say at the Colson Center, but our domestication by social media is so complete that it doesn’t matter. As Marantz says, social media has changed the way we talk and, in the process, who we are.

And not for the better.

Share


  • Facebook Icon in Gold
  • Twitter Icon in Gold
  • LinkedIn Icon in Gold

Sign up for the Daily Commentary