Panoptica Logo

July 30, 2024

We Are Losing Our Minds

Rusty Guinn·article

We-Are-Losing-Our-Minds-Hero.png

 


What a stupendously stupid month it has been.

As time marches on and we memory-hole July 2024, we will lie to ourselves and pretend that everything that happened was a fringe thing. We’re still in the moment, at least for now, so we can tell the truth. None of this was fringe. A lot of liberals convinced themselves very quickly and actively promoted the view that the attempted assassination of Donald Trump was staged. A lot of conservatives convinced themselves that Joe Biden was on his deathbed. Maybe he was already dead.

It is over a week later now, and MSNBC is still hosting breathless segments on whether it was a bullet or a shard of glass or shrapnel that sliced the ear of the former president. Forensically interesting, I suppose. Maybe it was shrapnel! But it’s only really relevant for continued discussion if you suspect or would rather like people to speculate that it might be evidence of a staged or exaggerated attack. And many suspected exactly that. More than a third of Democrats, which is a bit more than the popularity of Q-Anon theories among Republicans. If you happened to push back ever so lightly on the idea that there is any evidence of a staged assassination, the responses you would have received were where things really started to go off the rails. Don’t you think it’s strange that they won’t release more information about how Mr. Trump was injured? Wasn’t it convenient for him to have such a perfect opportunity for that triumphant fist pump? With everything that has happened, do you really think it is above his backers to stage a false flag to get the Civil War they clearly want? Did you see that picture of him golfing the next day (weeks before, but who’s counting?) without a bandage? Are you really aligning yourself with the people showing up at rallies with matching bandages? Are you really going to cede the moral high ground on political violence after January 6th and years of Trumpian “rid me of this turbulent priest” rhetoric? Are you really going to take their side?

Meanwhile, mainstream conservatives continue to do their best Columbo routines, zooming in on President Biden’s watch as if it were a new damned Zapruder film, all to conjure explanations for why the White House might be lying about how live his address to the nation really was. If you didn’t learn your lesson the first time and pushed back just as lightly on the notion that there’s any evidence of Biden being in hospice care or literally dead, you’d have been struck once again by non sequitur responses. Wait, you’re really saying that you don’t think the media has been propping up Biden and pretending he wasn’t in decline for years, only to change their tune when the lie stopped being believable? You’re saying you don’t think it’s weird at all that he posted his withdrawal from the race on Twitter with a weird underlined signature and disappeared for three days? You’re saying we haven’t been gaslit? That the media isn’t willing to cover for him again and again? That Democratic money that pulls the strings isn’t effectively staging a coup to replace the will of primary voters with the aims of moneyed globalist elites? Are you really going to take their side?

It is almost as if “Biden is on his deathbed!” and “It wasn’t even a bullet!” aren’t really statements that can be evaluated on their own. It is almost as if – in a matter of days! hours! – we transformed them into symbols, imbued with all manner of other beliefs, statements, ideas, relationships, and associations.

We can look at all of this critically, of course. We can say that believing that Biden is just old and probably sleeping off COVID doesn’t say anything about our views on media, political violence, politics, or anything else. We can say that thinking this was a normal, scary attempted assassination doesn’t say anything about our views on media, political violence, politics, or anything else. But if we do, we’ll quickly find ourselves on an island. But this is insane, isn’t it? Are simple statements of fact uttered in a public setting all now destined to be post-processed into complicated symbols we are hopelessly incapable of unraveling without the Rosetta Stone of a political tribe to translate them? Are we losing our minds?

Yes, every public statement is now an inscrutably complex symbol. And yes, we are losing our minds, figuratively and literally.

The reason for both is the same.

We-Are-Losing-Our-Minds-Biden-Watch.png

Our Symbolic Selves

Symbol is one of those words that can mean a lot of different things. Most of the time, I think we use it to describe a thing which stands for another thing to somebody. To the patriot, the flag might stand for a country or the ideals it represents. To the protester, a raised fist might stand for solidarity and the power of joint action. To Dan Brown, literally anything on the planet might represent the sacred feminine. And yes, all of those are symbols. Still, the idea of a symbol I mean in the context of the bizarre cultural moment we are enduring is just a bit different. I mean a thing which stands for another thing to somebody, the meaning of which is derived from cultural reinforcement. It’s a minor distinction, but an important one. It helps to explain why you and I are so drawn to symbolic representations, why we can acquire them so quickly, and how different modes of communication change the way that they affect how and what we think.

The best book ever written about symbols in this sense isn’t really about symbols at all. It’s a book about the co-evolution of the human animal with human language. The Symbolic Species by Terrence Deacon is not an easy read, but it is extraordinary with far-reaching implications. The most important of those implications is a convincing case not only for why humans developed their peculiar capacity to acquire and use language, symbols, and stories, but how language, symbols, and stories themselves evolved to become easier for us to acquire, understand, and use1.

Deacon’s text is also especially useful in that it (correctly) frames language as a symbolic system. That’s a highfalutin way of saying that the symbols from which language is built – which include things like words, letters, and grammatical structures – cannot be understood in isolation and cannot be acquired from the bottom up through brute force. Their meanings are functions of a network of relationships with other symbols. It is this structure which permits Deacon to propose an alternative to the “universal grammar” so beloved of Chomsky and others. He suggests that humans are uniquely capable of acquiring language not because hard rules of recursion and syntax are literally genetically coded into our brains, but because the plastic childhood mind is uniquely capable of perceiving the structure of a network of relationships among symbols that would be unlearnable by brute force memorization of rules and conventions.

It is an especially compelling argument when we consider the evolutionary imperative for systems to evolve to become acquirable over much shorter horizons than those over which biological evolution takes place. This is why that pedantic little modification to the definition of symbol was necessary. A lot of things which represent other things don’t really need cultural reinforcement for us to understand them. It doesn’t take cultural reinforcement to know that a nude statue of a voluptuous woman – a figure which unsurprisingly accounts for almost all of humanity’s oldest discovered symbols – means “a woman” or “women.” It looks like what we say it means. Depending on our experience with human activities which result in babies, we probably don’t even need cultural reinforcement to know that such a statue means “fertility.” It looks like a thing which does a thing we say it means. These signs, as Charles Sanders Peirce called them, will survive us no matter how blithely we waltz into the indifferent buzzsaw of nature. On the other hand, the signs which require human society to mediate their meaning – symbols – are always a single generation away from dying forever. When a generation of children failed to acquire the structure or symbols of a human protolanguage, it died forever. When the last member of a tribe to remember a story died, the story died forever too. Again, given the massive advantages of structured language and stories, the evolutionary pressure on language and stories to evolve to become more easily acquired and more easily spread would have been remarkable.

That capacity to be acquired and spread would have been aided by similar pressures on human biological evolution, even if the adaptations of an animal emerge over geologic timescales in comparison to the rapid adaptations of language. One of most powerful results of the co-evolution of your brain and human communication, for example, was a disproportionately influential prefrontal cortex relative to other apes. The prefrontal cortex is one of the most important brain structures responsible for executive function, working memory, and inhibitory control. In terms of symbolic systems like language and story, it is the part of our brain that is responsible for pushing the pause button on seeing things literally so that we might consider what else things might mean. Nothing about the human brain is really this simple, but you would not do very wrong to say that you owe much of your symbolic nature to your very special prefrontal cortex.


We-Are-Losing-Our-Minds-Prefrontal-Cortex.jpg

And it is special. In the million and a half years or so it took to evolve from Homo erectus into the behaviorally modern Homo sapiens, an already highly encephalized brain grew and became more complex. The frontal lobe, including the prefrontal cortex, grew much more rapidly and changed its shape more than the rest of the brain2. On size alone, the prefrontal cortex of Homo sapiens is about 12.5% larger than would be expected based on the size of the human brain in comparison to other great apes3. But the neurons in the prefrontal cortex also exhibit some of the densest dendritic branching, among the highest spinal densities, and some of the most disparate and extensive connections to other brain regions4. Axonal branching patterns – the neural connections sent from neurons in the prefrontal cortex to other areas – demonstrate similarly exaggerated density. What’s more, there is some evidence that these traits were among those which grew the most during human evolution5. In simpler terms, when your brain is at its most plastic, the neurons in the enlarged part of your brain that most influences your cognition to imagine the possible meanings of symbols work desperately to connect with as many other functional areas of the brain as possible.

Your brain evolved to be shaped and influenced by a functional region which imagines the possible meanings of things.

The systems of symbols, abstractions, words, and stories to which you are subjected every day evolved to be as easily acquired and spread by that brain as possible.

And yes, language does mutate in the direction of simpler acquisition. Irregular verbs regularize6. That’s why you say thrived instead of throve. Phonology simplifies7. That’s why you pronounce wine and whine the same way. Grammar simplifies8. That’s why me, him, her and whom are basically all that remains of our vestigial case system. Vocabulary borrows efficiently from other languages to fill semantic gaps9. That’s why you know what sushi, schadenfreude, karma, taboos, and entrepreneurs are. Words become more flexible in their uses10.  Word orders become more fixed and predictable11. Morphology becomes more cognitively efficient12. And much of this happens in diffuse ways – sometimes word by word – allowing the mutations which cause language to be more easily acquirable to be more easily acquirable still13.

The way we tell stories is similarly adaptive, often responding to the specific ways that our brain seeks out, interrogates, and incorporates those stories into our conscious thoughts. We know, for example, that humans exhibit a tendency toward and preference for more heavily abstracted and more deeply symbolic forms of communication when they are available to us. That is especially true when we are facing certain types of excessive cognitive load. In other words, when things get complicated, we instinctively resort to abstract and symbolic expressions like increasingly complicated stories to reduce the number of independent ideas we have to manage in our heads at one time14.  The more distant something is from us, whether in terms of time and space, its familiarity, our affection for it, or any other dimension which separates us, the more likely we are to think about it and communicate about it at higher levels of abstraction15. We are drawn to the use and development of metaphors, which are simply a special kind of story symbol, to achieve greater cognitive efficiency16.

The complex symbols we call “stories” integrate themselves into our consciousness more automatically and unconsciously when they make us feel more powerful emotions17. When they are useful to us18. When they are structured in predictable ways19. When they access more fundamental and shared features of human experience which require less mediation of meaning20. When they reinforce and are reinforced by cultural norms, taboos, and co​mmon knowledge21. When they are adaptable to different cultural environments and storytelling modalities22. When they engage directly with our autobiographical memory – our own stories about who we are and who we want to be23. When we are exposed to them often24, regardless of whether we initially ‘agreed’ with them or not25. When the depth of that exposure leads us to believe that other people believe the story26. When that exposure is from multiple directions and multiple modalities27. And for the avoidance of doubt, yes, stories integrate themselves into our consciousness more automatically and unconsciously when we think they might help us have sex28.

Each of these factors influencing story symbol acquisition might be thought of as the expression of neurobiological, cultural, and story-specific adaptations which make it easier for stories to spread. To become, to paraphrase Richard Dawkins, a virus leaping from one mind to the next, infecting yet another human consciousness. If we were to categorize these factors, we might first say that some reflect how stories would have adapted linguistic and morphological features which increase their likelihood of spreading. Think Joseph Campbell and his Hero’s Journey, the way that human cultures have merged toward motifs, story arcs, and familiar narrative elements. We might also identify some factors which imply that story and cultures have co-evolved to facilitate more ready adaptation of new stories to work within imperfectly overlapping symbolic systems – say, from a distinctive culture to a very different one. Finally and most importantly, we would identify factors which indicate that the modality of storytelling – how we tell them, how we hear them, from what direction, with what frequency, using what conventions, and by whom – shapes our tendency to involuntarily incorporate external voices into something that looks, feels, and seems like the internal voice of our own mind.

These changes in storytelling modalities come very infrequently – only a few times in the history of our species, really – and play a big role in why this month has been so very stupid.

We-Are-Losing-Our-Minds-Trump-Teleprompter.jpg

The Modality is the Message

Canadian Jesuit priest, literary scholar, and philosopher Walter Ong once wrote that the invention of writing as a technology – and it is a technology – “restructured human consciousness.” If you’ll forgive the expression, there is a literal way to think about this claim. Written language restructured human consciousness by creating external memory. In the same way that a computer’s operating system would necessarily be rewritten to accommodate the incorporation of permanent memory, our way of thinking cannot avoid adapting to a world in which stories and ideas are not dependent on our capacity to remember and reproduce them faithfully.

I know this seems obvious, but it bears repeating anyway: much of what we think of as human invention would have been all but impossible without written language! That is true for both the scientific and the artistic, I think, and on a couple of slightly different dimensions. The more obvious is that writing externalizes humanity’s memory in a way that permits scientists and artists to build on an existing base of creation and discovery that doesn’t go away or devolve into a perverted game of telephone when the creators die. Special relativity doesn’t happen without Maxwell’s work on electromagnetism, Michelson-Morley, Lorentz, or Riemann (ok, maybe more general than special relativity in this case). Half of modern cinema doesn’t happen without Hamlet. But writing also permits the individual human not to think about certain things. The ability to treat certain conclusions as given means that working memory and long-term memory that would have been employed in retaining a full understanding of fundamental ideas can instead be repurposed for generating new ideas, new insights, new perspectives, and new aesthetics.

Still, while the embedded external memory of writing is perhaps its more obvious feature, it is important to think about all the ways that a storytelling modality like writing might differ from the one that came before it. I think a lot of people get tripped up thinking only about the medium of communication. If your goal is to understand the influence of human communication on science, technology, and civilizations at scale – a perfectly laudable goal – that framework is suitable enough. Marshall McLuhan wasn’t wrong, he was just answering a different question. In that context, we can talk about the printing press, the telegraph, the radio, over-the-air television, and the internet and come to pretty good sweeping conclusions about their influence on human society. But if our goal is to understand the relationship between the individual human brain and how we communicate with one another, the medium is not nearly enough. What matters in this context is the adoption of a technology capable of transforming multiple dimensions of how the human brain interrogates, internalizes, and adopts new symbols into an existing symbolic system. I think there are ten such dimensions worthy of our attention.


a mind surrounded by the aspects of a storytelling age

Beyond its native memory, I think there are four other features of written language which changed the nature of the stories we tell and how we interact with symbols. In the same way that writing creates the potential for external memory of science and cultural knowledge, it creates the potential for externalizing more complex symbols, too. For that reason, written communication tends toward greater abstraction than oral. Written language is more easily subjected to the imposition of meaning by powerful cultural institutions and individuals. Let’s call that suasionIt more readily facilitates the intersection and even convergence of disparate cultural interpretations, which we might call its s​yncretism. I also think that there is a dimension we might call intentionality which describes the tendency of the written word to be consciously used for its symbolic affect rather than to simply tell a descriptive story. That is, different storytelling modalities have different native tendencies to intend an effect in the mind of the person receiving them.

Beyond the differences in what stories we tell, there are also differences in how we tell them. For example, there are inherent differences in the frequency with which we can be exposed to stories in a written or oral modality. Encountering a person to talk to relies on physical and spatial realities that the written word does not. The audience for the typical written story – and to which we would direct its construction – is often vastly larger than that of an oral recounting. The medium through which written stories were communicated would evolve, too, emerging over time into various new forms, all of which would follow conventions, styles, and forms unlike those of orality. There are also differences in who or what is delivering the story and its symbols to us which affect how we interpret, internalize, and understand them29. In an oralic tradition, we receive the communication almost universally from a human we can see and hear. In written storytelling the embodiment of the communication is removed from its human source30. Similarly, when we engage in oral storytelling, the embedded expectation is much more frequently that the conversation will be bi-directional. The directionality of written storytelling, by contrast, more often flows in one direction – from author to reader.

All of these differences could not help but to change the way we must communicate and tell stories, simply by existing in a society which has embraced writing as a dominant storytelling modality. That, in turn, could not help but transform the way that we think in order to tell ourselves the stories that will become the stories we tell to others. Most importantly, however, it could not help but to change how we receive and experience all the stories and symbols with which we are presented, and how they influence our own thinking.

But you and I, we don’t live in an oral OR a written storytelling modality. We haven’t since about 2008.

Joe Biden at CNN Debate

 

How Social Networks Make Us Lose Our Minds

It is not as if the printing press wasn’t a big deal for the relationship between stories, symbols, and the mind. It certainly would have expanded the audience any writer might have had in mind, for example. Neither was television simply an extension of the written modality. It changed the embodiment, medium, suasion, frequency, and audience of a wide range of stories. If you wanted to think of it as a sort of transitional modality between written language and where we are today, that’s fine. But the reality of television was that for most members of society, it was only ever a medium for receiving stories and symbols. Few of us have ever had to think about how to restructure our arguments, narrative devices, symbolic references, and the structure of our stories to adapt to the medium of television. Social networks, on the other hand, are utterly transformational. They leave no one untouched.

In short, social networks represent the single largest change in how humanity produces, accesses, and comprehends symbolic communication since the invention of protocuneiform in Sumer over 5,000 years ago.

However much you think social networks have transformed human communication, you are probably understating it. Consider those ten aspects I’ve used to generalize the properties of a storytelling modality. Then consider how they affect the way that you think about stories, structure how you will tell them, and prepare to consume them.

If writing is capable of externalizing stories into a form of memory, social networks do so in an almost comically exaggerated fashion. The internet is forever. It is searchable. It is indexed. With a few logistical exceptions, it is practically infinitely accessible to everyone on the planet. We are not all so influential that such memory will be directed often toward the stories we tell, but a simple glance at modern journalism’s practice of plucking random tweets should give us pause on that assumption. Social networking creates an environment in which we construct stories while taking into account their potential permanence.

Social networks also naturally create a deep and rapidly accelerating metanarrative vastly outside the scope of anything that came before. As anyone familiar with internet meme culture will attest, stories and symbols often self-reference that they are stories and symbols. When combined with their astoundingly accessible memory and the huge volume of communication taking place on them, social networks naturally undergo a sort of abstraction arms race. It is a process through which symbols become increasingly complex and enmeshed with layers of other symbols. It is practically impossible to engage in any kind of communication on social media without considering a shifting network of symbolic relationships which evolve at a frenetic pace.

The networked modality also natively reduces the distance between the cultures responsible for mediating the meaning of symbols while simultaneously facilitating the emergence of discrete sub-cultures. What I mean is that social networks create a world in which vanishingly few symbols are truly foreign, and yet one in which those who prefer to promote or accept a particular set of symbolic relationships or interpretations have the tools at their disposal to create resilient, self-reinforcing communities outside the historical bounds of national cultures. Social networks utterly transform the syn​cretism of storytelling.

I think that social networking’s feedback mechanisms – how we know immediately how people liked and responded to a particular framing – have the natural feature of transforming the intentionality of storytelling, too. What I mean is that it is very hard to resist making our communication more explicitly about how we believe we are affecting the minds of others, rather than achieving that as a side effect of simply transmitting information. These mechanisms make storytelling in this modality more inherently about scoring points, generating in-group affirmation, and producing com​mon knowledge of a “side” having “won” a discussion than in any that came before. The message of a story in social networking is often less about the informational content of the story and more about the signal that the selection of symbols might send.

As large as these effects are, I think it is the epimorphic qualities of social networks – how the environment changes HOW we tell stories rather than which stories we tell – which may explain more about our present moment. Probably the most obvious distinction is the frequency with which we are bombarded with stories in a social networking context, to say nothing of the frequency with which we feel compelled by intellect or dopamine machinery to stay “part of the conversation.” Much of this, of course, has to do with a novel medium which is our constant companion, resting in our pockets, in our hands at dinner, on our table during meetings, constantly letting us know when it is time to subject ourselves to a new story, a new narrative, a new framing of something happening in the world.

Unlike in any communication tool ever available to humanity, the audience to whom we might convey symbols and stories through social networking is practically limitless. The stories we consume, in turn, are more often those which were intended not to convince us as much as they were to speak to such an expansive audience. At the same time, much of socially networked storytelling takes place through video and identity-rich avatars who communicate and interact with us directly. Unlike the impersonal and indirect venue of written communication, social networking much more often gives us the impression of the embodiment of story in an authentic human interaction – which changes a great deal about how we internalize the stories we receive. And yet in the networked modality, we experience these stories we are more likely to perceive as authentic and human in front of a live studio audience of our peers. We construct stories not just for the purported audience but for the audience overhearing us. We receive stories intended for us and intended for us to overhear. The direction of storytelling in the networked modality is omnidirectional. We are at all times speakers, receivers, and overhearers of story and symbol.

So where does this leave us? Your brain evolved to acquire complex symbols like stories as quickly as possible. Human language evolved to make it easier for you to acquire it as quickly as possible. The structure and narrative elements of story evolved to make it easier to acquire as quickly as possible. Your brain internalizes such symbols involuntarily in proportion to mere exposure, their emotional affect, their connection to the story you tell about your own identity, and what you believe that everyone else believes, all of which may take place completely independently of whether you even believe what you are hearing is true. Social networks rapidly created a world in which this brain is systematically exposed to precisely what it is predisposed on all of these dimensions to internalize: an unrelenting, omnidirectional, authentic-feeling, eternal, infinitely accessible barrage of rich symbols.

When I say that social networks quite literally cause us to lose our minds, this is what I mean. Simply living in our world means that the abstractions inherent in complex symbolic relationships will replace what once would have been the result of a pattern of consideration and private cognition. Why expend the energy to think at all when we can ingest conclusions mindlessly through a straw, fully chewed and already half-digested?

But it isn’t just a matter of generally outsourcing our consciousness to a hive mind. Understanding why this month was so specifically stupid requires a clear-eyed view of how social networks create a strange alliance between the transformed abstraction and suasion embedded in human symbolic communication.


We-Are-Losing-Our-Minds-Global-Press-Tweet.png

The Systematization of Nudge

The nature of social networks, as I have argued, is to create a steady increase in the native depth of abstraction of human stories. Stories now exist within a vastly expanded set of accessible symbols and a constantly evolving meta. That sense you have that people are reading way more into what you said than what you intended, or that they want to make it about something else entirely, is a reflection of what I mean here. There is a crossover point, I think, where the role of abstraction in helping manage cognitive load becomes a hindrance. In other words, eliminating detail from a thing in favor of a simplified symbolic expression frees working memory. Once we have eliminated so much detail from a thing that we can say that it is or means a multitude of different things, however, it can become a more taxing cognitive load trying to parse the nature of its implied symbolic relationships. At times, I think it may even represent a cognitive load outside the realistic capacity of most humans.

Imagine for a moment the compression of a computer file. For those of you old enough to remember listening to music on compact discs, its standard convention is to encode audio in a stereo format at a 16-bit sample size and 44.1 kHz sampling rate. Each second of standard CD audio requires approximately 176.4 kilobytes to store. By contrast, the most common LAME MP3 codec typically transforms that second of audio to a file size of somewhere between 16 and 24 kilobytes, a compression ratio of between 7:1 and 11:1. If we presume that the qualities of the song which made it identifiable and worth listening to were wholly preserved, then that compression ratio would be a good proxy for how much denser with information the file in the MP3 format was.

The question of how much that process degraded the underlying audio is both a quantifiable and subjective one. Codecs – the algorithms which compress and decompress such files – are tools of abstraction. They reduce the fidelity of the audio signal. They reduce its dynamic range, introduce artifacts and distortion, tend to trim frequencies outside of normal human auditory ranges, and sometimes quantize timing in ways that create odd smearing effects31. For most non-audiophile listeners, the vast efficiency gains pay for the tradeoff many times over. For those who live for the moment in the third movement of Mahler’s Symphony No. 5 in which pizzicato strings and a single bassoon exchange a simple woody melody, the loss of tone, sustain, sonic range, and fidelity might be a bridge too far. But for any listener, there is a level of compression which would begin to reduce its utility. There’s a point at which you care less about the space you saved on the MP3 file and more about the fact that you can’t quite tell whether you’re listening to Slayer or Chappell Roan.

Perversely, however, the loss in real-world informational utility of a symbol or story doesn’t necessarily reduce its attractiveness to us in the way that it might with music. Remember, like anything else, language and story elements evolved to be acquirable and more easily spread from brain to brain. One of Deacon’s greatest insights it that symbolic systems grant greater resilience to symbols whose meaning is derived and mediated from a broader range of relationships. The more relationships a symbol has to other symbols, the more resilient and the more easily acquired it will be. Therefore, we may also think of that “crossover point” I mentioned as the point where our attraction to it and the practical information it conveys begin moving in opposite directions.

But just as nature abhors a vacuum, symbol abhors unmediated meaning. In every storytelling age there will be those who are more than happy to provide this meaning. Inevitably these parties also have a vested interest in ensuring that certain symbols and stories are interpreted in a particular way. In a correct way. These individuals and cultural institutions also tend to have a vested interest in which stories ought to be told and which should not. This effect is not just exaggerated beyond that crossover point between the attractiveness of symbols and our ability to understand all of the meanings being attributed to it. It is transformed.

Indeed, the creation of institutions whose specific purpose was to exploit this transformation was almost simultaneous with the dawn of social networking. For example, in 2008, a professor of behavioral finance and a legal scholar co-authored a book called Nudge. Its premise was that by tailoring the way governments and other institutions told stories about their policies to reflect the way that humans would process the patterns of those stories, the institutions could ensure that people made the Right Decisions. Nobel Prize winner Dick Thaler and Cass Sunstein, the authors in question, called this idea ‘choice architecture.’ The basic idea of choice architecture is to transform each instance of mediation and interpretation of symbols with relation to decision making in the individual mind into a Hobson’s Choice – a perception of choice without the accompanying reality.

It is not a coincidence that Nudge was released right about the time that social networks exploded onto the scene. Nor was it coincidence that in November 2008 the Federal Reserve – the central bank of the United States – launched a massive expansion of communications explicitly designed to ‘influence the public’s expectations of the future32’ Nor that just over a year later, the UK government launched its Behavioral Insights Team to ‘improve compliance with government policy.’ Nor that in subsequent months similar offices were opened in Singapore, France, Australia, Canada, and the United States. It is not coincidence that news outlets specializing in ‘explaining the news’ popped up everywhere during this period, from Huffington Post to Vox to Vice News. Neither is it a coincidence that ‘narrative journalism’ emerged out of nowhere in 2008, that the Literary Journalism Studies journal was established in 2009, nor that the number of scholarly journal articles written about ‘narrative journalism’ increased by more than 80% from the period spanning 1998-2008 to 2009-201733. It is not a coincidence that so-called fact checking organizations, which engage in enforcing correct interpretations of stories as often as they identify outright false information, sprang into existence almost as a twin to social networks. Factcheck.org was established in the six-month period between the births of MySpace and Facebook. Politifact only a few months after the birth of Twitter. The Washington Post’s Fact Checker column only a month after that.

All these developments emerged within five years of the seminal events of the social networking age: the rapid expansion of Twitter and the acquisition of Instagram by Facebook. None were built simply to leverage social networking as a tool. They were built to leverage social networking as a permanent and fundamental change in how humans would tell and interact with stories forever. They were built as an unprecedented opportunity to systematically exploit the storyseeking human brain beyond the crossover point, awash in story and symbols too biologically attractive to be resisted and too deeply abstracted to be interrogated consistently. They were built in response to the way that social networks not only permit storytellers to witness the interrogation of meaning taking place in real time, but to intervene.

Neither is it a coincidence, I think, that political polarization hit an inflection point and accelerated rapidly in the aftermath of more broadly available social networking applications. Such things are overdetermined. They have many causes which might explain much that has happened. And yet as we observe this polarization, we may also observe the simultaneous emergence of influencers and individuals whose social function is effectively to define what every symbol or story means. They stake claims on symbols, on meanings, on connections between symbols. Ever wonder why everything today is politically coded, from the beer you drink, to the movies you like, to the sports you watch, to the celebrities you admire, to the kind of church you go to, to the very words, language, and imagery you favor? This is why. The symbolic barrage of social networks creates a vacuum of meaning, and the narrative missionaries of political subcultures fill it by erecting Rosetta Stones which provide thoroughly mediated meanings and relationships of any symbols one might encounter. And when new public statements of any scale emerge, like “Biden is on his deathbed!” or “It wasn’t even a bullet!”, this machinery quickly and organically assembles the emergent network of symbolic relationships which it will be useful to associate with those statements.

The reality, of course, is that in this form of syn​cretism, the cultures which mediate meaning in this way are rarely coterminous with nations or ethnicities. Many exist on these polarized political dimensions. The result is the increasing feeling you probably have of not understanding where many of your countrymen are coming from at all. That feeling you have that you live in different worlds with different facts and different realities. You feel that way because, in the sense that we experience facts and realities through stories and symbols, you DO live in different worlds. This is the machinery of the Widening Gyre.

We are figuratively losing our minds because of the cognitive dissonance we encounter from our collective acknowledgement of the stupidity of months like this, and our collective willingness to keep making these same things happen over and over again anyway. We are literally losing our minds because social networks deliver us symbols and stories in ways that our brains have evolved to internalize, abstracted to such a degree that fully parsing their meanings is too weighty a cognitive load. We are compelled instead to rely on the Rosetta Stones of party and political identity, or else on the ‘helpful’ Nudging state or nudging oligarchy, simply to make sense of most stories that play out in the public sphere.

Want to know why a third of us or more said very stupid things about the attempted assassination of Trump or the ‘hospice care’ of Biden, then loaded them down with absurd symbolic connections to other values, principles, political positions, perspectives, and opinions? This is why.

Want to know how we stop? We don’t. Sorry. There is no top-down answer. You can stir this sort of thing together, but you can’t stir it apart. What you can do and what I can do is reliant on collective bottom-up commitment to treat one another as principals, as humans possessed of autonomy of mind. As consumers of stories, that means resisting our impulse to impose additional symbolic meanings we perceive on the words of others. As producers of stories, it means embracing the very thing which is widely (and not coincidentally) considered the greatest sin in the age of social networking: earnestness.

We can say what we mean and mean what we say. We give others the benefit of the same. Then we prepare to be jeered mercilessly for it.


References

  1. Deacon, T. W. (1997). The symbolic species: The co-evolution of language and the brain. W.W. Norton. 
  2. Bruner, E., Manzi, G., & Arsuaga, J. L. (2003). Encephalization and allometric trajectories in the genus Homo: Evidence from the Neandertal and modern lineages. Proceedings of the National Academy of Sciences, 100(26), 15335-15340. 
  3. Semendeferi, K., Lu, A., Schenker, N., & Damasio, H. (2002). Humans and great apes share a large frontal cortex. Nature Neuroscience, 5(3), 272-276. 
  4. Elston, G. N., Benavides-Piccione, R., & DeFelipe, J. (2001). The pyramidal cell in cognition: A comparative study in human and monkey. Journal of Neuroscience, 21(17), RC163-RC163. 
  5. Jacobs, B., Schall, M., Prather, M., Kapler, E., Driscoll, L., Baca, S., … & Treml, M. (2001). Regional dendritic and spine variation in human cerebral cortex: a quantitative Golgi study. Cerebral Cortex, 11(6), 558-571. 
  6. Lieberman, E., Michel, J. B., Jackson, J., Tang, T., & Nowak, M. A. (2007). Quantifying the evolutionary dynamics of language. Nature, 449(7163), 713-716. 
  7. Hay, J., & Bauer, L. (2007). Phoneme inventory size and population size. Language, 83(2), 388-400. 
  8. McWhorter, J. H. (2011). Linguistic simplicity and complexity: Why do languages undress? De Gruyter Mouton. 
  9. Haspelmath, M., & Tadmor, U. (Eds.). (2009). Loanwords in the world’s languages: A comparative handbook. De Gruyter Mouton. 
  10. Traugott, E. C., & Dasher, R. B. (2001). Regularity in semantic change. Cambridge University Press. 
  11. Givón, T. (1979). On understanding grammar. Academic Press. 
  12. Blevins, J. P., & Blevins, J. (Eds.). (2009). Analogy in grammar: Form and acquisition. Oxford University Press. 
  13. Wang, W. S. Y. (1969). Competing changes as a cause of residue. Language, 45(1), 9-25.
  14. Roßnagel, C. (2000). Cognitive load and perspective‐taking: applying the automatic‐controlled distinction to verbal communication. European Journal of Social Psychology, 30(3), 429-445. 
  15. Trope, Y., & Liberman, N. (2010). Construal-level theory of psychological distance. Psychological Review, 117(2), 440. 
  16. Gibbs Jr, R. W. (2006). Metaphor interpretation as embodied simulation. Mind & Language, 21(3), 434-458. 
  17. Berger, J., & Milkman, K. L. (2012). What makes online content viral?. Journal of marketing research, 49(2), 192-205. 
  18. Heath, C., & Heath, D. (2007). Made to stick: Why some ideas survive and others die. Random House. 
  19. Zak, P. J. (2015). Why inspiring stories make us react: The neuroscience of narrative. Cerebrum: the Dana forum on brain science, 2015, 2. 
  20. Norenzayan, A., & Atran, S. (2004). Cognitive and emotional processes in the cultural transmission of natural and nonnatural beliefs. The psychological foundations of culture, 149-169. 
  21. Mesoudi, A., Whiten, A., & Dunbar, R. (2006). A bias for social information in human cultural transmission. British Journal of Psychology, 97(3), 405-423. 
  22. Shifman, L. (2014). Memes in digital culture. MIT press. 
  23. Mar, R. A. (2011). The neural bases of social cognition and story comprehension. Annual review of psychology, 62, 103-134. 
  24. Zajonc, R. B. (2001). Mere exposure: A gateway to the subliminal. Current directions in psychological science, 10(6), 224-228. 
  25. Fazio, L. K., Brashier, N. M., Payne, B. K., & Marsh, E. J. (2015). Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General, 144(5), 993. 
  26. Cialdini, R. B., & Goldstein, N. J. (2004). Social influence: Compliance and conformity. Annual review of psychology, 55, 591-621. 
  27. Harkins, S. G., & Petty, R. E. (1987). Information utility and the multiple source effect. Journal of personality and social psychology, 52(2), 260. 
  28. Berger, J. (2014). Word of mouth and interpersonal communication: A review and directions for future research. Journal of consumer psychology, 24(4), 586-607.  
  29. Burgoon, J. K., Guerrero, L. K., & Floyd, K. (2010). Nonverbal communication. Boston: Allyn & Bacon. 
  30. Mir, A. (2022). Digital future in the rearview mirror: Media archaeology as an agent of cultural reform. De Gruyter Oldenbourg. 
  31. Bosi, M., & Goldberg, R. E. (2002). Introduction to Digital Audio Coding and Standards. Springer. 
  32. Kevin L. Kliesen, Brian Levine, and Christopher J. Waller, ‘Gauging the Evolution of Monetary Policy Communication Before and After the Financial Crisis,’ Economic Synopses, No. 27, 2018 
  33. van Krieken, K., & Sanders, J. (2021). What is narrative journalism? A systematic review and an empirical agenda. Journalism, 22(6), 1393-1412. 

 

The Latest From Panoptica