• image

The Age of Unreason

The Reptilian’s cloaking field breaks down and begins to phase shift, its inhuman visage briefly visible through a haze of holographic error. Slowed down and set to music, it is an eerie, emotive, and strangely beautiful sight. Our alien slavemasters the Annunaki are getting sloppy, not even caring if their true forms are visible to us any more. Wake up, sheeple, wake up and see what is before your eyes!

Or, at least this is what some followers of David Icke and other reptilian “researchers” seem to think. According to this video, which at time of writing has over 155,000 views, it appears that some of his disciples are so seduced by the strange worldview that they see trans-dimentional shapeshifters where others see video glitches or interference errors. A new face for an ancient malevolence, hitherto visualised mentally in dragon statues or crumby drawings of lizard-men. YouTuber MKirkbll comments “Finally! A legitimate shapeshifting video! I so badly wanted to believe. Now I can. Thank you.” Like an X-Files era cliche, MKirkbll here “wants to believe”. And he is so desperate to believe in something, he is willing to believe in anything, as long as it all fits together to tell an understandable story and gives him a sense of belonging.

It is easy to look at such nonsense and laugh, but the existence of such beliefs tell us something much deeper about human psychology and our need to make sense of the world. Since the earliest times humans have together woven complex and colourful mythologies to explain the the world around them, and today is no different. During our evolution, our brains’ storytelling ability acted as a form of data compression to keep track of what information it deemed useful, tying sensory prompts to emotional and behavioural responses. The consequence of using language and stories to keep track of environmental information was the gradual development of a narrative Self. Through studying psychology, we also know how identity construction within a social environment leads to emergent group behaviours that in turn tell us how group narratives are formed.

Some of those lessons are particularly relevant to the online realm, where a breezy brand of digital utopianism has led to a belief that the free flow of information will lead to an end of ignorance and the triumph of reason. Instead, we see the rise of bizarre new ideologies and ideas spreading virally across the web, ushering in not a New Enlightenment, but an Age of Unreason.

Emergent Hierarchies

Group Psychology has been extensively studied over the last half century with theories supported by strong experimental evidence and predictive ability. Leon Festinger’s famous 1956 study of a flying saucer cult documented the moments in which the group’s ideology evolved in light of a failed doomsday prophecy. Cult leader Marian Keech had told her followers the world would end at midnight while they, the chosen few, would be swooped away to safety in the comfort of a spacecraft. However as armageddon failed to materialise, minutes ticked awkwardly by and the cult members began to wonder what was going to happen next. Eventually Keech concocted an absurd excuse to explain why the world had not ended; our prayer averted the apocalypse!

The study, which was a precursor to his theory of Cognitive Dissonance, is famous for predicting which members of the group would drift away and which would rationalise away the failure and turn in into something to strengthen rather than weaken their beliefs. But also interesting is that Festinger reported that prior to Keech’s revelation the rest of the cult essentially brainstormed stories to explain away the failure. What this tells us is that if Keech had failed to concoct a “plausible” reason for the botched prophesy, the rest of her dedicated followers would have together dreamt up something between them. It would not have led to a spontaneous outbreak of reason.

The Asch conformity experiments were another classic study into the psychology of group behaviour. The simple experiment involved small groups of people shown a line on a board, and were asked to match it with one of three lines of differing lengths on a separate board. In normal circumstances, less that 1% of the group got the task wrong; in situations where the experiment was rigged and the wrong line was chosen, other subjects were wrong 36% of the time, and overall 70% of the participants went along with the apparent emerging consensus. This shows how susceptible people are to going along with the group consensus, even when many, when interviewed, knew that what they were doing was wrong. In an overview of the Asch experiments, Cass Sunstein noted that “significantly, the existence of at least one compatriot, or voice of sanity, mattered a great deal” to the the percentage of the overall group that went along with the group consensus. “When just one other person made an accurate match, errors were reduced by three quarters, even if there was a strong majority the other way”. Skepticism works.

From both of these experiments we know that humans are both willing to tie up personal identity with group identity at the expense of objective reality and forgo their own better judgement for the sake of group conformity. But it is Group Polarisation Theory that really gives us an insight into precisely how the mechanics of how social interaction generate new narratives and mythologies that are at odds with objective facts, and also how this strained relationship with empirical reality leads to a drift towards extremism. In summarising the phenomena, Cass Sunstein wrote that “group polarisation arises when members of a deliberating group moves towards a more extreme point in whatever direction indicated by the members’ predeliberation tendency.” (emphasis his). He continues that;

“Two principle mechanisms underlie group polarisation. The first points to social influences on behaviour; the second emphasises limited “argument pools” and the direction in which those limited pools lead group members”

The mechanisms work broadly like this. When certain individuals look to positioning themselves as dominant figure within a group, they project exaggerated opinions that are likely to increase their notoriety and reputation. As they become more certain of their opinions, they become bolder in their assertions. When there are struggles for dominance the stakes are continually raised, and less dominant members will shift their opinions in-line with dominant members, or remain silent to maintain in-group reputation. Because such reinforcement leads to increased certainty, Group Polarisation has a further characteristic in that it makes members of the group more likely to engage in risky behaviour. More risk-taking individuals will gamble on taking more extreme versions of positions to be viewed as thought-leaders, and so forth.

An important facet of the process are the limited “argument pools” that a group draws from, and what is interesting is that in an increasingly connected world one might think that the internet makes people more likely to come across contradictory arguments and stunt the process of polarisation. Internet Guru Clay Shirky argues as such, citing data that shows those with extreme views are more likely to be exposed to information of their “other”. For instance, visitors of moveon.org are more likely to visit foxnews.com. Shirky should spend more time in the trenches if he thinks that visitors of moveon.org are ever going to be convinced by articles on Fox, and vice versa. Far more likely is that they are looking for confirmation bias; trawling for reasons to confirm their pre-existing beliefs and report back to the group. This is a culminate process, by gaining new adherents, ideologies can crowdstorm away the ideological holes, making it seem more and more tangible. A narrative that seems more and more “real”.

It is true that many previously closed groups are opening up online and reaping the benefits of “the crowd”. As technologist David Weinberger writes, networks are smarter than individuals because of their combined expertise trumps the expertise of any one individual, or in his slightly more pithy terms “the smartest person in the room is the room itself”. This sounds like the usual sunny digital utopianism, but Weinberger qualifies this, saying it is far from a rule. “Some networks are dumber – and more insistent about their dumb views – that its smartest members are.” He adds that on the internet;

“We can now see every idiotic idea put forward seriously and every serious idea treated idiotically…. it is hard to avoid at least some level of despair as the traditional authorities lose their grip and before new tools and types of authority have fully settled in. The internet may not be making me and you stupid, but it sure looks like it is making a whole bunch of other people stupid.”

A central problem, according to Weinberger, is that the internet has caused a cultural shift regarding what we constitute as a “fact”.

“The Internet’s abundant capacity has removed the old artificial constraints on publishing, including getting our content checked and verified. The new strategy of publishing everything we find put thus results in an immense cloud of data, free of theory, published before verified, and available to anyone with an internet connection. And this is changing the role facts have played in the foundation of knowledge [...] we have so many facts at such ready disposal that we lose the ability to nail conclusions down, because there are always other facts supporting other interpretations.”

The age of Big Data makes it easier for Polarised Groups to mine for “facts” that appear to support any conclusion they want them to. For those who know no better and just want something to believe in, it can be an intoxicating illusion. Such stories fill a certain vacuum. The abundance of free-flowing information and the collapse of the traditional gatekeepers has had an erosive effect of authorities’ ability to control their respective narratives. Such narratives can either evolve and accommodate new information, such as the Catholic Church’s adoption of heliocentrism and evolution, or reject critical new information and succumb to polarising effects, as has happened to evangelical Christians in the United States with their spiral into creationism and fundamentalism. But even as old narratives adapt to the new information ecology, the general trend (in the West at least) is that religious or political ideologues are collapsing in on themselves, unable to control the flow of information. It has led to a hunger for contraband information that further erodes established authorities, a belief that somewhere in the secret archives of governments and corporations lies information that will make the rest of it make sense. This sentiment was articulated by a speech by Julian Assange in 2011;

“This generation is burning the mass media to the ground. [...] We are reclaiming our rights to world history. We are ripping open secret archives from Washington to Cairo. We don’t know yet exactly where we are. But we can see where we are going. The change in perspective that has happened over the last year is what this generation is going to use to find our lighthouse.”

And as these archives empty into the petabytes of other information freely available online and from there filter through into our everyday lives, the world begins to resemble less a robust story and more a dadaist wasteland of contradiction and confusion. For many, not even the powers of cognitive dissonance can save their beliefs from crumbling apart. So we look for signals in the noise, new ways of constructing information structures from the ruins of the old. Amidst such chaos and uncertainty, Polarised Groups offer just this, something to believe in. A certainty that as we saw in Asch’s experiments, even as it flies in the face of empirical reality, cannot help but draw in new acolytes.

But as you will recall, Asch experiments also showed that the presence of skeptical voices has a powerful dampening effects on irrational group behaviours. So do facts. In his paper “The Law of Group Polarisation” Sunstein cites “persuasive argument” via the injection of accurate empirical information as a key factor in “depolarisation” of groups. The bad news is that the narratives constructed by Polarised Groups have inbuilt defence mechanisms to circumvent the unwelcome presence of facts and skeptical inquiry, geared at only adopting favourable facts and making unwelcome facts taboo. Interestingly, the tactics used by many such groups are eerily familiar, and all seem to make abundant use of the word “troll”.

War of the Trolls

“Real” trolls – those out to provoke emotional responses from people by button-pushing and outright harassment – are undoubtably a ubiquitous phenomena online. On occasion ideological opponents of Polarised Groups, such as atheists haranguing some clueless Christian fundamentalist on YouTube, do set out to “troll” and cause emotional harm either through indifference or malice. However, what we see in Polarised Groups is lumping in all ideological opponents with “real” trolls, while at the same time aggressively looking to advertise their own narratives in a manner that itself amounts to harassment.

When Polarised Groups look aggressively promote their narrative, it comes with a set of morals that guide what can and cannot be questioned. In doing so, they are essentially looking to enforce the reputational boundaries. That is, pushing a worldview in which holding certain opinions or promoting facts that threatens the group has negative social consequences. The social consequence can vary from being “dogpiled” on an internet forum controlled by the Polarised Group when somebody brings up information of views not from the limited a “argument pool”, being banned from the Internet forum, having members of groups downvote social media content or flag video content on YouTube to have it removed, “googlebombing” to negatively effect social reputations, “doxxing” and having one’s real life identity revealed, harassing and petitioning employers to have people’s income and reputation jeopardised, and even calling people’s houses and issuing death threats.

Such an atmosphere has the effect of “silencing” without the need for outright censorship and leads to the loss of moderate voices and an acceleration of group polarisation both within the core and the undecided periphery. When dialogue withers away and banning, blocking and bullying atmospheres are prevalent, it creates the illusion that the narrative is cohesive, and plays on the audience’s need for certainty.

Often, the groups employ a rhetorical armoury of “cut and paste” argumentation, and a specialised vocabulary to limit the spread of new information and empirical facts from infiltrating the argument pools. Young-earth creationists for instance, have a set of off-the shelf arguments to dismiss carbon dating that have no relation to empirical reality. Conspiracy theorists often cry “we have the official documents” when what they have are cherry-picked quotes or pieces of historical curio dredged from the deep web. Criticism of such supposed evidence are proof that you are a “shill”, “stooge” or of course “troll”. Feminist groups that have suffered from group polarisation repeat statistics related to gender studies that have not been verified beyond reasonable doubt, but frame those who questions the validity of such statistics as “anti-women”, “hyperskeptical” or again a “troll”. Proponents of alternative medicine, when reacting to thorough debunkings, often invoke conspiracies and accusations of being in the pay of “big pharma” or that their opponents are “bullies” and, once again, “trolls”.

In all cases, the goals is to marginalise opposing views by making criticism emotionally and socially expensive, and in all cases if feeds the group further into group polarisation via the self-censorship of moderates. When unanalysed assumptions thrive it can lead to a situation known as “informational cascades”, where, in the words of Sunstein “large groups of people end up believing something – even if that something is false – simply because other people seem to believe it too”. Examples of such are 9/11 conspiracy theories that have moved into the mainstream in the last decade, popular no doubt because they come attached to an overarching mythology and alternate history to support unchallenged “facts” and aggressive adherents that look to paint critics as CIA stooges.

By standing up to group pressure and analysing on the basis of evidence rather than unquestioned assumption, we can work towards a position more supported by empirical reality. The hostile political environment in the United States for instance, has led to emotionally polarised narratives with highly questionable relationships to empirical facts, whether this is the Right’s position on climate change, or the Left’s position of GMOs and Nuclear Power. Unfortunately, the goal of clear thinking is confounded by a further factor, by groups who would frame the world in ways for their own political benefit rather than truth, taking advantage of the mental failings outlined above for their own ends.

Deductive reasoning and critical thinking are hard work, and in a sense require we go to war with our own nervous systems, fighting its every quirk and bias and being mindful of unsubstantiated beliefs creeping into our thinking. It is not easy, and few truly attain such clarity of thought. Indeed, we are at our worst when we mistake out biases as facts to the point that we dismiss opposing opinions and create almost without knowing it, the ideal environment for Group Polarisation to occur. In a nightmare scenario, one might become part of a hostile echo chamber and not even realise it, mistaking one’s limited argument pools for diversity of thought. To the point where one mistakes easily explainable phenomena as evidence of exotic belief systems, and sees shapeshifting aliens from Draco where others just see noise.

Follow

Get every new post on this blog delivered to your Inbox.

Join other followers: