Fuzzy Link between Extreme Views and Violent Actions
We live in a world where extreme views are the ‘norm’ on almost everything. Even the current America President, Donald Trump, regularly tweets extreme views of just about anything!
Extreme views, whether they be from the Far Right or the Far Left, are normalised by both ends of the political spectrum and given hyped-up expression via social media in our technologically-driven society.
The central and critical question in all of this for society is, what does this embedding, feeding, and pushing of extreme views do to the individual? Does a person with extreme views become more extreme in their actions? If so, does this make society a more dangerous place for all of us?
The answer, like much in our society, is more complex than the simple question suggests. This is where the ‘fuzziness’ of the link between ‘cognition’ (extreme view) and ‘behaviour’ (extreme action) comes into play.
The fact of the matter is, not everyone with an extreme view on a topic or social issue they ‘perceive’ as wrong will act on it. The majority of individuals with extreme and dogmatic beliefs about a particular ‘perceived’ grievance or injustice may passionately debate or even argue with others who hold a different, contrary, or alternative view but will stop short of taking any extreme action like becoming aggressive or violent to others.
However, there are always a few who hold extreme views that will go on to act in extremely violent ways on their dangerous beliefs. Usually, some triggering incident or event (eg., loss of job, discrimination, racism, humiliation, injury and so forth) pushes a person who nurtures extreme views to the point of becoming ‘behaviourally’ motivated to act out in a randomly violent way at others they ‘perceive’ as responsible for their grievances.
The complicated bit is that a person who holds an extreme view is ‘cognitively more susceptible’ to become behaviourally motivated than one with a more moderate or mainstream view. Hence, while the ‘link’ between holding an ‘extreme view’ and taking ‘violent action’ is not a direct one, it is nonetheless, an indirect or fuzzy one.
The sad reality we live with daily in this algorithm-driven world is that there are many opportunities for people to be infected as ‘susceptible’ individuals (ie. people-at-risk of extreme views) and hence ‘open’ to be cognitively primed and radicalised further if the ‘right’, or more precisely ‘wrong’, set of circumstances (ie. triggering incident/event) occurs in their world.
For instance, the social media mega-giant ‘YouTube’ uses ‘personalised’ playlists to not only amplifies extreme views but actively recommends biased and misleading content to users.
A recent story in the Wall Street Journal (Jack Nicas, San Francisco – reprinted in The Australian, Friday, February 9, 2018, p. 3), pointed out that, “YouTube is the new television, with more than 1.5 billion users …. People cumulatively watch more than a billion YouTube hours daily worldwide, a 10-fold increase from 2012 … Behind this growth is an algorithm that creates personalised playlists. YouTube says these recommendations drive more than 70 percent of its viewing time, making the algorithm among the single biggest deciders of what people watch. Recommendations by YouTube often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven’t shown interest in such content. When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more extreme viewpoints.”
YouTube’s personalised algorithm acts as an ‘echo chamber’ to reinforce your own opinions and beliefs. The more extreme your viewpoint the more extreme content will be served up to you. While this is worrying enough when you choose extreme content in the first instance, what is more alarming is YouTube continues to ‘push’ divisive, biased, false and misleading content even when you didn’t choose it in the first place.
As the Wall Street Journal feature notes, “Unlike Facebook and Twitter sites, where users see content from accounts they choose to follow, YouTube takes an active role in pushing information to users they likely wouldn’t have otherwise seen.” To underscore this point, the Wall Street Journal did an experiment and entered the same search term ‘FBI memo’ into YouTube and Google. The ‘FBI memo’ refers to the Republican Party’s release of a memo on how the FBI has sought a warrant authorising covert surveillance on Carter Page, a former Donald Trump adviser. The search results were ‘strikingly divergent’ according to the Wall Street Journal. They found, “On YouTube, after small thumbnails from mainstream news sources, the top result came from BPEarthWatch, which describes itself as dedicated to watching the end time events that lead to the return of Jesus. There were also videos from Styhexenhammer666, whose informational page simply says, ‘I am God’, and from AlexJones, the founder of Infowars, a site that often promotes conspiracy theories. In contrast, a Google search led users to only mainstream news sources.”
Even taking into account this ‘echo bubble’ effect of the YouTube algorithm in the way it automatically skews content towards extreme viewpoints, this trend to give voice to extreme views is well entrenched in social media, and to an extent in mainstream media and political discourse. For example, the mainstream political commentator and Associate Editor of the Weekend Australian newspaper, Chris Kenny, makes the case that “Radical views from the far left are now everyday fare on social media” to such an extent the “One of the consequences of the creeping advance of political correctness that constrains debate in academia, bureaucracy, politics and the media is that the extreme left is normalised. In the polite society of the political/media class, overt condemnation is reserved for the hard right while even the most anarchic or obscene contributions from the green left are tolerated, apparently because their intentions might be pure.” (Kenny, C., ‘The far left imperils economy, alliances and world standing’, feature in The Weekend Australian, February 10-11, 2018., p. 24).
In the final analysis, the ‘fuzzy’ link means ‘extreme views’ are correlated with ‘extreme behaviours’ to varying degrees in an individual. Therefore the risk susceptibility of an individual engaging in violent actions must always be considered an ‘open’ question requiring a number of triage screening tools like CATCH (Cognitive Assessment of Threat Capacity for Harm) to have any realistic hope of ‘finding the few amongst the many’ that will do harm to us in our society.