Digital Enclaves Under the Logic of Accumulation
Jiayin Luo
In the past few years, conspiracy theories’ growing influence on mainstream political discourse has become far more noticeable. With recent examples such as the far-right conspiracy group QAnon or the anti-vaccination movement, Jeff Tollefson has commented that previous research had been “turned upside down” when the US president brought fringe conspiracies to the mainstream discourse (192). As the prevalence of conspiratorial thinking poses significant threats, such as adverse health and social outcomes to believers of conspiracies surrounding COVID-19 (Van Prooijen et al. 4), scholars have sought to describe the structures of the kind of deliberative process that a functioning democracy necessitates.
For Afsoun Afsahi, there exists different types of digital enclaves, namely the democratic counter-public and the anti-democratic counter-public, with the latter being responsible for the degradation of the public discourse (Afsahi 2). Here, Afsahi uses the term counter-public to refer to a group of people who are habitual in their dissent of the larger public narrative. Among the anti-democratic counter-publics, a further distinction is made between an inward-looking (isolated) anti-democratic counter-public and an outward-looking (inquiring) anti-democratic counter-public, with only the outward-looking anti-democratic counter-public actively engaging with the larger public sphere.
I will focus on Afsahi’s account of the two anti-democratic counter-publics to explore the underlying mechanism that leads to the creation of conspiracy theories, which have been defined as “attempts to explain the ultimate causes of significant social and political events and circumstances with claims of secret plots by two or more powerful actors” (Douglas et al. 4). I will argue that the popularization of conspiratorial thinking is the product of the anti-democratic counter-public’s structure. Moreover, I will show that digital platforms not only permit the creation of the anti-democratic digital enclave, but also that its creation is incentivized through the power of digital platforms’ algorithms. To do this, I will use Shoshana Zuboff’s notion of the logic of accumulation in the scheme of surveillance capitalism, which can be understood as the need to constantly grow and expand for an entity to survive under capitalism.
One of the primary functions of digital enclaves — such as Reddit communities or Facebook groups — vis-à-vis counter-publics is to “sustain counterhegemonic discourse, challenging established systems of domination and legitimating and publicizing political claims by the powerless and marginalized” (Warf and Grimes 260). In a sense, digital enclaves serve as a space of empowerment for marginalized social groups; examples of this include online forums for LGBTQ+ communities (Afsahi 12). Additionally, researchers have found a negative correlation between conspiratorial thinking and the perception of socio-political control (Bruder et al. 11), suggesting that conspiracy theories may serve the same psychological function of empowering individuals who perceive themselves as powerless and marginalized. In conjunction with conspiracy theories, digital enclaves are, in another sense, the ideal spaces for discourse surrounding conspiratorial thinking.
Unified by a shared sense of marginalization and powerlessness, members of anti-democratic digital enclaves become susceptible to psychologically empowering conspiratorial thinking since it offers a sense of affirmation. On the other hand, a democratic enclave is able to empower its members without them gravitating towards conspiratorial narratives, thanks to one of the criteria that defines a democratic counter-public: according to Afsahi, “the mechanism of democratic enclaves do not make it impossible for publicity, listening, or justification to take place” (Afsahi 13). In other words, the structure of a democratic enclave allows for healthy epistemic practices to take place, and this factor instrumentally prevents conspiracy theories from taking root in a democratic counter-public. Furthermore, the mechanism of anti-democratic enclaves reinforces its members’ belief in prescribed narratives. In the case of an inward-facing digital enclave, members are actively discouraged from engaging in the larger public sphere, with the narrative of this counter-public framing itself as the “sole accurate view of the world” (Afsahi 14), presenting out-group members as complicit in their marginalization.
The nature of anti-democratic counter-publics can make it difficult to interact with individuals who have facts or ideas that contrast with entrenched narratives. This photograph depicts an in-person incident on March 27, 2021 in Denver, Colorado, where individuals denying the results of the 2020 US presidential election came face-to-face with counter-protestors trying to minimize the demonstration’s effect and redirect attention towards unresolved systemic issues such as police brutality and corresponding lack of accountability. Photo by Colin Lloyd on Unsplash.
When structural mechanisms invalidate the voices coming from the outside of the digital enclave, it effectively becomes an echo chamber, whose effect is “to amplify and corroborate a message, often uncritically, through as many channels as possible” (Carver 1056). This distrust of the world that is external to the anti-democratic digital enclave does not erode its members’ capacity for rational thinking; instead, it forces them to misplace their trust. Thi Nguyen suggests that “[m]embers of an echo chamber are not irrational but misinformed about where to place their trust” (Nguyen 23). Here, he draws a comparison between the members of an echo chamber and the members of a cult, in that both structures work to erode one’s trust of any resources outside of their bounds. In this sense, an inward-facing anti-democratic digital enclave can be understood as an epistemic structure that guides one’s functioning rationality away from an objective reality. This structural factor may provide an explanation for one’s seemingly irrational acceptance of conspiratorial beliefs; however, it does not provide an account for how the conspiracies are spread.
On the other hand, an outward-looking anti-democratic counter-public, such as the white supremacist groups highlighted in Afsahi’s example, “seeks to solidify and promote a particular worldview,” with its members adopting tactics such as “doublespeak and dog whistle” (Afsahi 18), which are language choices that purposefully disguise the true intention of words, either by its ambiguity, or its codified nature. These language choices allow them to avoid accountability when engaging with the larger public sphere in a duplicitous manner, which erodes trust in the larger public sphere, undermining “the possibility of publicity, listening, and justification” (Afsahi 17). These practices also further exacerbatea “crisis of legitimacy of authoritative institutions,” which has served as an anchor for evidence claims and norms of reasoned debate (Bennett and Livingston 4). Thus, it becomes clear that the outward-looking anti-democratic digital enclave acts as an instrument for the propagation of a prescribed narrative and is unconcerned with quality epistemic practices or with facilitating or engaging in constructive public discourse. But what can we discern from the agency of the individuals in this type of digital enclave? Do they simply lack the cognitive faculties that allow one to make solid epistemic judgements?
To explain the behaviour of these actors, we can turn our attention to Michael Huemer’s account of political irrationalities. Huemer frames one’s irrational dispositions to conspiracies and other irrational dispositions in their political beliefs as an abandonment of epistemic rationality, which differs from instrumental rationality. While epistemic rationality is concerned with the evidence and the logic of one’s argument, instrumental rationality is concerned with one’s prudence “in choosing the correct means to attain one’s actual goals” (Huemer 6). In the case of political irrationalities, one is essentially favouring their instrumental rationality over their epistemic rationality. This preference reflects one’s prioritization of self-interest over the quality of the epistemic practices within which they are engaged, either consciously via deception or subconsciously via cognitive biases and logical fallacies.
I believe this is the case for the individuals in the outward-facing anti-democratic counter-public. The notion of self-interest here can be interpreted as financial gains or social gains, which may manifest as paychecks for hosting an alt-right radio show, campaign contributions for a career politician, or the increase in follower numbers for a Twitter troll. Although the roles of a radio show host and career politician predate the existence of digital platforms, the latter role of a Twitter troll is only made possible by the existence of digital platforms and forums. This is particularly troublesome, as the sheer number of such actors, enabled by social media, makes it increasingly difficult to identify doublespeak and dog whistle, and to hold the speakers accountable (Afsahi 21). When incentivized and enabled, the individuals of these outward-looking anti-democratic counter-publics will inevitably become deeply invested in engaging in agitating behaviours, such as trolling or dog whistle, in the larger public sphere. This engagement between these individuals and the larger public sphere will further spread their conspiratorial beliefs and subsequently contaminate the public sphere with conspiratorial thinking. Moreover, the incentivizing mechanism displayed here is also enabled by another aspect of digital platforms: their algorithmic capacities.
When discussing digital enclaves, it is important to remember that they are constructed spaces that mediate discourse under the framework of digital platforms; the technologies that enable the existence of digital enclaves are an inseparable aspect of it. Further, as Shoshana Zuboff puts it, “technologies are constituted by unique affordances, but the development and expression of those affordances are shaped by the institutional logics in which technologies are designed, implemented and used” (Zuboff 85). Thus, digital enclaves will also express the same institutional logic that has shaped the technologies that enabled it, with the said technology coming in forms of data extraction and algorithms. On this point, Alexander Galloway provides an insightful addition, “there is no essential difference between data and algorithm” (Galloway 33). The algorithms are simply the means of interfacing with the extracted data. This is to say that algorithms sit on the boundary of raw data and their humanistic interpretations, forcing us to consider the function of data and algorithm as an inseparable whole. For our purposes, I will use the term algorithms to broadly define the process of extracting, analyzing, and application of data.
Algorithms are developed with the explicit purpose of advertising; as Zuboff states, “Google’s business is the auction business, and its customers are advertisers” (Zuboff 79). Digital platforms are for-profit actors operating primarily in the market sphere within which the logic of accumulation has become deeply entrenched. As a result of this, the algorithms developed and used by digital platforms also embody this logic of accumulation. According to Zuboff, this logic of accumulation commodifies reality itself, meaning data about every facet of reality are collected and operationalized by digital platforms for the explicit purpose of capital accumulation. One of the principal aspects of information on reality comes from “everydayness,” one’s day-to-day activities in their use of the digital platforms; this everydayness comes from one’s need for “self-expression, voice, influence, information, learning, empowerment, and connection” (Zuboff 79).
The activities of a digital enclave’s members constitute a representation of this everydayness, and subsequently, data surrounding these activities are collected by digital platforms under the logic of accumulation. This collection of data on individual behaviours entails an informational asymmetry enabled by one-sided data extraction practices. In essence, digital platforms have gained the ability to perform surveillance on their users. Operating not so different from the physical structure of a panopticon, surveillance capitalism replaces existing powers with its own structural power, subjugating the users of the digital platforms to their authority (Zuboff 79). Furthermore, by utilizing informational technologies with the logic of accumulation, and with the help of an asymmetrical power dynamic between the digital platform and its users, platforms are able to reinforce the informational asymmetry, covertly decode one’s behavioural patterns when interacting with digital enclaves and carry out continuous experiments to modify user behaviours in a way that serves the overarching logic of accumulation (Zuboff 85).
Based on the logic of accumulation ingrained in the structures of surveillance capitalism, we can also reasonably deduce some of the specific objectives of digital platforms’ algorithms. Most notably, these platforms extract data from user engagement; hence, to accumulate as much data as possible, the algorithms will seek to maximize user engagement on their platforms. On a digital platform, a maximizing principle as such may manifest itself in the favouring of a particular type of narratives, or a particular type of rhetorical practice. I will explore the possible effects of this maximizing principle by examining some of the features that define the different types of digital enclaves.
One of the common features shared by the two subtypes of anti-democratic counter-publics is that both look to promote or reinforce the adoption of a singular narrative as their end objective. In contrast, a democratic counter-public does not seek to do the same; instead, it aims to promote healthy epistemic practices, eliminating the potential for an unexamined dominant narrative. The anti-democratic counter-public’s dogmatic adoption of a singular narrative can be exploited to serve the interest of an algorithm’s maximization of user engagement. The nature of a singular narrative would entail far less complexity in the algorithm’s recommendations of articles and resources appropriate to the group’s view, which may bring it higher levels of attention, as it will be relatable for and agreeable to all who see it, allowing it to be uncritically amplified.
When it comes to user engagement maximization in relation to outward-looking anti-democratic counter-publics, one key feature stands out. As previously discussed, this subtype of anti-democratic counter-public engages with the larger public sphere via the use of dog whistle and doublespeak, allowing them to avoid accountability in their speech. However, another key feature of such practices is the direct user engagement it will induce on digital platforms. Dog whistle and doublespeak allow individuals to disseminate highly controversial views in the public sphere, either to provoke others or identify with members of the same enclave. While both practices can elicit user responses, the engagement is far more active when the doublespeak is identified by an out-group member. For example, when US House Representative Paul Gosar posted a tweet that referenced Holocaust denial, articles had to be written, first to highlight the double speak, then to debunk the underlying message; even then, Gosar is able to approach such allegations with some plausible deniability, sparking further controversy while occupying more of the public’s attention (Schwenk). This behaviour, which may also be called trolling, produces hostile arguments online that are not constructive, and are often damaging to the wider public discourse. As Zuboff points out, digital platforms hold a “formal indifference” to a user’s actions or speech, insofar as it can be converted into data (Zuboff 79). For digital platforms, these rhetorical practices perfectly serve the user engagement maximizing principle.
As Zuboff points out, digital platforms hold a “formal indifference” to a user’s actions or speech, insofar as it can be converted into data.
One criterion that is yet to be mentioned is the use of a bonding narrative of a democratic digital enclave versus the use of a bridging narrative by an anti-democratic digital enclave. For Afsahi, a bonding narrative aims to generate solidarity, such as in the case of the online LGBTQ+ groups (Afsahi 12), while a bridging narrative aims to foster understanding with ones who hold different dispositions, with the latter being more difficult to utilize (Afsahi 6). By virtue of being more difficult to carry out, enclaves that utilize bridging narratives will be placed at a disadvantage when competing with the internal bonding narratives of other enclaves. When competing in the larger public sphere, individuals using bonding narratives will remain entrenched in their enclave, while bridging narratives may see less success in attaining its prospective goals. Furthermore, as bonding narratives also function to elicit strong emotions from its users (Afsahi 6), such emotions are sometimes reflected by more intensive use of the digital platforms, allowing the algorithm to further maximize user engagement.
By examining the algorithm’s relationship with the differentiating features of digital enclaves under the assumptions of the logic of accumulation, we can see that the features of anti-democratic digital enclaves are structurally favoured by algorithms, which hold maximizing user engagement as one of their operating principles. Moreover, due to its structure, anti-democratic counter-publics are particularly susceptible to being corrupted by conspiratorial thinking and misinformation. Thus, I put forward the following claim: the algorithms of all digital platforms are susceptible to conspiratorial thinking and misinformation since they favour anti-democratic digital enclaves over its democratic counterpart, which serves as a kind of septic tank for conspiracies and misinformation.
To tackle the epistemic challenges of conspiratorial thinking and misinformation, there are a few directions that can be taken. First, one can examine the common function of counter-publics and conspiratorial thinking. While both serve to psychologically empower its subscribers, it does so in different ways. Namely, a democratic-counter public may seek to empower its members by equipping them with tools for a constructive democratic discourse, while a conspiracy seeks to provide the perception of control via a prescribed narrative. This is a feature we must keep in mind when combating conspiratorial thinking and misinformation, as mere criticism, while constructive, may contribute to one’s belief of disempowerment by out-group members. Second, digital platforms can adjust their algorithms to avoid favouring anti-democratic digital enclaves; however, as we have established, such an adjustment will run contrary to the logic of accumulation deeply entrenched in these digital platforms. This brings us to my final point: in order to bring about meaningful systemic changes, one may need to tackle the logic of accumulation directly by freeing the digital platforms from their capitalistic aspirations.
This academic essay is licensed under Creative Commons License CC-BY-NC 4.0.
References
Afsahi, A. “Enclave Deliberations In The Public: A Typology Of Enclaves In The Digital Age.” 2021. University of British Columbia.
Bennett, W. Lance, and Steven Livingston. “A Brief History of the Disinformation Age: Information Wars and the Decline of Institutional Authority.” The Disinformation Age:
Politics, Technology, and Disruptive Communication in the United States, edited by W. Lance Bennett and Steven Livingston, Cambridge University Press, Cambridge, 2020, pp. 3–40. SSRC Anxieties of Democracy.
Bruder, M., Haffke, P., Neave, N., Nouripanah, N., & Imhoff, R. “Measuring individual differences in generic beliefs in conspiracy theories across cultures: Conspiracy mentality questionnaire.” Frontiers in Psychology, vol. 4, 2013. https://doi.org/10.3389/fpsyg.2013.00225
Carver, Nico. “Media Echo Chamber.” The SAGE International Encyclopedia of Mass Media and Society. Edited by Debra L. Merskin, SAGE Publications, Inc., 2020, pp. 1056-57. https://dx.doi.org/10.4135/9781483375519.n406
Douglas, Karen M., Joseph E. Uscinski, Robbie M. Sutton, Aleksandra Cichocka,T urkay Nefes, Chee Siang Ang, and Farzin Deravi. “Understanding Conspiracy Theories.” Supplement: Advances in Political Psychology, vol. 40, no. s1, 2019, pp. 3-35.
Galloway, Alexander. The Interface Effect. Polity, 2012.
Huemer, Michael. “Why People Are Irrational About Politics.” 2016. Retrieved from https://rintintin.colorado.edu/~vancecd/phil3600/Huemer1.pdf
Schwenk, Katya. “Once Again, Gosar Posts White Nationalist Dog Whistle, Holocaust Denial Reference.” Phoenix New Times, Phoenix New Times, 2 May 2022, www.phoenixnewtimes.com/news/gosar-attracts-numbers-on-far-right-with-latest-dark-message-13530174.
Nguyen, C. T. “Why it’s as hard to escape an echo chamber as it is to flee a cult.” Aeon. 2018. Retrieved December 16, 2021, from https://aeon.co/essays/why-its-as-hard-to-escape-an-echo-chamber-as-it-is-to-flee-a-cult
Tollefson, Jeff. “Tracking QAnon: how Trump turned conspiracy-theory research upside down.” Nature, vol. 590, no. 7845, 11 Feb. 2021, pp. 192-193.
Van Prooijen, Jan-Willem, et al. “Conspiracy Beliefs Prospectively Predict Health Behavior and Well-Being during a Pandemic.” Psychological Medicine, 2021, pp. 1–25., doi:10.1017/s0033291721004438.
Warf, B., & Grimes, J. “Counterhegemonic discourses and the internet.” Geographical Review, 1997, vol. 87, no. 2, pp. 259. https://doi.org/10.2307/216008
Zuboff, Shoshana. Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 2015, vol. 30, no. 1, pp. 75–89. https://doi.org/10.1057/jit.2015.5
Jiayin Luo is an undergraduate student majoring in philosophy at UBC. He is interested in social and political philosophy.