This article is a study in anxiety with regard to social online spaces (SOS) conceived of as dark. There are two possible ways to define ‘dark’ in this context. The first is that communication is dark because it either has limited distribution, is not open to all users (closed groups are a case example) or hidden. The second definition, linked as a result of the first, is the way that communication via these means is interpreted and understood. Dark social spaces disrupt the accepted top-down flow by the ‘gazing elite’ (data aggregators including social media), but anxious users might need to strain to notice what is out there, and this in turn destabilises one’s reception of the scene. In an environment where surveillance technologies are proliferating, this article examines contemporary, dark, interconnected, and interactive communications for the entangled affordances that might be brought to bear. A provocation is that resistance through counterveillance or “sousveillance” is one possibility. An alternative (or addition) is retreating to or building ‘dark’ spaces that are less surveilled and (perhaps counterintuitively) less fearful.
This article considers critically the notion of dark social online spaces via four broad socio-technical concerns connected to the big social media services that have helped increase a tendency for fearful anxiety produced by surveillance and the perceived implications for personal privacy. It also shines light on the aspect of darkness where some users are spurred to actively seek alternative, dark social online spaces.
Since the 1970s, public-key cryptosystems typically preserved security for websites, emails, and sensitive health, government, and military data, but this is now reduced (Williams). We have seen such systems exploited via cyberattacks and misappropriated data acquired by affiliations such as Facebook-Cambridge Analytica for targeted political advertising during the 2016 US elections. Via the notion of “parasitic strategies”, such events can be described as news/information hacks “whose attack vectors target a system’s weak points with the help of specific strategies” (von Nordheim and Kleinen-von Königslöw, 88). In accord with Wilson and Serisier’s arguments (178), emerging technologies facilitate rapid data sharing, collection, storage, and processing wherein subsequent “outcomes are unpredictable”. This would also include the effect of acquiescence.
In regard to our digital devices, for some, being watched overtly—through cameras encased in toys, computers, and closed-circuit television (CCTV) to digital street ads that determine the resonance of human emotions in public places including bus stops, malls, and train stations—is becoming normalised (McStay, Emotional AI). It might appear that consumers immersed within this Internet of Things (IoT) are themselves comfortable interacting with devices that record sound and capture images for easy analysis and distribution across the communications networks. A counter-claim is that mainstream social media corporations have cultivated a sense of digital resignation “produced when people desire to control the information digital entities have about them but feel unable to do so” (Draper and Turow, 1824).
Careful consumers’ trust in mainstream media is waning, with readers observing a strong presence of big media players in the industry and are carefully picking their publications and public intellectuals to follow (Mahmood, 6). A number now also avoid the mainstream internet in favour of alternate dark sites. This is done by users with “varying backgrounds, motivations and participation behaviours that may be idiosyncratic (as they are rooted in the respective person’s biography and circumstance)” (Quandt, 42).
By way of connection with dark internet studies via Biddle et al. (1; see also Lasica), the “darknet” is
a collection of networks and technologies used to share digital content … not a separate physical network but an application and protocol layer riding on existing networks. Examples of darknets are peer-to-peer file sharing, CD and DVD copying, and key or password sharing on email and newsgroups.
As we note from the quote above, the “dark web” uses existing public and private networks that facilitate communication via the Internet. Gehl (1220; see also Gehl and McKelvey) has detailed that this includes “hidden sites that end in ‘.onion’ or ‘.i2p’ or other Top-Level Domain names only available through modified browsers or special software. Accessing I2P sites requires a special routing program ... . Accessing .onion sites requires Tor [The Onion Router]”.
For some, this gives rise to social anxiety, read here as stemming from that which is not known, and an exaggerated sense of danger, which makes fight or flight seem the only options. This is often justified or exacerbated by the changing media and communication landscape and depicted in popular documentaries such as The Social Dilemma or The Great Hack, which affect public opinion on the unknown aspects of internet spaces and the uses of personal data.
The question for this article remains whether the fear of the dark is justified. Consider that most often one will choose to make one’s intimate bedroom space dark in order to have a good night’s rest. We might pleasurably escape into a cinema’s darkness for the stories told therein, or walk along a beach at night enjoying unseen breezes. Most do not avoid these experiences, choosing to actively seek them out. Drawing this thread, then, is the case made here that agency can also be found in the dark by resisting socio-political structural harms.
1. Digital Futures and Anxiety of the Dark
Fear of the dark
I have a constant fear that something's always near
Fear of the dark
Fear of the dark
I have a phobia that someone's always there
In the lyrics to the song “Fear of the Dark” (1992) by British heavy metal group Iron Maiden is a sense that that which is unknown and unseen causes fear and anxiety. Holding a fear of the dark is not unusual and varies in degree for adults as it does for children (Fellous and Arbib). Such anxiety connected to the dark does not always concern darkness itself. It can also be a concern for the possible or imagined dangers that are concealed by the darkness itself as a result of cognitive-emotional interactions (McDonald, 16). Extending this claim is this article’s non-binary assertion that while for some technology and what it can do is frequently misunderstood and shunned as a result, for others who embrace the possibilities and actively take it on it is learning by attentively partaking. Mistakes, solecism, and frustrations are part of the process. Such conceptual theorising falls along a continuum of thinking.
Global interconnectivity of communications networks has certainly led to consequent concerns (Turkle Alone Together). Much focus for anxiety has been on the impact upon social and individual inner lives, levels of media concentration, and power over and commercialisation of the internet. Of specific note is that increasing commercial media influence—such as Facebook and its acquisition of WhatsApp, Oculus VR, Instagram, CRTL-labs (translating movements and neural impulses into digital signals), LiveRail (video advertising technology), Chainspace (Blockchain)—regularly changes the overall dynamics of the online environment (Turow and Kavanaugh). This provocation was born out recently when Facebook disrupted the delivery of news to Australian audiences via its service. Mainstream social online spaces (SOS) are platforms which provide more than the delivery of media alone and have been conceptualised predominantly in a binary light. On the one hand, they can be depicted as tools for the common good of society through notional widespread access and as places for civic participation and discussion, identity expression, education, and community formation (Turkle; Bruns; Cinque and Brown; Jenkins). This end of the continuum of thinking about SOS seems set hard against the view that SOS are operating as businesses with strategies that manipulate consumers to generate revenue through advertising, data, venture capital for advanced research and development, and company profit, on the other hand. In between the two polar ends of this continuum are the range of other possibilities, the shades of grey, that add contemporary nuance to understanding SOS in regard to what they facilitate, what the various implications might be, and for whom.
By way of a brief summary, anxiety of the dark is steeped in the practices of privacy-invasive social media giants such as Facebook and its ancillary companies. Second are the advertising technology companies, surveillance contractors, and intelligence agencies that collect and monitor our actions and related data; as well as the increased ease of use and interoperability brought about by Web 2.0 that has seen a disconnection between technological infrastructure and social connection that acts to limit user permissions and online affordances. Third are concerns for the negative effects associated with depressed mental health and wellbeing caused by “psychologically damaging social networks”, through sleep loss, anxiety, poor body image, real world relationships, and the fear of missing out (FOMO; Royal Society for Public Health (UK) and the Young Health Movement). Here the harms are both individual and societal. Fourth is the intended acceleration toward post-quantum IoT (Fernández-Caramés), as quantum computing’s digital components are continually being miniaturised. This is coupled with advances in electrical battery capacity and interconnected telecommunications infrastructures. The result of such is that the ontogenetic capacity of the powerfully advanced network/s affords supralevel surveillance.
What this means is that through devices and the services that they provide, individuals’ data is commodified (Neff and Nafus; Nissenbaum and Patterson). Personal data is enmeshed in ‘things’ requiring that the decisions that are both overt, subtle, and/or hidden (dark) are scrutinised for the various ways they shape social norms and create consequences for public discourse, cultural production, and the fabric of society (Gillespie). Data and personal information are retrievable from devices, sharable in SOS, and potentially exposed across networks. For these reasons, some have chosen to go dark by being “off the grid”, judiciously selecting their means of communications and their ‘friends’ carefully.
2. Is There Room for Privacy Any More When Everyone in SOS Is Watching?
An interesting turn comes through counterarguments against overarching institutional surveillance that underscore the uses of technologies to watch the watchers. This involves a practice of counter-surveillance whereby technologies are tools of resistance to go ‘dark’ and are used by political activists in protest situations for both communication and avoiding surveillance. This is not new and has long existed in an increasingly dispersed media landscape (Cinque, Changing Media Landscapes). For example, counter-surveillance video footage has been accessed and made available via live-streaming channels, with commentary in SOS augmenting networking possibilities for niche interest groups or micropublics (Wilson and Serisier, 178). A further example is the Wordpress site Fitwatch, appealing for an end to what the site claims are issues associated with police surveillance (fitwatch.org.uk and endpolicesurveillance.wordpress.com). Users of these sites are called to post police officers’ identity numbers and photographs in an attempt to identify “cops” that might act to “misuse” UK Anti-terrorism legislation against activists during legitimate protests. Others that might be interested in doing their own “monitoring” are invited to reach out to identified personal email addresses or other private (dark) messaging software and application services such as Telegram (freeware and cross-platform).
In their work on surveillance, Mann and Ferenbok (18) propose that there is an increase in “complex constructs between power and the practices of seeing, looking, and watching/sensing in a networked culture mediated by mobile/portable/wearable computing devices and technologies”. By way of critical definition, Mann and Ferenbok (25) clarify that “where the viewer is in a position of power over the subject, this is considered surveillance, but where the viewer is in a lower position of power, this is considered sousveillance”. It is the aspect of sousveillance that is empowering to those using dark SOS. One might consider that not all surveillance is “bad” nor institutionalised. It is neither overtly nor formally regulated—as yet. Like most technologies, many of the surveillant technologies are value-neutral until applied towards specific uses, according to Mann and Ferenbok (18). But this is part of the ‘grey area’ for understanding the impact of dark SOS in regard to which actors or what nations are developing tools for surveillance, where access and control lies, and with what effects into the future.
3. Big Brother Watches, So What Are the Alternatives: Whither the Gazing Elite in Dark SOS?
By way of conceptual genealogy, consideration of contemporary perceptions of surveillance in a visually networked society (Cinque, Changing Media Landscapes) might be usefully explored through a revisitation of Jeremy Bentham’s panopticon, applied here as a metaphor for contemporary surveillance. Arguably, this is a foundational theoretical model for integrated methods of social control (Foucault, Surveiller et Punir, 192-211), realised in the “panopticon” (prison) in 1787 by Jeremy Bentham (Bentham and Božovič, 29-95) during a period of social reformation aimed at the improvement of the individual. Like the power for social control over the incarcerated in a panopticon, police power, in order that it be effectively exercised, “had to be given the instrument of permanent, exhaustive, omnipresent surveillance, capable of making all visible … like a faceless gaze that transformed the whole social body into a field of perception” (Foucault, Surveiller et Punir, 213–4).
In grappling with the impact of SOS for the individual and the collective in post-digital times, we can trace out these early ruminations on the complex documentary organisation through state-controlled apparatuses (such as inspectors and paid observers including “secret agents”) via Foucault (Surveiller et Punir, 214; Subject and Power, 326-7) for comparison to commercial operators like Facebook. Today, artificial intelligence (AI), facial recognition technology (FRT), and closed-circuit television (CCTV) for video surveillance are used for social control of appropriate behaviours. Exemplified by governments and the private sector is the use of combined technologies to maintain social order, from ensuring citizens cross the street only on green lights, to putting rubbish in the correct recycling bin or be publicly shamed, to making cashless payments in stores. The actions see advantages for individual and collective safety, sustainability, and convenience, but also register forms of behaviour and attitudes with predictive capacities. This gives rise to suspicions about a permanent account of individuals’ behaviour over time. Returning to Foucault (Surveiller et Punir, 135), the impact of this finds a dissociation of power from the individual, whereby they become unwittingly impelled into pre-existing social structures, leading to a ‘normalisation’ and acceptance of such systems. If we are talking about the dark, anxiety is key for a Ministry of SOS. Following Foucault again (Subject and Power, 326-7), there is the potential for a crawling, creeping governance that was once distinct but is itself increasingly hidden and growing. A blanket call for some form of ongoing scrutiny of such proliferating powers might be warranted, but with it comes regulation that, while offering certain rights and protections, is not without consequences.
For their part, a number of SOS platforms had little to no moderation for explicit content prior to December 2018, and in terms of power, notwithstanding important anxiety connected to arguments that children and the vulnerable need protections from those that would seek to take advantage, this was a crucial aspect of community building and self-expression that resulted in this freedom of expression. In unearthing the extent that individuals are empowered arising from the capacity to post sexual self-images, Tiidenberg ("Bringing Sexy Back") considered that through dark SOS (read here as unregulated) some users could work in opposition to the mainstream consumer culture that provides select and limited representations of bodies and their sexualities. This links directly to Mondin’s exploration of the abundance of queer and feminist pornography on dark SOS as a “counterpolitics of visibility” (288). This work resulted in a reasoned claim that the technological structure of dark SOS created a highly political and affective social space that users valued. What also needs to be underscored is that many users also believed that such a space could not be replicated on other mainstream SOS because of the differences in architecture and social norms. Cho (47) worked with this theory to claim that dark SOS are modern-day examples in a history of queer individuals having to rely on “underground economies of expression and relation”.
Discussions such as these complicate what dark SOS might now become in the face of ‘adult’ content moderation and emerging tracking technologies to close sites or locate individuals that transgress social norms. Further, broader questions are raised about how content moderation fits in with the public space conceptualisations of SOS more generally. Increasingly, “there is an app for that” where being able to identify the poster of an image or an author of an unknown text is seen as crucial. While there is presently no standard approach, models for combining instance-based and profile-based features such as SVM for determining authorship attribution are in development, with the result that potentially far less content will remain hidden in the future (Bacciu et al.).
4. There’s Nothing New under the Sun (Ecclesiastes 1:9)
For some, “[the] high hopes regarding the positive impact of the Internet and digital participation in civic society have faded” (Schwarzenegger, 99). My participant observation over some years in various SOS, however, finds that critical concern has always existed. Views move along the spectrum of thinking from deep scepticisms (Stoll, Silicon Snake Oil) to wondrous techo-utopian promises (Negroponte, Being Digital). Indeed, concerns about the (then) new technologies of wireless broadcasting can be compared with today’s anxiety over the possible effects of the internet and SOS. Inglis (7) recalls,
here, too, were fears that humanity was tampering with some dangerous force; might wireless wave be causing thunderstorms, droughts, floods? Sterility or strokes? Such anxieties soon evaporated; but a sense of mystery might stay longer with evangelists for broadcasting than with a laity who soon took wireless for granted and settled down to enjoy the products of a process they need not understand.
As the analogy above makes clear, just as audiences came to use ‘the wireless’ and later the internet regularly, it is reasonable to argue that dark SOS will also gain widespread understanding and find greater acceptance. Dark social spaces are simply the recent development of internet connectivity and communication more broadly. The dark SOS afford choice to be connected beyond mainstream offerings, which some users avoid for their perceived manipulation of content and user both. As part of the wider array of dark web services, the resilience of dark social spaces is reinforced by the proliferation of users as opposed to decentralised replication. Virtual Private Networks (VPNs) can be used for anonymity in parallel to TOR access, but they guarantee only anonymity to the client. A VPN cannot guarantee anonymity to the server or the internet service provider (ISP). While users may use pseudonyms rather than actual names as seen on Facebook and other SOS, users continue to take to the virtual spaces they inhabit their off-line, ‘real’ foibles, problems, and idiosyncrasies (Chenault). To varying degrees, however, people also take their best intentions to their interactions in the dark. The hyper-efficient tools now deployed can intensify this, which is the great advantage attracting some users. In balance, however, in regard to online information access and dissemination, critical examination of what is in the public’s interest, and whether content should be regulated or controlled versus allowing a free flow of information where users self-regulate their online behaviour, is fraught. O’Loughlin (604) was one of the first to claim that there will be voluntary loss through negative liberty or freedom from (freedom from unwanted information or influence) and an increase in positive liberty or freedom to (freedom to read or say anything); hence, freedom from surveillance and interference is a kind of negative liberty, consistent with both libertarianism and liberalism.
The early adopters of initial iterations of SOS were hopeful and liberal (utopian) in their beliefs about universality and ‘free’ spaces of open communication between like-minded others. This was a way of virtual networking using a visual motivation (led by images, text, and sounds) for consequent interaction with others (Cinque, Visual Networking). The structural transformation of the public sphere in a Habermasian sense—and now found in SOS and their darker, hidden or closed social spaces that might ensure a counterbalance to the power of those with influence—towards all having equal access to platforms for presenting their views, and doing so respectfully, is as ever problematised. Broadly, this is no more so, however, than for mainstream SOS or for communicating in the world.
Bacciu, Andrea, Massimo La Morgia, Alessandro Mei, Eugenio Nerio Nemmi, Valerio Neri, and Julinda Stefa. “Cross-Domain Authorship Attribution Combining Instance Based and Profile-Based Features.” CLEF (Working Notes). Lugano, Switzerland, 9-12 Sep. 2019.
Bentham, Jeremy, and Miran Božovič. The Panopticon Writings. London: Verso Trade, 1995.
Biddle, Peter, et al. “The Darknet and the Future of Content Distribution.” Proceedings of the 2002 ACM Workshop on Digital Rights Management. Vol. 6. Washington DC, 2002.
Bruns, Axel. Blogs, Wikipedia, Second Life, and Beyond: From Production to Produsage. New York: Peter Lang, 2008.
Chenault, Brittney G. “Developing Personal and Emotional Relationships via Computer-Mediated Communication.” CMC Magazine 5.5 (1998). 1 May 2020 <http://www.december.com/cmc/mag/1998/may/chenault.html>.
Cho, Alexander. “Queer Reverb: Tumblr, Affect, Time.” Networked Affect. Eds. K. Hillis, S. Paasonen, and M. Petit. Cambridge, Mass.: MIT Press, 2015: 43-58.
Cinque, Toija. Changing Media Landscapes: Visual Networking. London: Oxford UP, 2015.
———. “Visual Networking: Australia's Media Landscape.” Global Media Journal: Australian Edition 6.1 (2012): 1-8.
Cinque, Toija, and Adam Brown. “Educating Generation Next: Screen Media Use, Digital Competencies, and Tertiary Education.” Digital Culture & Education 7.1 (2015).
Draper, Nora A., and Joseph Turow. “The Corporate Cultivation of Digital Resignation.” New Media & Society 21.8 (2019): 1824-1839.
Fellous, Jean-Marc, and Michael A. Arbib, eds. Who Needs Emotions? The Brain Meets the Robot. New York: Oxford UP, 2005.
Fernández-Caramés, Tiago M. “From Pre-Quantum to Post-Quantum IoT Security: A Survey on Quantum-Resistant Cryptosystems for the Internet of Things.” IEEE Internet of Things Journal 7.7 (2019): 6457-6480.
Foucault, Michel. Surveiller et Punir: Naissance de la Prison [Discipline and Punish—The Birth of The Prison]. Trans. Alan Sheridan. New York: Random House, 1977.
Foucault, Michel. “The Subject and Power.” Michel Foucault: Power, the Essential Works of Michel Foucault 1954–1984. Vol. 3. Trans. R. Hurley and others. Ed. J.D. Faubion. London: Penguin, 2001.
Gehl, Robert W. Weaving the Dark Web: Legitimacy on Freenet, Tor, and I2P. Cambridge, Massachusetts: MIT Press, 2018.
Gehl, Robert, and Fenwick McKelvey. “Bugging Out: Darknets as Parasites of Large-Scale Media Objects.” Media, Culture & Society 41.2 (2019): 219-235.
Gillespie, Tarleton. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. London: Yale UP, 2018.
Habermas, Jürgen. The Structural Transformation of the Public Sphere: An Inquiry into a Category of Bourgeois Society. Trans. Thomas Burger with the assistance of Frederick Lawrence. Cambridge, Mass.: MIT Press, 1989.
Inglis, Ken S. This Is the ABC: The Australian Broadcasting Commission 1932–1983. Melbourne: Melbourne UP, 1983.
Iron Maiden. “Fear of the Dark.” London: EMI, 1992.
Jenkins, Henry. Convergence Culture: Where Old and New Media Collide. New York: New York UP, 2006.
Lasica, J. D. Darknet: Hollywood’s War against the Digital Generation. New York: John Wiley and Sons, 2005.
Mahmood, Mimrah. “Australia's Evolving Media Landscape.” 13 Apr. 2021 <https://www.meltwater.com/en/resources/australias-evolving-media-landscape>.
Mann, Steve, and Joseph Ferenbok. “New Media and the Power Politics of Sousveillance in a Surveillance-Dominated World.” Surveillance & Society 11.1/2 (2013): 18-34.
McDonald, Alexander J. “Cortical Pathways to the Mammalian Amygdala.” Progress in Neurobiology 55.3 (1998): 257-332.
McStay, Andrew. Emotional AI: The Rise of Empathic Media. London: Sage, 2018.
Mondin, Alessandra. “‘Tumblr Mostly, Great Empowering Images’: Blogging, Reblogging and Scrolling Feminist, Queer and BDSM Desires.” Journal of Gender Studies 26.3 (2017): 282-292.
Neff, Gina, and Dawn Nafus. Self-Tracking. Cambridge, Mass.: MIT Press, 2016.
Negroponte, Nicholas. Being Digital. New York: Alfred A. Knopf, 1995.
Nissenbaum, Helen, and Heather Patterson. “Biosensing in Context: Health Privacy in a Connected World.” Quantified: Biosensing Technologies in Everyday Life. Ed. Dawn Nafus. 2016. 68-79.
O’Loughlin, Ben. “The Political Implications of Digital Innovations.” Information, Communication and Society 4.4 (2001): 595–614.
Quandt, Thorsten. “Dark Participation.” Media and Communication 6.4 (2018): 36-48.
Royal Society for Public Health (UK) and the Young Health Movement. “#Statusofmind.” 2017. 2 Apr. 2021 <https://www.rsph.org.uk/our-work/campaigns/status-of-mind.html>.
Statista. “Number of IoT devices 2015-2025.” 27 Nov. 2020 <https://www.statista.com/statistics/471264/iot-number-of-connected-devices-worldwide/>.
Schwarzenegger, Christian. “Communities of Darkness? Users and Uses of Anti-System Alternative Media between Audience and Community.” Media and Communication 9.1 (2021): 99-109.
Stoll, Clifford. Silicon Snake Oil: Second Thoughts on the Information Highway. Anchor, 1995.
Tiidenberg, Katrin. “Bringing Sexy Back: Reclaiming the Body Aesthetic via Self-Shooting.” Cyberpsychology: Journal of Psychosocial Research on Cyberspace 8.1 (2014).
The Great Hack. Dirs. Karim Amer, Jehane Noujaim. Netflix, 2019.
The Social Dilemma. Dir. Jeff Orlowski. Netflix, 2020.
Turkle, Sherry. The Second Self: Computers and the Human Spirit. Cambridge, Mass.: MIT Press, 2005.
Turkle, Sherry. Alone Together: Why We Expect More from Technology and Less from Each Other. UK: Hachette, 2017.
Turow, Joseph, and Andrea L. Kavanaugh, eds. The Wired Homestead: An MIT Press Sourcebook on the Internet and the Family. Cambridge, Mass.: MIT Press, 2003.
Von Nordheim, Gerret, and Katharina Kleinen-von Königslöw. “Uninvited Dinner Guests: A Theoretical Perspective on the Antagonists of Journalism Based on Serres’ Parasite.” Media and Communication 9.1 (2021): 88-98.
Williams, Chris K. “Configuring Enterprise Public Key Infrastructures to Permit Integrated Deployment of Signature, Encryption and Access Control Systems.” MILCOM 2005-2005 IEEE Military Communications Conference. IEEE, 2005.
Wilson, Dean, and Tanya Serisier. “Video Activism and the Ambiguities of Counter-Surveillance.” Surveillance & Society 8.2 (2010): 166-180.