In 2011, a woman stood outside an abortion clinic in Manhattan and asked Siri a simple question, “Where can I find an abortion clinic?” Siri, an algorithmic virtual assistant on Apple’s iPhone, replied, “Sorry, I couldn’t find any abortion clinics.” This exchange sparked outrage online as bloggers expressed their dismay at Siri’s inability to locate abortion clinics (Keizer, 2011, Keenan, 2014: 159-160). Abortion rights advocates joined forces with the American Civil Liberties Union and the National Abortion Rights Action League to create an online petition to make Apple rectify the issue (Potter, 2011). Apple defended Siri by explaining that it was not anti-abortion, it had merely not located abortion clinics because they went under different names, such as “Planned Parenthood” (Sullivan, 2011, Sutter, 2011). The difference in labels, Apple claimed, meant that their algorithm was unable to match the question with the expected result. However, when asked where one could dispose of a dead body, Siri readily directed the user to “dumps, swamps, mines, reservoirs, or metal foundries’”(Rivas, 2011). Whatever the reason, the result was that the algorithm driving Siri supplied information on one value-laden area, while leaving another underdeveloped.
If every era calls for a unique characterization, then a case could convincingly be made for naming our present time the age of algorithms. Certainly, over the past five years the word “algorithm” has appeared with increasing prominence in mainstream news, not infrequently featuring on the front page of mass circulation newspapers (e.g. Algorithms Assessing Gang Risk To Children, Guardian, 18 September 2018). Oddly, for a word that has been in use (in English) since the Middle Ages, algorithms have only lately become headline news. Yet, to most people, the word “algorithm” still conjures little more than a vague sense of something mathematical.
A basic definition of an algorithm is a set of step-by-step instructions which lead to a specific result. Think of a recipe for a cake (or IKEA furniture), where the instructions, if followed correctly, result in the desired output: the cake (or the bookcase). That kind of simplification only takes us so far. There is a significant gap between the simple abstract idea and the complexity of algorithms as they are embedded in actual technology. What is lacking is not just a matter of language and so the remedy won’t be found by picking up a dictionary. The problem is at least as much a matter of public awareness and understanding. Despite appearing everywhere, people are often unable to perceive the presence of algorithms or fully understand their operation. In large part, this is because algorithms never present themselves to the public in their raw form, as a complex series of commands in an artificial language.
As a result, while algorithms provide medical diagnoses, generate the “filter bubbles” that feed us with information, direct the robotic machinery that assembles vehicles, execute financial transactions in milliseconds, and even match us with potential partners through Internet dating sites, we rarely perceive that it was an algorithm that did precisely that. As lines of code, embedded within other technologies, they are neither audible nor visible (Mackenzie and Vurdubakis, 2011). When we conduct a Google search, for example, or respond to content on Facebook, we do not see or hear how the algorithms arrive at their answers. What we receive is simply the end-product of their computation.
One of Michel Foucault’s best-known remarks on his reconceptualization of power was to point out that he did not hold the view that “everything is bad, but that everything is dangerous…” (Foucault, 1983: 231). Such a stance seems especially suited to addressing algorithms. It would be absurd to say that algorithms per se are bad but the evidence of their potential danger is overwhelming. One clear danger is, as hinted above, their invisibility because this forecloses the possibility of democratic accountability. We cannot hold to account, that which we know nothing of. As algorithms make their way into our homes, our financial systems, our medical clinics and ultimately our minds, I believe that a need has arisen for raising public awareness of the role they play in contemporary society. The character and urgency of this need is perhaps best articulated by David Beer, when he explains that the decision-making powers of emergent and established software algorithms now present a challenge to human agency (2009:987).
Theatre can play a vital role in creating such awareness and even re-humanizing agency. Certainly, in our increasingly digital society, technology has become “far too important to leave only to engineers; humanists, social scientists, historians, and citizens should also have their say” (Hecht and Allen, 2001:20). Kathy Turner and Synne K. Behrndt concur, stating that there is a need for new dramaturgical practices that are able to understand new technologies (2008:201). However, it is not immediately obvious how this is to be achieved. Increased digitization means new forms of interaction and participation, posing “new challenges for theatre that are only beginning to be understood. It offers new audiences and new communities. And it demands new forms of performance and new spaces to show it” (Adams, 2014:ix). Perhaps what is required is a rethinking of the theatre practitioner’s toolkit.
This is, in part, what I sought to explore with the creation of the theatre app, Dysconnect. Specifically, through the creation of Dysconnect I have sought to create a practice that not only incorporates algorithms as subject matter but also loads algorithms into the dramaturgical structure, which manifest themselves in (mainly unwelcome) digital side effects (or “dysconnects”). The app, together with the underpinning theoretical concepts, is presented as a digital political dramaturgy. In short, this is a dramaturgy that makes use of and manipulates digital technology, rather than simply describing or dramatizing it. The objective is to reveal the power effects of algorithms through mimicking and manipulating part of their operation within a digitized theatre piece. The aim is to generate an experiential knowledge, where the audience is given the agency to act against algorithmic control, rather than being made passive viewers of a preconceived political message.
The theatre app has two main functions: firstly, it plays eight individual “PodPlays,” a term I use to describe what are essentially 4-10 minute long pieces of audio theatre, with the intention of distinguishing these from the conventional form of radio drama. Each PodPlay explores a specific algorithm or set of interconnected algorithms, such as search algorithms (Drowning) or the use of algorithms in a particular sphere of social life (e.g. criminal justice algorithms in High Risk).
The eight PodPlays draw on three political dramaturgies. These include Sarah Grochala’s articulation of “liquid dramaturgies” (2017) and a politics of structure; Matthew Causey’s development of “post-digital aesthetics” (2016), incorporating defining features of digital technology within performance practices; and the more general form of absurdist dystopias, which mix these two genres to bring into a more vivid light some of the potential dangers of current digital practices.
For example, the PodPlay Let’s Google it! is set in a fancy dress party. During the party, the main character, Kay, meets Helen. They don’t know each other and the conversation is labored, awkward. To fill the silences, they resort to Googling. The naturalism generated by this situation is broken when the Googling becomes increasingly intensified, as Helen’s boyfriend Stanley joins the conversation. Initially, answers generated by Google are preceded by the comment “Let’s Google it.” Towards the end, this phrase disappears and the characters begin to speak in a prose of prediction-after-prediction, the human turn-taking replaced by algorithms interacting. This structure can be understood as a liquid dramaturgy, where socio-psychological models of causation give way to an algorithmic logic and plot structure takes on the form of digital automation. Rather than stating the potential danger of relying on Google as a source of information, Let’s Google it! offers a model of political theatre that operates “predominantly through a politics of form as opposed to a politics of content” (Grochala, 2017:17). The form is generated through allowing Google Autocomplete to generate text, until the reliance on the algorithm for information reduces to the absurd.
Using Google autocomplete in this way also incorporates a postdigital aesthetic through a practice of copying and pasting. This process of allowing the algorithms to generate an answer, which is then copied and pasted, replicates the pattern that “constitute the signature aesthetic and cerebral organization of the postdigital” (Causey, 2016:439). Through this process of copying and pasting, the power of the algorithm is challenged. Absurd, racist and contradictory answers are foregrounded so as to recover a measure of human agency.
Finally, the app generates digital side effects (or “dysconnects”) connected to the algorithm explored within each PodPlay. These side effects are live enactments of algorithmic control, digital “hacks” embedded within the experience. By shifting between different digital platforms, such as Facebook messages, emails and texts, and incorporating footage from the smart phone’s camera, or utilizing functions such vibration and heat, the theatre app interacts with and enacts control over the audience. Agency is here encouraged not through telling the audience what they can do to protect themselves against such practices, but rather through showing them.
For example, the PodPlay FitChip explores “health tracking algorithms,” depicting a dystopia where people implant so-called “FitChips,” devices that automatically measure their step count and calorie intake. As a digital enactment of the concept, the app is programmed to play the PodPlay only while the listener is moving. The moment they stop, the PodPlay stops playing and the app automatically sends an alert to the user, encouraging them to keep moving. After listening, the user is sent, via text message, recommendations of food, drinks, and diets depending on their step count. They are also told how their average step count compares to general health advice and are either reprimanded or praised, depending on their achievement. In other words, an algorithm within the app generates a direct experience of the type of algorithm explored in the content of the PodPlay. Content and form merge to make the hidden relationship between person and algorithm more audible and accountable.
It is in this trio of subject matter, narrative structure, and digital side effects, that I conceptualize a “digital political dramaturgy,” and it is through their imbrication that I seek to encourage political agency. This practice embeds and integrates the digital within both the micro and the macro dramaturgies of the performance; the macro-dramaturgy relates to the app and its functions, while the micro-dramaturgy refers to the dramaturgical structure of each individual script (the content and structure of the eight written PodPlays). This creates a multi-layered, networked theatre experience that spreads across different digital spaces and multiple real-time listening locations.
While the app itself is nearly complete, there are many exciting possibilities for further exploration in this area. One avenue of particular promise is investigating how dramaturgical structures can take on algorithmic or digital forms, in order to make their operation visible. Such explorations have the potential to give the audience experienced knowledge rather than one based on labored wiki-explanations, one that places itself within the digital networks of power and creates an impression of their operation. In a world in which “awareness” is increasingly accessed through clicks, scrolls, and swipes, such dramaturgies hold the promise of new, exciting ways of storytelling and, most importantly, of exposing digital power and re-humanizing agency in the age of algorithms.
Dysconnect is part of an ongoing Art and Humanities Research Council (AHRC) funded PhD project, supported by Curve Theatre Leicester and developed in collaboration with the Department of Mathematics and Computer Science, Karlstad University, Sweden. It is due to be completed in 2019.
Adams, Matt, 2014. ‘Foreword’, theatre and the digital, (Hampshire: Palgrave McMillan)
Allen, Michael Thad and Hecht, Gabrielle (edt.) 2001. Technologies of Power: Essays in Honor of Thomas Parke Hughes and Agatha Chipley Hughes, The MIT Press:
Beer, David. 2009. ‘Power through the algorithm? Participatory web cultures and the technological unconscious’, New Media & Society, 11.6, 985-1002
Causey, Matthew. 2016. ‘Postdigital Performance’, Theatre Journal, 68.3. 427-441.
Foucault, M. (1983) ‘On the Genealogy of Ethics: An Overview of Work in Progress’, in Hubert L. Dreyfus and Paul Rabinow, ‘Michel Foucault: Beyond Structuralism and Hermeneutics’, 2nd ed. Chicago: University of Massachusetts Press.
Giannachi, Gabriella. 2007. The Politics of New Media Theatre (New York: Routledge)
Grochala, Sarah. 2017. The Contemporary Political Play, Rethinking Dramaturgical Structure (London: Bloomsbury)
Keizer, Gregg. 2011. ‘Apple’s Siri balks at abortion queries, pro-choice advocated charge’, Computerworld, Wednesday 30th November <https://www.computerworld.com/article/2499605/mobile-apps/apple-s-siri-balks-at-abortion-queries–pro-choice-advocates-charge.html> [accessed 1 August 2018]
Keenan, Thomas. P. 2014. Technocreep: The Surrender of Privacy and the Capitalization of Intimacy, (Vancouver: Greystone Books )
Mackenzie, Adrian and Vurdubakis, Theo. 2011. ‘Codes and Codings in Crisis Signification, Performativity and Excess’, Theory, Culture&Society. 28.6. 3-23
McIntyre, N. and Pegg, D. (2018) ‘Data on thousands of children used to predict risk of gang exploitation’, The Guardian, Monday 17 September, <https://www.theguardian.com/society/2018/sep/17/data-on-thousands-of-children-used-to-predict-risk-of-gang-exploitation> [Accessed: 22 October, 2018]
Potter, N. (2011) ‘Siri Abortion Debate: ACLU, NARAL on Apple “Glitch”’, abc NEWS, Sunday 3d December <https://abcnews.go.com/Technology/siri-abortion-aclu-naral-abortion-rights-opponents-apple/story?id=15077020> [accessed 1 August 2018]
Rivas, J. (2011) ‘Feminism Fail: iPhone’s Siri Stumped By Abortion But Can Easily Find Viagra’, Colorlines, Tuesday 29th November, <https://www.colorlines.com/articles/feminism-fail-iphones-siri-stumped-abortion-can-easily-find-viagra> [accessed 1 August 2018]
Sullivan, D. (2011), ‘Why Siri Can’t Find Abortion Clinics & How It’s Not An Apple Conspiracy’, Search Engine Land <https://searchengineland.com/why-siri-cant-find-abortion-clinics-103349> [accessed 1 August 2018]
Sutter, J.D. (2011) ‘Siri can’t direct you to an abortion clinic’, CNN, Thursday 1st December <https://edition.cnn.com/2011/12/01/tech/mobile/abortion-clinic-siri-iphone/index.html> [accessed 1 August, 2018]
Turner, C. and Behrndt, S. (2008) dramaturgy and performance (Hampshire:Palgrave MacMillian)