During the War on Terror, the United States military has been conducting an increasing number of foreign campaigns by remote control using drones—also called unmanned aerial vehicles (UAVs) or remotely piloted vehicles (RPVs)—to extend the reach of military power and augment the technical precision of targeted strikes while minimizing bodily risk to American combatants. Stationed on bases throughout the southwest, operators fly weaponized drones over the Middle East. Viewing the battle zone through a computer screen that presents them with imagery captured from a drone-mounted camera, these combatants participate in war from a safe distance via an interface that resembles a video game. Increasingly, this participation takes the form of targeted killing.
Despite their relative physical safety, in 2008 reports began mounting that like boots-on-the-ground combatants, many drone operators seek the services of chaplains or other mental health professionals to deal with the emotional toll of their work (Associated Press; Schachtman). Questions about the nature of the stress or trauma that drone operators experience have become a trope in news coverage of drone warfare (see Bumiller; Bowden; Saleton; Axe). This was exemplified in May 2013, when former Air Force drone pilot Brandon Bryant became a public figure after speaking to National Public Radio about his remorse for participating in targeted killing strikes and his subsequent struggle with post-traumatic stress (PTS) (Greene and McEvers). Stories like Bryant’s express American culture’s struggle to understand the role screen-mediated, remotely controlled killing plays in shifting the location of combatants’s sense of moral agency. That is, their sense of their ability to act based on their own understanding of right and wrong. Historically, one of the primary ways that psychiatry has conceptualized combat trauma has been as combatants’s psychological response losing their sense of moral agency on the battlefield (Lifton).
This articleuses the popular science fiction novel Ender's Game as an analytic lens through which to examine the ways that screen-mediated warfare may result in combat trauma by investigating the ways in which it may compromise moral agency. The goal of this analysis is not to describe the present state of drone operators’s experience (see Asaro), but rather to compare and contrast contemporary public discourses on the psychological impact of screen-mediated war with the way it is represented in one of the most influential science fiction novels of all times (The book won the Nebula Award in 1985, the Hugo Award in 1986, and appears on both the Modern Library 100 Best Novels and American Library Association’s “100 Best Books for Teens” lists). In so doing, the paper aims to counter prevalent modes of critical analysis of screen-mediated war that cannot account for drone operators’s trauma.
For decades, critics of postmodern warfare have denounced how fighting from inside tanks, the cockpits of planes, or at office desks has removed combatants from the experiences of risk and endangerment that historically characterized war (see Gray; Levidow & Robins). They suggest that screen-mediation enables not only physical but also cognitive and emotional distance from the violence of war-fighting by circumscribing it in a “magic circle.” Virtual worlds scholars adopted the term “magic circle” from cultural historian Johan Huizinga, who described it as the membrane that separates the time and space of game-play from those of real life (Salen and Zimmerman). While military scholars have long recognized that only 2% of soldiers can kill without hesitation (Grossman), critics of “video game wars” suggest that screen-mediation puts war in a magic circle, thereby creating cyborg human-machine assemblages capable of killing in cold blood. In other words, these critics argue that screen-mediated war distributes agency between humans and machines in such a way that human combatants do not feel morally responsible for killing. In contrast, Ender’s Game suggests that even when militaries utilize video game aesthetics to create weapons control interfaces, screen-mediation alone ultimately cannot blur the line between war and play and thereby psychically shield cyborg soldiers from combat trauma.
Orson Scott Card’s 1985 novel Ender’s Game—and the 2013 film adaptation—tells the story of a young boy at an elite military academy. Set several decades after a terrible war between humans and an alien race called the buggers, the novel follows the life of a boy named Ender. At age 6, recruiters take Andrew “Ender” Wiggin from his family to begin military training. He excels in all areas and eventually enters officer training. There he encounters a new video game-like simulator in which he commands space ship battalions against increasingly complex configurations of bugger ships. At the novel’s climax, Ender's mentor, war hero Mazer Rackham, brings him to a room crowded with high-ranking military personnel in order to take his final test on the simulator. In order to win Ender opts to launch a massive bomb, nicknamed “Little Doctor”, at the bugger home world. The image on his screen of a ball of space dust where once sat the enemy planet is met by victory cheers. Mazer then informs Ender that since he began officer training, he has been remotely controlling real ships. The video game war was, "Real. Not a game" (Card 297); Ender has exterminated the bugger species. But rather than join the celebration, Ender is devastated to learn he has committed "xenocide." Screen-mediation, the novel shows, can enable people to commit acts that they would otherwise find heinous.
US military advisors have used the story to set an agenda for research and development in augmented media. For example, Dr. Michael Macedonia, Chief Technology Officer of the Army Office for Simulation, Training, and Instrumentation told a reporter for the New York Times that Ender's Game "has had a lot of influence on our thinking" about how to use video game-like technologies in military operations (Harmon; Silberman; Mead). Many recent programs to develop and study video game-like military training simulators have been directly inspired by the book and its promise of being able to turn even a six-year-old into a competent combatant through well-structured human-computer interaction (Mead). However, it would appear that the novel’s moral regarding the psychological impact of actual screen-mediated combat did not dissuade military investment in drone warfare. The Air Force began using drones for surveillance during the Gulf War, but during the Global War on Terror they began to be equipped with weapons. By 2010, the US military operated over 7,000 drones, including over 200 weapons-ready Predator and Reaper drones. It now invests upwards of three-billion dollars a year into the drone program (Zucchino).
While there are significant differences between contemporary drone warfare and the plot of Ender's Game—including the fact that Ender is a child, that he alone commands a fleet, that he thinks he is playing a game, and that, except for a single weapon of mass destruction, he and his enemies are equally well equipped—for this analysis, I will focus on their most important similarities: both Ender and actual drone operators work on teams for long shifts using video game-like technology to remotely control vehicles in aerial combat against an enemy.
After he uses the Little Doctor, Mazer and Graff, Ender's long-time training supervisors, first work to circumvent his guilt by reframing his actions as heroic.
“You're a hero, Ender. They've seen what you did, you and the others. I don't think there's a government on Earth that hasn't voted you their highest metal.”
“I killed them all, didn't I?” Ender asked.
“All who?” asked Graff. “The buggers? That was the idea.”
Mazer leaned in close. “That's what the war was for.”
“All their queens. So I killed all their children, all of everything.”
“They decided that when they attacked us. It wasn't your fault. It's what had to happen.”
Ender grabbed Mazer's uniform and hung onto it, pulling him down so they were face to face. “I didn't want to kill them all. I didn't want to kill anybody! I'm not a killer! […] but you made me do it, you tricked me into it!” He was crying. He was out of control. (Card 297–8)
The novel up to this point has led us to believe that Ender at the very least understands that what he does in the game will be asked of him in real life. But his traumatic response to learning the truth reveals that he was in the magic circle. When he thinks he is playing a game, succeeding is a matter of ego: he wants to be the best, to live up to the expectations of his trainers that he is humanity’s last hope. When the magic circle is broken, Ender reconsiders his decision to use the Little Doctor. Tactics he could justify to win the game, reframed as real military tactics, threaten his sense of himself as a moral agent. Being told he is a hero provides no solace.
Card wrote the novel during the Cold War, when computers were coming to play an increasingly large role in military operations. Historians of military technology have shown that during this time human behavior began to be defined in machine-like, functionalist terms by scientists working on cybernetic systems (see Edwards; Galison; Orr). Human skills were defined as components of large technological systems, such as tanks and anti-aircraft weaponry: a human skill was treated as functionally the same as a machine one. The only issue of importance was how all the components could work together in order to meet strategic goals—a cybernetic problem. The reasons that Mazer and Graff have for lying to Ender suggest that the author believed that as a form of technical augmentation, screen-mediation can be used to evacuate individual moral agency and submit human will to the command of the larger cybernetic system. Issues of displaced agency in the military cyborg assemblage are apparent in the following quote, in which Mazer compares Ender himself to the bomb he used to destroy the bugger home world: “You had to be a weapon, Ender. Like a gun, like the Little Doctor, functioning perfectly but not knowing what you were aimed at. We aimed you. We're responsible. If there was something wrong, we did it” (298). Questions of distributed agency have also surfaced in the drone debates. Government and military leaders have attempted to depersonalize drone warfare by assuring the American public that the list of targets is meticulously researched: drones kill those who we need killed. Drone warfare, media theorist Peter Asaro argues, has “created new and complex forms of human-machine subjectivity” that cannot be understood by considering the agency of the technology alone because it is distributed between humans and machines (25). While our leaders’s decisions about who to kill are central to this new cyborg subjectivity, the operators who fire the weapons nevertheless experience at least a retrospective sense of agency. As phenomenologist John Protevi notes, in the wake of wars fought by modern military networks, many veterans diagnosed with PTS still express guilt and personal responsibility for the outcomes of their participation in killing (Protevi).
Mazer and Graff explain that the two qualities that make Ender such a good weapon also create an imperative to lie to him: his compassion and his innocence. For his trainers, compassion means a capacity to truly think like others, friend or foe, and understand their motivations. Graff explains that while his trainers recognized Ender's compassion as an invaluable tool, they also recognized that it would preclude his willingness to kill.
It had to be a trick or you couldn't have done it. It's the bind we were in. We had to have a commander with so much empathy that he would think like the buggers, understand them and anticipate them. So much compassion that he could win the love of his underlings and work with them like a perfect machine, as perfect as the buggers. But somebody with that much compassion could never be the killer we needed. Could never go into battle willing to win at all costs. If you knew, you couldn't do it. If you were the kind of person who would do it even if you knew, you could never have understood the buggers well enough. (298)
In learning that the game was real, Ender learns that he was not merely coming to understand a programmed simulation of bugger behavior, but their actual psychology. Therefore, his compassion has not only helped him understand the buggers’ military strategy, but also to identify with them.
Like Ender, drone operators spend weeks or months following their targets, getting to know them and their routines from a God’s eye perspective. They both also watch the repercussions of their missions on screen. Unlike fighter pilots who drop bombs and fly away, drone operators use high-resolution cameras and fly much closer to the ground both when flying and assessing the results of their strikes. As one drone operator interviewed by the Los Angeles Times explained, "When I flew the B-52, it was at 30,000 to 40,000 feet, and you don't even see the bombs falling … Here, you're a lot closer to the actual fight, or that's the way it seems" (Zucchino). Brookings Institute scholar Peter Singer has argued that in this way screen mediation actually enables a more intimate experience of violence for drone operators than airplane pilots (Singer).
The second reason Ender’s trainers give for lying is that they need someone not only compassionate, but also innocent of the horrors of war. The war veteran Mazer explains: “And it had to be a child, Ender,” said Mazer. “You were faster than me. Better than me. I was too old and cautious. Any decent person who knows what warfare is can never go into battle with a whole heart. But you didn't know. We made sure you didn't know" (298). When Ender discovers what he has done, he loses not only his innocence but his sense of himself as a moral agent. After such a trauma, his heart is no longer whole.
Actual drone operators are, of course, not kept in a magic circle, innocent of the repercussions of their actions. Nor do they otherwise feel as though they are playing, as several have publicly stated. Instead, they report finding drone work tedious, and some even play video games for fun (Asaro). However, Air Force recruitment advertising makes clear analogies between the skills they desire and those of video game play (Brown). Though the first generations of drone operators were pulled from the ranks of flight pilots, in 2009 the Air Force began training them from the ground. Many drone operators, then, enter the role having no other military service and may come into it believing, on some level, that their work will be play.
Recent military studies of drone operators have raised doubts about whether drone operators really experience high rates of trauma, suggesting that the stresses they experience are seated instead in occupational issues like long shifts (Ouma, Chappelle, and Salinas; Chappelle, Psy, and Salinas). But several critics of these studies have pointed out that there is a taboo against speaking about feelings of regret and trauma in the military in general and among drone operators in particular. A PTS diagnosis can end a military career; given the Air Force’s career-focused recruiting emphasis, it makes sense that few would come forward (Dao). Therefore, it is still important to take drone operator PTS seriously and try to understand how screen-mediation augments their experience of killing.
While critics worry that warfare mediated by a screen and joystick leads to a “‘Playstation’ mentality towards killing” (Alston 25), Ender's Game presents a theory of remote-control war wherein this technological redistribution of the act of killing does not, in itself, create emotional distance or evacuate the killer’s sense of moral agency. In order to kill, Ender must be distanced from reality as well. While drone operators do not work shielded by the magic circle—and therefore do not experience the trauma of its dissolution—every day when they leave the cyborg assemblage of their work stations and rejoin their families they still have to confront themselves as individual moral agents and bear their responsibility for ending lives. In both these scenarios, a human agent’s combat trauma serves to remind us that even when their bodies are physically safe, war is hell for those who fight.
This paper has illustrated how a science fiction story can be used as an analytic lens for thinking through contemporary discourses about human-technology relationships. However, the US military is currently investing in drones that are increasingly autonomous from human operators. This redistribution of agency may reduce incidence of PTS among operators by decreasing their role in, and therefore sense of moral responsibility for, killing (Axe). Reducing mental illness may seem to be a worthwhile goal, but in a world wherein militaries distribute the agency for killing to machines in order to reduce the burden on humans, societies will have to confront the fact that combatants’s trauma cannot be a compass by which to measure the morality of wars. Too often in the US media, the primary stories that Americans are told about the violence of their country’s wars are those of their own combatants—not only about their deaths and physical injuries, but their suicide and PTS. To understand war in such a world, we will need new, post-humanist stories where the cyborg assemblage and not the individual is held accountable for killing and morality is measured in lives taken, not rates of mental illness.
Alston, Phillip. “Report of the Special Rapporteur on Extrajudicial, Summary, or Arbitrary Executions, Addendum: Study on Targeted Killings.” United Nations Human Rights Council (2010).
Asaro, Peter M. “The Labor of Surveillance and Bureaucratized Killing: New Subjectivities of Military Drone Operators”. Social Semiotics 23.2 (2013): 196-22.
Associated Press. “Predator Pilots Suffering War Stress.” Military.com 2008.
Axe, David. “How to Prevent Drone Pilot PTSD: Blame the ’Bot.” Wired June 2012.
Bowden, Mark. “The Killing Machines: How to Think about Drones.” The Atlantic Sep. 2013.
Brown, Melissa T. Enlisting Masculinity: The Construction of Gender in US Military Recruiting Advertising during the All-Volunteer Force. London: Oxford University Press, 2012.
Bumiller, Elisabeth. “Air Force Drone Operators Report High Levels of Stress.” New York Times 18 Dec. 2011: n. pag.
Card, Orson Scott. Ender’s Game. Tom Doherty Associates, Inc., 1985.
Chappelle, Wayne, D. Psy, and Amber Salinas. “Psychological Health Screening of Remotely Piloted Aircraft (RPA) Operators and Supporting Units.” Paper presented at the Symposium on Mental Health and Well-Being across the Military Spectrum, Bergen, Norway, 12 April 2011: 1–12.
Dao, James. “Drone Pilots Are Found to Get Stress Disorders Much as Those in Combat Do.” New York Times 22 Feb. 2013: n. pag.
Edwards, Paul N. The Closed World: Computers and the Politics of Discourse in Cold War America. Cambridge: MIT Press, 1997.
Galison, Peter. “The Ontology of the Enemy: Norbert Wiener and the Cybernetic Vision.” Critical Inquiry 21.1 (1994): 228.
Gray, Chris Hables “Posthuman Soldiers in Postmodern War.” Body & Society 9.4 (2003): 215–226. 27 Nov. 2010.
Greene, David, and Kelly McEvers. “Former Air Force Pilot Has Cautionary Tales about Drones.” National Public Radio 10 May 2013.
Grossman, David. On Killing. Revised. Boston: Back Bay Books, 2009.
Harmon, Amy. “More than Just a Game, But How Close to Reality?” New York Times 3 Apr. 2003: n. pag.
Levidow, Les, and Robins. Cyborg Worlds: The Military Information Society. London: Free Association Books, 1989.
Lifton, Robert Jay. Home from the War: Vietnam Veterans: Neither Victims nor Executioners. New York: Random House, 1973.
Mead, Corey. War Play: Video Games and the Future of Armed Conflict. Boston: Houghton Mifflin Harcourt, 2013.
Orr, Jackie. Panic Diaries: A Genealogy of Panic Disorder. Durham: Duke University Press, 2006.
Ouma, J.A., W.L. Chappelle, and A. Salinas. Facets of Occupational Burnout among US Air Force Active Duty and National Guard/Reserve MQ-1 Predator and MQ-9 Reaper Operators. Air Force Research Labs Technical Report AFRL-SA-WP-TR-2011-0003. Wright-Patterson AFB, OH: Air Force Research Laboratory. 2011.
Protevi, John. “Affect, Agency and Responsibility: The Act of Killing in the Age of Cyborgs.” Phenomenology and the Cognitive Sciences 7.3 (2008): 405–413.
Salen, Katie, and Eric Zimmerman. Rules of Play: Game Design Fundamentals. Cambridge, MA: MIT Press, 2003.
Saleton, William. “Ghosts in the Machine: Do Remote-Control War Pilots Get Combat Stress?” Slate.com Aug. 2008.
Schachtman, Nathan. “Shrinks Help Drone Pilots Cope with Robo-Violence.” Wired Aug. 2008.
Silberman, Steve. “The War Room.” Wired Sep. 2004: 1–5.
Singer, P.W. Wired for War: The Robotics Revolution and Conflict in the Twenty-First Century. New York: Penguin Press, 2009.
Zucchino, David. “Drone Pilots Have Front-Row Seat on War, from Half a World Away.” Los Angeles Times 21 Feb. 2010: n. pag.