The pandemic in the United States has allowed for an experiment, at scale, in how a visually obsessed culture orients itself, frantically, toward the unseen—the virus, its transmission. Capture of the bodies suspected to be infected, or about to be infected, by this unseen, has intensified. The theater of security and quarantine has made legible the precise kind of psychological space that surveillance depends on. Surveillance depends on an initial buy-in, whether begrudging or unwitting, of millions of users and subjects of technology. But the buy-in is entrenched and secured beyond vision. It is continued on through rhetoric, through persuasion, through training in the play of unseeing and seeing, such that each of us using technology internalizes the logics of capture.
Over the past decade, both critical activism and legal and academic advocacy have helped cultivate widespread awareness of the trade-offs we make in using consumer technology. In fact, there is more widely available history, research, and ongoing theorizing on surveillance, on calm design of consumer technology, the history of dark patterns, than ever before. “Users” seem to largely understand how they—we—rescind our civil liberties and privacy for the convenience and ease of elite design. Business school scholars like Shoshanna Zuboff publish international bestsellers on the foundations and strategies of companies pursuing surveillance capitalism, and ploddingly examine each new buttress in their architecture.1 Interface designers and programmers are more wary of their own role. Tech activists and anti-spying coalitions educate teens about the surveillance state. As users, we seem to share an intellectual understanding of being surveilled. Users, citizens, accept corporate and governmental surveillance in exchange for use of a host of platforms, infrastructures, and software tools. Users adjust to a baseline knowledge of how their profiles and movements are tracked, their data lifted.
Less evident are the small shifts in public rhetoric, which continue to ensure collective buy-in for surveillance. Through these changes in narrative, which ask that we surveil ourselves and each other, we learn to inhabit the role of the surveilling eye. We sympathize with the surveillant and fail to interrupt our capture of the surveilled. We begin to relate to each other through the act of policing of language, expression, of bodily movement, intention, motivation, and presence. Taking the world as ours to consume, define, hedge, label, watch, and rewatch, in endless loops, we become police.
As forms of colonizing, imperialist seeing continue online, one begins to internalize the logics of capture. Everyone that falls across one’s screen belongs to one, and every movement is one’s to possess. Even within the growing spectacle of contactless pickup and no-touch sociality, in which one avoids overt touch or bodily intimacy, and produces visual evidence of this avoidance, the more subtle logics of capture persist.
Latent elements of the surveillance state have been activated and expanded rapidly in the current economic, epidemiological, and bio-political crisis. Particularly oppressive surveillance has targeted essential workers, who are particularly vulnerable, living at the intersection of manifold socioeconomic, gender, and racial inequities. In April and May of 2020, police across the country echoed the spirit of punitive seventeenth-century “lantern laws”; they detained and arrested individuals going to work, to second and third jobs, or to home to nap in between, for violating “curfew,” despite their having papers of excuse around an arbitrary, overnight declaration. In June of 2020, police departments across the country scanned digital images of protestors, individuals critical of the police, and then hunted for their faces on social media to then track them down. Journalists from the New York Times to popular podcasts reported and report such insidious efforts breathlessly, as though a massive surveillance architecture has not unfolded around us, and in our hands, for over a decade.
Each event is a new head of a growing hydra. A crisis reveals hidden workings of this Leviathan, a many-eyed, glittering apparatus that flashes in full view, before sinking below the surface of the water. But for the theater of capture to be enacted, a groundwork had to be very methodically, slowly, dug, tiles placed, laid in iterations. For the apparatus of surveillance to become easily acknowledged, visible to us, for it to really take root in our enforced distance from one another, it had to become part of our own, active seeing. It is no longer totally hidden in third-party apps, in hidden, black-boxed machine learning systems reading our images for life signatures. We are daily, steadily inculcated, through narrative and media consumption, through public health initiatives and tool updates, to internalize the logics of surveillance, so that we become surveillant of a bio-political landscape that consists entirely of at-risk bodies. In a form of scrying, we mine our screen’s display of crowds near and far, sussing out, discerning their hidden intentions.
This article’s title, “Where Eyes Can’t Follow: Internalizing the Logics of Capture,” can be further appended with: “as a Dream of Sovereigns.” Sovereigns dream of capture, of the mechanics of capture being hidden. They dream of this hidden capture becoming ritualized, internalized, done out of the sight of those who would protest. I take up as my focus the many ways that surveillance logics are internalized by us, watching spectacular life unfold, consuming feeds, images, and media about violence, often in real time, online. Caught within the theater of algorithmic capture, itself fueled by theaters of true, active, physical capture, we risk experiencing a slow conversion to the embrace of the logics of policing. Seduced by the power of identifying with its logics, we naturalize them, and then lose sight of our own tendencies to police.
I am inspired by thinkers like Simone Browne, Jackie Wang, and Safiya Noble, who leverage their criticism at the design of the carceral state, and the technologies that support it. In their wake, I take up how foundational methods of algorithmic capture are to society. Earlier forms of coding, of measuring movement, through the census, lantern laws, and branding, have transformed into current algorithmic supremacy, which fuels predictive capture, entrenching injustice along scales we can barely conceive of.
Mapping Psychological Space and Groundwork: Setting the Stage
As the U.S. first grappled at scale with the emergency of the pandemic, a shift of the “burden” of policing, from the police to citizens themselves, started to take place. Communities were harnessed, overnight, to police themselves, and other communities, using technology already at hand. Such practices are supported and encouraged by the mediation of images through surveillant digital infrastructure. I map how we become part of the disciplinary eye, take our place in concentric rings of internalized surveillance, turned into lifestyle.
The digital surveillance infrastructure we work within was always ready for self-surveillance through apps, platforms, and access to databases. It was always driven by an ethic of predictive community policing that seeks out and marks potential “risks.” And this structure was primed perfectly for the mass internalizing of the logics of capture, policing, and oblique social management. Threats to the public good, like a virus, are easily solved by the “affordances” of technological solution-ism.
As a number of writers, such as Kim Stanley Robinson, have written, the pandemic’s impact has been equally “abstract and internal. It was [and is] a change in the way we were looking at things” before.2 This internal, abstract change is my subject. As the social pandemic began to unravel alongside the actual, public focus shifted to a view from overhead, to consider the designs of social systems and resource distribution. Pundits discussed urban design as it perpetuates inequality, through racist housing laws, redlining, zoning, and anti-houseless public architecture. The widely uneven infrastructures of public support surfaced, and a lack of preparedness left citizens further dependent on devices—their pervasive infrastructures of techno-surveillance—for guidance, strategies, information. Each person, left to themself, seizing on a digital semblance of stability, reliant on its offerings.
The conscientious, cognitive laborer, in enforced solitude, scrolls her feeds. She zooms in on virus simulations, does her own amateur epidemiological modeling. She researches Sweden and herd immunity, flirts with conspiracy, and gets more and more frustrated about her isolation as a state directly linked to the select actions of individual strangers. Who are these people making it harder for her to leave? From her couch, tucked into a blanket, she scans images of poor people on the subway, crammed shoulder to shoulder, and shudders. People in photos start to exist for her within the language and metrics of the virus; are these people sick, are they immune? Are they asymptomatic or are they potentially spreading their disease everywhere?3
Keeping abreast of the “rules” required near-constant surveilling of media and surveillance of self. A week offline—assuming one had the internet—could mean missing curfew, or not knowing public sentiment on masks had changed in one’s community and state. Here, an invocation of “We” seems tempting, and would work if we existed within a clear bio-political order, in which all people lived and experienced the world in predictable ways. “We” are instructed to wear masks. “We” are told how we all should navigate urban spaces, approach each other, and stay at bay. “We” are told to strive to be contactless, to avoid touch. There is a dissolution of “We” when the bio-political order is unclear. Simulations, algorithms, and predictions all change, and so create rapidly changing rules. We all certainly heard a lot of chiding, mostly of the many who will flout rules made in this cybernetic stronghold, even as those rules change daily. This is not an expression of sympathy for those who actively put others at risk, but a note on the natural disorientation caused by rules based on public health “interpreters” of rough, emerging, and competing simulations of a poorly understood, unseen virus.4
The theater of pandemic lit up with fear, anger, and critique of those who do not practice social distancing. Police without masks started a referendum on police abuses and disdain of the public, especially the overtaxed poor.5 From the vantage point of the interior, it became difficult to know precisely what was happening outside of news and social media. Every action and event came mediated. It was, and is, up to the reader to sort out what is real, what is exaggerated, to the best of their ability within algorithmic bubbles. Small news items are amplified, concentrated, and made ubiquitous within the span of an hour.
And outside, in the confusion of received information, citizens began to monitor others, on alert for violations of social distance. “We” quickly enforced a kind of gaze on each other, a kind of gentle surveillance, a relationship of power that immediately posits the viewer as correct. In the emptied streets, the police prowled with new reasons to enact violence. News of New York City police’s choices dominated headlines as the city became a pandemic epicenter. Areas that were already over-policed experienced more policing. Bias against communities of color intensified. Little empathy, more aggression. As NYC police officer Jorge Trujillo described early days of the pandemic, “So on top of the unprecedented problem that we’re dealing with, the police can come and escalate and make things worse.”6 Many pointed out the inflexibility of policing, how the absence of oversight meant little to no change in the abusive tactics of the police. I was struck by how deeply important citizen countersurveillance, the vitality of it as a tactic, would become. Who would see back?
In May of 2020, protests erupted in dozens of cities across America in response to the viral filmed extrajudicial murder of George Floyd at the hands of Minneapolis police officer Derek Chauvin. In late May, Minnesota Public Safety Commissioner John Harrington announced that protest arrestees would be “contact traced” to determine their associations, political affiliation, levels of organization, platforms, to then build “an information network.”7 That “contact tracing,” a concept used for passively finding sickness, be then used to trace “unseen” sentiments, like anti-police activism, anti-fascism, or anti-racism, and criminalize them much as sickness, was not an ambiguous move. Keen technology critics like Adrian Chen swiftly noted how the “war on COVID is normalizing surveillance in a bad way.”8 The language tracked.
Kentaro Toyama, a surveillance expert at the University of Michigan, notes that “the normal ethics of surveillance might not apply” in a time of crisis. Normal ethics meaning a shared understanding grounded in civil liberties, in which individual privacy should be protected, and surveillance is seen as primed for misuse.9
Theaters of Policing: Spectacular Aims
Surveillance serves the American state, but it also serves civilians whose interest is in maintaining the violent, bankrupting, extractive economic system the American state relies on. Their lives, families, and communities are only “safe” if the state is steadied under law and order. Being part of the eye, then, having the ability to capture and evade, is a distinct power within this regime. Being able to see “danger,” name it, and call for punishment of it, is rewarded. The spectacle of the weak, the suffering, and the aggrieved fuels an algorithmic economy that cycles, feeds, and amplifies on rage, grief, and other strong emotions.
Pandemic theater has been not only a great amplifier, but a wide-scale simulation of logics of capture, seeing, and unseeing. Never in history have we had access to so much information about the iterative, compounding injustices in the world; further, never in history have we had the ability to wall ourselves off from them so smoothly. In isolation, the protection of the body—the healthy body—becomes a site of contestation. And in sovereign systems like or comparable to America’s, with weak or absent social welfare systems, the individual must carry the burden of her self-healing. It is your individual responsibility to keep your health intact, and so, to map and chart possible threats around you from your terminal. Every healthy body must protect itself and has the right to do so with any tools available. Quarantine has become a mark of the healthy in the absence of good data. And so, while the pandemic has begun long-overdue public discussions about structural inequality, the incompetence with which it has been handled has given free rein for many to prioritize personal safety and sovereignty over the rights of others. Policing of the most marginalized intensifies, out of fear of death.
The management of sickness and plague is where our commitments to relative senses of community, if they existed before, are tested. Paul B. Preciado describes the pandemic as a “great laborator[y] of social innovation, the occasion for the large-scale reconfiguration of body procedures and technologies of power,” along with a “new way of understanding sovereignty.”10 Sovereignty is not just waged but made concrete through networks, through gaming of algorithmic calculus. Some are deemed outside the realm of community, dispensable by definition. Understanding how these definitions are augmented, concentrated, and distributed through algorithmic media is critical, as we are experiencing the pandemic through the framework of computation.
In domestic isolation, videos and streams and feeds pour in. People inside, cut off, turn even further inward. Online, fake news proliferates through armies of bots. The play unfolds. A cast of characters emerges: the unmasked suburban libertarian; the anti-vaxxer; the working, undocumented immigrant in farm fields; the essential worker, the hero-nurse, the ones clapping at dusk; the zealous moralizer. Each character type, each position, becomes part of the story of the bio-political state. Surveilling other’s crises, other’s grief, and the narratives of pain and loss, death, becomes a form of ritual. Through the algorithmic and digital media economy, outrageous narratives, each an “example” of political positions, become naturalized.
Through this piece, I want to argue for capture as a pervasive, seductive cognitive tendency, a practice that is honed through media consumption. The isolated eagerly take on this work out of a sense of urgency, self-protection, and survival. Capture becomes a matter of life and death. This tendency toward a logic of capture—grown, cultivated through uninterrupted consumption of mediated, violent acts of capture—becomes itself a lifestyle, an undertaking with civic weight and import. Seeking answers in the pandemic within screens, one finds unlimited freedom to surveil, download, and zoom in on the bodies of others with a clarity that comes from a many-tiered remove. This tendency leans toward an unreflective adoption of an authoritarian bent toward streams of videos and images, and the objects and bodies represented therein.
The viewer and feed are co-constitutive. The viewer flows through her feed, moving from inhabiting one surveilling eye to the next, scanning, watching, aggressively framing, moving right along with the feed’s flow of violence. This digital movement with algorithmic spectacle compounds and entrenches one’s political position, and affirms it. Though a critical reader of digital media might suggest everyone “reflect” and always make space for critical consumption, it does not seem tenable at scale, to mobilize the mind and spirit to thoughtfully read every video and image in the feed.
Further, we read images and videos of violence, violently, and do so by technological design. Machine learning-driven visual consumption has its own spectacular time. It depends on capture of bodies and objects to function; its time is collapsed, fueled through sorting, bucketing, separation of protestor, vigilante, worker, hero, scourge. Outrageous narratives, “examples” of individual political positions, become naturalized. Each character, type, position, becomes part of the story of the pandemic theater.
As spectacular time unfolds through the media, we see historical structures of relating, capturing, naming, all bleed through.11 One is cognitively disposed to recognize, or mis-recognize, spectacular violence as violence. Ritualized forms and patterns of viewing violent capture—the police capturing, a vigilante, capturing—as an event that happens in blur, far away, or far below, become commodified. Inevitably, when our media time is dominated by images of capture, the spectacle of consuming violence becomes, itself, a subject of focus.
Sociologist and scholar Zeynep Tufekci has written at length about the dangers of “pandemic theater,” in which security measures and police harassment are likely amplified in any vacuum of information.12 As was well covered in New York, police harassed, fined, and arrested black and brown citizens on camera for not socially distancing, while their cohorts handed out virus “protection” kits of masks and wipes, with care, to sunbathers in Domino and McCarren Parks.13 Mimicking extant “stop and frisk” practices meant 35 of 40 arrested people in New York City were black.14 Similarly, in many countries, police data for COVID-19 arrests and fines followed predictable, comparable patterns. In Sydney, Australia, for instance, social distancing’s highest infringements take place among Indigenous people, despite being .04% of the population, reflecting systemic bias and over-policing of Indigenous locals.
Algorithmic and digital media economies deputize civilians to be purveyors of violent spectacle. And these economies mask the ways in which being a spectator makes one more vulnerable to capture, to disciplinary tracking, targeting, and training. In 1998, Wendy Hui Kyong Chun richly theorized the cyberflâneur who navigates online space as a kind of “detached observer who remain[s] hidden from the world while at the center,” invested in a fallacious illusion of control. Not much has changed within today’s algorithmically driven digital media landscape. Today’s cyberflâneur maps seamlessly atop with the captor, the surveilling state. She is instantiated and trained into a position. Any stance of detachment and distance impossible. The consumer of spectacular algorithmic time and life is shaped by her relationship to the spectacle, even though she might imagine herself as merely gawker, a passive seer. In this cycle, violent spectatorship is naturalized, exacerbated.
In 2020, the new age of the cyberflâneur, the algorithmic citizen might still believe herself in a position to judge, assess, analyze, close-read with objectivity. She might fancy herself a critical reader, capable, with the tools of visual analysis, of understanding the story of any image. What if this critical position is totally collapsed, compromised, hacked? The reality of how computational systems today actively shape, crop, highlight, direct and nudge image readings makes a critical close-reading very difficult. The algorithmic spectacle of crisis embeds the tussle of divergent, wildly opposed ideologies into its framing, harnessing more watchers. I read a static news headline, topping an image of a nurse in a mask, standing proudly in an intersection before protestors. I then watch her in a video; she pushes and is pushed by a woman draped in an American flag. As a viewer, I immediately sympathize with the nurse. I identify with this overworked, underpaid woman who looks like me. I cannot find a scrap of critical generosity for the woman in the flag, even as I learn she has gone bankrupt, as her small business closed. In eight seconds, I am pulled into bad reading, conclusive judgments.
In introducing Chun’s essay, Nicholas Mirzoeff notes how Chun’s cyberflâneur is further a kind of “rubbernecker,” who lightly colonizes with her gaze, takes ownership of the subjects in the image.“For all the futuristic talk associated with the internet,” Mirzoeff writes, “the dominant models of internet use rely on nineteenth-century ideas of the colonizing subject, and skate over the implications of the largely independent ways computers actually exchange data.”15 Today, each spectator is still yet “also a spectacle, given that everyone automatically produces traces,”16 literal traces, data generated with each click, each site visited, each uploaded photo. She leaves her tracks everywhere online, and she is also colonized in the process of viewing.
From Seeing With the State → Learning to Look Again
My goal here is a quick portrait of how users of technology are being encouraged to adopt a logic of seeing with the state, and through its eyes, offline. Algorithmically guided scanning and capture becomes a method of scanning, reading, and capture in the real world. Increased surveillance through streams and feeds can accelerate structures of literal capture. The technical space embeds and amplifies the most authoritarian tendencies that one might harbor. An inculcation into the logics of platforms exacerbates a tendency to surveil, to identify with the state, with police, to identify with the entity with outsized power to see. The inconceivable becomes normalized. And considered, incisive critique of the grounding conditions that allow this distributed authoritarian lens becomes harder.
This seeing with the state is both the product of a nameable, immediate crisis, like the pandemic, and an expression of an ongoing, slow, training in our collective sympathies with the logics of policing, which we privatize, internalize, and take on ourselves as duty. Some public response has clamored for a status quo to be restored. A return to normal. Overnight, a new culture of informants has emerged, informing on social-distance violators. Apps like Neighbors and Next Door are used to deliver notes to the public, and the police, about crowds and gatherings of suspicious people. These rough networks of ambient capture and potential punishment, which enforce pre-existing power relationships (the healthy versus the sick, the protected and socially distanced versus the precariously housed, more at-risk) reveal a relationship to discipline that takes place outside the body but now depends on digital capture. Some remain untouched, always the center of the story, while other bodies are doomed, subject to disease, death—a story of foretold dispensability, now mediated.
In theorizing the pandemic, Preciado reminds us of Foucault’s useful framework of biopolitics as one in which “the techniques of biopolitical government spread as a network of power that goes beyond the juridical spheres to become a horizontal, tentacular force, traversing the entire territory of lived experience and penetrating each individual body.”17 How might we understand this tentacular force, shaping the state of exception ushered in by COVID-19? How does the state of exception allow for individual surveillance in isolation to flourish? If we understand technology and the digital as not just mirroring, but actively framing and magnifying existing structural narratives, how do we square the centrality of technology and smart devices as portals to services, money, and work, with the specific ideological world views they espouse? Further, how might critical and academic studies move much more swiftly with these tides of socio-technical change? While unorthodox to write about such world historical events in real time, my analysis of the past three months is rooted in identifying patterns and models central to emerging surveillance studies, criticism of software ideology, and the philosophy of cybernetics. Theorists need to move with the changing tactics of surveillance on all the fronts that it moves, so users, consumers, can actively interrogate their role and responsibility.
March 2020 to June 2020: it took four months to transition from crisis fueled by fear of viral infection to a secondary crisis of “necessary” surveillance, fueled by fear of people protesting police violence, on the state’s fear of Black Americans as a political force, on a fear of “antifa.” And a crisis of countersurveillance, fueled by citizen fear of police brutality and for many, of white supremacy. It all depends on how you are conditioned to see.
Critiques of vision and its role as the ultimate conduit of racial surveillance predicate that one is able to see the surveilling mechanism, the apparatus. However, just as race exceeds the visible to denote the invisible, so too does racialized surveillance function along unseen byways. Preciado, for instance, describes the “subaltern vertical workers, racialized, and feminized bodies” as they are “condemned” to work outside, over-policed. In this work they are also, perversely, unseen.18 I’d posit that the work of moving away from algorithmic surveillance and its unseeing is to first imagine, and then articulate, all the different modes in which feminized and racialized bodies are unseen. Through shared, collective imperatives to narrate, honor, and elevate unseen experience, to navigate the world beyond dominant modes of vision, to think along vectors of risk, safety, erasure, and power.
Surveillance has many modes of unseeing, because seeing involves choices to unsee, or not see. Clear surveillance can be ignored or not seen. Or, the surveilling apparatus goes offline. What the surveilling eye chooses not to see can be both a death sentence and a lost opportunity for justice. Police body cameras are shut off before enacting extrajudicial murder. In June of 2020, police brutalized protestors with tear gas and rubber bullets, riot gear shields, tasers, and batons, before cameras, rolling, in clear view of the world. But the demands of protestors were as much about the police violence past and ongoing, as the abuse that happens in prisons, and behind closed doors, domestic and institutional. The violence that is seen is the exception; unseen, systemic abuses are now, wonderfully, at the fore of collective consciousness. What happens in the absence of footage fills the shared imagination. Activists and community organizers share statistics on missing women, on the missing dead of Black and immigrant communities. We are asked to imagine all the violence people are subject to, that goes beyond the sight of the world. The illegible violence. Without citizens to track, name, film, and complicate the state seeing apparatus, one can only imagine what is not filmed and shown to the world.
Activists have raised widespread awareness that increased countersurveillance, whether by civilians working individually in protests or in groups, helps document abusive, terrorizing acts, the murders by police. It remains to be seen whether increased countersurveillance can prevent deaths and harm, or does, instead, the equally important work of broadcasting nationally and internationally the fact of violence, which would otherwise be buried, hidden, or lost. Or, whether the evidence of countersurveillance only feeds into the algorithmic thirst for capture of violence.
In the U.S., from state to state, we have moved at a snail’s pace toward practical logical measures that will help with developing a vaccine, manage social spread, and facilitate access to care. Instead, we have a swift activation of corporate and state control of medical data without firm plans or insight on how this data will or won’t be folded into extant forms of algorithmic oppression, which punish one in insurance claims, loan applications, and other opportunities based on one’s personal metadata. All these active technological interventions are part of the tentacular spread: the ambient style of mental capture encouraged through algorithmic media, the normalization of tracing presented as necessary for public health. All the information we receive about COVID-19 is technologically mediated through computational and statistical simulations, received as information from above.
Within these spectacular, swift activations of surveillance infrastructure, it is critical to also recognize how surveillance is slow, how it is introduced in tiny waves, over years. This understanding helps us reframe debates about technological surveillance, from one of individual choice and civil liberties, alone. In this advocacy, being surveilled or not is presented as a clear choice, a choice that can be attained through just enough civic engagement and resistance and rejection. The pandemic, as Elvia Wilk describes, is a form of “slow violence, resulting not only from sudden, invasive ‘foreign,’ nonhuman threats, but also from ongoing, pervasive, systemic power imbalances inside and outside the arbitrary borders we draw around places, people, and concepts.”19 And within this negotiation of power imbalance, surveillance architectures are the ultimate example of a slow violence which “is hard to identify, hard to describe, and hard to resist.”20 Understanding how we are surveilled requires that we recognize our own practices of surveillance, and how we have accepted this role over time. The debate over surveillance demands that we examine our motives, that we interrupt our ownership.
De-Normalizing the Trace
Two years ago, living in Detroit’s North End, I wrote about the psychological effect of Project Greenlight’s live-feed cameras on the sidewalks and buildings of Detroit as they introduced a new type of paranoia, about the machine eye reading one badly, intervening without one’s knowledge. To wit, this past month, Project Greenlight is, as many speculated, now being activated for contact tracing to enforce social distancing.21 The logic of surveillance is not ever so surprising; opening the door to surveillance ensures more surveillance. A slow movement is made from the cusp of possibility (a confusion about what such cameras will be used for) to active, hunting surveillance—an eye that follows, that punishes, that notes deviation. Increased policing with a layer of specialized cameras is now justified as essential for keeping the collective safe. Black neighborhoods in Detroit like North End are hit the hardest, making the city a hot spot for COVID-19.22
By normalizing surveillance of one another and of communities for health reasons, we have experienced a truly unprecedented turn: the acceptance, the slide, into what is acceptable public surveillance, as enforced through consumer technologies. A new chime, a new interface, a new system of tracking.23 A day before the protests in Minneapolis took form, Apple joined with Google and announced a new system update which introduced technical tools to “help combat the virus and save lives.” They ensured user privacy and security are central—with new framework APIs, cryptography and Bluetooth tracing protocols woven into phone updates.24 These “exposure notifications” would, apparently, be made on millions of phones to “aid in the fight against the pandemic.” They would be made “available to states’ public health agencies, and governments” to build a host of virus tracing apps that will help one know whether one has had contact with an infected, positive-testing person.25 As of late May, states like North Dakota and South Carolina had already signed up to unroll it statewide.26 In August, few states had rolled out “Covid Watch”; as students gear up to return to their “hybrid” campuses, states anticipated better data from the app.27
Closing the loop of algorithmic supremacy, we see and sort the world algorithmically to become part of the sovereign’s dream. Machine vision dominates; the cameras, computers, and screens that are eyeless and sightless, that see without eyes, reify and enforce within us an algorithmic seeing, processing, and sorting within ourselves. Algorithms, so often described and understood within the field of technological criticism as “not neutral,” meaning, man-made, imbued with human values, human-crafted from dataset up, have their own form of seeing, sorting, and understanding. We are inculcated and drawn into their logics of capture without end. The apparatus of surveillance becomes one that is distributive, an apparatus we carry within us.
Preciado suggests we just need to change “the relationship between our bodies and bio-vigilant machines of biocontrol”—our phones, the internet, our computers—by “altering” them, or making for a “re-appropriation of bio-political techniques and their pharmacopornographic devices.”28 It seems he is suggesting a kind of hacking or intervention. I appreciate this throwback impulse to old-school activism. But as we have discussed at length, because the “de-collectivization and tele-control” of online work catalyzes so much internally, the work of surveillance must also take place in our minds. The tool is but one wave; the brick wall of supremacy that goes hand in hand with capture is far harder to hack.
We can discuss mismanagement of the crisis as a type of open door to technocratic mass control, yes. But as the medical surveillance apparatus activates at scale, as policing intensifies or is privatized, we must be careful to not frame each slide as just another expansion, a new arm or tentacle of the surveillance state, but instead as a highly historical continuation of noting, tracking, and marking bodies with unseen “pre-existing conditions,” as reasons to be abandoned by the state, diminished, and discarded.
We should remember the eagerness our communities displayed in this crisis to have more phone surveillance, more police, in exchange for civic freedoms. We should be hedging against the casually dangerous impulse to embrace tracking and tracing for being inside or outside, and also reserve our energy and critique for systems, for governments being wholly unprepared. Can we name this widespread desire for techno-authoritarian oversight, the scolds hoping for more police, more photos, more tracing? How much space will we leave for a serious self-critique of the comforts afforded by our relative positions? How much space in cultural discourse do we make for assessing our role in continuing state surveillance, in expressing its logics? At this moment, calls to redesign or redefine surveillance—in some cases, embracing it as a potential good, or advocating for more “trained” systems for deeper tracking of health—ignores how the current infrastructure of surveillance is working perfectly, just as designed. Surveillance depends on people in power identifying with the police.
Crisis, the state of exception, and the protests that followed all held and hold radical potential for a methodical revision of the history of this moment. If one can take the writing of a history as a site of struggle, and especially so within sites of the digital, then the present and future of surveillance must account for the collective, internalized efforts we take to surveil each other, algorithmically and in flesh. We can recognize the individual walking across our feeds is not to account for the lack of programs of increased security, justice, and restorative care, accounting for poor systemic distribution of resources. Our insecurity about them comes from a lack of safety; our desire to police comes from a lack of imagination about safety not predicated on policing. Turning away from each other, we can turn back to the state.
- Shoshanna Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: Public Affairs, 2019). I draw on Zuboff’s framework and investigation lightly throughout. Her breakdown of Google as a case study provides keen insight into the methods of obfuscation, opacity, law-breaking that tech companies undertake to gather the data and information of their users. But we might also note that Zuboff’s methods keep her from advocating that the economic system that allows Google to thrive should not exist at all. Almost never does the ethical tragedy of wholesale abuse and theft of citizen data that she outlines in detail ever lead her to cast aspersion on capitalism or examine how surveillance is historically endemic, completely essential to capitalism’s function. Further, it is critical to frame the internet’s evolution into the ultimate surveillance tool as more than an assemblage of case studies of a few powerful protagonists who shaped platforms.[↩︎]
- Kim Stanley Robinson, “The Coronavirus Is Rewriting Our Imaginations,” New Yorker, May 1, 2020, https://www.newyorker.com/culture/annals-of-inquiry/the-coronavirus-and-our-future.[↩︎]
- The early pandemic will be narrated as a suspension of time in which the virus was only seeable when it was historical; it can’t be seen until it registers in the body.[↩︎]
- One outcome of the pandemic, mediated in this way, could be that more people will inadvertently train in how models and simulations produce reality, and further, in interpreting their inputs, their assumptions, their lack of clarity and points of necessary revision.[↩︎]
- Christopher Robbins, “NYPD Makes Arrests For Social Distance Violations As More Officers Call Out Sick,” Gothamist, April 3, 2020, https://gothamist.com/news/nypd-makes-arrests-social-distance-violations-more-officers-call-out-sick.[↩︎]
- Alice Speri, “NYPD’s Aggressive Policing Risks Spreading the Coronavirus,” The Intercept, April 3, 2020, https://theintercept.com/2020/04/03/nypd-social-distancing-arrests-coronavirus/.[↩︎]
- NBC News, Twitter, May 30, 2020, https://twitter.com/NBCNews/status/1266758240018276352.[↩︎]
- Adrian Chen, Twitter, May 30, 2020, https://twitter.com/adrianchen/status/1266859149612072960.[↩︎]
- Toyama spoke of the situation in: https://www.bridgemi.com/michigan-government/violating-michigan-social- distancing-orders-big-brother-may-be-watching.[↩︎]
- Paul B. Preciado, “Learning From the Virus,” Artforum, (May/June 2020), https://www.artforum.com/print/202005/paul-b-preciado-82823. Paul Preciado’s text is one of the few strong theoretical analyses of the politics and technological dimensions of this pandemic within the history of past pandemics.[↩︎]
- Guy Debord, “Spectacular Time,” in The Society of the Spectacle (New York: Zone Books, 1995). Guy Debord described spectacular time as the time spent consuming images and, in the broader sense, “as image of the consumption of time,” of vacations and time portrayed, “like all spectacular commodities, at a distance.”[↩︎]
- Zeynep Tufekci, “Keep the Parks Open,” The Atlantic, April 7, 2020, https://www.theatlantic.com/health/archive/2020/04/closing-parks-ineffective-pandemic-theater/609580/.[↩︎]
- Ron Lee, “How New Crowd Controls at Some City Parks Worked Out this Weekend.” ny1.com, May 11, 2020, https://www.ny1.com/nyc/all-boroughs/news/2020/05/11/new-social-distance-enforcement-at-city-parks[↩︎]
- Ashley Southall, “Scrutiny of Social Distance Policing as 35 of 40 Arrested Are Black,” New York Times, May 29, 2020, https://www.nytimes.com/2020/05/07/nyregion/nypd-social-distancing-race-coronavirus.html[↩︎]
- Nicholas Mirzoeff, The Visual Culture Reader (New York: Routledge, 1998), 166.[↩︎]
- Wendy Hui Kyong Chun, “Othering Space” in The Visual Culture Reader, ed. Nicholas Mirzoeff (New York: Routledge,1998), 244.[↩︎]
- Ibid. Preciado takes time to walk us through Foucault’s conception of biopolitics as foremost a way “to speak of the relationship that power establishes with the social body in modernity. Foucault described the transition from what he calls a sovereign society, in which sovereignty is defined in terms of commanding the ritualization of death, to a ‘disciplinary society,’ which oversees and maximizes the life of populations as a function of national interest.” Preciado asks us to consider how discipline and punishment are enacted in the anesthetized technological theater of quarantine.[↩︎]
- Elvia Wilk, “What’s Happening? Or: How to name a disaster,” Bookforum, (Summer 2020), https://www.bookforum.com/print/2702/what-s-happening-24019.[↩︎]
- Ibid. [↩︎]
- Kelly House,“Violating Michigan social distancing orders? Big Brother may be watching,” Bridge, May 5, 2020, https://www.bridgemi.com/michigan-government/violating-michigan-social-distancing-orders-big-brother-may-be-watching.[↩︎]
- Makada Henry-Nickie, and John Hudak, “Social Distancing in Black and white neighborhoods in Detroit,” Brookings Institution, May 19, 2020, https://www.brookings.edu/blog/fixgov/2020/05/19/social-distancing-in-black-and-white-neighborhoods-in-detroit-a-data-driven-look-at-vulnerable-communities/.[↩︎]
- As Wendy Chun describes masterfully in Updating to Remain the Same, software updates introduce and obfuscate their content to such a degree through insidious and opaque dark patterns that users rarely notice or read—or have the space—to analyze what is being introduced. Over time, this relationship to software in which users are actively incentivized to scroll past has made the force of the click and swipe, the need to keep one’s phone moving, a matter of powerful design working as intended. [↩︎]
- “Privacy-Preserving Contact Tracing,” Apple and Google, https://www.apple.com/covid19/contacttracing.[↩︎]
- “Exposure Notification API launches to support public health agencies,” Google, https://blog.google/inside-google/company-announcements/apple-google-exposure-notification-api-launches/.[↩︎]
- Kif Leswing, “Three states will use Apple-Google contact tracing technology for virus tracking apps,” cnbc.com, May 20, 2020, https://www.cnbc.com/2020/05/20/three-states-commit-to-apple-google-technology-for-virus-tracking-apps.html.[↩︎]
- Mohana Ravindranath, and Amanda Eisenberg, “Contract Tracing apps have been a bust,” Politico, August 19, 2020, https://www.politico.com/news/2020/08/19/contact-tracing-apps-have-been-a-bust-states-bet-college-kids-can-change-that-398701.[↩︎]
- Preciado, “Learning From the Virus.”[↩︎]