The recent episode (Season 4, Episode 2) of Black Mirror entitled Arkangel has lots of people chirping. As someone who focuses a lot of energy on the vagaries and harms done by electronic monitoring, much of this chirping could draw some much broader connections.
Like many Black Mirror episodes, Arkangel is set in some incredibly grey, futuristic, no name town. The triggering incident for the plot is a young mother who loses her young daughter in the local park for a couple hours. This precipitates panic and she heads for Arkangel, a local firm that injects a device into children’s heads that enables their parents to watch them 24/7. Actually the device does more than watch them, it live streams what the child actually sees for remote viewing by the parent on a special Arkangel tablet. The added dimension of the technology is the capacity to pixelate negative images-violence, growling dogs, foul language etc. So the growing child never really sees these things. Long story short, as the girl grows up, the mother relaxes, puts the tablet in the attic and life carries on-until the teen years. One night when her daughter stays out late, the mother reactivates the device and sees her daughter snorting drugs and having sex with a young man. This precipitates a total meltdown and ultimately a violent fight between mother and daughter.
Most of the commentary on this episode focuses on the implications for parent-child relations and the ways in which surveillance technology may negatively impact these dynamics, the phenomenon of “helicopter parenting.”
As with much of the analysis of surveillance, the boundaries remain fixed within the realm of white, middle class surburban life, rather than connecting to the surveillance state or what Hamid Khan refers to as the “stalker state.” The reality of surveillance technology for vulnerable populations, particularly poor people of color, is that such devices are neither far-fetched nor are their impacts confined to family relations. Rather, existing invasive technologies currently connect the “criminalized” to the broad spectrum of databases where they already appear-school expulsions, family services, foster care, mental health, substance abuse, not to mention the court and prison systems. These metadata bases ultimately deprive people of liberty-blocking them from access to employment, housing, education and other opportunities. Tracking technology presently appears in crude forms in electronic monitoring devices, typically ankle shackles, which record a person’s location in real time.
Though not yet Arkangel, these devices are growing their reach. Telmate’s cellphone based monitoring system gives supervising authorities the power to access a monitored person’s online interactions as well as capacity to phone them and order them to turn on the camera so the supervisor can see where they are and who they are with. Marketed under the slogan, DON’T GIVE THEM AN ANKLE BRACELET,GIVE THEM A HAND, the Guardian can make use of facial recognition technology to create virtual group guilt by association.
In another twist, researchers at University of Massachusetts Lowell recently won a research grant to develop a new monitoring technology as known as BEACON, or Behavioral Economics Application with Correctional Opportunities Notification.
As with Telmate, the rhetoric accompanying BEACON is that of what I call “carceral humanism,” packaging punishment as if it is a social service, a “helping hand” as it were. “Think of it as a personal coach, like a weight-loss program, to keep probationers motivated and accountable,” said one of the researchers in the project media release. They also say their proposed technology could identify when probationers are at risk by tracking altered behavior — for example, changes in sleep and movement pattern that might indicate substance abuse. In such situations “Responders would then have a chance to intervene.” This raises at least two issues. First, what other biometrics would be monitored besides sleep patterns and to what end? Second, what type of response would be forthcoming? While for academic researchers, intervention may be idealized as medical treatment or much-needed psychotherapy, the gentle hand of state intervention is not typically the response in over-policed communities of color. SWAT teams or goon squads are at least as likely a prospect.
The point is, people experience surveillance technology in different ways. For the privileged it may being surveilled has often become a consumer choice they make with some knowledge of the tradeoffs. The popularity of Fitbits and other biometric based devices, even their use by professional sporting teams, is ample evidence that many layers of the population have no problem giving up their data to the corporate cloud. But for those living in the hot spots targeted by predictive policing, Fusion Centers and Stingrays, those who live in communities where ankle shackles are more plentiful than college degrees, the Arkangel in some form will likely be watching over them soon. Five years ago I interviewed a group of youth in Los Angeles about their experience of electronic monitoring devices. I asked them where they thought this technology was headed. They all agreed without hesitation, “they’re gonna stick chips in everybody.” For them Arkangel is not a question of a fantasy or a future Twilight Zone, it’s a question of when is this coming to our neighborhood? Unless, of course, more people start reading between the class and race lines of Black Mirror.