The Anxiety of Being Watched by Machines: There's an App for That
Ai Weiwei and Jacob Appelbaum, in a still from "Surveillance Machine" by Laura Poitras. Image: Madison McGaw/BFA

FYI.

This story is over 5 years old.

Tech

The Anxiety of Being Watched by Machines: There's an App for That

Modern surveillance has both artists and technologists on edge.

What does it feel like to be under surveillance? For Laura Poitras, the filmmaker who documented Edward Snowden in Citizenfour, it feels like a ringing in your ears. "Depending on the stress level it would get louder or quieter," Poitras told the data scholar Kate Crawford during the New Museum's annual "Seven on Seven" conference earlier this month. That sound—ringing—is the effect of tinnitus, but it was also a fitting sign for the anxiety that, we know now, is otherwise represented by our phones.

Advertisement

As if we all weren't anxious enough already, a panel of federal judges recently deemed the National Security Agency's indiscriminate collection of telephone metadata totally illegal. It's just the latest reminder of how widespread systemic surveillance and its seemingly innocuous cousin, data collection, have become. And because we can't quite know how much the machines "see," more and more we internalize the effect of being watched as if it's happening at all times, which it very might well be. At least it can feel that way.

That anxiety was a recurring theme at this year's "Seven on Seven," in which the New Museum's new media wing Rhizome pairs seven artists and seven technologists, imposing on them the reality TV-style challenge of making something in 24 hours and presenting it the next day. The theme of this year's event, the seventh in as many years, was "sympathy and disgust."

In addition to being a part of the keynote speech, Poitras also travelled to Beijing to document one of the seven pairs, the dissident artist Ai Weiwei and independent security researcher Jacob Appelbaum, who was an early partner of Wikileaks and one of the journalists and consultants who reported the disclosures by Edward Snowden.

Neither Ai or Appelbaum were able to travel to New York for the event due to restrictions imposed by their respective governments. Due to the lengthy travel to Beijing, the pair were given an additional 24 hours to complete their project. Fusion reporter Kashmir Hill, who accompanied them to Beijing, took to calling them the "the global dissident elite."

Advertisement

"Close your eyes," Poitras says, and asks Ai and Appelbaum to both recount their "worst experience with the state." Underneath footage of the pair preparing their project for Rhizome's challenge—they shredded once-secret NSA documents and stuffed the paper inside toy panda plush toys, along with mysterious SD cards—we hear each of them disclose their own memories of interrogation.

"There's always two soldiers, military type with uniforms standing 80cm away," Ai says quietly, recalling the room where he was kept for months on charges of tax evasion, a physical space he has mined in recent work.

Appelbaum describes his detention at a New York City airport in the wake of WikiLeaks' disclosures. "At one point you could hear this conversation right before the door closed. Someone said, 'so that's what a terrorist looks like these days,'" he remembers.

The two brave their encounters with power differently—Ai strikes a generally more peaceful, sagacious attitude than Appelbaum—while the banal footage of his Beijing studio suggests the ubiquity of these anxious spaces.

Digital Divination: A Mystical Kind of Self-Surveillance

While the "dissident elite" represented the anxieties of surveillance in the objects they made and the recounting of their travails, some of the event's other pairs conceived of strategies for coping with the general weather of surveillance.

French artist Camille Henrot and activist and programmer Harlo Holmes looked toward timeless methods of relieving anxiety as a way of dealing with the stresses of our post-Snowden era. Taking inspiration from ancient modes of divining like the Ai Ching, they created an oracle-style application called "Desktopmance," which algorithmically answers a user's personal questions with a poetic response and a series of images, built from the files stored on your desktop.

Advertisement

Camille Henrot and Harlo Holmes. Image: Madison McGaw/BFA

"Desktopmance" is, in part, a replacement for typing your burning questions into a search engine like Google, where your most secret queries like "will I ever find love?" or "do I have cancer?" are being stored along with the rest of our data. The result is a scrambled, imaginative form of self-surveillance and predictive modeling, built from some of our most disparate content and metadata: a way of exorcising, perhaps, the anxiety of the indecision, information overload, and hidden processes that comprise the online life.

Holmes explained how they built a template based on some poems they liked and then designed the app to do natural language processing on the fly, finding random words from one's desktop files to insert into the template. The result is an answer that's both lyrically opaque and eerily familiar. (Henrot and Holmes underscored their choice to keep their app offline and, befitting a divine power, to keep the code closed to scrutiny.) The divinatory method, Henrot explained, provides relief from anxiety because the answer is never straightforward. The act of interpreting is what relieves anxiety, they suggested, shifting our attention away from the burning question and toward a contemplative analysis of the situation.

Computer Visions

While Henrot and Holmes considered the ways computers might relieve some of the anxieties they generate as they gather our data, another presentation asked what it means for computers to literally watch us. Trevor Paglen, the artist who's made much out of the imagery of surveillance, and Mike Kreiger, who, as co-founder of Instagram, has made much out of images, found a third collaborator along the way in artist Adam Harvey. Harvey, whose art centers on surveillance technologies and various responses to them, came to meet paglen for lunch and ended up spending the next dozen hours helping Paglen and Krieder begin to make some sense of exactly how computers see images. They analyzed several famous images, from the "Napalm Girl" photo to Manet's "Olympia" to understand how artificial intelligence uses things like facial recognition, edge detection, and a formula for nudity that relies on a certain ratio of white skin.

Advertisement

Adam Harvey, Trevor Paglen, and Mike Krieger. Image: Madison McGaw/BFA

Paglen, Kreiger, and Harvey's presentation reduced some anxiety: as it turns out, machines are actually still pretty bad at seeing. (In a telling anecdote about the limits and biases of these sophisticated algorithms, the black servant attending to Manet's model didn't even register to the computer as a person.) One tool they used, a University of Toronto project that interprets the image it sees in text, described the Mona Lisa as "a woman in a black dress taking a selfie" and the horrific "Napalm Girl" photo as "a large group of people walking around on the beach." Paglen said we should see it as a snapshot of nascent AI vision right now. Another reason for anxiety creeped in, with his reminder that bombs are sometimes dropped based on such grossly imperfect technology. (While humans are still involved in executing US drone strikes, the technology for autonomous killer robots is not science fiction.)

The conference's final presentation reinforced the sense that the machines that are watching and engaging us can't see us or understand us. The artist Hannah Black and Twitter bot maker Thricedotted staged a performance in which Black attempted to have a conversation on the topic of labor and mechanization with a bot. Following each of her diatribes, in which she lyrically addressed urgent questions about technology, intimacy, and violence, she'd ask, "Do you understand?" "No," the bot would answer.

Hannah Black and Thricedotted. Image: Madison McGaw/BFA

"In theory," Hannah told the bot, "there is something utopian about the NSA's total recording of all language. Of every word said in love or anger or boredom… Reading and misreading are among the first operations of love." The robot didn't get it.

According to last week's ruling, Black is not far off when she describes this "total recording." The 97-page decision details the NSA's metadata collection program: "The orders at issue here contain no such limits. The metadata concerning every telephone call made or received in the United States using the services of the recipient service provider are demanded, for an indefinite period extending into the future." Every word forever. The machines don't understand the implications of that, and the humans continue to struggle to as well.