New Times,
New Thinking.

  1. Culture
  2. Art & Design
16 October 2019

Trevor Paglen: the artist who created ImageNet Roulette thinks AI is taking us backwards

For his latest exhibition at the Barbican, the US artist spent years plumbing the depths of a uniquely disturbing database. 

By Hettie O'Brien

When ImageNet Roulette first appeared online, people posted pictures of their faces framed by thin neon-green squares, accompanied with words: “swot, nerd, nun, hatmaker”. Scrolling down my newsfeed, I saw what happened when those with darker skin uploaded their photos: “negroid, black person, wrongdoer”. The tool, developed by the US artist Trevor Paglen and researcher Kate Crawford, exposed the labels within the database used for artificial intelligence, and showed how algorithms classify our images using racist tropes.

Imagenet is a vast collection of real pictures used to train machine learning systems (the process commonly known as AI). It houses images for a thousand nouns – apples, oranges, skiing, limousine, Sussex Spaniel, and people; a man in a suit is an “entrepreneur”, a woman in a bikini, smiling, is a “slattern, slut”. Its combinations of words and pictures often feel like like the crowded insides of a feverish brain.

At the curve gallery in the Barbican, London, Paglen, 45, has pinned thousands of pictures on the wall that are taken from ImageNet and classified according to its schema. He starts with an apple, a deliberately benign object, represented according to the numerous apples and apple-like objects that an algorithm has been trained to identify. The categories quickly morph and collide; “Investor” shows a group of suited men; “Picker” groups rural figures with fruit crops and straw baskets; “Crazy” shows humans with a reductive set of horror-movie tropes. 

Paglen describes himself as a landscape artist – but he bears little relation to Turner or Constable. His landscapes stretch to the edges of the internet and the bottom of the sea. Over the last 20 years, Paglen has dug into the political infrastructures that govern our lives, photographing the offshore prisons pivotal to the “war on terror”, capturing the cables used by the US government to spy on the web and documenting the code names of NSA and GCHQ surveillance progarmmes. Before Google Maps was invented, he was scouring maps in search of places the state didn’t want you to see. 

It requires indomitable effort. Paglen trained as a scuba diver to travel 100ft towards the ocean bed to capture the broadband cables that transport human knowledge across the world, and spent months in remote deserts photographing secretive military bases. Now, the artist has turned his gaze to AI. His recent exhibition at the Barbican, From Apple to Anomaly, is an extension of his collaboration with Kate Crawford, an artificial intelligence researcher and co-founder of the AI Now Institute at New York University. “This stuff takes forever,” he tells me. “I probably know what’s in the ImageNet database better than the people who made it.” 

From Apple to Anomaly distorts scale, and I find myself oscillating between peering close at the images and standing back to take them in. The categories on display – “divorce lawyer”, “wine lover”, “traitor” – could have been otherwise: there’s no reason why an investor should be represented as a man rather than a woman, other than it reflects our existing prejudices. The group marked “drug addicts” is disproportionately people of colour; “artist models” are almost all female and Asian.

For Paglen, this arbitrary categorisation is part of how technology encodes political assumptions (“you have to have a worldview that has a concept of a divorce lawyer, or whatever”). In replicating meanings that people already attach to images, tools like ImageNet reproduce existing social prejudices and deepen the grooves that contain us. “The whole endeavour of collecting images, categorising them and labelling them,” Crawford and Paglen write in a paper about the dataset, “is itself a form of politics, filled with questions about who gets to decide what images mean”.

Give a gift subscription to the New Statesman this Christmas from just £49

Trevor Paglen, From Apple to Anomaly (2019), photo courtesy of the Barbican. 

I meet Paglen at the Barbican and we sit on a bench in front of the exhibit. He’s visiting London from his studio in Berlin, where he lives and works. Paglen’s ascetic appearance – black jumper, dark jeans – is at odds with his cheerful demeanour. There’s an irony to our meeting spot; the Barbican was among a handful of cultural spaces in London that recently installed a face-scanning CCTV system, a move for which it has been widely criticised (facial recognition, incidentally, relies on training data like the ImageNet photos). 

The apple, Paglen tells me, is both a deliberately non-controversial object to begin with, and a reference to the Belgian surrealist artist René Magritte, whose 1929 painting Ceci N’est Pas une Pomme (This is Not An Apple) was painted in the context of phrenology and criminology, and defied the Victorian certainty that peoples’ characters could be deduced from how they looked. With AI, are we seeing systems of categorisation that move in a similar direction? 

“If I say ‘let’s look at a picture of an oligarch’, everyone in different cultures would have different ideas of what an oligarch might look like,” Paglen says. Yet AI imposes a particular set of cultural and political assumptions on the digital rules that govern our lives and shape our perceptions. Rather than greater objectivity, he thinks, we’re walking towards cultural colonialism. 

Paglen resists biographical interpretations of his work. “For me, one project always morphs into the next,” he tells me evasively. You could still draw a line between his past and present, though; the artist grew up on American airbases between the US and Germany (his father was an ophthalmologist for the airforce). He studied fine art before taking a PhD in geography at University of California, Berkeley, where he completed a project on the architecture of America’s prisons – work that drew his attention to the places that authorities wish to keep a secret. 

From Apple to Anomaly is less concerned with state secrets than private sector algorithms whose internal workings are as opaque as a black box. “When you’re looking at an outfit like the NSA or GCHQ, one of the things you realise [is] that there are these much bigger outfits called Google, which are pretty similar in terms of the kinds of data they collect,” he says. “You begin thinking about what the politics of information are in general”.

Paglen and Crawford’s work with the ImageNet database exposes how the urge to collect and categorise information is never neutral. As historians of science have pointed out, the phrenologists and criminologists of the early 20th century, who used skull measurements to classify and reinforce artificial racial categories, didn’t think of themselves as political reactionaries – rather, they thought that modern science would lead to greater clarity. 

For Paglen, the scientific urge to create less biased algorithms by collecting and categorising more and more information may lead humanity down an alternate path; one where racial categories and social prejudices are reinforced. “The entire way classification is conceived of in machine learning systems is always going to be discriminatory. That is a feature of the system, and not a bug”. 

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football