New Times,
New Thinking.

  1. Science & Tech
26 September 2019updated 12 Oct 2021 4:36pm

“The New Jim Code” – Ruha Benjamin on racial discrimination by algorithm

The Princeton sociologist and author of Race After Technology on how new technologies encode old forms of segregation – and how we might build something better. 

By Hettie O'Brien

When the author Ruha Benjamin was selecting a name for her newborn son, shortly after the 9/11 attacks, she had a choice: an Arabic name, in honour of her maternal bloodline, or something safer and less provocative? Her son was a black male and, for this, would be profiled. Did she want to add another mark to his file? Benjamin, a resolute professor of African American Studies at Princeton University, accepted it as a dare. “I gave the child an Arabic first and middle name and noted on his birth announcement: ‘This guarantees he will be flagged anytime he tries to fly,'” she writes.

As Benjamin identifies in her new book Race After Technology, to name something (or someone) is to code a piece of information and make it legible to the authorities’ eyes. This has real-world effects. In the UK, job seekers with white-sounding names receive more call-backs from prospective employers. Though it might seem like a matter of personal detail, a name is a defining statement and we take for granted the social messages it encodes. 

“If this is what’s going on with names, what’s happening with more complex forms of categorisation?” Benjamin asks when we meet at a cafe in south London. She’s on a short visit to the UK from New York, and is about to deliver a lecture at Goldsmiths University. Today, information gathered from our digital activity is fed into computer code to create the algorithms that increasingly govern where public resources are spent, which advertisements people see, and what types of loan they can apply for. 

Like the assumptions made about a baby’s name, the prejudices that persist in new data can be equally specious. Benjamin names this paradox “The New Jim Code” (in reference to the Jim Crow laws that enforced US segregation), describing how algorithms can extend – rather than erase – racial discrimination. Two days after we met, I encountered a real-life version of the new Jim Code online. An AI tool developed by researchers at Stanford and Princeton began to circulate on Twitter and was quickly seized upon by journalists, who uploaded pictures of themselves to the server, revealing the assumptions that computers make when categorising human faces. 

I uploaded mine, and was disappointed by the results: “mortal soul, religious person, nun, sister” – a wounding jab at my earnest avatar, yes, but also an innocuous mistake. When my colleague Stephen Bush uploaded a series of suited portraits, the results were more striking: “negroid, black, black person, blackamoor” – it went on – “wrongdoer, offender, convict”. Users of Asian descent were “gook, slant-eye” – the former a disparaging term used by US soldiers during the Vietnam War.

“How are you supposed to react,” the journalist Julia Carrie Wong wrote of her experience using the tool, “when a computer calls you gook?” Her observation captured the problem of holding an algorithm to account. In giving primacy to the power of technology, the prophets of Silicon Vallely ignore how present social discrimination is encoded into the tools that supposedly pave the way to a better future. “They get away with it,” Benjamin says, “because we assume that technology itself is a do-gooding field”.

Benjamin moved to Los Angeles at the age of three, and spent part of her childhood living with her grandmother off Crenshaw Boulevard, a 23-mile stretch of road at the heart of LA’s African-American neighbourhood. Some of her most vivid memories as a child involved the police – the non-stop rumbling of helicopters overhead that would shake the roof as she tried to sleep. “Like anyone who lives in an over-policed neighbourhood, I grew up with a keen sense of being watched. Family, friends, and neighbours – all of us caught up in a carceral web, in which other peoples’ safety and freedom are predicated on our containment,” she would later write in Race After Technology.

Give a gift subscription to the New Statesman this Christmas from just £49

She first started working on Race After Technology in 2016 – before the election of Donald Trump, or the Cambridge Analytica scandal. “At the time it felt like I had to tread carefully – everyone loves their iPhones,” she laughs. The mood couldn’t be more different now. Data, we’ve come to understand, is power. The logic underpinning its collection is that more information is always better. But better for who?

“The underside of data accumulation is extraction. You have to get it from somewhere – and so historically, if you’ve been on the side of those from whom things were taken… your relationship to knowledge is different,” Benjamin says. Silicon Valley is deaf to history; its infatuation with social sorting and data collection neglects the fact these technologies frequently emerge in countries divided by racial and economic inequalities. As the author Virginia Eubanks notes in her book Automating Inequality, the serial numbers tattooed on the forearms of Auschwitz inmates began as punch-card identification numbers developed by IBM for the Nazi regime. Collecting information to measure and sort a population is far from benign.

Formal segregation may have long been outlawed in the US, but one of Benjamin’s key insights is how data collection on an unprecedented scale allows new, more insidious forms of segregation to take place – albeit hidden from clear view. Technologies that falter – such as automated soap dispensers that fail to recognise dark skin – are often presented as glitches, friction yet to be erased on an otherwise smooth trajectory. 

But discrimination by computer isn’t merely a “glitch” in the system, Benjamin warns; it’s the central architecture. “As soon as you start thinking about it as one mistake, the idea becomes, ‘let’s fix it with better data’ – rather than stepping back and thinking: is it really possible to create fair and just algorithms in an unfair and unjust society?” 

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football