New Times,
New Thinking.

  1. Ideas
13 September 2023

The rise of the new tech right

How the cult of IQ became a toxic ideology in Silicon Valley and beyond.

By Quinn Slobodian

In 1958, the British sociologist Michael Young published what would become a celebrated dystopian novel – and a novel dystopia – with an enduring coinage in its title: The Rise of the Meritocracy, 1870-2033. Writing from the first decades of the 21st century, an imaginary Michael Young of 2034 describes an earlier era when “intelligence was distributed more or less at random. Each social class was, in ability, the miniature of society itself; the part the same as the whole.” This began to change around 1963 when “schools and industries were progressively thrown open to merit, so that the clever children of each generation had opportunity for ascent” into the class of “brain workers”. A new social hierarchy began to emerge – based on intelligence.

Introducing the “unpleasant term”, which made him famous, Young wrote that, by the 2030s, “we frankly recognise that democracy can be no more than aspiration, and have rule not so much by the people as by the cleverest people; not an aristocracy of birth, not a plutocracy of wealth, but a true meritocracy”. Prefiguring a half-century of discussions about artificial intelligence and automation, Young described how “cyberneticists” modelled machines on the minds of men, achieving a breakthrough in 1989 when a computer called Pamela with an IQ of 100 became the national yardstick, a gold standard of brainpower.

[See also: How Big Tech made us screen addicts]

Yet the meritocracy had a problem. By definition, only those deemed elite had the capacity to grasp fully the necessity of their own elevated status. Resisters to the new paradigm included the religious and the socialists, who made common cause with those “just intelligent enough to be able to focus their resentment on some limited grievance”. There were also “intellectual egalitarians… so much afraid of being envied that they identify themselves with the underdog, and speak for him”. In a twist ending, the narrator, so confident that the lower classes have internalised their position, is killed in a general strike and armed insurrection of “Populists” led by women on May Day 2034.

Young’s vision of a world divided into cognitive classes has had periodic moments of resurgence. Check Your Own IQ by Hans Eysenck was a bestseller in the 1960s for those curious about how they would fare in the “knowledge economy”, a term coined by Austrian economist Fritz Machlup in the same decade. The tech right in Silicon Valley (not for nothing is the main IQ assessment called the Stanford-Binet test) expresses faith in the idea that “traits like intelligence and work ethic… have a strong genetic basis”.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

The author of that line, the American writer Richard Hanania, is one of the specific reasons for the conjuncture. Hailed by his publisher, HarperCollins, as “one of the most talked-about writers in the nation”, Hanania – who on Twitter has referred to black people as “animals” – was also exposed in July as the pseudonymous author of even more openly racist articles for the website, AlternativeRight.com, founded by the white supremacist Richard Spencer. His articles, posted between 2008 and 2012, included calls to sterilise forcibly everyone with an IQ under 90 and claimed that Hispanics “don’t have the requisite IQ to be a productive part of a first-world nation”.

For Hanania, whose guests on his podcast and YouTube channel have included high-profile intellectuals such as Steven Pinker and Tyler Cowen (whose Mercatus Center at George Mason University (GMU) contributed $50,000 to him), high IQ in individuals and nations leads to success, libertarianism and the appreciation of markets. He frets about “dysgenic fertility” as measured in the decline of IQ rates in the American population and suggests that “the real source of class difference is traits like IQ and intellectual curiosity”.

Where does this obsession with IQ come from? IQ fetishism on the US right has a history that goes back a century, to those in favour of restrictions on immigration and eugenicists such as Madison Grant and Henry Goddard. Its more recent revival, however, can be dated to the 1990s, when culture-war campaigns against affirmative action (or positive discrimination) and civil rights legislation hitched themselves to the rising stars of neuroscience and genomics. When The Rise of the Meritocracy was republished in 1994, the Fortune columnist and amateur intelligence researcher Daniel Seligman wrote that “what would earlier have read as a wickedly provocative speculation about the future now looks a lot like mere reality”. The occupational pyramid of the US had, he wrote, become an “intelligence hierarchy”, and he cited the work of the psychologist Linda Gottfredson.

Talk of the “information age” and the “information economy” took off in the 1990s. In 1991, the then Harvard academic Robert Reich described an economy dominated by a new class of “symbolic analysts… analysing and manipulating symbols – words, numbers or visual images”. The vision of the economy as a collection of nodes transmitting packets of information, overseen by analytical, left-brained engineers, had been the basis of cyberpunk fiction since William Gibson coined the term “cyberspace” in his novel Neuromancer (1984). In 1988, the American sci-fi novelist Bruce Sterling depicted the fictional leader of Singapore announcing to the crowd: “This is an Information Era, and our lack of territory – mere topsoil – no longer restrains us.”

The following year, Tim Berners-Lee brought this closer to reality when he combined hypertext and a local network from his office at Cern in Geneva to create the World Wide Web. “We think of ‘creative’ work as a series of abstract mental operations performed in an office, preferably with the aid of computers,” the historian and cultural critic Christopher Lasch wrote in 1994, the year Young’s book was reissued. This creative work was becoming the métier of a new elite who “live in a world of abstractions and images, a simulated world that consists of computerised models of reality”.

The new economy promised to inaugurate a new reign of the intelligent and the incarceration of those deemed imbeciles. “Cognitive deficit” theories linking low IQ to weak impulse control – and, thus, criminality – also had a breakthrough. The George HW Bush administration declared the 1990s the “Decade of the Brain”, and the launch of the Human Genome Project at its start accelerated what Nature called, in 1995, “the rise of neurogenetic determinism”. “There is no doubt,” Seligman wrote, “that the literate public has been assimilating a few large truths: that genes play a greater role in human behaviour than previously posited; that human beings are somewhat less malleable than had been assumed; that human nature is making something of a comeback.” He proclaimed: “Hereditarianism is on the march. Nature is clobbering nurture.”

IQ fetishism was given its biggest boost in 1994 with the release of The Bell Curve, by the Harvard psychologist Richard J Herrnstein and the libertarian think-tanker Charles Murray – an 800-page treatise that became a surprise hit, with over 400,000 copies sold. Intelligence, the authors argued, was highly heritable, and group differences in intelligence did not change greatly over time. In their book they introduced a category that would have a long afterlife: the so-called cognitive elite.

On paper, the authors did not mean “cognitive elite” as a compliment. In later books Murray would even more pointedly criticise the cognitive elite for their literal and figurative aloofness from the rest of the population. But the moral charge was always ambiguous. The authors were themselves part of this elite. Both Herrnstein and Murray attended Harvard and lived in centres of wealth and privilege – Greater Boston and Greater Washington DC. Wasn’t membership of a cognitive elite being presented as an aspiration?

[See also: The sovereign individual in Downing Street]

In the early 2000s, what one journalist called “the revenge of the nerds” continued, as the West Coast tech sector – far from traditional centres of American power – began to emerge as the engine of the digital economy. Seattle, a city known for its rain, coffee shops and grunge bands, became the home of Amazon, and the sleepy town of Portland the home of Intel. Further south, the fruit-growing valley around Palo Alto, California, which had been used by state defence contractors as a launchpad for many small companies, became the epicentre of what were called, by the 1980s, “start-ups”.

In the process, what has been a critical term became a self-congratulatory one. The apparent world-historical inversion, whereby the smart kids were also the rich and most powerful ones, was celebrated on iconic blogs and mailing lists such as Slate Star Codex and LessWrong (where users self-reported implausibly high IQ scores), as well as Econlib and Marginal Revolution. These latter two featured regular contributions from the GMU economics professors, Bryan Caplan and Cowen, respectively. (Yet another GMU economist, Garett Jones, wrote a book on IQ called Hive Mind and defended gender differences in cognitive reasoning.) Writers and commenters on these sites revelled in arcane detail, the visual language of statistics and graphs, and the impression of academic rigour. Some, like the former National Review columnist Steve Sailer, were open proponents of genetic determinism and group differences in intelligence based on race, or what he dubbed “human biodiversity”, shortened online to HBD.

Illustration by Klawe Rzeczy

Another high-profile member of what was called the “neo-reactionary” movement, or nascent tech right, was Curtis Yarvin, who blogged under the pseudonym Mencius Moldbug. As a teenager, Yarvin had been part of the Study of Mathematically Precocious Youth established by the Johns Hopkins University psychology professor Julian Stanley to identify high-IQ youngsters.

Still attached as an adult to the idea of the cognitive elite, he condemned democracy for spoiling the coexistence of “high-IQ” and “low-IQ” people and proposed a “psychometric qualification” for voting in South Africa, disenfranchising everyone with an IQ below 120.

IQ fetishism has British roots too. One of the most prominent psychologists of race and intelligence, Richard Lynn, who died in July, was unwelcome in his own profession but had collaborated with the free-market think tank the Institute of Economic Affairs since the 1960s. More recently, he was a speaker at a controversial eugenics conference which ran for three years at UCL (though without the university’s knowledge of the themes being discussed). 

When Dominic Cummings started blogging in 2014 he created another platform for the British derivative of a Silicon Valley-based intellectual milieu, with prolix posts on “genomic prediction” and “polygenic scores”. Like Hanania after him, Cummings revealed a special admiration for the physicist Stephen Hsu, whose primary enterprise at this time involved gathering saliva from thousands of high-IQ individuals in China for the Beijing Genomics Institute in the hope of finding the “gene variants that are associated with intelligence”. Meanwhile, at the Future of Humanity institute at Oxford, Nick Bostrom – who on an email forum in the 1990s wrote that “blacks are more stupid than whites” (a comment he has apologised for), and who would later become a core thinker of effective altruism – thanked Hsu in the acknowledgements of a paper about using embryo selection to increase human intelligence.

To followers of the neo-reactionary ideology, the internet and its affiliated communities were offering an alternative public sphere where a new elite could arise by virtue of their brains, their genes or frequently both.

In the run-up to Donald Trump’s election in 2016, the intelligence question emerged again in the ecosystem of what was now being called the alt-right. Charles Murray’s work on the supposed “forbidden knowledge” of intelligence research resurfaced for another round of controversies, claims and counterclaims, as the “intellectual dark web” earned excited profiles in the New York Times. Trump seemed fixated on IQ, referring frequently to his own apparently high score. The tone was captured well in a tweet from 2013 that read: “Sorry losers and haters, but my IQ is one of the highest – and you all know it! Please don’t feel so stupid or insecure, it’s not your fault.”

This time around, the discourse was less about criticising the detachment of the creative elite or praising new leaders of economic innovation. It had taken a graver turn towards the potential need to escape from the drag of surplus members of society, or even exclude them from equal status.

In Germany, an influential 2010 book by the politico and polemicist Thilo Sarrazin, Germany Abolishes Itself, included a defence of restricted immigration from specific communities based on their supposedly low IQ. Murray had proposed something similar. Arguing against continued immigration, the right-wing YouTuber Stefan Molyneux said, “You cannot run a high-IQ society with low-IQ people.” From the loftier perch of Georgetown Universtiy, the philosopher Jason Brennan has made a case for testing citizens’ political knowledge before allowing them to vote, in what he has called an “epistocracy”.

[See also: Silicon Valley Bank unmasks the hypocrisy of libertarian tech bros]

The return of IQ fetishism did not happen spontaneously. It was aided by considerable financial support from a few wealthy men. One is Harlan Crow, heir of a real-estate fortune whose holding company has $29bn under management. Charles Murray dedicated his most recent two books on race science to him. Murray is a regular guest at Crow’s house – along with the Supreme Court justice Clarence Thomas – and the kinship between the two is captured in an unforgettable photorealistic portrait, titled Contemplation, showing the duo gazing into the distance together.

Crow funds the Salem Center for Policy at University of Texas at Austin, where Hanania was a fellow. This is not to be confused with the non-accredited start-up University of Austin (UATX), funded by Peter Thiel’s partner, Joe Lonsdale, where Hanania is also a lecturer in  its “Forbidden Courses” summer programme. While Hanania is not a product of Silicon Valley, he notes himself that most of his readership comes from the world of the tech right. He concedes that his writing is haunted by “the ghost of Yarvin”.

How have things changed since the publication of Michael Young’s book in 1958? Whereas Young’s dystopia portrayed a meritocracy working too well, the complaint of many IQ fetishists on today’s tech right is that it is not working well enough. Even after the Supreme Court ruling against affirmative action in June – a long-time goal of conservatives like Murray – they fear that admissions officers at top universities and hiring committees at top firms still assemble cohorts based on criteria other than true ability. As Dominic Cummings put it in his inimitable style, in a blog post in 2020, “People in SW1 talk a lot about ‘diversity’ but they rarely mean ‘true cognitive diversity’. They are usually babbling about ‘gender identity diversity blah blah’. What SW1 needs is not more drivel about ‘identity’ and ‘diversity’ from Oxbridge humanities graduates but more genuine cognitive diversity.”

The opposition to so-called diversity, equity and inclusion (DEI) efforts is especially virulent. In his blurb to Hanania’s forthcoming book, The Origins of Woke: Civil Rights Law, Corporate America, and the Triumph of Identity Politics, Peter Thiel uses violent rhetoric, writing: “DEI will never d-i-e from words alone – Hanania shows we need the sticks and stones of government violence to exorcise the diversity demon.”

Declaring membership of the cognitive aristocracy must bring a narcissistic buzz. It could even be harmless if it stayed in the comments section. But IQ fetishism has pernicious social effects. In North America, the UK and Europe, it draws racial lines, placing Caucasians, East Asians and Ashkenazi Jews on one side of the line, with other Asians, Hispanics and people of African descent on the other.

The IQ fetishists like to think they are living in a near future where they – the pure creative information workers imagined in the 1990s – have been elevated through their high intelligence and innate ability. They were not simply in the right place at the right time, bobbing along in a sea of liquidity in an era of zero interest rates. They were, like the staff at the Apple Store, geniuses.

With the dream of cryptocurrency – and its many related fantasies, from non-fungible tokens (NFTs) to decentralised autonomous organisations (DAOs) – deflation, and the figurehead of the avenging nerd, the founder of crypto exchange FTX Sam Bankman-Fried, behind bars, a gullible tech right has sought refuge in its consistent creed. “[Bankman-Fried] always seemed low-IQ to me,” wrote one tech CEO in a post hoc justification.

Perhaps the darkest direction that tech-right thinking could go was foreshadowed by Yarvin in 2008. Speculating about the transformation of San Francisco into a private entity called “Friscorp”, he wondered what could be done with the city’s unproductive residents. After considering and then dismissing the idea of pulping surplus “hominids” into biodiesel for city buses, he suggested “the best humane alternative to genocide” was “not to liquidate the wards… but to virtualise them”.

Yarvin envisioned the incarceration of the knowledge economy’s underclass in “permanent solitary confinement, waxed like a bee larva into a cell which is sealed except for emergencies”. Against fears this would seed an insurrection like the one imagined by Michael Young a half-century earlier, Yarvin turned to the new right’s fountain of meaning: technology. The captive’s cell would not be bare. It would include “an immersive virtual-reality interface which allows him to experience a rich, fulfilling life in a completely imaginary world”. In Yarvin’s future – and perhaps ours – May Day 2034 passes without notice. The metaverse saves the meritocracy. The cognitive elite governs undisturbed.

[See also: How Saudi Arabia is buying the world]

Content from our partners
Water security: is it a government priority?
Defend, deter, protect: the critical capabilities we rely on
The death - and rebirth - of public sector consultancy

This article appears in the 13 Sep 2023 issue of the New Statesman, The Revenge of the Trussites