If the rise of computing is one of the biggest stories of the twentieth century, then the failure of the nation that invented the electronic computer to capitalise on it is undoubtedly one of history’s most important cautionary tales.
In 1944, Britain led the world in electronic computing. The top-secret codebreaking computers deployed by the British at Bletchley Park worked round the clock to ensure the success of D-Day and the Allies’ win in Europe. At a time when the best electronic computing technology in the United States was still only in its testing phase, British computers literally changed the world.
After the war, British computing breakthroughs continued, and British computers seemed poised to succeed across the board, competing with US technology on a global scale. But by the 1970s, a mere 30 years later, the country’s computing industry was all but dead.
What happened? The traditional history of computing would have you understand this change through the biographies of great men, and the machines they designed. It would gesture towards corporations’ grand global strategies, and the marketing that those companies pushed to try to define what computers were for an entire generation of workers. It would not, however, focus on the workers themselves. And by ignoring them, it would miss the reasons for this catastrophic failure—a failure that remains a cautionary tale for many other countries today, particularly the United States.
Collective failures
When we talk about computing history, we rarely talk about failure. Narratives of technological progress are so deeply ingrained into our ways of seeing, understanding, and describing high tech that to focus on failure seems to miss the point. If technology is about progress, then what is the point of focusing on failure? Up until recently, we also rarely talked about women in relation to computing.
The first silence is related to the second. Women, after all, were seen as having largely failed in computing until recent historians’ attempts to correct that assumption. But as it turns out, technological failure and women’s erasure are intimately related in more than one way. When we put these facts together—our avoidance of failure, our ignoring of women in computing, and our tendency to see women’s contributions as less important—curious patterns start to emerge.
The failure of one unnamed and ignored post-war computer worker is a good place to start. In 1959, this particular computer programmer faced a very hectic year. She needed to program, operate, and test all of the computers in a major government computing centre that was doing critical work. These computers didn’t just crunch numbers or automate low-level office work—they allowed the government to fulfil its duties to British citizens. Computers were beginning to control public utilities like electricity, and the massive and growing welfare state, which included the National Health Service, required complex, calculation-dense taxation systems. Though the welfare state was created by policy, it was technology that allowed it to function.
In addition to doing all of her normal work, our programmer also had to train two new hires. These new hires didn’t have any of the required technical skills. But once she trained them, which took about a year, they stepped up into management roles. Their trainer, meanwhile, was demoted into an assistantship below them. She succeeded at her job, only to fail in her career.
That the trainer was a woman, and that her trainees were both young men, was no coincidence. Nor was it coincidental that, as a woman, she had the technical skills for a job like this while they did not. That’s because before computing became electronic, women were seen as ideal for what was considered mundane calculation work. Though this work often required advanced mathematics knowledge, it was perceived as unintellectual. Before a computer was a machine, it was a job classification—these women workers were literally called “computers.”
Even when electromechanical and then electronic computers came in, women continued to do computing work. They programmed, operated, troubleshooted, tested, and even assembled these new machines. In fact, IBM UK measured the manufacturing of computers in “girl hours” (which were less expensive than “man hours”) because the people who built the machines were nearly all women. Meanwhile, the British government, the largest computer user in the nation, called their computer workers the “machine grades” and later, the “excluded grades”—excluded from equal pay measures brought into the Civil Service in the 1950s. Because their work was so feminised, the government declined to give them equal pay and raise their pay to the men’s rate on the basis that the men’s wage was almost never used. Therefore, the lower, women’s wage became the default market rate for the work. So concentrated in machine work were women that the majority of women working in government did not gain equal pay.
By the mid-to-late 1960s, however, the low value assigned to computing work was starting to change. Not because the work itself was changing, but because the perception of the work was. Instead of being seen as intimidating behemoths that were only good for highly technical tasks, computers were now becoming widely integrated into government and industry. Their great power and potential was growing more apparent. Suddenly, low-status women workers were no longer seen as appropriate for this type of work—even though they had the technical skills to do the jobs.
So the UK faced a major problem: all of the workers who could do this work were no longer the type of workers that management wanted doing the work. Instead, managers wanted people who would eventually become managers themselves to control these newly important machines and all of the decision-making that was being programmed into them. That excluded women. In this era, women were not supposed to be in positions of power over men. Both implicit and explicit prohibitions prevented women from managing men or mixed-gender workforces.
Moving out, moving up
Around the same time that the woman programmer trained two men to replace her, a young woman named Stephanie Shirley embarked on a technical career at the prestigious Post Office Research Station in Dollis Hill—the same government agency where the Colossus codebreaking computers had been created during World War II. Shirley had been a child during the war, born in Germany, and she was Jewish. She was evacuated out of Nazi-occupied Europe with 10,000 other children on the Kindertransport, a humanitarian refugee program designed to take Jewish children to England. By comparison, the United States allowed in little more than 1,000 children through a similar program.
Grateful for the chances afforded by her adoptive country, Shirley set out to make the most of them. Yet early in her career she began to chafe at the confines of British culture. With a degree in math, a good work record, and a master’s degree on the way, Shirley was the perfect candidate for promotion—or so she thought. As she was denied promotion after promotion, she started to understand that her role was being defined by things other than her technical skill and education—that the much-vaunted “meritocracy” of the government service was anything but.
“What shocked me was the discovery that, the more I became recognised as a serious young woman who was aiming high—whose long-term aspirations went beyond a merely subservient role—the more violently I was resented and the more implacably I was kept in my place,” she wrote in her memoir.
After being denied another promotion, one that she’d earned several times over, she eventually learned that the men evaluating her were resigning from the promotions board rather than making a decision on her case. “They disapproved on principle of women holding managerial posts,” she found out, so they would rather resign than consider her for a promotion. “I was devastated by this: it felt like a very personal rejection,” she recalled.
After hitting the glass ceiling first in government and then in industry, Shirley did what women were supposed to do—she got married and resigned from her position. But she wasn’t happy about it. She still had the skills, the intelligence, and the drive to work in computing, and she knew many other women who were in the same situation—being stymied in their careers not because they weren’t good enough, but because they were women.
Because she saw the need for computers growing, she knew that people who could program would be essential. Only they could figure out how to unlock the potential of the new mainframes that so few managers understood, even as those managers earmarked hundreds of thousands of pounds to buy them. So although Shirley got married and started a family, she continued to work. In 1962, she started her own software company, Freelance Programmers, out of her home. When she had stationary made for her new company she half-jokingly put the name all in lowercase, because “we had no capital at all.”
Nevertheless, she began to recruit women who had similarly been forced into “early retirement” by having children or getting married. Once her business began to grow, she published an ad seeking people for full-time programmer positions, in the classified section of the Times of London. It read: “Wonderful chance, but hopeless for anti-feminists.” In other words, the company had a woman boss. The ad also announced that there were “opportunities for retired programmers (female) to work from home.” In 1964, this was revolutionary.
Shirley’s freelance programmers worked from home in an era when computer time was so expensive that most programming was done on paper before being punched onto cards and then tested on an actual machine. Programming from home was therefore not a problem, as long as you had a telephone to collaborate with your co-workers. Indeed, one of her programmers was once chastised by a company for using too much computer time to debug a program. Programming without a computer was cheaper—and preferred.
Programming from home also allowed women to simultaneously take care of their young children and fulfil their domestic responsibilities. To make things seem more professional, Shirley played a tape recording of typewriter sounds in the background when she answered the phone at her house, in order to drown out the sounds her young son might make. And when she was unable to get contracts early on, she took her husband’s suggestion that she start signing her letters with her nickname instead: “Steve.” With Steve Shirley as the public face of the company, business began to take off.
Ann in electronics
Like many start-up founders, Shirley was meeting a consumer need that was still barely understood. In the early 1960s, most software came packaged with the computer itself or was written in-house after a company purchased a mainframe. Software was not considered a product in its own right—and few people expected that customers would actually pay for it separately after spending so much money on a computer.
Shirley realised that they would. She had seen the need in both government and industry for programmers who could unleash the potential of expensive hardware with good software. Without software, after all, computers didn’t do anything, and with poor software they couldn’t fulfil their potential or justify their cost. Shirley also knew that British industry and government were getting rid of most of the people who had programming skills and training because they were women, thereby starving the entire country of the critical labour that it needed to modernise effectively.
Shirley scooped up this talent pool by giving women a chance to fulfil their potential. Offering flexible, family-friendly working hours and the ability to work from home, her business tapped into a deep well of discarded expertise. Because people who could do this work were an absolute necessity, the government and major British companies hired her and her growing team of women programmers to do mission-critical computer programming for projects ranging from payroll and accounting to cutting-edge projects like programming the “black box” flight recorder for the first commercial supersonic jet in the world: the Concorde.
A woman named Ann Moffatt led the Concorde programming team. She kept the credit for her work. Working from home, Moffatt managed a team of women who also worked from their homes. In fact, this was the first time that Freelance Programmers had undertaken a project managed and staffed exclusively by remote workers—rather than being overseen by one of the four full-time managers who operated out of the small office space Shirley had rented a few years after starting the company out of her house. The arrangement of using remote workers to manage projects worked so well that Ann would go on to become technical director at the company, in charge of more than 300 home-based programmers.
Moffatt sits at her kitchen table in 1966, writing the code for the Concorde, while her baby looks on.
Much like Stephanie Shirley, Ann had begun working in technical roles in the 1950s, but had encountered a roadblock once she had children. The feminist business practices of Freelance Programmers let Ann continue her career and take care of her home and children, all the while contributing to Britain’s high-tech economy. In addition to being a major project for the company, the Concorde was a symbol of British high-technology pride and prestige—it flew successfully for decades, the only supersonic passenger airplane to date. And Ann’s concurrent project functioned for even longer: the baby in the photograph is now 53 years old.
Losing The Lead
While Shirley, Moffatt, and hundreds of other women programmers created software that helped Britain advance further into the accelerating digital age, British industry and government struggled to hire, train, and retain their computer workers. Women had the technical skills, but were not supposed to be managers. Even the fact that women were wielding more power by controlling computers was viewed as dangerously out-of-bounds.
As computing became increasingly interwoven with all of the functions of the state—from the Bank of England to the Atomic Energy Authority—computer workers grew indispensable. By the late 1960s, the government began to fear losing control of the machines that allowed the state to function because they did not have a well-trained, permanent, reliable core of technical experts. The women who had the technical skills were judged unreliable because they were not aligned with management. They were seen as liminally working-class, temporary workers who should not rise above their current station. To elevate women further would upend the hierarchies of both government and industry, pushing low-status workers into high-status positions.
So determined were ministers within government that they needed a cadre of male, management-oriented technocrats that they began, counterintuitively and in desperation, to lower the standards of technical skill needed for the jobs. Lowering standards of technical proficiency to create an elite class of male computer workers didn’t work, however. In fact, it made the problem worse, by producing a devastating labour shortage.
Well-heeled young men tapped for the positions often had no interest in derailing their management-bound careers by getting stuck in the “backwater” of computer work, which had still not fully shaken its association with low-level, feminised labour. Machine work in general was viewed as unintellectual and working-class, ensuring that men of the desired background had little interest in being swept up in the “industrialisation of the office.” Most men who were trained for these positions, at great employer expense, left to take better, non-computing jobs within a year. As a result, the programming, systems analysis, and computer operating needs of government and industry went largely unmet. Although there were plenty of women who had the required skills, the government all but refused to hire them, and private industry largely refused to promote them.
Stephanie Shirley, Ann Moffatt, and their co-worker Dee Shermer.
Soon, the most powerful people within the Civil Service had become convinced that the government could no longer function by trying to get more young men into computing: the numbers simply weren’t there. The shortage of “suitable” computer labour had risen to the level of a national security issue in the eyes of the state. Even low-level women computer workers held great power: when the all-women punching staff went on strike for better pay and working conditions, the massive new VAT system ground to a halt, derailing months of planning, to the horror of the men at the top. So they decided to approach the problem from a different angle: if there weren’t enough men for computer jobs, the number of these jobs needed to be reduced. They needed to find a way to do the same amount of computing work with fewer computer workers.
This meant ever more massive, powerful mainframes that could be run by centralised control and command. On the advice of the minister of technology, the UK decided to force the largest remaining British computer companies to merge into one huge firm that could provide government and industry with the sort of massive, centralised mainframe technologies they needed. In 1968, International Computers Limited (ICL) was born, and ordered to produce the machines that would allow Britain to meet its digital needs with its newly minimised and masculinised computer labour force.
Unfortunately, this change occurred right as the mainframe was on its way out, in a period when smaller and more decentralised systems were becoming the norm. This meant that by the time ICL delivered the product line they had been tasked with creating, in the mid-1970s, the British government no longer wanted it, and neither did any other potential customers. As the government realised their mistake—though not the underlying sexism that had caused it—they quickly withdrew their promised support for ICL, leaving the company in the lurch and finishing off what was left of the British computer industry.
Hiding tech’s mistakes
Stephanie Shirley’s company succeeded by taking advantage of the sexism intentionally built into the field of computing to exclude talented and capable technical women. At the same time, the rest of the British labour market discarded the most important workers of the emerging computer age, damaging the progress of every industry that used computers, the modernisation projects of the public sector, and, most strikingly, the computer industry itself.
By utilising just a small portion of this wasted talent, Shirley rescued many women’s skills from being discarded entirely and helped British industry and government fulfil some of the promise of computerisation. But for every woman Shirley employed there were always several more applicants she could not. The massive waste of human talent rippled upward, eventually destroying the British lead in computing and the British computer industry.
In computing, discrimination is as old as the field itself. And discrimination has shaped the field in ways we are only now coming to understand and admit. The technical labour shortage in the UK was produced by sexism—it did not represent a natural evolution of the field, nor a reflection of women’s talents, goals, or interests.
Computing history shows us that the “computer revolution” was never really meant to be a revolution in any social or political sense. People who were not seen as worthy of wielding power were deliberately excluded, even when they had the required technical skills. To a great extent, that process continues today. Now, as then, hierarchies are constructed through high tech to preserve powerful social and political structures.
That we have historically ignored the impact of women on computing—both the women who stayed in the field and the many more who were pushed out and shaped the field through their absence—shows how narratives of technological progress hide the mistakes of the past. Often, failures teach us more. But by assuming the tautology that technology always leads to progress, we become blinded to all of the situations in which the opposite has occurred.
In twentieth-century Britain, computers helped re-institutionalise ideas about women’s second-class status in society. They took away women’s ability to participate in the digital economy under the pretext that they should not be in charge of powerful machines even if they had the technical know-how.
Though these attitudes may seem antiquated today, a closer look at the technological landscape of the United States reveals that societies continue to ignore the role of women and other minority groups in technology fields—and the impact of technology on them. When Twitter or Facebook is accused of doing something that hurts women, it is seen as a niche concern.
Technology’s alignment with actual progress has a long and uneven history, and its effects are rarely straightforward or fully foreseen. Real progress isn’t synonymous with building another app—it involves recognising the problems in our society and confronting the uncomfortable fact that technology is a tool for wielding power over people. Too often, those who already hold power, those who are least able to recognise the flaws in our current systems, are the ones who decide our technological future.
Marie Hicks is a historian of technology and professor at the Illinois Institute of Technology in Chicago. This piece draws on their book, Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing.
A longer version of this piece first appeared in the fifth issue of Logic Magazine, a print publication about technology.