New Times,
New Thinking.

The Policy Ask with Carly Kind: “Don’t wait for permission to do the things you want”

The director of the Ada Lovelace Institute on Big Tech, disinformation, and the UK’s retreat from international human rights laws.

By Spotlight

Carly Kind is the director of the Ada Lovelace Institute, a research organisation dedicated to ensuring data and artificial intelligence are employed ethically and benefit people and society. She is also a human rights lawyer and an expert in the intersection between technology and human rights. She was formerly legal director at the charity Privacy International, has advised industry and government on digital rights, and has worked with the European Commission, Council of Europe and multiple UN bodies.

How do you start your working day?

My working day begins many hours into my actual day, which jolts to a start around 5am to the demands of some small children. By the time I sit at my desk, I’m ready for a silent moment of meditation (aka scrolling through Twitter) before I check my calendar and jump into the first of many meetings. I’m enjoying the return to some face-to-face meetings and conferences, but also appreciate virtual meetings (which seem to be persisting beyond the pandemic era) and the equal opportunity of participation they bring.

What has been your career high?

Taking the Ada Lovelace Institute from the bare beginnings of an organisation in 2019 to a well-established actor in the institutional landscape today has, undeniably, been the most rewarding endeavour of my career. Among other things, the experience has enabled me to explore different strategic mechanisms for leveraging research to shape policymaking. It has also been exciting to think creatively about how to influence the often impenetrable tech sector.

What has been the most challenging moment of your career?

Building a collaborative, diverse and interdisciplinary team culture is one of the most difficult parts of leading an organisation, and the overnight shift to a fully remote workplace that occurred in March 2020 made that job even harder, particularly for a growing team that had worked together for less than nine months. Now, the task of imagining what the workplace looks like in the post-pandemic world poses new challenges, just as it promises exciting new opportunities to reimagine policy and research.

Give a gift subscription to the New Statesman this Christmas from just £49

If you could give your younger self career advice, what would it be?

Don’t wait for anyone to give you permission to do the things you want to do. You are the only person for whom your career is a priority, so don’t wait around for someone to bring you the opportunities you need. I would also congratulate my younger self on finding the right balance between generality of experience and subject matter expertise – both are critical as you forge a path in your chosen profession.

Which political figure inspires you?

It’s hard to be a young(ish) woman working in tech policy and not be inspired by Alexandria Ocasio-Cortez – not only because her frank, unapologetic and outspoken brand of politics is so refreshing, but because it is matched by clear intelligence and an ability to grasp incredibly complex tech policy and regulatory issues. Her Congressional interrogation of Mark Zuckerberg concerning political advertising on social media platforms set a new standard for politicians challenging Big Tech.

What UK policy or fund is the government getting right?

I applaud the UK’s aspirations to be a world leader when it comes to algorithmic transparency. Most people won’t be aware that decision-making by algorithms has become increasingly common in the delivery of public services, as well as in hiring, the financial sector and insurance, and that these systems often have problems of bias or other inaccuracies. The UK was one of the first countries to develop a national algorithmic transparency standard, and I’m hoping it will soon go further to place obligations on public sector organisations to publish understandable details about algorithms that significantly affect people.

And what policy should the UK government ditch?

I remain unpersuaded that the post-Brexit deregulation strategy being pursued by government will deliver the benefits claimed, and as a human rights lawyer I’m particularly concerned by the proposal to replace the Human Rights Act with a British Bill of Rights. Not only does it degrade the historical role Britain played in drafting the European Convention on Human Rights, but it undermines many decades of work by the judicial branch to carefully weigh and refine the content of and relationship between rights. At a time where other nations are stress-testing the international order, I also worry that the UK’s retreat from international consensus could have an unforeseen and regrettable domino effect.

What piece of international government policy could the UK learn from?

The European Union’s ambitious approach to regulating data and technology is still the gold standard, but the UK has an opportunity post-Brexit to use regulation to go further in encouraging responsible and ethical innovation in data and artificial intelligence. It should do this rather than further entrenching the “move fast and break things” ideology, which has fostered an online ecosystem rife with filter bubbles, disinformation and trolling. The UK could take inspiration from the draft EU AI Act – the first attempt to comprehensively regulate AI in a cross-sectoral manner – to develop a domestic regulatory framework.

If you could pass one law this year, what would it be?

Prompted by clear public concern about the rise of facial recognition technology everywhere from policing to the supermarket, the Ada Lovelace Institute has spent the past three years researching and analysing the use and governance of biometrics technologies. We will be calling on government to adopt new primary legislation to govern the use of biometric data, fill gaps in the existing framework, and ensure a social licence exists for the adoption of new technology in the private and public sectors.

Read more: The Policy Ask with Helen Margetts: “Zelensky shows the power of digital communication”

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football

Topics in this article : ,