Over the weekend, Facebook revealed it had struck a partnership with the Technical University of Munich – one of the world’s leading centres for artificial intelligence research – to establish a new AI ethics institute.
Facebook has pledged to give the centre $7.5m (£5.8m) over the next five years, in an attempt to “advance the growing field of ethical research on new technology and explore fundamental issues affecting the use and impact of AI”.
While some may be pleased to see Facebook using the spoils of its global advertising machine to fund research into the ethical implications of technology, others might be concerned about academics partnering with the very companies whose products they are employed to evaluate.
Both Facebook and the university were keen to stress that the centre – which is destined to become one of the best funded and largest AI ethics institutes in the world – will operate independently. In a blogpost about the announcement, the social media giant’s director of applied machine learning, Joaquin Quiñonero Candela, uses the word “independent” five times, including in the first sentence.
“Drawing on expertise across academia and industry, the Institute will conduct independent, evidence-based research to provide insight and guidance for society, industry, legislators and decision-makers across the private and public sectors,” Quiñonero Candela states later in the post. “The Institute will address issues that affect the use and impact of artificial intelligence, such as safety, privacy, fairness and transparency.”
A spokesperson for Facebook told NS Tech that the centre would maintain full academic independence and not be required to report to Facebook in any way regarding the selection or publication of their work. It’s also free to explore other funding streams.
But while it may be true that researchers will not be officially required to report back to Facebook, Quiñonero Candela does claim that the company will have input into one of the most influential stages of the research process. He writes: “To help meet the need for thoughtful and groundbreaking academic research in these areas, Facebook looks forward to supporting the Institute and help offer an industry perspective on academic research proposals, rendering the latter more actionable and impactful.”
A spokesperson for the university clarified that the researchers would be free to ignore the company’s guidance and that Facebook would not be sitting on any of its advisory boards. But if Facebook has some influence – however benign – on the subjects researchers are exploring, it may threaten the centre’s independence. That the centre will be so dependent on one company for funding may also concern some observers.
In 2017, a team of researchers at a Google-funded thinktank in Washington were jettisoned from the organisation after a senior member of staff wrote a blogpost praising the EU’s decision to fine the search giant, allegedly drawing the ire of its then chief executive Eric Scmidt. The incident sent shockwaves through the university sector, and served as a warning to any research centre taking funding from those it’s expected to critique.
Julia Powles, an associate professor of technology law and policy at the University of Western Australia, expressed concern about Facebook and TUM’s partnership. “Quite frankly, some of the most challenging ethical quandaries in current and proposed deployments of A.I. are directly with Facebook,” she told NS Tech. “There is more need than ever for the independence of the academy in debunking the myth-making around what boils down to probabilistic targeting based on surveilling every aspect of our daily lives.”
“That, of course, is the business model of Facebook, so funding research that helps insulate it is smart strategy. But you’ve got to ask: is it ethical?”
While some universities have refused to take funding from tech firms that could compromise their research, it’s not uncommon for Silicon Valley giants to fund academic studies. Nor is it uncommon for academics to tell journalists they are unwilling to comment on particular issues because they concern firms who fund them.
If the experts paid to scrutinise these companies cannot afford to criticise them, then something has gone very wrong with the system. Universities must ensure that the funding they hope will liberate their researchers does not end up gagging them instead.