Sandra Wachter


Sandra Wachter is as Associate Professor and Senior Research Fellow in Data Ethics, Artificial Intelligence, robotics and internet regulation at the Oxford Internet Institute, University of Oxford. She is also a Fellow at The Alan Turing Institute.

Early life and education

Wachter grew up in Austria. She studied law at the University of Vienna. Wachter has said that she was inspired to work in technology because of her grandmother, who was one of three women admitted to the Austrian technical university. She completed her Master of Law in 2009, before starting as a legal counsel in the Austrian Federal Ministry of Health. During this time she joined the faculty at the University of Vienna, where she started a doctoral degree in technology, intellectual property and regulation. She completed her PhD in 2015, and simultaneously earned a master's degree in social sciences at the University of Oxford. After earning her doctorate Wachter joined the Royal Academy of Engineering where she worked in public policy. She returned to the University of Vienna where she worked on the ethical aspects of innovation.

Research and career

Her work covers legal and ethical issues associated with Big Data, AI, algorithms and data protection. She believes that there needs to be a balance between technical innovation and personal control of information. Wachter was made a Research Fellow at the Alan Turing Institute in 2016. In this capacity she has evaluated the ethical and legal aspects of data science. She has argued that artificial intelligence should be more transparent and accountable, and that people have a "right to reasonable inferences". She has highlighted cases where opaque algorithms have become racist and sexist; such as discrimination in applications to St George's Hospital and Medical School in the 1970s and overestimations of black defendants reoffending when using the program COMPAS. Whilst Wachter appreciates that it is difficult to eliminate bias from training data sets, she believes that is possible to develop tools to identify and eliminate them. She has looked at ways to audit artificial intelligence to tackle discrimination and promote fairness. In this capacity she has argued that Facebook should continue to use human moderators.
She has argued that General Data Protection Regulation is in need of reform, as despite attention being paid to the input stage, less time is spent on how the data is assessed. She believes that privacy must mean more than data protection, focussing on data evaluation and ways for people to control how information about them is stored and shared.
Working with Brent Mittelstadt and Chris Russell, Wachter suggested counterfactual explanations – statements of how different the world would be to result in a different outcome. When decisions are made by an algorithm it can be difficult for people to understand why they are being made, especially without revealing trade secrets about an algorithm. Counterfactual explanations would permit the interrogation of algorithms without the need to reveal secrets. The approach of using counterfactual explanations was adopted by Google on What If, a feature on TensorBoard, a Google Open Source web application that uses machine learning. Counterfactual explanations without opening the black box: automated decisions and the GDPR, a paper written by Wachter, Brent Mittelstadt and Chris Russell, has been featured by the press and is widely cited in scholarly literature.

Academic service

She was made an Associate Professor at the University of Oxford in 2019. She is a Visiting Professor at Harvard University from Spring 2020. Wachter is a member of the World Economic Forum Council on Values, Ethics and Innovation, an affiliate at the Bonavero Institute of Human Rights and a member of the European Commission Expert Group on Autonomous Cars.

Awards and honours