Ethics Isn’t Enough - Designing Equitable Technology: Growing Digital Ethics in Practice

In this episode, Mutale encouraged listeners to move away from the concept of ‘ethics’, which can be too slippery to design policy around and to focus instead on harm reduction for the most vulnerable groups and reducing inequality overall. By designing for the consumer who has the least systemic power, we can all benefit. Mutale’s prior research includes a framework on racial literacy designed to empower technologists to reduce the harms of racism through three foundations: acknowledging that structural racism has a bearing on technological outcomes, developing the emotional intelligence to have these conversations instead of avoiding them, and making a commitment to take action for harm reduction.

Ethics Isn’t Enough - Designing Equitable Technology: Growing Digital Ethics in Practice

A powerful example of recognizing a problem and working quickly to commit to addressing it happened in the week we recorded this podcast: Twitter users noticed that the algorithm which crops images in the news feed was repeatedly prioritizing white faces. Community pressure grew to address this problem, and Twitter eventually committed to reviewing the system again—even though they had originally tested the system for racial and gender bias when it was originally deployed, clearly more work is needed. Mutale highlighted this as an example of people who were being affected by an issue of bias managing to create accountability where there would otherwise have been none. But there are many technology systems which create invisible unequal outcomes, and there is still more work to do to ensure technologists recognize the harms they might cause and have mechanisms in place to reduce these harms—with one of the most powerful being involving the stakeholders who might be disadvantaged by a technological choice in designing the systems that affect them.

As well as the racial literacy framework, Mutale pointed to two other major levers that can be used to improve technological outcomes for all: the first is introducing better technology legislation and more powerful penalties for violating consumer protection law and human rights law, and the second is to develop more creative visions of the future of what technology can do for us, both in terms of benefits as well as possible harms. Many of the technologies which are regular parts of our lives today started not as a research experiment or commercial project, but as an imagined piece of science fiction that was eventually brought to life by passionate and committed engineers. Mutale recommended the work of Octavia Butler as a great starting point for those looking for a more imaginative and egalitarian future envisioning.

You can find out more about Mutale’s work at, and her work for Data & Society on racial literacy, including a comprehensive framework for harm reduction, at


We use cookies to improve the user experience of our website. If you continue to browse, we'll assume you are happy for your web browser to receive all cookies from our website. See our cookies policy for more information on cookies and how to manage them.




Future thinking & strategy

Re-imagining business

Leadership & capabilities