Published

18

Jun

2019

Position Paper

Stemming Sinister Tides: Sustainable Digital Ethics Through Evolution

Key messages

  • In a professional setting, ethics is a shared conversation about what behaviours among our group – and so what impacts we collectively have on others – are acceptable.
  • As it becomes more and more difficult to opt out of technologically mediated life, the ethics of how to do right with technology become more and more urgent. The ethics of business (and the business of ethics) are nothing new; this is a conversation that spans thousands of years. What has changed is the potential harms and benefits of new technologies.
  • Organizations risk four key harms if they fail to address digital ethics:
    • Loss of customer base to more ethical service providers
    • Loss of talent to more ethical employers
    • Public harm through exacerbating systemic inequalities
    • Loss of competitive edge against unethical actors
  • Developing digital ethics is like having a digital strategy: you cannot have a digital strategy without a business strategy, and you cannot have digital ethics without business ethics. Digital ethics, and being ethically digital, is about managing the specific ethical concerns that emerge through technological ubiquity.
  • Ethical frameworks and codes are a means for defining our expectations. Turning these from ideals into ethical impacts requires an ethics of action. The tools and techniques in this paper get ethics off the page and bring them into practice.
  • I propose an approach that matches ethical schools of thought, and the pragmatic tools developed within them, to stages of technological maturity, as illustrated in Figure 1. This enables ethical decision-making from the earliest genesis of an idea through to its commodification. This is a sustainable approach to digital ethics.

Figure 1 – The evolutionary qualities of ethics

Figure 1 – The evolutionary qualities of ethics

There is a tide in the affairs of men. Which, taken at the flood, leads on to fortune; Omitted, all the voyage of their life Is bound in shallows and in miseries. On such a full sea are we now afloat, And we must take the current when it serves, Or lose our ventures. –Julius Caesar Act 4, scene 3 

The United States Industrial Alcohol (USIA) company, in the guise of its subsidiary the Purity Distilling Company, had a big problem. A 2.3 million gallon-sized problem1. It had built the problem, a massive molasses storage tank, in 1915 to keep up with increased wartime demand for alcohol used in the production of munitions, cleaning products and so on. From the day it was built, it leaked. It leaked so badly that they painted it brown. It leaked so badly that one of the maintenance men used to race out in a panic from his home in the middle of the night to check on it. It leaked so badly that children from the North End neighbourhood, at the time one of the most densely populated patches of ground in the United States, collected buckets of the sweet brown syrup to enjoy2.

Then one day in January 1919, the tank exploded. A 40-foot wave of molasses washed over the neighbourhood, sweeping away buildings, crushing part of an elevated railway line and sucking people under. 21 people were killed and 150 were injured. Later, over a thousand witnesses testified in a five-year investigation.

This is a story about corporate social responsibility, public image and greed. It is a story about unequal balances of power. It is a story about one of the first class-action lawsuits against a major corporation in the United States. It is a story about the challenges in correctly assessing the risks of complex systems. And it is a story about a young industry getting regulated because it had failed to protect the public interest in the pursuit of its aims. Today’s technologists, both in well-established sectors and emergent industries, could learn a lesson or two from this story about creating a scalable ethical technology practice – and what happens when ethics are on the back burner.

What makes this a story about ethics? Isn’t the real story about risk assessment and compliance? Is ethics more than that? Few people wake up in the morning and think, “Today’s a great day to be unethical!” But all of us make decisions each day that require us to weigh competing interests. Sometimes this involves assessing impacts on different groups, including potential impacts that are not clear at the point of the decision.

All of us make decisions each day that require us to weigh competing interests. Sometimes this involves assessing impacts on different groups, including potential impacts that are not clear at the point of the decision.

Ethical concepts

  • Ethics: in a professional context, this is a guiding framework outlining acceptable standards of behaviour within a professional body such as a company, a regulatory body within a field of practice, a guild or association. While ethics can include an implicit set of cultural expectations, aspirations, habits and attitudes, more typically professional ethics consists of an explicit body of knowledge including a set of principles, expectations, guiding behavioural rules and consequences of non- compliance.
  • Morals: personal beliefs about right and wrong guided by individual or shared values and social expectations.
  • Etiquette: in a professional setting, this typically refers to unwritten, implicit behavioural expectations. A sort of ‘ethics lite’.
  • Values: qualities, principles or beliefs that describe morals on a personal or collective level. Values can also be used as broad-stroke principles to shape specific ethical expectations and codes.
  • Social expectations/social norms: informal shared understandings that influence group behaviour. The ‘shoulds’ that go without saying – until something happens that requires them to be made explicit, like a mismatch in cultural expectations.

In the Boston molasses flood case, USIA took a very considered approach to the tank’s potential impact on the different communities where it might end up being built – or rather, the possibility of pushback against a potential impact. One reason USIA chose the site for its enormous tank, one of the largest ever built, was that Boston’s North End neighbourhood was a tightlyknit community of Italian immigrants who were not very politically active. At the time, Italian immigrants were viewed with suspicion and contempt throughout America, with incidents of violence and discrimination against them extremely common. By comparison, Irish immigrant communities in Boston and beyond tended also to be close-knit but comparatively outward-looking, participating actively in local elections and forming influential voting blocs that could impact government policies, including regulations about things like building codes. There would be little public opposition to the tank in the primarily Italian immigrant neighbourhood, unlike other sites USIA considered.

As with most disasters, there was more than one point of failure. Throughout this paper I will use examples of these failures to demonstrate how an ethical approach to building technologies can avoid or greatly mitigate ‘big problems’ similar to the ones USIA built. The blame lies not with USIA alone, but also with faulty regulatory frameworks: at the time, the tank itself did not require a building permit because it was an unoccupied structure, so only the concrete pad underneath the tank was assessed. This building code requirement changed in the aftermath of the disaster, a foreshadowing of Boston’s other major impact on national building codes, the Cocoanut Grove nightclub fire, which set the national standard for American building safety codes for years to come3. When building the tank there was such pressure to complete it before the first delivery of molasses that the construction schedule was extended to round-theclock shifts during freezing December weather. The rushed workmanship was later found to be a major contributing factor to the tank’s structural collapse. On completion, the supervising manager for the construction project, not an engineer but a treasurer for USIA, ordered work crews to put just six inches of water in the 50-foot-high tank to test for leaks. Despite repeated warnings from its own staff about the volume of molasses leaking daily from the tank, USIA did no follow-up analysis to look for structural weaknesses.

For many organizations today, developing ethical approaches to technology means automation projects: learning to navigate ethical handling and processing of vast amounts of data in a technoscape that increasingly requires everyone’s digital participation, whether in their private lives as consumers, their civic participation with citizen services and government functions, or their economic participation as employees.

Figure 2 – The scope of the digital ethics challenge

The scope of the digital ethics challenge

It becomes more and more difficult to opt out of technological solutions in our day-to-day lives, as attested by my LEF colleagues David Moschella’s work on the prevalence of innovation in both vertical technology stacks and horizontal cross-industry platforms4, and Glen Robinson’s on leveraging these capabilities by adopting the Matrix Mindset5. Where a technology is so ubiquitous that participation is a requirement for operating at all, there must be effective mechanisms of representation and redress for the interests of everyone impacted by that technology. Because so much of ‘digital’ is underpinned by data – its storage, combination and use as training sets for automated decision systems – much of digital ethics rests on the pillars of data ethics. Today, fairness, transparency, accountability and explainability are the core principles of nearly all AI ethics codes specifically and technology ethics codes more broadly (Figure 3). There is an ever-increasing multitude of ethical codes from which to choose, but these principles are common to them all. Jointly they can provide a framework that is not only about virtuous ideals, but also a commitment to putting ethics into action through providing mechanisms for oversight.

Take a look at how these principles might have prevented USIA’s big problem over a century ago: in 1915, USIA chose a site where it knew the general public would not object, and even if members of the neighbourhood did kick up a fuss, no one else would care. It did not test its new structure and when things went wrong, it simply painted over the problem. USIA’s actions were unfair, opaque, unaccountable and inexplicable.

In my commentary on the subject, I outlined four key harms organizations risk doing to themselves and others if they fail to care about digital ethics:

  • Loss of customer base to more ethical service providers
  • Loss of talent to more ethical employers
  • Public harm through exacerbating systemic inequalities
  • Loss of competitive edge against unethical actors

Above we covered why, and now this paper delves more deeply into how to create and sustain a culture of ethics.

Figure 3 – Making ethical ... digital

Figure 3 – Making ethical ... digital

1. This is about 3.5 Olympic swimming pools, or 7,500 bathtubs.

2. For information regarding the Boston Molasses flood I am grateful to: Stephen Puleo, Dark Tide: The Great Boston Molasses Flood of 1919, Beacon Press, 2004; Cara Giaimo, “The Boston Molasses

Flood Is Worth Taking Seriously”, Atlas Obscura, 15 January 2019 www.atlasobscura.com/articles/boston-molasses-flood-100-year-anniversary; and Sarah Betancourt, “The Great Boston Molasses

Flood: why the strange disaster matters today”, Guardian, 13 January 2019 www.theguardian.com/us-news/2019/jan/13/the-great-boston-molasses-flood-why-it-matters-modern-regulation

3. Nik De Kosta-Klipa, “The deadliest disaster in Boston’s history happened 75 years ago. Some worry the city is forgetting”, Boston Globe, 28 November 2017 https://www.boston.com/news/history/2017/11/28/cocoanut-grove-fire-memorial-75-years

4. David Moschella, https://leadingedgeforum.com/research/embracing-the-matrix-the-machine-intelligence-era/ and Seeing Digital: A Visual Guide to the Industries, Organizations and Careers of the 2020s, DXC Technology, 2018

5. Glen Robinson, https://leadingedgeforum.com/research/the-matrix-mindset/

 


Cookies

We use cookies to improve the user experience of our website. If you continue to browse, we'll assume you are happy for your web browser to receive all cookies from our website. See our cookies policy for more information on cookies and how to manage them.

Accept