Presentation: A Young Profession Coping With Ethical Debt

Track: Tech Ethics in Action

Location: St James, 4th flr.

Duration: 10:35am - 11:25am

Day of week: Wednesday

Level: Intermediate

Share this on:

What You’ll Learn

  • Learn why software developers should take ethics into consideration
  • Hear what can be done to incorporate ethics in a software product
  • Understand why professional ethics is more important than ever in software engineering

Abstract

The concept of professional ethics is not new to the world, but it is fairly new to computing because computing itself is so new. Ethics and its requirement as a part of professionalism started with monks and quickly embraced the three occupations of importance: religious service, law, and medicine. Today most professional trades have a code of ethics that is uncontested. Computing does not. How did we get here, why must we change, and how?

Question: 

What is the motivation for your talk?

Answer: 

If you look around our industry the vast majority of people that are working today in our industry, writing code, making decisions that impact users, haven't had an intense ethics course in their life, they haven't taken an ethics course in high school, they haven't taken an ethics course in college. It doesn't mean that they don't know ethics, ethics are pretty innate in human beings. They're not surprises most of the time, but having a rigorous model for understanding whether or not what you're doing is ethical and what the consequences are of those actions is something that is learned. It's not something that is innate, and I can propose a rigorous model for asking yourself questions, not getting answers. There's a playbook for discussing ethics; there's a playbook for contemplating them; there's not a playbook for answering them. Ethics are very open ended. The vast majority of hard ethical questions have two or more non-ethical answers: you don’t choose from right and wrong ethical answers. The question is what your mental model is for making yourself answer those questions instead of just avoiding them and pretend they don't exist.

We have 30 years of software development, and the last 10 to 15 of those have been hyper-accelerated software development. We have software all over the place that influences the behavior of human beings, and we didn't create that software with strong ethical constructs around it. So we have a ton of software that we reuse and deploy today that didn't ask itself hard questions. Its authors didn't ask themselves hard questions. So, even in an existing project, if you're greenfield, when you pull in a library or pull on some other modeling system, especially in the world of ML, which luckily is somewhat new, there's a good chance that the system that you are adopting didn't have ethics as a part of its initial construct. And that is going to result in very tragic consequences for society. We are not ahead of the game. We need to get back in a position where we can cope with this stuff.

There's a YouTube video called Slaughter bots. The scariest part of that video is that there is not a single piece of technology that's discussed in isolation that does not already exist today capably. So the only thing in that thing that's a little unbelievable is the battery life on the drones. It's incredibly disturbing. And all of that is a long chain of ethical lapses in software development. I don't want to paint a doom and gloom thing. What I really want to do is to get people to incorporate ethical considerations into the work that they do. I think that software developers have come a long way and they try to think about security as they develop code now. I don't think they have done a great job of it, but I will say that the security community has through various mechanisms -fear, uncertainty, doubt, liability - all of these things have caused that conversation to come up more often than not during the cycles of software development. Is this secure or should we be concerned about that? Just those two simple questions are fantastic. Sometimes we don't have the right answers but at least software developers ask themselves that. They're not asking themselves, what are the ethical consequences of this? Who could get hurt by this? Who does this enable over another person? Who does this disadvantage or advantage? They're not asking those questions. My goal is to have that part of the natural sequence of developing software. Someone should be asking those questions. Every developer coming to your other question of who this is for, every line level developer that writes a line of code should have the capacity to incorporate those questions into what they do and it should go all the way up the stack to grand vision.

Question: 

How do you scope applying ethics to software. I mean, as a developer, what are the boundaries? (Are there boundaries?)

Answer: 

I think that in some respect there are no bounds. But I think that it's pretty easy to frame the discussion. The ethical consequences of software development are those that affect humans. Humans don't consume JavaScript libraries. They don't consume C APIs or Kubernetes clusters. They don't do that, but humans incorporate those things to deliver services to humans. So, the ethical concerns about building a software package that is a library or a framework are really limited in scope in many ways. There are two responsibilities that you have there. One of them is that you need to have the ethical considerations of the consumer of your product, which is a very limited group of people, software developers. This is a great example: you probably don't want to write documentation that disenfranchises a group of developers. That's easy. People don't do it but it's not a complicated ethical question to contemplate. It's like the surface area of what you're building has very limited exposure it's a small relatively small group of people, it doesn't have wide impact on society. How it gets used can. So the second ethical concern there is making sure that you adequately disclose the things that you have contemplated in your software so that the person incorporating it can make good decisions. None of those decisions are on them but you want to enable them to make the right decisions.

Question: 

Can you give us an example of ethical considerations that a developer might face?

Answer: 

Sure. There are certain soap dispensers in airports that are automatic, that detect when your hand is under it unless you're African-American because it doesn't see your hand. One needs to hold a white piece of paper under the soap dispenser to get it to fire. The reason that's there is not because a group of engineers thought that would be really funny to disenfranchise African-American people and people with dark skin. That is a layered cake of a lack of ethical questioning. It's simply because people chose not to ask the questions, not because they asked them and were like "let's do evil." A lot of ethical concerns there. That system is a whole bunch of chip integrators and product designers, and some software writers. But they adopted software from other libraries and they adopted models. There's a tree of software development. A lot of people who wrote software for that didn't know what was going into that machine.

I am specifically considering ethics in software development not in staffing or hiring. I think all ethical concerns share a common fabric. But those are very different. It will always be the case that software is developed by someone who is not represented in society, everywhere. There are so many different veins. There are people who are visually impaired, who are hearing impaired, who are of every different "race", different heights, different weights, different genders. The idea that I'm going to develop a piece of software for a wide community, and I need a one of everyone, is an intractable problem. It's important that we have a framework for asking questions. Open-ended questions, ethical questions are almost always open-ended. Who does this affect? What could go wrong with the software? Not in bugs, not in lines of code, not in crashing, but in who will this not service? Those are hard questions, and we're going to screw it up. What I want people to do is to ask the questions, and have that as part of their software development cycle. In the context of the detector of skin color, I don't know what technology is in it at all. But if you have a simple computer vision system you're probably using a computer vision library, perhaps there should be a small ethical discussion in the computer vision library about what are the consequences of training this on a wrong model? You could also detect cookies with rat poop in them as if they were chocolate chips. There are other consequences, and you might disenfranchise a minority if you don't train it on a diverse set of human beings.

Question: 

Who are you talking to in this talk?

Answer: 

I'm talking across the stack. They all play a part. An individual contributor needs to think about the ethical context and how they could best communicate that there might be consequences with using their software. They need to know what their questions should be so that they can ask those questions and answer those questions. They need to be part of that machine. That machine needs to exist and be driven from the product where that product interfaces with society. But if you don't have context for that below in the tree of dependencies, it's too easy to ignore that the dependencies that you have haven't had ethical considerations taken into account.

Question: 

What type of questions should developers be asking?

Answer: 

It's highly dependent on what they do. What type of software they work on. I think that the one thing that they can absolutely take away is a simple question of how could this harm someone. Honestly, asking the question is so valuable. Just asking it, some percentage of the time will result in a corrected behavior or consideration that wasn't otherwise there. That's what they can do. They can look at the code that they're writing and ask questions. I would argue that no one ever writes software without that software eventually interfacing with a human being. We are the ultimate consumers of software. We build software to make human lives better. It's not like we're going to make robot lives better. If we do it's because robots make human lives better. There's a chain there that always ends up with the human as the as the benefactor of technology. Just asking the question of how is my code going to be used by humans?

Question: 

Should there be consequences of unethical software?

Answer: 

Yes, I would argue there should be possible consequence of justice when you do things unethical. But ethics are driven not by punishment but by mutual societal benefit. This is one of the worries that I have in general working with a lot of people that have a background thin in ethics, often stemming from no liberal arts background at all. I feel like everyone should read some books. At university I learned how to ask questions, even the hard ones, and be comfortable with questions I couldn't answer yet. And ethical questions should be part of it.

Speaker: Theo Schlossnagle

Founder and CEO @Circonus, Editorial board of ACM's ‘Queue’

Theo founded Circonus in 2010, and continues to be its principal architect. After earning undergraduate and graduate degrees from Johns Hopkins University in computer science with a focus on graphics and randomized algorithms in distributed systems, he went on to research resource allocation techniques in distributed systems during four years of post-graduate work. A widely respected industry thought leader, Theo is the author of Scalable Internet Architectures (Sams), has contributed to several other published works, and is a frequent speaker at worldwide IT conferences. Theo is a computer scientist in every respect. Theo is a member of the IEEE and a distinguished member of the ACM. He serves on the editorial board of the ACM's Queue Magazine and ACM's Practitioner Board.

Find Theo Schlossnagle at

Last Year's Tracks

Monday, 5 March

Tuesday, 6 March

Wednesday, 7 March