Computing Professionalism: Do Good and Avoid Evil...and Why It Is Complicated to Do that in Computing

For context, please make sure you've attended the webinar, presented by Don Gotterbarn.

Q: Why are there so many different ethics codes? If right is right, why isn't there only one ethics code?
A: I agree that there are a common set of core values across cultures. Why is there more than one code? On one level it may have to do with pride— everyone thinking they can say it better than the next guy. But it also may be that different organizations and different professions have specific responsibilities that they need to emphasize. The basic issue for us as computing professionals is that in addition to subscribing to a common set of core values we have special obligations related to our profession which get stated in Profession Specific Codes of Ethics such as the Software Engineering Code of Ethics and Professional Practice of the IEEE-CS and the ACM. A medical profession is obliged to respond to an accident in their presence; by the same token given my skill and belief in do no harm, I am obliged to quickly find knowledgeable help.

Q: Computing folks see things in black and white (1 and 0). Is this is an advantage to understand ethics or do we need to know grey?
A: Ethical issues, like everything else, are on a sliding scale of clarity; clearly good, highly likely the right thing to do, I haven’t the slightest idea so I will seek help. (1 and 0) Black and white thinking (sometimes called an all or nothing fallacy) has two significant problems; first when confronted with a possible problem we only think of responses at either extreme and we miss all of the reasonable options in between. The second problem is this fallacy gets converted into an excuse for doing the wrong thing. Too often we hear “Do exactly what the boss says or quit/get fired.” If we take a moment and step outside of the “0 1” thinking we might propose a ‘good’ solution to the boss, which reduces the problems with the initial request, saves the boss some money, and you actually get rewarded for your suggestion. Something good comes out of it.

Q: Do you advocate that a software developer quit his/her job if the developer discovers that requirements will lead to software that will cause social harm? How about the harm to his/her career, family finances, etc... What is the ethical balance?
A: As I indicated in response to Q2, I think there is a range of responses. On different occasions I have quietly (making clear that I had no intent to "one up" the person) pointed out to a customer or boss, because I was concerned for THEM, that doing it this way might open the company to legal suits, and gently suggested that it could be done another way which –saved money- kept more accurate records-whatever they liked (and did not cause the harm I was concerned with). On other occasions a simple pointing the possible harm will be enough.  Often, the all or nothing approach is assumed prematurely; a person has missed a creative possibility for a win/win situation. Unfortunately there are sometimes those cases which require whistle blowing, because no creative solution can be found, and the professional recognizes that blowing the whistle is the only effective way to protect the public.

Q: Keith, greetings from Keith Olson, formerly of Montana Tech, Butte Montana. Good to hear you again.
A: Great hearing from you, too. I hope all is well. --Keith Miller.

Q: What does the NSA/Snowden event say about professionalism in the field going forward?
A: Snowden has become a negative icon for some, and a hero for others. I taught at NSA and I think most of the people I met are competent computing professionals. My talk was an attempt to get us to focus back on positive professionalism. I can tell you about the thousands of volunteer hours spent by computing professionals trying to improve the quality of what they do and the quality of the people their work affects.  If what he did offends you, use you talents on the good side of the force. If you think Snowden did the right thing (or on balance, more right than wrong), then be grateful for his decisions.

Q: What percentage of developers are "Competent Slumlords"?
A: I have no idea. I do have a level of optimism because most of the people I have worked with really want to do quality, socially positive work. When I use the phrase “slumlord” I mean someone who knowingly tries to skimp on quality to maximize profits.

Q: Regarding certification, what about the ethics of the body that provides the certification? I mean for instance, a certification that is more indoctrination to a series of products or a branded methodology rather than focusing on practical standards?
A: There are many certifications that are vendor certifications, which unfortunately don’t carry any professionalism component.  I think they are training courses in a narrow domain, which may be useful in some employment situations. Clearly, as you have noticed, there is a great deal of difference between being a professional and getting a narrow certification in a particular technology. We often use the word “technician” to designate someone who has facility with a technology, but does not act as a fully differentiated professional.

Q: My experience has been that 80-90% of the developers out there would fall into the "competent slumlord" camp, both knowingly and unknowingly.
A: I have been luckier than you.

Q: What is "justified harm"?
A: I have defined harm as anything that limits and fails to promote core values, so the incarceration of Freddie Kruger (Friday 13th) would be a justified harm to him. (See James Moor on core values listed on the resources slide.)

Q: Can you comment on the topic of certification? I'm bothered seeing organizations sponsoring certification (PMP, ScrumMaster, etc.) require a $1000-$2000 course before being eligible for the certification test ignoring life experience. How can we resolve this?
A: I don’t think there is a necessary conflict. I was a teacher for a number of years, so I tend to think training courses are useful to those without experience; and also useful for the experienced to get new knowledge—this is a continual need in our field. I also did software development before there were software engineering courses so I think we can learn a lot from experience and I hired people based on legitimate training they have received. Employers like courses because they help address the problem of "inflated" experience descriptions. Of course, it depends on the course whether or not it is worth what you paid for it.

Q: How do we better celebrate the good, while degrading the bad? Or what is the best approach?
A: Some companies have ethics offices which unfortunately focus on the bad. I know of other companies that make ethics part of the performance review. Mangers sometimes do it on an individual level: “Mary spotted this problem with our release that would have hurt people. Thank you Mary.” They also make announcements like this in their management groups (but avoid posting it on the web because they fear bad corporate publicity).

Q: While there is indifference by the corporations that fulfill contracts and as long as the harm is not very "visible," highly professional persons stand no chance and will keep losing jobs and promotions.
A: I think even as individuals we can make positive steps. The person choosing the keys on the Therac-25 could have made a difference without putting their job at risk. The person designing the Python interface could have made a difference by using directions in addition to the color references.

Q: "What the customer wants" may be incorrect and create ethical issues, and "what the customer needs" can be different. Thoughts?
A: Yup. This is where the fiduciary model comes into play. Most of my best inside salespeople where those who thanked me for asking if I was right asking them if they needed something different.

Q: From Twitter: If 90% of computing professionals are "good guys" who "do the right thing," is professionalization of computing needed?
A:  Clarification: I said 90% of us want to be good guys, defining what is involved in computing professionalism, and then I gave some reasons that interfere with our desire and some ways to address those distracters. Professionalization is one of the ways to address those distracters and to educate those who aren’t aware of the distracters.

Q: The problem is that people who try to test an existing product or service may end up exposing the vulnerability of the systems. Will this be a case of bad guy situation, especially if the press/media put their hands on the critical reviews?
A: If the intention of someone is good, then I hesitate to call them “bad guys.” However, when bad things happen, we think it has ethical significance, and we hope a computing professional can head off the worst harms.

Q: How is "positive" and "negative" societal impact determined? Isn't it possible that what's positive for one population may be negative for another?
A: Yes, it can be. And sometimes there are difficult tradeoffs. We need to make professional judgments, sometimes in consultation with others.

Q: I believe that the "Paternalism" model is fading with the era of open source platforms that facilitate "copy, paste, and tweak it."
A: This is answered in the presentation.

Q: Doesn't the traditional separation of analyst (requirements development) and developer (code production) roles distance both from responsibility for systems as delivered? Yet large projects seem to mandate some silos of this sort. Solution?
A: This is answered in the presentation.

Q: Are all Ethics issues big? When I mention small things like use of office paper supplies, people get annoyed with policy instead of interested. They think they just need to do quality work and the big ethical issues will be resolved in lawsuits.
A: I think significant ethical issues eventually get addressed in law, but our profession moves so fast that the technology outdistances the law. There is discussion about the ethics of using RFID chips in people as trackers. That discussion is overtaken by the tracking function of cell phones, the arrival of Google Glass and Google contact lens.

Q: FMEA can be a very powerful tool for revealing these sorts of issues. How does society best ethically address or follow-up on the research now underway about the digital affect of the human brain?
A: There is fascinating work being done on the effect of digital devices on brain development, and there is another body of work that uses digital imaging to explore brain activity relevant to ethical decision making. How to best follow up on this research is a wonderful challenge, one that we haven’t yet figured out.

Q: I am disappointed with the subjective language relied upon without providing structure to the problem. See Christopher Alexander's A Pattern Language and the "Collected Works of Bernard Lonergan" as extreme but useful approaches.
A: Thanks for the pointers.

Q: 20/20 hindsight makes some issues obvious. How do you identify issues that are unanticipated? Is there something that can help?
A: Consider potential stakeholders (using questions on the slide). Look at the Software Engineering Code of Ethics as a "Computing Professional's Conscience" and see if it helps you think of some additional issues.

Q: It's not possible to think of everything. Does this risk additional "analysis paralysis"?
A: One of my points was that we can’t think of everything and should not use that human limitation as a cop-out. Work on what you can identify, and deal effectively with that. That should yield incremental progress in most cases.

Q: Observation about slides: In light of the NSA stuff, are there any lessons learned there? What should people coding systems for the NSA have done?
A: If those people thought there was significant harm to what they were doing, then they would have had difficult choices: refuse to work on the system, request changes to reduce the harms, and blowing the whistle would be three possible options. Perhaps several people tried the other two options before Snowden blew the whistle.

Q: I notice from the anecdotal cases, my attention goes immediately to problem-solving that situation. I realize this is just adding more technique and not looking at the broader context, and yet I see this everywhere along with my own automatic tendency.
Q: How do we break the temptation to just add more technology/technique?
A (to both): Sometimes the answer is technical: text directions added to the Python interpreter. Sometimes the technical answers are a disaster; check with the stakeholders.

Q: Awesome presentation, Don!
A: Thank you.

Q: Thank you, I found your presentation informative and provocative (in a good way)!
A: Thank you so much (in a good way).

Q: Thank you, amazing presentation and examples.
A:  Thank you for being part of the event.

Q: So a higher order of care is a bit like defensive and considerate driving (weak example?).
A: You need to add liberal use of the horn to warn and keep others out of trouble:)

Q: I did not hear any mention of UX designers. Would it not make sense (as regards many of the examples in the session) to empower UX designers to work with the programmers and the managers to include consideration of who is impacted, and how and why?
A: The emphasis of UX Design on focusing on the user needs could have major benefits, many with ethical significance. I think the designer will have to have the ethical implications in mind to maximize the effect.

Q: How does one balance harms and benefits? A technological solution might cost jobs but might lead to better accuracy or use of resources, both good things for a different group.
A: LIFE IS NOT EASY. Some people are working on automating ethics, therefore providing “ethical assistants” that are machines. But so far, that work has not produced machines that are as good as human ethical experts.

Q: Writing a software that can be hacked is ethical? Like not encrypting passwords or sensitive information in a database...and we know many do it and it is hard to work on avoiding it...
A: You’re right; it is hard work to “harden” software in order to discourage hackers. However, since getting hacked can have seriously harmful consequences, then at least SOME work to discourage hacking is warranted. Exactly how much effort is, it seems to me, application specific.

Q: Regarding 20-20 hindsight, do we have a professional responsibility to share our experiences? (e.g., most programmers I know are not members of IEEE/ACM). Or even share experiences within a company (or maybe I've been in dysfunctional companies?).
A: I have been luckier than you: The professional organizations have only a small percentage of computing professionals as members but most of the computing professionals I knew shared their experiences (sometimes even when I didn’t want them to!).

Q: Privacy as an ethical concern?
A: Yes, privacy is an important ethical issue. In fact, privacy is historically one of the first and most persistent issues discussed in the literature of computer ethics scholarship.

Q: The "ethical issues" in this presentation are covered by a field called "UX Design." UX experts are in high demand these days by digital agencies. Why did you not mention UX at all?
A: The emphasis of UX design on focusing on the user needs could have major benefits, many with ethical significance. I think the designer will have to have the ethical implications in mind to maximize the effect.

Q: Should the naiveté of developers be addressed (because a developer may not have experienced or be aware of these issues) by educating the professional early in their professional education or certification (SWEBOK)?
A: Yes, I am trying to do just that and have contributed to the professionalism section of SWEBOK v 3 (just approved).

Q: A single design may not make all customers happy. The software company may have to make a call to build more than one product based on different requirements for different stakeholders. A small company may not be able to afford building multiple products in Rel. 1. I think the order of "Avoid Evil and Do Good" should be reversed (Do Good, and Avoid Evil ).
A: THANK YOU—You are right! Great idea. Next time, we’ll do it the way you suggest.

Q: What were/are the ethical issues with releasing with security issues?
A: There are lots of things we could say about this. There were serious consequences because this software didn't work well, although it has improved. There are also significant concerns about privacy and security with respect to this system. Overall, whenever people are affected by software (either positively or negatively), we see ethical responsibilities being either fulfilled or not fulfilled.

Q: We celebrate good ethical behavior by allowing the market to reward great UX. This is why Apple is so wealthy; they create better and more empathic user experiences.
A: The emphasis of UX design on focusing on the user needs could have major benefits, many with ethical significance. I think the designer will have to have the ethical implications in mind to maximize the effect.

Q: Airlines report many more "near misses" than accidents to the FAA. Perhaps the software profession should institute such a system in order to get "good" lessons learned.
A: One of the problems with near misses is that they are treated NEGATIVELY (“Boy I almost screwed up,” “He/She was dumb and almost...”). Because they are viewed as purely negative, they don’t get recorded/rewarded. Imagine if we paid programmers for every error they found in their code. (We sort of do this for some software testers.) It is a positive thing that you were smart enough to avoid a problem, but avoidances are hard to record.

Q: Is there a software engineering ethics code?
A: Yes, it is on the resource slide and available at It is the only code adopted by the ACM and the IEEE-CS.

Q: Are there not also ethical questions which we cannot realistically expect an individual to be able to address, rather than an institution?
A: I think some situations require an institution to address them, but the institution sometimes requires an individual or three to work motivating the institution.

Q: I disagree about that open-source observation. The meritocracy models followed are often extremely paternalistic.
A: Perhaps you are right. We’ll have to think about that.

Q: SWE ethics is sometimes treated as being applicable only within the context of SWE projects (no software, no need to be so much "professionally ethical"). In your opinion, how  acceptable is this point of view?
A: I think that in spite of all of our labels we are a complex computing profession, as complex as the medical profession, and that ethical responsibility needs to be exercised by everyone involved in it.  I had the privilege of leading the development of the Software Engineering Code of Ethics, but I think that those who worked the hardest on it saw it as a statement of the conscience of computing. Read the preambles and you will probably see that.

Q: I believe humility is a rare characteristic in folks who do computer programming.
A: Of course you are right.

Q: I like the idea that having the courage to raise a concern is a moral issue
A: Yes, and it often does require courage.

Q: How to know if proprietary software is ethically right when people can see it?
A: This limited access to code is a problem. After 2002 companies used the proprietary claim to prevent review of the fairness and correctness of voting maching software..

Q: Comment: One of the difficult problems that hasn't been mentioned is how we become aware of the limitations of our competencehow I am supposed to recognize that I need to obtain additional knowledge?
A: One problem is that computing folk love a challenge. A simple question, "Do I have the skill to do that honestly?" answered will do. Students tend to overestimate their skills, but recognized them when a biology professor requested that they write a genetic mutation tracking program.

Q: Dr. Gotterbarnvery much enjoyed your presentation. I'm was a student of yours at ETSU and now work for a grocer. We'll have to rethink our checkout process. I do try to use the golden rule, but it's hard when you don't/can't understand some challenges. Outstanding.
A: Thanks for joining us today.

Due to the volume of questions received during the January 23, 2014 ACM Learning Webinar, not all audience questions were answered during the live event. Presenter Don Gotterbarn and Moderator Keith Miller answered the questions listed here offline.


Education is the most powerful weapon which you can use to change the world.


Safari Books Online                         Books 24x7