b. Who invented it? Quinn credits Jeremy Bentham and John Stuart Mill with inventing the theory of utilitarianism. (see p. 75, Quinn).

Similar documents
Violent Video Games First Amendment United States Constitution

Computer Ethics. Dr. Aiman El-Maleh. King Fahd University of Petroleum & Minerals Computer Engineering Department COE 390 Seminar Term 062

Tip. Getting on the same page. More topics - my favorites. Basic CSC 300 Issue List. Readings - make sure you keep up General Ethical concepts

Data Protection Act 1998 Access to Personal Data (Subject Access)

Getting on the same page

H5ST 04 (SCDHSC0370) Support the Use of Technological Aids to Promote Independence 1

CSI Professional Practice In Computer Science. Ethical, social, and professional aspects of the modern computing technology.

Student name: Class: Date:

Academic Vocabulary Test 1:

PROFESSIONAL ETHICS A LIFE-LONG PURSUIT BY AN ENGINEER

ETHICS AND THE INFORMATION SYSTEMS DEVELOPMENT PROFESSIONAL: ETHICS AND THE INFORMATION SYSTEMS DEVELOPMENT PROFESSIONAL: BRIDGING THE GAP

The Synthetic Death of Free Will. Richard Thompson Ford, in Save The Robots: Cyber Profiling and Your So-Called

Managing Information Systems Seventh Canadian Edition. Laudon, Laudon and Brabston. CHAPTER 4 Social, Ethical, and Legal Issues in the Digital Firm

Being able to make choices about your life and your care changing the law to do with mental capacity

Stanford CS Commencement Alex Aiken 6/17/18

HEALTH PERMANENT HOUSING CONNECTIONS EDUCATION LIFE SKILLS ESSENTIAL EMPLOYMENT DOCUMENTS. Independent Living Plan

Catalysts for Change p. 1 Introduction p. 2 Milestones in Computing p. 6 Mechanical Adding Machines p. 6 The Analytical Engine p.

A PERSPECTIVE IN COMPUTER ETHICS. Pattarasinee Bhattarakosol 1. Abstract. Introduction. What is computer ethics?

CS 305: Social, Ethical and Legal Implications of Computing

STUDENT APPLICATION. Requirements for submitting this application

Hello, everyone, and thank you so much for all of the positive feedback about my last newsletter! I hope you will enjoy today s new issue, which is

Chapter 4. L&L 12ed Global Ed ETHICAL AND SOCIAL ISSUES IN INFORMATION SYSTEMS. Information systems and ethics

Management Information Systems MANAGING THE DIGITAL FIRM, 12 TH EDITION. Chapter 4 ETHICAL AND SOCIAL ISSUES IN INFORMATION SYSTEMS

Lodger Information Pack

Playing With Your Health

Ethical and social aspects of management information systems

Commonwealth Data Forum. Giovanni Buttarelli

GUIDE TO SPEAKING POINTS:

CDT Annual Dinner. Center for Democracy and Technology, Washington. 10 March 2015

When beginning to read a new novel, there are several things you need to be aware of

RSAC Podcast Transcript Episode 3: You Cannot Have Privacy Without Security TREVOR HUGHES & ART COVIELLO, APRIL 7, 2016

Photovoice Safety and Ethics. Pennie Foster-Fishman, Ph.D. Michigan State University

Interoperable systems that are trusted and secure

Negotiating Essentials

Personal Data Protection Competency Framework for School Students. Intended to help Educators

Our Digital Future: An Interview with the UM Dean of School of Information

More Thinking Matters Too Understanding My Life Patterns

The Zipper Test Find out where you re stuck and get unstuck

Social Care. Care and support planning under the Care Act 2014

Human Rights Approach

Dealing With A CRITICAL JERK

Avoid the 5 Biggest DWI Pitfalls Presented by: The Volk & McElroy Law Firm

How to prepare for a job interview: easy tips and questions to think about

1 Grammar in the Real World A What are some important things to think about when you plan your career or look

The importance for Oman of teaching IP law

Ethical Framework for Elderly Care-Robots. Prof. Tom Sorell

Digital Identity Innovation Canada s Opportunity to Lead the World. Digital ID and Authentication Council of Canada Pre-Budget Submission

The Witness Charter - Looking after Witnesses

Chartered Property & Casualty Underwriters Society Meetings. "What We Can Learn from Walt Disney"

Evictions and Lockouts

Before the NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION Washington, D.C Docket No. NHTSA

Last week a long-time friend asked what type of law I practice. I was surprised that he didn t know what I do for a

COPYCAT - CASE 1 COPYCAT - CASE 2

Incident at Morales. Fred: Chemical Engineer hired by Phaust to design a new plant to manufacture a new paint remover

YOUR RIGHTS. In Intermediate Care Facilities for Persons with. Mental Retardation (ICF-MR) Programs. Texas Department of Aging and Disability Services

Ethics in Artificial Intelligence

***NEW*** We will give you 2 pencils, an eraser and sharpener. You are not allowed to bring your own stationery into the testing room with you.

2/28/2018 POWER CONFERENCE ETHICS SESSION FORMAT OF SESSION LEARNING OBJECTIVES. Part 1: Changes to the Code of Ethics Break

How to complete the Tier 4 (General) Student visa application online

Teaching the Ethics of Big Data

CILIP Privacy Briefing 2017

Frank Abagnale By CommonLit Staff 2015

Your Rights. In An ICF-MR Program

LAB3-R04 A Hard Privacy Impact Assessment. Post conference summary

Writing to the Editor

Ethics and Cognitive Systems

How to Present a 4 H Computer Assisted Demonstra on

Surveillance Technologies: efficiency, human rights, ethics Prof. Dr. Tom Sorell, University of Warwick, UK

Minority Report Assignment

TIPS FOR COMMUNICATING WITH CRIME VICTIMS

Computer Ethics(1) IT Engineering II Instructor: Ali B. Hashemi

SAM PAIOR (1) Now can you start by telling us all about your service and tell us how did it, how did it get started?

Artificial intelligence and judicial systems: The so-called predictive justice

I ve made a new friend online. But I m worried. What do I do?

MENTAL HEALTH ADVANCE DIRECTIVES

An introduction to the use of Body Scanners, the science behind the technology, and the ethical implications of these devices.

Pan-Canadian Trust Framework Overview

ANEC response to the CEN-CENELEC questionnaire on the possible need for standardisation on smart appliances

AC : THE ROLE OF INFORMATION WARFARE IN INFORMATION ASSURANCE EDUCATION: A LEGAL AND ETHICAL PERSPECTIVE

The little BIG book of badness

A BAGFUL OF GAMES. The haunted mansion. For this activity you will need to print out the game board provided (see last page).

Computer Ethical Challenges Facing Professionals in Zimbabwe

Moral appearances: emotions, robots, and human morality

Teaching a Course on Social Implications of Computing in Undergraduate Curriculum

Your information - but is it really yours?

Digital Citizenship Continuum

Paola Bailey, PsyD Licensed Clinical Psychologist PSY# 25263

Roadmaps. A Guide for Intellectual Entrepreneurs. Working for Liberty: Think Like a Start-Up and Turn Your First (or New) Job into a Rewarding Career

Digital Forensics Lecture 11. Evidence, Reporting, and Action

Rights and Responsibilities


An Insider s Guide to Filling Out Your Advance Directive

SPONSORSHIP AND DONATION ACCEPTANCE POLICY

Police Technology Jack McDevitt, Chad Posick, Dennis P. Rosenbaum, Amie Schuck

Building DIGITAL TRUST People s Plan for Digital: A discussion paper

PublicServicePrep Comprehensive Guide to Canadian Public Service Exams

C>CF. Citizenship Educational Change. In My View 02:10. by Mike Ribble. p^ 1

Science and Engineering Ethics Enters its Third Decade

Cultural Implications of Tech CS 340 CHAPTER 6 HOW COMPUTING IS CHANGING WHO WE ARE FALL 2015

Habits of Unhappy People

Transcription:

CS285L Practice Midterm Exam, F12 NAME: Holly Student Closed book. Show all work on these pages, using backs of pages if needed. Total points = 100, equally divided among numbered problems. 1. Consider expressing ethical theory as a modern ethical framework. Examples include: Utilitarianism of Bentham and Locke expressed as Act Utilitarianism in Quinn; Kant's ethics expressed as Kantianism in Quinn. Due to the increasing use of computers in all aspects of society and government, we can characterize the procedures of a government group as mandating Act Utilitarianism. Neglect any distributive justice aspects. Imagine that the decisions of the board look only at utility numbers. (This goes a bit past Quinn, who relies on lists of harms and benefits.) Pretend you are a new. Your first meeting is next week. The case is to evaluate whether it is a good idea to stop validating the content of the NCIC database (if good idea, stop). The issue is privacy and whether data about individuals has to be accurate. Sometimes false arrests occur due to bad data. Sometimes criminals can be caught using data supplied by other agencies (without validation). Today is your training. You are asked to explain the concepts of utilitarianism as found in the group's written procedures as they will be applied to the case. a. What is utility? Utility is the tendency of something to produce happiness or reduce unhappiness for an individual or community (see p. 75,Quinn) b. Who invented it? Quinn credits Jeremy Bentham and John Stuart Mill with inventing the theory of utilitarianism. (see p. 75, Quinn). b. How can it be used to make a decision that is ethical using the modern technology of the technique called Act Utilitarianism? (one sentence) An action is considered ethical if it increases the total happiness of those affected. (see p. 75, Quinn) c. Suppose there is a list of 3 harms (H1, H2, H3) that could be consequences of not validating the data; list them and give possible, different utility numbers. Harm (scale of -1 to -100 selected; others possible): H1-20 (slight harm) H2-100 (very serious harm) H3-50 d. Suppose there is a list of 3 benefits (B1, B2, B3) that could be consequences of not validating the data; list them and give possible, different utility numbers. Benefit (scale of 1 to 100 selected; others possible): B1 30 (some benefit) B2 100 (great benefit) B3 60

e. Explain how to aggregate the data on the two lists to determine the utility of the decision (one sentence). To determine the utility of the action, we compare harm to benefit: if the numbers are all positive, we add up the harm, add up the benefit and express the result as a ratio of harm to benefit, deciding that the action is ethical.if the harm is less than the benefit, and unethical.otherwise; if the numbers include negatives for harm and positives for benefit, we net them out and decide that the action is ethical if the result is positive, and unethical.otherwise. f. Perform the aggregation and get a result. Give the result and say what it means. To aggregate utility numbers given above as negatives and positives into a decision-maker, we sum up the harm (-170), sum up the benefit (190), net them out (20) and conclude that the decision to not validate the NCIC data is ethical. (see p.358, Quinn, for an analysis focusing on stolen automobiles only) g. Explain to the "losing" side why they lost (one sentence). We would say that the decision weighed both the factors raised by the FBI and the factors raised by the privacy advocates, and used utilitarianism to determine that the benefits for not validating the NCIC data any longer outweighed the harms noted by the privacy advocates, and was thus an ethical change to make.on behalf of the entire public.. 2. Adapt the above question about theory to apply to Kant. Write the question with subparts only. You do not need to give the answers. Consider expressing the ethical theory of Kant as a modern ethical framework such as Kantianism in Quinn. Pretend you are a new member. Your first meeting is next week. The case is to evaluate whether it is a good idea to stop validating the content of the NCIC database (if good idea, stop). The issue is privacy and whether data about individuals has to be accurate. Sometimes false arrests occur due to bad data. Sometimes criminals can be caught using data supplied by other agencies (without validation). Today is your training. You are asked to explain the concepts of Kantianism as found in the group's written procedures as they will be applied to the case. a. What is good will? What is the categorical imperative?

b. Who invented it? b. How can it be used to make a decision that is ethical using the modern technology of the technique called Kantianism? (one sentence) c. Suppose there is a list of good will shown by the different people in the scenario for the Justice Dept not validating the data; list them and give possibilities and assessments. d. Suppose there is a list of rules that could be universalize without causing logical inconsistencies with good will for not validating the data; list them and say what happens when they are universalized; and also look at whether the different people in the scenario are using other people as means to their end rather than respecting them. e. Explain how to aggregate the data on the two lists to determine whether the decision is ethical (one sentence). f. Perform the aggregation and get a result. Give the result and say what it means. g. Explain to the "losing" side why they lost (one sentence). 3. Perform the Act Utilitarian assessment. This means you are to apply the above theory of Utilitarianism as expressed in the modern technology Act Utilitarianism to a specific scenario to get a decision. a. Draw the decision-maker with the "not validate" input and the decision "ethical" or "not ethical" as output. input: not validate-----> Act Utilitarian assessor output: ethical or not b. Label the decision-maker Act Utilitarian assessor. c. Perform your analysis. List several harms and several benefits with utility numbers. Aggregate the numbers and write the answer a harm/benefit ratio.

Harm from Not Validating NCIC data (see p. 356-358, Quinn): H1 - possibility of false arrests - utility is negative 100 H2 - increasing size of database means increasing size of erroneous data collected by the government - utility is negative 50 H3 - government has to pay people damages - utility is negative 80 Benefit from Not Validating NCIC data (see p. 356-358, Quinn): B1 - became impractical to verify, it was taking too much work - utility 50 B2 - database is larger and more criminals get caught - utility 30 B3 - FBI agents should use own judgment about data - utility 10 d. Add some factors to the diagram to make it reflect your decision-making. input: not validate-----> Act Utilitarian assessor output: ethical or not Benefits: B1 B2 B3 Harm: H1 H2 H3 e. Explain to the public the decision by writing three sentences (Based on applying the Act Utilitarian ethical framework as established by rule of law in our procedures, our decision is:...; The privacy advocates will want to know:...; The Justice Department will want to know:...; Thank you.) Good afternoon, Ladies and Gentlemen.. The Justice Department has reached a utilitarian decision that it is ethical stop validating the data it receives for the National Crime Information Center databases, and we took into account the benefits that the FBI expressed relative to catching criminals, the harms that the privacy advocates expressed relative to protecting the privacy of each individual and calculated utility of our decision, which tells us that the benefit to society of catching more criminals outweighs the cost of the loss of some individual privacy. Thank you.

4. Perform the Kantian assessment. input: not validate-----> Kantian assessor output: ethical or not The people in the scenario include several groups. The first is the FBI who is having trouble keeping up with the privacy law's requirement to validate the data in the National Crime Information Center databases. Most likely, the cost of this has risen a lot in the years as the databases have grown. Clearly, a change must be made. Someone came up with the idea to dropping the validating. The second is the Justice Department who is tasked to figure out if the FBI's idea is going to be a good solution. (Normally, they would be deciding what is legal, but here Quinn uses the factors to do an ethical assessment. For this question, we'll make it a Kantian assessment.) The third group are the privacy advocates who are very aware that false arrests do occur, people are harmed by them, the government has to pay damages, and more unvalidated data could well mean more false arrests, harming more people. Regarding goodwill, it's clear that all three groups have good will. The FBI wants to catch criminals and do so on a reasonable budget, and the validating is costing too much. The Justice Department wants to be fair and make an assessment that benefits the public. The privacy advocates consider the rights of individuals, so they are sensitive to the harm of even one citizen. Regarding a rule to be universalized that is related to the action, don't validate the data any longer, we could use: it is unwise to let any component of work relating to carrying out a law get too costly without trying to reduce it (watchdog rule). If everyone used this rule, we would keep the cost of government from sky-rocketing in areas that grow more costly in ways that the law did not anticipate. Looking at it from using people as means to an end, the FBI is not at fault. A different rule could give a different result. In this case, the decision is judged ethical.

Adding more factors: input: not validate-----> Kantian assessor output: ethical or not goodwill - all parties rule: watchdog universalizes without conflict? yes 5. As a planet, we seek a good life for everyone. As nations, we seek a good life for citizens. As states, we seek a good life for people. As communities, we seek a good life for residents including education. As families, we seek a good life for members including sharing. As individuals, we seek a good life for ourselves. The internet has wrought changes at all levels. There are feedback effects that result in changes to our social fabric. New products are adopted because we want to use them, but sometimes new products create new problems. State whether you are a techno-optimist, a techno-pessimist or a don't-know-yet. Then write one paragraphs about how you see your moral duty in these changing times. Mention at least three of the above levels to show your concern for others. I am a techno-optimist in that I work to bring on new software systems because I believe they have value to the people who pay for them and to society; however, I am well aware that technological change (like any kind of change) always has another side.because someone somewhere will have been benefiting from the old system. My moral duty starts at home: I value the talents I have and I seek to use them productively. I help my children, I help my friends, I contribute to my community, but I do all these things in a manner that avoids being self-sacrificing or intrusive. On the levels of national or international issues, I learn about issues, I vote, I follow the laws, but I recognize that some people sometimes don't agree with me about this. I am currently fascinated by the problem of ethics in computing and societal change resulting from solutions to that problem. It's fundamentally (to me) a networking issue, so I show my concern by studying it and teaching it.