Module 1: Digital Information and Privacy

Ethics for Computer Scientists and Software Engineers You will be taking ENGR 482 (Engineering Ethics) at some point. This course is hosted by the Philosophy Department, but is oriented to Engineers. The principles span across all engineering disciplines. Dr. Ed Harris is an expert on Engineering Ethics and literally wrote the book. Why do computer scientists have to learn about Ethics? 1. Employers require that students understand the

ethical implications of design and decision-making (including how to deal with gray areas). 2. There are a lot of interesting related topics, like intellectual property (e.g. copyrights), privacy and security, etc. 3. We trade in digital information (from images and music to source code); there are many implications, such as what can be copied freely. 4. Software has a real impact on people. There can be consequences of software we design and programs we write, based on how they affect users, as well as others. (e.g. in terms of safety, privacy...) In ENGR 482, you will learn about different ethical frameworks.

Religion/Morality for some people, morals might be grounded in their religious beliefs Philosophical some researchers argue that there are common, innate ideals that most humans instinctively share, such as about fairness, reciprocity, and individual rights and dignity Utilitarian driven by cost-benefit analysis In ENGR 482, you will learn about sources of ethical conflicts, and techniques for evaluating difficult situations and making ethical decisions. laziness, greed, pressure from boss, group-think... We live in a world of Digital Information icons, clipart, photos, videos, music

scanned, searchable documents software emails, tweets Question: What is free (public)? What is protected (copyrighted)? What is private? What information is Public? images, clip art, icons, music, text, software, engineering designs? Can you post pictures of your friends?

a famous building like the Empire States building? what if the Coca-Cola logo is in background? Do you have to get permission? when in doubt, you should cite sources this includes online material (e.g. URL) suggestion for Powerpoint slides: you can reuse images/icons from Wikimedia Commons (Creative Commons license allows this) Reminder: plagiarism is serious. Do not copy stuff off the web (text, images, code) and present it as your own The Legal Side concept of Intellectual Property (IP)

rationale: time-limited monopoly to incentivize creativity what counts as IP? inventions, methods, processes, use patents (e.g. new uses for old medications) valuable to employers what about design/style (look-and-feel)? yes, just ask Apple, who has been sued by Microsoft over graphical OS interfaces and Samsung over smart phone designs however, you cant copyright or patent an idea, only the expression of that idea

copyrights original work of authorship; the expression of an idea, like a book or a song patents inventions, methods for doing/making something, or improvements of such criteria must be:

1. novel 2. useful 3. non-obvious also first to file (a recent policy change in the US) What is an adequate modification? any derivative work is protected (including translation to other languages, etc.) is there a minimal clip? mashups/remixes? incidental depiction in a photo? often subjective the extent of the clip counts (and whether it affects profit potential) Concept of fair use can use some material for personal, non-profit, or

educational purposes example: use the Nike swoosh as a symbol on your personal calendar; make a mix tape; print out a quote from your favorite author as an inspiration ... can mention a brief quote in a review article can make personal copies of software, e.g. backups Can you patent an algorithm? Richard Stallman argues that algorithms are like ideas and shouldnt be patented (and it damages the industry to do so) there are major problems with patent trolls who look for intentional or accidental re-use of methods and try to extort fees the USPTO finally ruled that algorithms cannot

be patented Yet, there are patents for algorithms like: IDEA (International Data Encryption Algorithm) LZW compression used in GIF images both patents are now expired Copyrighting of Software software doesnt quite fit definition of a copyright or patent cant copyright an algorithm (like an idea), but can copyright the expression/implementation as a program in a language (like book or process) it can be viewed as a blueprint for making executables

putting your copyright on a piece of code (source code file) is sufficient, though you should register if you want to protect it Copyright @ 2013. Thomas Ioerger. All rights reserved. software is often designed using libraries/components/plugins, so you have to be careful about what can be re-used see Google Drive example: fonts actually licensed as software alternative types of software copyrights:

Public Domain/freeware Shareware Open source generally unrestricted, can even sell it GNU FSF (Free Software Foundation) promote unrestricted sharing of code to benefit society Copyleft: can freely use, modify and re-distribute, but only if your software also propagates this policy. effectively prevented from commercial profit The Ethical Side Aspirational ethics:

do more than just minimally follow the rules, i.e. not just abiding by restrictions and making decisions based on potential liability, etc. Principles respect for persons (Golden Rule you wouldnt want somebody stealing your ideas) give credit, acknowledge source (even if it might mean sharing profit) when in doubt: cite it, or re-write it in your own words Privacy and Security what information is private? medical/financial records

academic/employment/voting records not google searches or tweets Interestingly, the right to privacy is not in the US Constitution Bill of Rights, though it has been interpreted by the Supreme Court an extension of other rights. FOIA Federal Open-Information Act emails caution: theyre not as private as you think, especially in employer accounts might as well assume they could become public Privacy vs. Social utility (tradeoff) there are cases where we justify collection of

private data for the public good examples: health insurers have to collect info to set premiums credit card companies have to collect info to issue credit scores digital info can be aggregated and crossreferenced in large databases of information risks to privacy, potential for identity theft solutions fair information practices de-identification The Legal Side who is responsible for security?

you? employer? IT person (firewall)? note: dont share passwords! what must you keep password-protected? social security numbers, credit card numbers... security decisions driven by liability legal/financial consequences risks of potential for identity theft companies have a responsibility to inform parties of security breaches example: in 2007, hackers stole 45 million credit card numbers from TJ Maxx servers more recent example: Target (Nov 2013) stolen at

point-of-sale The Ethical Side privacy and security represent a tradeoff there are many levels of security with different costs passwords firewalls? is encryption warranted? what bit-level? could take a utilitarian view balance costs and inconvenience the problem is, people often disagree on: perceived risk (probability of being hacked) relative weight/importance of privacy

aspirational ethics: dont just make decisions based on legal or financial considerations (which tend to emphasize the negative), but instead, aspire to protect users rights ethical principles: individuals dignity and right to privacy Ethics in Software Engineering There are many design decisions in software that can have significant consequences... Famous software bugs Therac-25 (1985) chemo-radiation equipment software bug in beam-shield controller caused radiation

overdoses, multiple deaths AT&T long-distance network (1990) crashed (in New England) due to missing break in a switch case causing 1 switch send a failure/congestion message to another switch causing it to reset/self-test, and this cascaded to crash other switches Floating point/math co-processor in Intel chips Responsibility for documentation of software, testing, and fixing bugs basically, common engineering ethics apply here shared responsibility of programmer, team,

manager, company... ACM Code of Ethics Association for Computing Machinery Like most professional societies, we have a code of ethics: defines professionalism - taking responsibility for your work, keeping informed, honor laws, confidentiality, privacy, etc. emphasizes positive aspects of ethical behavior over negative what you should do, instead of prohibitions ACM code emphasizes safety of public over interests of employer so for example, if your manager asks you to do something risky or dangerous, such as release code you know has bugs in it, you

shouldnt just blindly do it 1. GENERAL MORAL IMPERATIVES. 1.1 Contribute to society and human well-being. 1.2 Avoid harm to others. 1.3 Be honest and trustworthy.

1.4 Be fair and take action not to discriminate. 1.5 Honor property rights including copyrights and patent. 1.6 Give proper credit for intellectual property. 1.7 Respect the privacy of others. 1.8 Honor confidentiality. 2. MORE SPECIFIC PROFESSIONAL RESPONSIBILITIES. 2.1 Strive to achieve the highest quality, effectiveness and dignity in both the process and products of professional work. 2.2 Acquire and maintain professional competence. 2.3 Know and respect existing laws pertaining to professional work. ... 3. ORGANIZATIONAL LEADERSHIP IMPERATIVES. 3.2 Manage personnel and resources to design and build information

systems that enhance the quality of working life. 3.3 Acknowledge and support proper and authorized uses of an organization's computing and communication resources. ... Ethics in Interface Design Design of software must match cognitive structures how people think about a system we cant go into detail here (take a class on it), but this is a significant aspect of design of software systems that must be considered, especially for safety consider how to display state info/status indicators so users really understand

consider how to make actions/options and their effects clear (example: does this action erase data?) Ethics in Interface Design An example: early Air-Traffic Control systems had a GUI with lights that blinked twice per second to indicate every thing was OK this had to be changed to once per second, because controllers were getting confused because the human brain interprets lights blinking twice per second as Alert! Case: Autopilot or Human Error?

The following was excerpted from Olson, W.A. (2001). Risks of Cockpit Automation. ing_and_mitigating_risks.pdf On 24 April 1994 an Airbus 300-600 crashed while on approach to Nagoya, Japan. During the approach the copilot inadvertently engaged the aircrafts go-around mode, which caused the automated systems to attempt to fly away from the ground using the aircraft pitch trim system, while the pilots attempted to continue the landing approach via input to the elevator. The pilots were unable to determine that the pitch trim input of the autopilot system was causing difficulties controlling the aircraft.

Additionally, the design of the A300 autopilot (at that time) did not allow the pilots to override the autopilot by use of opposing control stick pressure. Thus, the pilots and automated systems continued to struggle for control, with the aircraft eventually pitching up to near vertical, stalling, and crashing on the approach end of the runwaykilling 264 passengers and crew. This case illustrates that a technological advance (the autopilot) that was supposed to help human operators actually got in the way. It didnt malfunction. The problem stemmed from the crew not fully understanding

what it was doing and how that interacted with what they wanted to do. There are downsides to technology/online/the Internet/ social networking, which can make us... ...less sociable via personal interactions e.g. children who spend all day playing video games or texting loss of inter-personal skills, expression alienation, loneliness (think of online shopping vs. face-to-face) ...reliant/dependent on computers loss of math skills, grammar/spelling, memory I dont know how my car works anymore, and I cant fix it. GPS, googlemaps for navigation ...gullible and uncritical

It must be true - I saw it on the Internet. blaming the technology - I didnt fix it because the red light wasnt blinking. ...desensitized to violence (video games), pornography, copying Design of software features should promote human well-being What if you can implement something that is bad or disruptive? examples of abuse: disassembly, hacking web-bots, spam a script that queries a web site so much that it

overloads it (e.g. checking every millisecond for a book at the library) the 1988 Morris Worm, which exploited a loophole in a Unix daemon to spread from machine to machine they didnt intend it to be mischievous, it was just an experiment they made a mistake in the implementation that generated many more copies than intended cases related to hacking Kevin Mitnick ( gained remote access to corporate data using phones was sentenced to 5 years in prison and forbidden to use computers afterwards

even though he didnt profit, did his punishment befit his crime, or were the feds unduly trying to make an example out of him (perhaps out of ignorance or fear)? 1986 Computer Fraud and Abuse Act Ethical Principles dont harm/hinder others Respect for Persons if someone sets up a server, respect its intended use RSS - this is an example where they intend to stream it to you and have anticipated the load even if you figure out how somebody implemented something, dont publish it (it violates their rights)

So in summary... try to make decisions guided by ethical principles keep in mind the ACM Code of Ethics consider the consequences of software you design (on users and others) Design software features that promote human well-being take responsibility for testing your code respect copyrights and dont re-use stuff unless you give due credit Food for thought: Googles corporate motto is ...?

Food for thought: Googles corporate motto is Dont be evil. Food for thought: Googles corporate motto is Dont be evil. Why do you think that is? Food for thought: Googles corporate motto is Dont be evil. Why do you think that is? Because they have the power to do some really bad things with their technology and data.

Recently Viewed Presentations

  • Current management of NSTEACS - RS ROEMANI

    Current management of NSTEACS - RS ROEMANI

    Current management of NSTEACS. dr. Jarot Widodo, Sp.JP, FIHA. Definition of Acute Coronary Syndrome. A syndrome largely due to coronary plaque erosion or rupture which is further subdivided into presentations with and without ST-segment elevation on the ECG.
  • Complexity, individuation and function in ecology Part II ...

    Complexity, individuation and function in ecology Part II ...

    Scale factors 1. Observation. Scale is a choice of the observer. It is partly determined by the observer's preferred approach to ecology. Once scale is chosen, the system dynamics should guide data collection and analysis. Failure to get coherent results...
  •    :   :   2 /30  Sharif University of Technology

    : : 2 /30 Sharif University of Technology

    INTERNATIONAL JOURNAL OF NETWORK MANAGEMENT Int. J. Network Mgmt 2001; 11:3-30 مراجع Su W., "Mobility Prediction in Mobile/Wireless Networks", PhD thesis, University of California, Los Angeles, 2000 Wang, K.H.; Baochun Li, "Group mobility and partition prediction in wireless ad-hoc networks",...
  • Comparisons for ACE

    Comparisons for ACE

    ACE Comparisons Kaley Walker, Ashley Jones, Chris Boone, Chris Sioris, Felicia Kolonjari, Sean McLeod, Peter Bernath and Tom McElroy MOHAVE-2009 #2 Workshop - Bern, Switzerland - 20 October 2010
  • Starbucks: Strategies to Sustain Competitive Advantage

    Starbucks: Strategies to Sustain Competitive Advantage

    Starbucks: Sources of Competitive Advantage Key Questions Do you like Starbucks? Why do people buy Starbucks? What are people buying when they buy Starbucks? Is the company successful? What are key measures of success? What are Starbucks core resources? …capabilities?...
  • Unit 10 Personality Learning Targets Module 55 Psychoanalytic

    Unit 10 Personality Learning Targets Module 55 Psychoanalytic

    Explain how Sigmund Freud's treatment of psychological disorders led to his view of the unconscious mind, and describe his view of personality. 55-3. Identify the developmental stages Freud proposed, and discuss how he thought people defended themselves against anxiety.
  • Bohr's Atom - problems

    Bohr's Atom - problems

    Models of The Atom. 1863- John Dalton pictures atoms as tiny, indestructible particles, with no internal structure. Models of the Atom. 1897- J.J. Thomson, a British scientist, discovers the electron. The later leads to his "Plum Pudding" model. He pictures...
  • Presentazione standard di PowerPoint - ventilab

    Presentazione standard di PowerPoint - ventilab

    Test blu di metilene Preparazione del test: utilizzare spazzolino/spugnetta per igiene orale, immergerla in 5 ml (1/2 fiala) di blu di metilene e strizzare fino a che non goccioli Esecuzione: con spazzolino/spugnetta sporcare: vestiboli e guance; se possibile (per apertura...