Towson University Disruptive Technologies Case Study Discussion

User Generated

frnaehgf2

Writing

Towson University

Description

Need a 3-4 case study about the case below. The assignment instructions and another resource will be attached

https://www.mckinsey.com/business-functions/digital-mckinsey/our-insights/disruptive-technologies

Unformatted Attachment Preview

2nd Case Study assignment Steps in Ethical Case Analysis 1. Get the facts straight. Review the case. Briefly recap the details of the case at the beginning of your paper. 2. Identify the central stakeholders in the case. 3. Identify the technical/professional problem in the case. 4. Identify the Ethical problem or problems in the case. 5. Solve the technical and ethical problems using both technical and Ethical standards. Analyze the case from 3 stakeholder perspectives, use 1 ethical principle for each perspective. The main perspective from which you analyze he case should be from a Business point of view. 6. Anticipatory Ethical analysis. Analyze the case from 3 stakeholder perspectives, use 1 of Miller’s 5 rules for each perspective. What can you anticipate about the development of your technology? Apply 3 rules from Keith Miller et al 5 rules to your subject. 6. Will your solution to the problem withstand criticism from the perspectives of both a variety of Ethical principles and Professionals in your field? 7. What recommendations can you make about the problems in the case based upon your ethical analysis? When you construct your analysis be sure and remember that we are assigning a 3-5 page analysis. With this length limitation it is important to realize that you will probably only be able to look at the problems in the case from 3 stakeholder perspectives. If you try to analyze every stakeholder perspective you will probably exceed the length limitation. Grading and Evaluation of Individual Papers 1. What are the objectives of the papers? A. i. To become sensitized to the ethical issues in Engineering and Information Technology. ii. Learn how to analyze a case. iii. Learn how to identify the major stakeholders in a case. iv. Learn how to identify the technical problems in a case. v. Learn how to identify the ethical problems in a case. vi. Learn how to apply ethical principles to a case. vii. Learn how to make recommendations in a case based upon ethical analysis. B. a. Critical thinking i. Did you identify and focus on the crucial material and facts in the case? ii. Did you support claims you make about the case with facts? iii. Did you think about the case from a variety of stakeholder perspectives? b. Ethical analysis i. Did you identify the central Ethical problem(s) ii. Did you conduct a well thought out ethical analysis by applying 3 ethical principles? iii. Did you justify ethical judgments with accurate facts and ethical principles? iv. Did you think about the case from a variety of ethical perspectives? v. Did you base your analysis upon ethical principles that cannot be easily criticized? How will your papers be assessed? Evaluations of Papers (Percentages) A. The Case Recap (.10) i. Do you highlight the key points in the case? a. Are the facts in the case accurate? b. Did you focus on the crucial aspects of the case? c. Did you avoid including non essential or superfluous information? B. Stakeholders (.10) i. Who are the primary and secondary stakeholders in the case? ii. Did you clearly identify from which stakeholder perspective you are analyzing the material in the case? a. From whose perspective are you analyzing the case? C. What is the central technical problem? (.15) i.. How is the central technical problem related to the ethical problems in the case? D. What is/are the ethical problem/problems? (.15) i. What ethical problems do you see in the case? ii. What is the central ethical problem? iii. Why is this the central ethical problem? E. What ethical principles apply to the central ethical problem? (.30) i. Do you correctly define the ethical principles you use? ii. Have you correctly applied the ethical principles you use? iii. Do the principles you use withstand obvious criticisms from other ethical perspectives? F. What are your recommendations? (.20) i. Are your recommendations based upon your ethical analysis? ii. Do your recommendations link to your ethical analysis? iii. Rather than stating the obvious ( e. g. this problem could have been easily solved if … ) what do you recommend for similar cases in the future? IT Ethics © Winterberg | Dreamstime.com Moral Responsibility for Computing Artifacts: “The Rules” Keith W. Miller, University of Illinois at Springfield I n March 2010, the Poynter Center for the Study of Ethics and American Institutions held a workshop sponsored by the US National Science Foun­ dation. An interdisciplinary group of philosophers, computer scien­ tists, practitioners, and lawyers gathered to discuss “ethical guid­ ance for the research and applica­ tion of pervasive and autonomous information technology” (http:// poynter.indiana.edu/pait). During the workshop, we started to develop a short statement about the ethics of developing computer systems, and this statement has since evolved into a document about moral responsibility. It’s not a Wiki, but 50 people, including academics and IT professionals, have already contributed to it. An early working title was “Principles Governing Moral Responsibil­ ity for Computing Artifacts,” but most of the time, we just call it “The Rules.” The Rules Currently, the document includes a preamble, some definitions, five rules, and explanations, though it could change before this column is published. Anyone can volun­ teer to join the Ad Hoc Committee 1520-9202/11/$26.00 © 2011 IEEE on Responsible Computing and suggest changes for The Rules by emailing the coordinator. As the current coordinator, it’s my job to circulate new suggestions and incorporate the accepted changes. Here, I present a condensed ver­ sion of the latest document, “Moral Responsibility for Computing Artifacts: Five Rules, Version 27.” The reasons that the document has gone through 27 versions is that the signers have taken a great deal of care with the words in The Rules, so the excerpts here don’t constitute an official ver­ sion. Please visit https://edocs. uis.edu/kmill2/www/TheRules/ moralResponsibilityForComputer ArtifactsV27.pdf for the full ver­ sion, which includes more detailed explanations for the definitions and rules. Preamble As computing artifacts become in­ creasingly complex, some have sug­ gested that such artifacts greatly complicate issues of responsibil­ ity. In order to help deal with these complexities, we propose five rules as a normative guide for people who design, develop, deploy, evalu­ ate, or use computing artifacts. Our aim is to reaffirm the importance Published by the IEEE Computer Society of moral responsibility for these artifacts, and to encourage individ­ uals and institutions to carefully examine their own responsibilities with respect to computing arti­ facts. We do not claim that these rules are exhaustive; professionals, individuals, and organizations may choose to take on more responsi­ bility than we describe here. A Working Definition of “Com­ puting Artifacts.” We use “com­ puting artifact” for any artifact that includes an executing com­ puter program. [This includes] software applications running on a general-purpose computer, pro­ grams burned into hardware and embedded in mechanical devices, robots, phones, webbots, toys, and programs distributed across more than one machine… . We [include] software that’s com­ mercial, free, open source, recre­ ational, an academic exercise, or a research tool. A Working Definition of “Moral Responsibility.” We use “moral responsibilit y for computing artifacts” to indicate that people are answerable for their behavior when they produce or use com­ puting artifacts, and that their computer.org/ITPro 57 IT Ethics actions reflect on their character.1 “Moral responsibility” includes an obligation to adhere to reason­ able standards of behavior, and to respect others who could be affected by the behavior. We do not address legal liability in this document. A Working Definition of “Socio­ technical Systems.” Each com­ puting artifact should be under­s tood in the context of “sociotechnical systems.” A socio­technical system includes people, relationships between people, other artifacts, physical surroundings, customs, assumptions, procedures, and protocols.2 We acknowledge the impor­ tance of sociotechnical systems to the issue of moral responsibility for computing artifacts. For ex­ ample, a GPS navigator is a com­ puting artifact, but in isolation from the satellites it uses to ascer­ tain location, it can’t perform its function…. [Ignoring] the sociotechnical systems in which a computing artifact is embedded is folly, [but] including all relevant sociotechni­ cal systems components in every discussion of moral responsibil­ ity involving a computing artifact will make it impractical to assign meaningful responsibility to the people most directly involved with that specific artifact. To negotiate this tension, we first discuss moral responsibility for computing ar­ tifacts in a more focused sense (Rules 1, 2, and 3), and then place this discussion into a broader con­ text that explicitly includes socio­ technical systems (Rules 4 and 5). Rule 1 The people who design, de­ velop, or deploy a computing artifact are morally responsible for that artifact, and for the fore­ seeable effects of that artifact. This responsibility is shared 58 IT Pro May/June 2011 with other people who design, develop, deploy or knowingly use the artifact as part of a so­ ciotechnical system. Rule 2 The shared responsibility of computing artifacts is not a zero-sum game. The respon­ sibility of an individual is not reduced simply because more people become involved in de­ signing, developing, deploying, or using the artifact. Instead, a person’s responsibility in­ cludes being answerable for the behaviors of the artifact and for the artifact’s effects after deployment, to the degree to which these effects are reason­ ably foreseeable by that person. … By using the word “foresee­ able,” we acknowledge that the people who design, develop, de­ ploy and use artifacts cannot rea­ sonably be expected to foresee all the effects of the artifacts, for all time. However, implicit in our use of this word is the expectation that people make a good faith ef­ fort to predict the uses, misuses, and effects of the deployment; and to monitor these after de­ ployment. Willful ignorance, or cursory thought, is not sufficient to meet the ethical challenges of Rules 1 and 2…. Rule 3 People who knowingly use a particular computing artifact are morally responsible for that use. The word “knowingly” is prob­ lematic in Rule 3, but we think it is, on balance, appropriate. People who “use” a particular computing artifact might not be aware of this use. For example, a driver might not have any knowledge of a com­ puting artifact embedded in the car that records data for analysis in case of a crash. It seems counterintuitive to us to assign moral re­ sponsibility to the driver for the use of that artifact. However, when someone know­ ingly and intentionally uses a par­ ticular computing artifact, that person takes on moral responsi­ bility attached to that use. A dra­ matic example is when someone launches a cruise missile at an enemy target; a more mundane example is when someone searches the Web for information about a prospective employee. The moral responsibility of a user includes an obligation to learn enough about the computing artifact’s effect to make an informed judg­ ment about its use for a particular application. It is not our intent to absolve the users of computing artifacts from moral responsibility if they are willfully ignorant about artifacts or their effects…. Rule 4 People who knowingly design, develop, deploy, or use a com­ puting artifact can do so re­ sponsibly only when they make a reasonable effort to take into account the sociotechnical sys­ tems in which the artifact is embedded. Sociotechnical systems are in­ creasingly powerful. If people thoughtlessly produce and adopt these systems, they are, in our opinion, being morally irrespon­ sible. Ignorance is not a justifi­ cation for harms associated with sociotechnical systems and the computing artifacts imbedded in those systems. Security issues that occur when computing arti­ facts are deployed via the Internet are an example of the interaction of an artifact and a sociotechnical system. Rule 4 is intended to be a pro­ gressively heavy burden. It requires an honest effort to identify and understand relevant systems, com­ mensurate with one’s ability and depth of involvement with the artifact and system. Thus, the burden is heavier for those with more expertise and more influ­ ence over the artifact’s effects and over the system’s effects. Those in design and development cannot shift their burden to the users (see Rule 2), and users cannot shift the burden to developers when users’ local knowledge is critical to ap­ propriate ethical action…. Rule 5 People who design, develop, deploy, promote, or evaluate a computing artifact should not explicitly or implicitly deceive users about the artifact or its foreseeable effects, or about the sociotechnical systems in which the artifact is embedded. Morally responsible use of com­ puting artifacts and sociotech­ nical systems requires reliable information about the artifacts and systems. People who design, develop, deploy or promote a computing artifact should provide honest, reliable, and understand­ able information about the arti­ fact, its effects, possible misuses, and, to the extent foreseeable, about the sociotechnical systems in which they think the artifact will be embedded…. Computing Artifacts that are Not Exceptions to the Rules No matter how sophisticated computing artifacts become, the rules still apply. For example, if an artifact uses a neural net, and the designers subsequently are sur­ prised by the artifact’s effects, the rules hold. If a computing artifact is self-modifying and eventually becomes quite different from the original artifact, the rules still hold. If a computing artifact is a distributed system or an emerging system, the rules still hold for the people associated with the pieces that are distributed, for the people associated with the organization of the overall system, and for the people responsible for the sys­ tem from which the new system emerged…. A New Community On 4 March 2011, at the annual meeting of the Association for Practical and Professional Eth­ ics, a panel of people who helped write The Rules discussed what the document means. Michael Davis (Illinois Insti­ tute of Technology) discussed both similarities and differences between The Rules and codes of ethics. For example, the subject matters are similar, but codes of ethics are usually aimed at orga­ nizations; The Rules are aimed at people from different professions. Organizations adopt codes of eth­ ics, but individuals sign up for The Rules. Chuck Huff (St. Olaf College) compared The Rules to a pro­ phetic voice. Although most of the five rules are stated as explanations, Huff views them as inspirational, not just descriptive. The word “should” appears explicitly in only one rule, but the document it­ self is in the spirit of “should.” By signing up for The Rules, people are embracing the responsibili­ ties given there. The Rules can be viewed as an attempt both to challenge computing profession­ als and users to embrace their re­ sponsibilities and to support those who do. Ken Pimple (Poynter Institute) emphasized that a community has begun to form around The Rules through the document’s coopera­ tive development. The community already contains academics and practitioners, computer scientists and philosophers, and people from nine different countries, united by their willingness to publicly assert their support for The Rules. I think people are interested in The Rules in part because the pace of technological change and global reach of computing and telecommunications systems are unsettling. We’re hungry for more clarity about who is respon­ sible for what in these increasingly important sociotechnical systems, and The Rules are one attempt to reason together about these diffi­ cult issues. I hope you’re sufficiently in­ trigued to read The Rules in their entirety (available at https://edocs. uis.edu/kmill2/www/TheRules), and I invite you to get involved. If you like The Rules, you can sign on by emailing me at miller.keith@ uis.edu. If you don’t like The Rules, you can also get involved by sug­ gesting changes. Perhaps the next version of The Rules will include some of your ideas about our moral responsibility in designing, developing, and employing com­ puting artifacts. References 1. M. Davis, “‘Ain’t No One Here But Us Social Forces’: Construct­ ing the Professional Responsibil­ ity of Engineers,” to be published in Science and Engineering Ethics; w w w.springerlink.com /content / 33338u607x251074/. 2. C. Huff, “Why a Sociotechnical System?” ComputingCases.org, http://computingcases.org/general_ tools/sia/socio_tech_system.html. 3. H. Nissenbaum, “Computing and Accountability,” Comm. ACM, Jan. 1994, pp. 72–80. Keith W. Miller is the Schewe Professor in Liberal Arts and Sciences at the University of Illinois at Springfield. His research areas are computer ethics and software testing. Contact him at miller. keith@uis.edu. computer.org/ITPro  59
Purchase answer to see full attachment
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Explanation & Answer

please find the attached document, feel free to reach out incase you have any question. Thank you and goodbye...

Running Head: DISRUPTIVE TECHNOLOGIES

Disruptive Technologies
Name
Institutional Affiliation

1

DISRUPTIVE TECHNOLOGIES

2

Disruptive technologies
The world of technology is always on the rise, and people are always creating new
technology to improve the existing one. Disruptive technology is changing how people are doing
business, getting treatment, and even moving from one place to the other. Other technologies like
mobile internet, advanced genomics, and autonomous vehicles can reshape the world and how
people live. It is thus necessary for business and governments to prepare for the impacts such
technology will have. The increasing number of disruptive technologies should also come with
moral and ethical responsibility which govern and guide their usage.
The first stakeholder in the case is the person developing the technology. These
individuals are responsible for creating the program which can is used to solve a particular
problem. The computer scientists are responsible for identifying a problem and then formulating
an artifact to solve it. They are thus responsible for ensuring that the technology operates in a
morally responsible manner (Manyika, Chui...


Anonymous
Really helpful material, saved me a great deal of time.

Studypool
4.7
Trustpilot
4.5
Sitejabber
4.4

Similar Content

Related Tags