Internet of Things Summary

User Generated

nyvd966

Other

Description

I need 8-10 or more pages or more if needed summary of the attached papers and chapter in my capstone project with topic ((IoT security and machine learning)). It will be a start and if the supervisor accept it we will go further with project together up to the end. If possible to help me find the research gap and how the project work contributes towards the advancement of human knowledge. According to the book I attached only chapter 14 needed. APA style.


Unformatted Attachment Preview

Texts in Computer Science Joseph Migga Kizza Ethical and Social Issues in the Information Age Sixth Edition Texts in Computer Science Series editors David Gries, Cornell University, Ithaca, NY, USA Orit Hazzan, Technion—Israel Institute of Technolog, Haifa, Israel Fred B. Schneider, Cornell University, Ithaca, NY, USA More information about this series at http://www.springer.com/series/3191 Joseph Migga Kizza Ethical and Social Issues in the Information Age Sixth Edition 123 Joseph Migga Kizza University of Tennessee at Chattanooga Chattanooga, TN USA ISSN 1868-0941 ISSN 1868-095X (electronic) Texts in Computer Science ISBN 978-3-319-70711-2 ISBN 978-3-319-70712-9 (eBook) https://doi.org/10.1007/978-3-319-70712-9 Library of Congress Control Number: 2017957974 © Springer International Publishing AG 2017 This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, express or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. Printed on acid-free paper This Springer imprint is published by Springer Nature The registered company is Springer International Publishing AG The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland Preface to the Sixth Edition In the fifth edition of this book, I made the following statement as an opener to the Preface of that edition “We may have experienced the fastest growth of technology in the last ten years than ever before.” I am going to make the same but bolder statement in this sixth edition because literally nothing has changed to prove otherwise. We may have experienced the fastest growth of technology in the last ten years than ever before. Technology has grown even faster and more enchanting and perplexing since the writing of that statement. Amazing and complex new technological advances have been registered across the broad spectrum of computing and telecommunication with jaw-dropping developments in networking and internet connectivity creating new the long expected convergence that is leading into new communications and computing platforms that are reaching into all remote corners of the world, bringing big and small, house and automobile devices to talk to each other and covering more of the poor and less affluent and bringing them to a position on a par with the rich and powerful than ever before. Along the way, these new technological developments have created new communities and ecosystems that are themselves evolving, in flux and difficult to secure and with questionable, if not evolving ethical systems that will take us time to learn, if it remains constant at all. Because of these rapid and unpredictable changes, my readers across the world have been contacting me to revise the contents of the book that has so far stood the currents now for 22 years. The frequency of new editions of this book is a testimony to these rapid and tremendous technological changes in the fields of computer and telecommunication sciences. First published in 1995, the book has rapidly gone through five editions already and now we are in the sixth. During that time, we have become more dependent on computer and telecommunication technology than ever before, and computer technology has become ubiquitous as the Internet of Things (IoT) technologies are blanketing the world we live in. Since I started writing on social computing, I have been advocating a time when we, as individuals and as nations, will become totally dependent on computing technology. That time is almost on us. Evidence of this is embodied in the rapid convergence of telecommunication, broadcasting, computing and mobile devices, the miniaturization of these devices, the ever increasing storage capacity, speed of v vi Preface to the Sixth Edition computation, and ease of use. These qualities have been a big pulling force sucking in millions of new users every day, sometimes even those unwilling. Other appealing features of these devices are increasing number of applications, apps, as they are increasingly becoming known, and their being wireless and easily portable. Whether small or big, these new gizmos have become a centerpiece of an individual’s social and economic activities and the main access point for all information. Individuals aside, computing technology has also become the engine that drives the nations’ strategic and security infrastructures that control power grids, gas and oil storage facilities, transportation, and all forms of national communication, including emergency services. These developments have elevated cyberspace to be the most crucial economic and security domains of nations. The US government, and indeed other national governments, has classified cyberspace security and cyber threat as one of the most serious economic and national security challenges the USA is facing as a nation.1 This, in particular, classifies the country’s computer networks as national security priority. What led to this has been a consistent and growing problem of cyber threats. In his article, “New Security Flaws Detected in Mobile Devices”, Byron Acohido,2 reports on two research reports by Cryptography Research. In one study, Cryptography Research showed how it is possible to eavesdrop on any smartphone or tablet as it is being used to make a purchase, conduct online banking, or access a company’s virtual private network. Also, McAfee, an anti-virus software company and a division of Intel, showed ways to remotely hack into Apple iOS and steal secret keys and passwords, and pilfer sensitive data, including call histories, e-mail, and text messages. What is more worrying is the reported fact that the device under attack would not in any way show that an attack is underway. Almost every mobile system user, security experts, and law enforcement officials are all anticipating, and as recent attack events have shown, that cybergangs will accelerate attacks as consumers and companies begin to rely more heavily on mobile devices for shopping, banking, and working. To make this even more complicated is the growing geographical sources of such cybergangs, now spanning the whole globe with patches of geopolitical laws, in reality unenforceable. So there is an urgent need for a broader array of security awareness, at a global scale, of communities and actions by these communities to assist in providing all users the highest level of protection. In April 2009, the US government admitted, after reports, that the nation’s power grid is vulnerable to cyber attack, following reports that it has been infiltrated by foreign spies. According to reports, there is a pretty strong consensus in the security community that the SCADA (Supervisory Control And Data Acquisition), an industrial control system that is used to monitor and control industrial, infrastructure or facility-based processes, and similar critical control platforms and systems “US ‘concerned’ over cyber threat”. http://news.bbc.co.uk/2/hi/americas/8126668.stm. Byron Acohido, “ New Security Flaws Detected in Mobile Devices”.http://www.enterprise-securitytoday.com/news/Mobile-Devices-Vulnerable-to-Attack/story.xhtml?story_id=0010003FAI65, April 10, 2012. 1 2 Preface to the Sixth Edition vii are not keeping pace with the rapid growing cyber attack pace and rapid changes in technology. The rising trend in cyber attacks, many of them with lightning speed, affecting millions of computing and mobile devices worldwide and in the process causing billions of dollars in losses to individuals and businesses, may be an indication of how unprepared we are to handle such attacks not only now but also in the future. It may also be a mark of the poor state of our cyber security posture, policies, and the lack of will to implement these policies and develop protocols and build facilities that will diminish the effects of these menacing activities if not eliminating them all together. It is encouraging though to hear and indeed see that at long last governments and private enterprise around the globe have started to act. There is a growing realization that the next big war may probably be fought in cyberspace. One hopes, though, that as governments prepare defensive stances, that they also take steps to protect the individual citizens. As we look for such protective and defensive strategies, the technological race is picking up speed with new technologies that make our efforts and existing technologies on which these strategies have been based obsolete in shorter and shorter periods. All these illustrate the speed at which the computing and telecommunication environments are changing and demonstrate a need for continuous review of our defensive strategies and more importantly a need for a strong ethical framework in our computer, information, and engineering science education. This has been and continues to be the focus of this book and remains so in this edition. What is New in this Edition There has been considerable changes in the contents of the book to bring it in line with the new developments we discussed above. In almost every chapter, new content has been added and we have eliminated what looked as outdated and what seems to be repeated materials. Because of the bedrock moral values and the enduring core ethical values of our community, the content in some chapters had not changed since the first edition. Because the popularity of Issues for Discussion, a series of thought-provoking questions and statements, meant to make the reading of chapters more interactive, this series has been kept in this edition. But of more interest to our readers and in recognition of the rapidly changing computing and telecommunication ecosystem, two new chapters on Cyberbullying and the Internet of Things (IoT) have been added. The addition of these chapters has been driven by technology advances that have seen an almost ubiquitous use of internet-ready mobile devices making cyberspace access easy and yet still anonymous thus creating fertile ground for abuse. Quick advances in technology have also made the appearance of new and increasingly minutiae smart devices in homes and cars that are everywhere that can self-organize and connect to the internet creating a new internet interface whose proposals and policies are either incompatible with the viii Preface to the Sixth Edition current internet protocols, policies, and standards or yet to be defined, debated, and accepted. This state of the newly defined internet interface is, in its present form, a security quagmire. The discussion throughout the book is candid and intended to ignite students interest, participation in class discussions of the issues and beyond. Chapter Overview The book is divided into eighteen chapters as follows: Chapter 1—History of Computing gives an overview of the history of computing science in hardware, software, and networking, covering prehistoric (prior to 1946) computing devices and computing pioneers since the Abacus. It also discusses the development of computer crimes and the current social and ethical environment. Further, computer ethics is defined, and a need to study computer ethics is emphasized. Chapter 2—Morality and the Law defines and examines personal and public morality, identifying assumptions and value the law, looking at both conventional and natural law, and the intertwining of morality and the law. It, together with Chap. 3, gives the reader the philosophical framework needed for the remainder of the book. Chapter 3—Ethics and Ethical Analysis builds upon Chap. 2 in setting up the philosophical framework and analysis tools for the book discussing moral theories and problems in ethical relativism. Based on these and in light of the rapid advances in technology, the chapter discusses the moral and ethical premises and their corresponding values in the changing technology arena. Chapter 4—Ethics and the Professions examines the changing nature of the professions and how they cope with the impact of technology on their fields. An ethical framework for decision making is developed. Professional and ethical responsibilities based on community values and the law are also discussed. And social issues including harassment and discrimination are thoroughly covered. Chapter 5—Anonymity, Security, and Privacy and Civil Liberties surveys the traditional ethical issues of privacy, security, and anonymity and analyzes how these issues are affected by computer technology. Information gathering, databasing, and civil liberties are also discussed. Chapter 6—Intellectual Property Rights and Computer Technology discusses the foundations of intellectual property rights and how computer technology has influenced and changed the traditional issues of property rights, in particular intellectual property rights. Chapter 7—Social Context of Computing considers the three main social issues in computing, namely the digital divide, workplace issues like employee monitoring, and health risks, and how these issues are changing with the changing computer technology. Preface to the Sixth Edition ix Chapter 8—Software Issues: Risks and Liabilities revisits property rights, responsibility and accountability with a focus on computer software. The risks and liabilities associated with software and risk assessment are also / discussed. Chapters 9—Computer Crimes surveys the history and examples of computer crimes, their types, costs on society, and strategies of detection and prevention. Chapter 10—New Frontiers for Computer Ethics: Artificial Intelligence discusses the new frontiers of ethics in the new intelligent technologies and how these new frontiers are affecting the traditional ethical and social issues. Chapter 11—New Frontiers for Computer Ethics: Virtualization and Virtual Reality discusses the new developments and consequences of the virtualization technology and its implications on our participation and how the technology informs our behavior based on our traditional moral and ethical values. Chapter 12—New Frontiers for Computer Ethics: Cyberspace discusses the new frontiers of ethics in cyberspace and the Internet, and how these new frontiers are affecting the traditional ethical and social issues. Chapter 13—Cyberbullying (New) discusses the growing threat and effects repeated deliberate harm or harassment other people by using electronic technology that may include devices and equipment such as cell phones, computers, and tablets as well as communication tools including social media sites, text messages, chat, and Web sites. Chapter 14—New Frontiers for Computer Ethics: Internet of Things (IoT) (New) discusses the new frontiers of ethics in the new and developing Internet-user interface whose protocols, policies, and standards are yet to be defined, discussed, and accepted by the scientific and user community. We will explore how this new interface has created a security quagmire and how it is affecting our traditional ethical and social systems. Chapter 15—Ethical, Privacy, and Security Issues in the Online Social Network EcoSystem discusses the new realities of global computer social network ecosystems, global linguistic, cultural, moral and ethical dynamisms and their impact on our traditional and cherished moral and ethical systems. Chapter 16—Ethical, Privacy, and Security Issues in the Mobile Ecosystems begins by presenting rather a frightening and quickly evolving mobile telecommunication and computing technologies, their unprecedented global reach and inclusion, unparalleled social, financial and cultural prowess, and the yet to be defined social, moral, and ethical value systems. Chapter 17—Computer Crime Investigations and Ethics discusses what constitutes digital evidence, the collection and analysis of digital evidence, chain of custody, the writing of the report, and the possible appearance in court as an expert witness. Ethical implications of these processes, the role of the legal framework, and the absence of an ethical framework are discussed in depth. Chapter 18—Biometrics Technologies and Ethics starts by discussing the different techniques in access control. Biometric technologies and techniques are then introduced to be contrasted with the other known techniques. Several biometrics and biometric technologies and their ethical implications are discussed. x Preface to the Sixth Edition Audience This book satisfies the new following curricula standards (http://www.acm.org/ education/curricula-recommendations): Computer Engineering • CE2016: Computer Engineering Curricula 2016 (English) Computer Science • CS2013: Curriculum Guidelines for Undergraduate Programs in Computer Science (English) Information Systems • IS2010 Curriculum Update: The Curriculum Guidelines for Undergraduate Degree Programs in Information Systems is complete and approved. Information Technology • IT 2008: The Computing Curricula Information Technology Volume is complete and approved. Software Engineering • SE2014: Curriculum Guidelines for Undergraduate Degree Programs in Software Engineering Associate-Degree Computing Curricula • • • • • Associate-Degree Computing Curricula Information Technology Competency Model Computer Science Transfer Computer Engineering Transfer Software Engineering Transfer Kindergarten through 12th Grade CSTA K-12 CS Standards, 2011 Edition These curricula focus on the need for any computer-related undergraduate programs to understand the basic cultural, social, legal, and ethical issues inherent in the disciplines of computing sciences. To do this, they need to: • understand where the discipline has been, where it is, and where it is heading. • understand their individual roles in this process, as well as appreciate the philosophical questions, technical problems, and esthetic values that play an important part in the development of the discipline. • develop the ability to ask serious questions about the social impact of computing and to evaluate proposed answers to those questions. Preface to the Sixth Edition xi • be aware of the basic legal rights of software and hardware vendors and users, and they also need to appreciate the ethical values that are the basis for those rights. Students in related disciplines like computer information and information management systems, and library sciences will also find this book informative. The book is also good for Computing Sciences practitioners who must practice the principles embedded in those curricula based on understanding: • that the responsibility that they bear and the possible consequences of failure. • their own limitations as well as the limitations of their tools. The book is also good for anyone interested in knowing how ethical and social issues like privacy, civil liberties, security, anonymity, and workplace issues like harassment and discrimination are affecting the new computerized environment. In addition, anybody interested in reading about computer networking, social networking, information security, and privacy will also find the book very helpful. Acknowledgements I appreciate all the help I received from colleagues who offered ideas, criticism, sometimes harsh, and suggestions from anonymous reviewers over the years. Special thanks to my dear wife, Dr. Immaculate Kizza, who offered a considerable amount of help in proofreading, constructive ideas, and wonderful support. Chattanooga, TN, USA 2017 Joseph Migga Kizza Contents 1 2 .... 1 . . . . . . . . 1 1 3 5 .... .... .... 5 6 7 History of Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.1 Historical Development of Computing and Information Technology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.1.1 Before AD 1900 . . . . . . . . . . . . . . . . . . . . . . . . . . 1.1.2 After AD 1900 . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.1.3 The Development of the Microprocessor . . . . . . . . 1.1.4 Historical Development of Computer Software and the Personal Computer (PC) . . . . . . . . . . . . . . 1.2 Development of the Internet . . . . . . . . . . . . . . . . . . . . . . . . 1.3 Development of the World Wide Web . . . . . . . . . . . . . . . . 1.4 The Emergence of Social and Ethical Problems in Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.4.1 The Emergence of Computer Crimes . . . . . . . . . . . 1.4.2 The Present Status: An Uneasy Cyberspace . . . . . 1.5 The Case for Computer Ethics Education . . . . . . . . . . . . . . 1.5.1 What Is Computer Ethics? . . . . . . . . . . . . . . . . . . . 1.5.2 Why You Should Study Computer Ethics . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 8 9 10 10 11 12 Morality and the Law . . . . . . . . . . . . . . . . 2.1 Introduction . . . . . . . . . . . . . . . . . . . 2.2 Morality . . . . . . . . . . . . . . . . . . . . . . 2.2.1 Moral Theories . . . . . . . . . . 2.2.2 Moral Decision Making . . . . 2.2.3 Moral Codes . . . . . . . . . . . . 2.2.4 Moral Standards . . . . . . . . . 2.2.5 Guilt and Conscience. . . . . . 2.2.6 Morality and Religion . . . . . 2.3 Law . . . . . . . . . . . . . . . . . . . . . . . . . 2.3.1 The Natural Law . . . . . . . . . 2.3.2 Conventional Law . . . . . . . . 2.3.3 The Purpose of Law . . . . . . 2.3.4 The Penal Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 16 17 18 18 19 21 22 23 23 24 25 25 26 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii xiv Contents 2.3.5 Morality and the Law . . . . . 2.3.6 Issues for Discussion . . . . . . 2.4 Morality, Etiquettes, and Manners . . 2.4.1 Issues for Discussion . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 28 28 28 29 3 Ethics and Ethical Analysis . . . . . . . . . . . . . . . . . . . . . . . 3.1 Traditional Definition . . . . . . . . . . . . . . . . . . . . . . . . 3.2 Ethical Theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2.1 Consequentialism . . . . . . . . . . . . . . . . . . . . . 3.2.2 Deontology . . . . . . . . . . . . . . . . . . . . . . . . . 3.2.3 Human Nature . . . . . . . . . . . . . . . . . . . . . . . 3.2.4 Relativism . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2.5 Hedonism . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2.6 Emotivism . . . . . . . . . . . . . . . . . . . . . . . . . . 3.3 Functional Definition of Ethics . . . . . . . . . . . . . . . . . 3.4 Ethical Reasoning and Decision Making . . . . . . . . . 3.4.1 A Framework for Ethical Decision Making . 3.4.2 Making and Evaluating Ethical Arguments . 3.5 Codes of Ethics . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.5.1 Preamble . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.5.2 Objectives of Codes of Ethics . . . . . . . . . . . 3.6 Reflections on Computer Ethics . . . . . . . . . . . . . . . . 3.6.1 New Wine in an Old Bottle . . . . . . . . . . . . . 3.7 Technology and Values . . . . . . . . . . . . . . . . . . . . . . 3.7.1 Issues for Discussion . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 33 33 34 34 35 35 35 35 37 38 39 39 41 41 49 50 50 52 53 54 4 Ethics and the Professions . . . . . . . . . . . . . . . . . . . . . . . . 4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2 Evolution of Professions . . . . . . . . . . . . . . . . . . . . . . 4.2.1 Origins of Professions . . . . . . . . . . . . . . . . . 4.2.2 Requirements of a Professional . . . . . . . . . . 4.2.3 Pillars of Professionalism . . . . . . . . . . . . . . 4.3 The Making of an Ethical Professional: Education and Licensing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.3.1 Formal Education . . . . . . . . . . . . . . . . . . . . 4.3.2 Licensing Authorities . . . . . . . . . . . . . . . . . . 4.3.3 Professional Codes of Conduct . . . . . . . . . . 4.4 Professional Decision Making and Ethics . . . . . . . . . 4.4.1 Professional Dilemma in Decision Making . 4.4.2 Guilt and Making Ethical Decisions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 56 56 56 57 60 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 64 65 66 68 69 70 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Contents 4.5 Professionalism and Ethical Responsibilities . . . 4.5.1 Whistle-Blowing . . . . . . . . . . . . . . . . . 4.5.2 Harassment and Discrimination . . . . . . 4.5.3 Ethical and Moral Implications . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xv . . . . . . . . . . . . . . . . . . . . . . . . . 71 72 74 75 76 5 Anonymity, Security, Privacy, and Civil Liberties . . . . . . . . . . 5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.2 Anonymity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.2.1 Anonymity and the Internet . . . . . . . . . . . . . . . . . . 5.2.2 Advantages and Disadvantages of Anonymity . . . . 5.2.3 Legal View of Anonymity . . . . . . . . . . . . . . . . . . . 5.3 Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.3.1 Physical Security . . . . . . . . . . . . . . . . . . . . . . . . . . 5.3.2 Physical Access Controls . . . . . . . . . . . . . . . . . . . . 5.3.3 Information Security Controls . . . . . . . . . . . . . . . . 5.3.4 Operational Security . . . . . . . . . . . . . . . . . . . . . . . 5.4 Privacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.4.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.4.2 Types of Privacy . . . . . . . . . . . . . . . . . . . . . . . . . . 5.4.3 Value of Privacy . . . . . . . . . . . . . . . . . . . . . . . . . . 5.4.4 Privacy Implications of Database System . . . . . . . 5.4.5 Privacy Violations and Legal Implications . . . . . . 5.4.6 Privacy Protection and Civil Liberties . . . . . . . . . . 5.5 Ethical and Legal Framework for Information . . . . . . . . . . 5.5.1 Ethics and Privacy . . . . . . . . . . . . . . . . . . . . . . . . . 5.5.2 Ethical and Legal Basis for Privacy Protection . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 81 82 82 83 84 84 85 85 87 90 90 90 91 92 93 94 97 99 99 100 101 6 Intellectual Property Rights and Computer Technology. 6.1 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.2 Computer Products and Services. . . . . . . . . . . . . . . . 6.3 Foundations of Intellectual Property . . . . . . . . . . . . . 6.3.1 Copyrights . . . . . . . . . . . . . . . . . . . . . . . . . . 6.3.2 Patents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.3.3 Trade Secrets . . . . . . . . . . . . . . . . . . . . . . . . 6.3.4 Trademarks . . . . . . . . . . . . . . . . . . . . . . . . . 6.3.5 Personal Identity . . . . . . . . . . . . . . . . . . . . . 6.4 Ownership . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.4.1 The Politics of Ownership . . . . . . . . . . . . . . 6.4.2 The Psychology of Ownership. . . . . . . . . . . 6.5 Intellectual Property Crimes . . . . . . . . . . . . . . . . . . . 6.5.1 Infringement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 104 104 107 107 110 111 112 115 116 116 117 118 118 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvi 7 8 Contents 6.5.2 The First Sale Doctrine . . . . . . . . . . . . . . . . . . . . . 6.5.3 The Fair Use Doctrine . . . . . . . . . . . . . . . . . . . . . . 6.6 Protection of Ownership Rights . . . . . . . . . . . . . . . . . . . . . 6.6.1 Domain of Protection . . . . . . . . . . . . . . . . . . . . . . 6.6.2 Source and Types of Protection . . . . . . . . . . . . . . . 6.6.3 Duration of Protection . . . . . . . . . . . . . . . . . . . . . . 6.6.4 Strategies of Protection . . . . . . . . . . . . . . . . . . . . . 6.7 Protecting Computer Software Under the IP . . . . . . . . . . . . 6.7.1 Software Piracy . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.7.2 Protection of Software Under Copyright Laws . . . 6.7.3 Protection of Software Under Patent Laws . . . . . . 6.7.4 Protection of Software Under Trademarks . . . . . . . 6.7.5 Protection of Software Under Trade Secrets . . . . . 6.8 Transnational Issues and Intellectual Property. . . . . . . . . . . 6.8.1 Issues for Discussion . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 119 120 120 121 122 122 122 123 123 124 125 125 126 127 128 Social Context of Computing . . . . . . . . . . . . . . . . . . . . . . 7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.2 The Digital Divide . . . . . . . . . . . . . . . . . . . . . . . . . . 7.2.1 Access . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.2.2 Technology . . . . . . . . . . . . . . . . . . . . . . . . . 7.2.3 Humanware (Human Capacity) . . . . . . . . . . 7.2.4 Infrastructure . . . . . . . . . . . . . . . . . . . . . . . . 7.2.5 Enabling Environments . . . . . . . . . . . . . . . . 7.3 Obstacles to Overcoming the Digital Divide . . . . . . . 7.4 ICT in the Workplace . . . . . . . . . . . . . . . . . . . . . . . . 7.4.1 The Electronic Office . . . . . . . . . . . . . . . . . . 7.4.2 Office on Wheels and Wings . . . . . . . . . . . . 7.4.3 The Virtual Workplace . . . . . . . . . . . . . . . . 7.4.4 The Quiet Revolution: The Growth of Telecommuting . . . . . . . . . . . . . . . . . . . . 7.4.5 Employee Social and Ethical Issues. . . . . . . 7.5 Employee Monitoring . . . . . . . . . . . . . . . . . . . . . . . . 7.5.1 Workplace Privacy and Surveillance . . . . . . 7.5.2 Electronic Monitoring . . . . . . . . . . . . . . . . . 7.6 Workplace, Employee, Health, and Productivity . . . . 7.6.1 Ergonomics . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129 130 131 131 139 142 143 143 144 145 145 146 146 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147 151 152 153 156 159 159 162 Software Issues: Risks and Liabilities . . . 8.1 Definitions . . . . . . . . . . . . . . . . . . . . 8.1.1 Standards . . . . . . . . . . . . . . . 8.1.2 Reliability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165 166 166 167 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Contents 8.1.3 Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8.1.4 Safety . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8.1.5 Quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8.1.6 Quality of Service . . . . . . . . . . . . . . . . . . . . 8.2 Causes of Software Failures . . . . . . . . . . . . . . . . . . . 8.2.1 Human Factors . . . . . . . . . . . . . . . . . . . . . . 8.2.2 Nature of Software: Complexity . . . . . . . . . 8.3 Risk . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8.3.1 Risk Assessment and Management . . . . . . . 8.3.2 Risks and Hazards in Workplace Systems . . 8.3.3 Historic Examples of Software Risks . . . . . 8.4 Consumer Protection . . . . . . . . . . . . . . . . . . . . . . . . . 8.4.1 Buyer and Provider Rights . . . . . . . . . . . . . 8.4.2 A Service Provider–User Contract . . . . . . . . 8.4.3 The Tort Option . . . . . . . . . . . . . . . . . . . . . 8.5 Improving Software Quality . . . . . . . . . . . . . . . . . . . 8.5.1 Techniques for Improving Software Quality 8.6 Producer Protection. . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvii . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168 169 169 170 170 170 171 172 173 174 175 181 182 184 185 187 187 188 189 Computer Crimes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.2 History of Computer Crimes . . . . . . . . . . . . . . . . . . . . . . . . 9.3 Types of Computer Systems Attacks . . . . . . . . . . . . . . . . . 9.3.1 Penetration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.3.2 Denial of Service . . . . . . . . . . . . . . . . . . . . . . . . . . 9.4 Motives of Computer Crimes . . . . . . . . . . . . . . . . . . . . . . . 9.5 Costs and Social Consequences . . . . . . . . . . . . . . . . . . . . . 9.5.1 Lack of Cost Estimate Model for Cyberspace Attacks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.5.2 Social and Ethical Consequences . . . . . . . . . . . . . . 9.6 Computer Crime Prevention Strategies . . . . . . . . . . . . . . . . 9.6.1 Protecting Your Computer . . . . . . . . . . . . . . . . . . . 9.6.2 The Computer Criminal . . . . . . . . . . . . . . . . . . . . . 9.6.3 The Innocent Victim . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191 192 193 195 195 197 197 199 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202 203 204 204 205 206 207 10 New Frontiers for Computer Ethics: Artificial Intelligence . . . 10.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10.2 Artificial Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10.2.1 Advances in Artificial Intelligence . . . . . . . . . . . . . 10.2.2 Artificial Intelligence and Ethics . . . . . . . . . . . . . . 10.2.3 The Future Role of Autonomous Agents . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211 212 213 214 215 217 219 9 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xviii Contents 11 New Frontiers for Computer Ethics: Virtualization and Virtual Reality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1 Virtualization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Different Aspects of Virtualization . . . . . . . . . . . . . . . . . . . . . . . 11.3 Virtualization of Computing Resources . . . . . . . . . . . . . . . . . . . 11.3.1 History of Computing Virtualization . . . . . . . . . . . . . . . 11.3.2 Computing Virtualization Terminologies . . . . . . . . . . . . 11.3.3 Types of Computing System Virtualization . . . . . . . . . . 11.3.4 The Benefits of Computing Virtualization . . . . . . . . . . . 11.4 Virtual Reality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.4.1 Different Types of Virtual Reality . . . . . . . . . . . . . . . . . 11.4.2 Virtualization and Ethics . . . . . . . . . . . . . . . . . . . . . . . . 11.5 Social and Ethical Implication of Virtualization . . . . . . . . . . . . . 11.6 Virtualization Security as an Ethical Imperative . . . . . . . . . . . . . 11.6.1 Hypervisor Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.6.2 Securing Communications Between Desktop and Virtual Environment . . . . . . . . . . . . . . . . . . . . . . . . 11.6.3 Security of Communication Between Virtual Environments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.6.4 Threats and Vulnerabilities Originating from a Virtual Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 New Frontiers for Computer Ethics: Cyberspace . . 12.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 12.2 Cyberspace and the Concepts of Telepresence and Immersion . . . . . . . . . . . . . . . . . . . . . . . . . 12.3 Securing Cyberspace . . . . . . . . . . . . . . . . . . . . . 12.3.1 Detecting Attacks in Cyberspace . . . . . 12.3.2 Cyberspace Systems Survivability . . . . 12.4 Intellectual Property Rights in Cyberspace . . . . 12.4.1 Copyrights . . . . . . . . . . . . . . . . . . . . . . 12.4.2 Patents . . . . . . . . . . . . . . . . . . . . . . . . . 12.4.3 Trade Secrets . . . . . . . . . . . . . . . . . . . . 12.4.4 Trademarks . . . . . . . . . . . . . . . . . . . . . 12.4.5 Personal Identity . . . . . . . . . . . . . . . . . 12.5 Regulating and Censoring Cyberspace . . . . . . . 12.6 The Social Value of Cyberspace . . . . . . . . . . . . 12.7 Privacy in Cyberspace . . . . . . . . . . . . . . . . . . . 12.7.1 Privacy Protection . . . . . . . . . . . . . . . . 12.8 Global Cyberethics . . . . . . . . . . . . . . . . . . . . . . 12.9 Cyberspace Lingua Franca . . . . . . . . . . . . . . . . 12.10 Global Cyber Culture . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221 221 222 222 223 224 225 228 231 232 233 235 236 237 237 237 238 239 ............. ............. 241 242 . . . . . . . . . . . . . . . . . . 243 244 244 247 248 251 252 252 253 254 255 257 258 259 259 260 261 263 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Contents xix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265 265 266 266 267 267 267 267 268 268 268 268 268 269 269 270 270 271 271 272 272 272 273 273 275 14 Internet of Things (IoT): Growth, Challenges, and Security . . 14.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14.2 Overview and Growth of Internet of Things . . . . . . . . . . . . 14.3 Architecture and Networking of IoT . . . . . . . . . . . . . . . . . . 14.3.1 Architecture and Protocol Stack of IoTs . . . . . . . . 14.3.2 Challenges of Using TCP/IP Architecture Over the IoT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14.4 IoT Governance, Privacy, and Security Challenges. . . . . . . 14.4.1 Governance and Privacy Concerns . . . . . . . . . . . . 14.4.2 Security Challenges . . . . . . . . . . . . . . . . . . . . . . . . 14.4.3 Autonomy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14.4.4 Computational Constraints . . . . . . . . . . . . . . . . . . . 14.4.5 Discovery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14.4.6 Trust Relationships . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277 277 279 280 281 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283 286 286 287 288 289 289 289 291 13 Cyberbullying . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13.1.1 Legal Definition. . . . . . . . . . . . . . . . . . . . . . 13.1.2 Cyberstalking. . . . . . . . . . . . . . . . . . . . . . . . 13.1.3 Cyber Harassment . . . . . . . . . . . . . . . . . . . . 13.2 Types of Cyberbullying . . . . . . . . . . . . . . . . . . . . . . 13.2.1 Harassment . . . . . . . . . . . . . . . . . . . . . . . . . 13.2.2 Flaming . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13.2.3 Exclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . 13.2.4 Outing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13.2.5 Masquerading . . . . . . . . . . . . . . . . . . . . . . . 13.3 Areas of Society Most Affected by Cyberbullying . . 13.3.1 Schools . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13.3.2 Cyberbullying in the Workplace . . . . . . . . . 13.4 Legislation Against Cyberbullying . . . . . . . . . . . . . . 13.4.1 Federal Laws . . . . . . . . . . . . . . . . . . . . . . . . 13.4.2 State Laws . . . . . . . . . . . . . . . . . . . . . . . . . . 13.4.3 International Laws . . . . . . . . . . . . . . . . . . . . 13.5 Effects of Cyberbullying . . . . . . . . . . . . . . . . . . . . . . 13.6 Dealing with Cyberbullying . . . . . . . . . . . . . . . . . . . 13.6.1 Awareness . . . . . . . . . . . . . . . . . . . . . . . . . . 13.6.2 Legislations . . . . . . . . . . . . . . . . . . . . . . . . . 13.6.3 Community Support . . . . . . . . . . . . . . . . . . 13.7 Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xx Contents 15 Ethical, Privacy, and Security Issues in the Online Social Network Ecosystems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.2 Introduction to Computer Networks . . . . . . . . . . . . . . . . . . . . . . 15.2.1 Computer Network Models . . . . . . . . . . . . . . . . . . . . . . 15.2.2 Computer Network Types . . . . . . . . . . . . . . . . . . . . . . . 15.3 Social Networks (SNs) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.4 Online Social Networks (OSNs) . . . . . . . . . . . . . . . . . . . . . . . . . 15.4.1 Types of Online Social Networks . . . . . . . . . . . . . . . . . 15.4.2 Online Social Networking Services . . . . . . . . . . . . . . . . 15.4.3 The Growth of Online Social Networks . . . . . . . . . . . . 15.5 Ethical and Privacy Issues in Online Social Networks . . . . . . . . 15.5.1 Privacy Issues in OSNs . . . . . . . . . . . . . . . . . . . . . . . . . 15.5.2 Strengthening Privacy in OSNs . . . . . . . . . . . . . . . . . . . 15.5.3 Ethical Issues in Online Social Networks . . . . . . . . . . . 15.6 Security and Crimes in Online Social Networks . . . . . . . . . . . . . 15.6.1 Beware of Ways to Perpetuate Crimes in Online Social Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.6.2 Defense Against Crimes in Online Social Networks . . . 15.7 Proven Security Protocols and Best Practices in Online Social Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.7.1 Authentication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.7.2 Access Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.7.3 Legislation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.7.4 Self-regulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.7.5 Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15.7.6 Recovery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 Mobile Systems and Their Intractable Social, Ethical and Security Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16.2 Role of Operating Systems in the Growth of the Mobile Ecosystem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16.2.1 Android . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16.2.2 iOS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16.2.3 Windows mOS . . . . . . . . . . . . . . . . . . . . . . . . . . . 16.2.4 BlackBerry mOS . . . . . . . . . . . . . . . . . . . . . . . . . . 16.2.5 Other Smaller mOS . . . . . . . . . . . . . . . . . . . . . . . . 16.3 Ethical and Privacy Issues in Mobile Ecosystems . . . . . . . . 16.4 Security Issues in Mobile Ecosystems . . . . . . . . . . . . . . . . 16.4.1 Application-Based Threats . . . . . . . . . . . . . . . . . . . 16.4.2 Web-Based Threats . . . . . . . . . . . . . . . . . . . . . . . . 16.4.3 Network Threats . . . . . . . . . . . . . . . . . . . . . . . . . . 293 293 293 294 296 297 299 299 300 301 303 303 306 307 310 311 313 317 317 317 318 318 318 318 319 .... .... 321 321 . . . . . . . . . . . 322 323 324 324 324 325 326 327 328 329 330 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Contents 16.4.4 Physical Threats . . . . . . . . . . . . . 16.4.5 Operating System-Based Threats . 16.5 General Mobile Devices Attack Types . . . 16.6 Mitigation of Mobile Devices Attacks . . . 16.6.1 Mobile Device Encryption . . . . . . 16.6.2 Mobile Remote Wiping . . . . . . . . 16.6.3 Mobile Passcode Policy . . . . . . . . 16.7 Users’ Role in Securing Mobile Devices . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 330 330 331 334 335 336 336 337 337 17 Computer Crime Investigations and Ethics . . . . . . . . . . . . . . . . 17.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17.2 Digital Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17.2.1 Looking for Digital Evidence . . . . . . . . . . . . . . . . 17.2.2 Digital Evidence: Previewing and Acquisition . . . . 17.3 Preserving Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17.4 Analysis of Digital Evidence . . . . . . . . . . . . . . . . . . . . . . . 17.4.1 Analyzing Data Files . . . . . . . . . . . . . . . . . . . . . . . 17.4.2 Analysis Based on Operating Systems . . . . . . . . . . 17.4.3 Analysis Based on Digital Media . . . . . . . . . . . . . 17.5 Relevance and Validity of Digital Evidence . . . . . . . . . . . . 17.6 Writing Investigative Reports . . . . . . . . . . . . . . . . . . . . . . . 17.7 Ethical Implications and Responsibilities in Computer Forensic Investigations . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 339 339 340 341 341 344 344 345 346 347 350 350 .... .... 351 353 18 Biometric Technologies and Ethics. . . . . . . . . . 18.1 Introduction and Definitions . . . . . . . . . . . 18.1.1 Definitions . . . . . . . . . . . . . . . . . . 18.2 The Biometric Authentication Process . . . 18.3 Biometric System Components . . . . . . . . . 18.3.1 Data Acquisition . . . . . . . . . . . . . 18.3.2 Enrollments . . . . . . . . . . . . . . . . . 18.3.3 Signal Processing . . . . . . . . . . . . 18.3.4 Decision Policy . . . . . . . . . . . . . . 18.4 Types of Biometric Technologies . . . . . . . 18.4.1 Finger Biometrics . . . . . . . . . . . . 18.4.2 Hand Geometry . . . . . . . . . . . . . . 18.4.3 Face Biometrics . . . . . . . . . . . . . . 18.4.4 Voice Biometrics . . . . . . . . . . . . . 18.4.5 Handwriting Analysis . . . . . . . . . 18.4.6 Iris Biometrics . . . . . . . . . . . . . . . 18.4.7 Retina . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355 356 357 358 359 359 359 360 360 360 360 363 363 364 365 365 366 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxii Contents 18.5 Ethical Implications of Biometric Technologies 18.5.1 Issues for Discussion . . . . . . . . . . . . . . 18.6 The Future of Biometrics . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 366 367 367 369 Appendix A: The Digital Millennium Copyright Act . . . . . . . . . . . . . . . . 371 Appendix B: The Federal False Claims Act . . . . . . . . . . . . . . . . . . . . . . . 383 Appendix C: Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 405 Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 409 1 History of Computing Learning Objectives After reading this chapter, the reader should be able to 1. Learn about the contributions of several pioneers in the computing field 2. Compare life before and after the advent of personal computers and the Internet 3. Identify significant trends in the history of the computing field 1.1 Historical Development of Computing and Information Technology 1.1.1 Before AD 1900 From time immemorial, human beings have been trying to make their lives easy and worth living through the invention of gadgets. The invention of the computer and, therefore, the history of computing have taken the same track. The timeline of the development of computing stretches back like the recorded history of humanity itself. Besides tools that make life easier, human beings have always been fascinated with numbers. So, it should be no surprise that the first utility tools recorded in history dealt with numbers. For example, it is believed that the first prime numbers recorded on animal bones and rocks—the only available and durable storage devices of the time—were done between 20,000 BC and 30,000 BC [1]. By 1800 BC, the first place-value number system was in place. To help merchants who were trading goods to quickly calculate goods bought and sold, and also gains and losses, there was a need to develop a device to do the mathematics quickly. This led to the invention of the abacus, the device many believe was the mother of the digital computer as we know it today, between 1000 BC and 500 BC. Without performing © Springer International Publishing AG 2017 J. M. Kizza, Ethical and Social Issues in the Information Age, Texts in Computer Science, https://doi.org/10.1007/978-3-319-70712-9_1 1 2 1 History of Computing actual calculations, the abacus helps the person using it to keep track of the calculated results that he or she has done mentally. Zero and negative numbers were first used between 300 BC and 500 AD. The period between 1500 and 1900 saw a surge of activity in the development of computing devices. Many of these developments were driven by the commerce of the time. In 1500, the computing community got a boost when Leonardo da Vinci invented a mechanical calculator. This was followed by the invention of the slide rule in 1621. Leonardo da Vinci’s mechanical calculator was followed by Wilhelm Schickard’s mechanical calculator in 1625 and by Blaise Pascal’s arithmetic machine 15 years later. The major breakthrough in speed came in 1800 with the invention of the punched card by Joseph Marie Jacquard, a French silk weaver [1]. Jacquard’s punched card revolutionized computing in the sense that it quickly spread to other fields where it was used not only to speed up computation but also to store information. The period after 1830 was an exciting one in the history of computing because, during this period, there were successive history-making inventions starting with Charles Babbage’s analytical engine in 1830 and George and Edward Schutz’s difference engine. Within a decade, these were followed by one of the milestone inventions in computing and mathematics—George Boole’s development of Boolean algebra. The invention of Boolean algebra opened up the fields of mathematics, engineering, and computing to the new frontiers in logic, where the possibilities were boundless. Sir Charles Wheatstone’s invention of the paper tape to store information in 1857 created new excitement in the computing community of the time. With paper tape, huge amounts of data could be fed into the computing device, and similar quantities could also be stored. This invention brought computing to a new level and into a new era. From the mid-1850s through the turn of the century, computing made an enormous progress with various inventions including the invention of the logic machine by William Stanley Jevons in 1869, the invention of the first keyboard by Sholes around 1874, and the rectangular logic diagrams by Allan Marquand in 1881. Starting around 1890, a burst of major inventions similar to those of 1850s started all over again. In 1886, Charles Pierce first linked Boolean algebra to circuits based on switches, a major breakthrough in mathematics, engineering, and computing science. In 1890, John Venn created the Venn diagrams now used extensively in switching algebras in both hardware and software development. Finally, in 1890, Herman Hollerith invented the tabulating machine. Hollerith’s invention utilized Jacquard’s punched card to read the presence or absence of holes. The data read were to be collated using an automatic electrical tabulating machine with a large number of clocklike counters that summed up and accumulated the results in a number of selected categories. 1.1 Historical Development of Computing and Information Technology 3 1.1.2 After AD 1900 The inventions before AD 1900 were all crucial building blocks of the computing industry. The period created a child, but the child did not grow until the second period of development that started around the turn of the twentieth century. The century began with a major milestone in the computing history—the invention of the vacuum tube by John Ambrose Fleming. This was a major development in computing as the vacuum tube played a major role in computing for the next half-century. All digital computers in the first half-century ran on vacuum tubes. The next 20 years saw the development of computing with a variety of inventions including the invention of the triode by Lee de Forest in 1906. However, another major milestone invention was to be recorded during this period, although it was not to come into full use for some time. The year 1926 saw the invention of the first semiconductor transistor, which would come to dominate the computing industry from that point forward. Many smaller and less significant inventions were made during the next 10 years that included the complex number calculator by George Robert Stibitz in 1937. That same year also saw another key advance in the history of computing. The invention of the Turing machine by Alan Turing in 1937 was as revolutionary as it was exciting. Turing, an English mathematician, showed by the invention of an abstract computer that some problems do not lend themselves to algorithmic representations and therefore are not computable. This was a major development in computing. Turing was 7 years later to work on the design of COLOSSUS, one of the first working programmable digital computers. Two years after Turing, in 1939, the world saw the first digital computer developed by John Vincent Atanasoff, a lecturer at Iowa State College (now University). Atanasoff’s computer was the first special-purpose electronic digital computer. Working with his graduate assistant Clifford Berry, Atanasoff designed a device that utilized capacitors to store electronic charge to represent Boolean numbers 0 and 1 to be used by the machine in calculations, a major breakthrough in computing history [1]. Both input and output data were on punched cards, and Atanasoff’s magic was in creating a storage representation for intermediate data in the machine as it is used by the digital machine for calculations before it is output on the punched cards and tape. There is doubt, however, whether Atanasoff’s model ever worked. Around the same time Atanasoff and Berry were working on their model in 1939, Howard Aiken, a graduate of Harvard University, was developing the first large-scale automatic digital computer. Aiken’s computer came to be known as the Harvard Mark I (also known as IBM automatic sequencer calculator —ASC) [1]. The following decade saw the development of an actual working model of the digital computer as we know it today. In 1943, Alan Turing, working as a cryptographer, constructed the COLOSSUS, considered by many as the world’s earliest working programmable electronic digital computer. The COLOSSUS, designed to break the German ENIGMA code, used about 1,800 vacuum tubes to execute a variety of routines. 4 1 History of Computing Around the time that COLOSSUS was being developed by Turing, the team of John William Mauchly and J. Presper Eckert, Jr., was working at the University of Pennsylvania to develop another vacuum tube-based general-purpose electronic digital computer. Their model, named electronic numerical integrator and computer (ENIAC), was 10 ft high, weighed 30 t, occupied 1,000 ft2, and used about 70,000 resistors, 10,000 capacitors, 6,000 switches, and 18,000 vacuum tubes [1, 2]. After ENIAC went into use, the team encountered a number of problems, the main one being that it did not have an internal memory because it was hardwired and was consistently programmed by switches and diodes. This problem had to be worked on for the next model. From 1944 through 1952, the team developed a new computer called the electronic discrete variable automatic computer (EDVAC). This is believed to be the first truly general-purpose digital computer. EDVAC was a stored-program computer with internal read–write memory to store program instructions. The stored-program concept gave the device the capability to branch from the current program instruction under execution to alternative instruction sequences elsewhere in the stored program. When it was completed in 1956, EDVAC was still a carousel machine with 4,000 vacuum tubes and 10,000 crystal diodes. Although most of these activities were taking place in the USA, there were other efforts in other countries. For example, around the time, EDVAC was being developed, an experiment was being conducted at the University of Manchester in the UK, also based on the stored-program concept. By 1948, the Manchester team had produced a machine working with 32 words of memory and a five-instruction set. Also in England, at Cambridge University, the electronic delay storage automatic calculator, EDSAC, was produced in 1949. In 1948, the universal automatic computer, UNIVAC I, became the first commercially available computer. From that point, the general-purpose computer took on a momentum of its own. It became bigger and more powerful. Companies sprang up both in the USAand Europe to manufacture these wonder machines. Among the leaders were International Business Machines (IBM), Honeywell, and Control Data Corporation (CDC) in America and International Computers Limited (ICL) in England. These companies and a number of others built what came to be known as the mainframe, a huge computer that consisted of a 4–5-ft by 8-ft tape drive, a huge control processing unit, a huge printer, several huge fixed disks, a large card reader, and a paper punch. These components usually filled a large room or two. Because these computers were big, expensive, and difficult to use, computer users could only use them through an operator. The operator fed jobs to the computer via a card or tape reader. The jobs were submitted to the card reader as decks of punched cards. Because these computers were big, expensive, and, as we have seen, difficult to use, only large companies and institutions were able to use them. Around the mid- to late 1960s, a movement to make computers less expensive and more affordable started gathering momentum. This movement led to a number of developments. First, it led to the manufacture of a less expensive and smaller computer—the medium-range computer commonly referred to as a minicomputer. Second, it started a mode of computing that later led to networking. This was 1.1 Historical Development of Computing and Information Technology 5 time-sharing, where one computer could be used by a number of users who would remotely connect on to the mainframe. Third and most important, between 1971 and 1976, it led to a milestone in the history of computing: the development of the first microprocessor. A microprocessor is an integrated circuit with many transistors on a single board. Before the birth of the microprocessor, computer technology had developed to a point that vacuum tubes and diodes were no longer used. Computers were constructed from thousands of transistors. The demand for more powerful computers necessitated the development of computers with many thousands of transistors. But it was not possible at the time to simply pack in more transistors and create a working, more powerful computer. A way forward had to be found. 1.1.3 The Development of the Microprocessor That way was found by Ted Hoff. Hoff designed the world’s first microprocessor, the 4004. The last four in 4004 indicated that the device had a four-bit data path. The 4004 microprocessor was a four-chip system consisting of 256-byte ROM, a 32-bit RAM, 4-bit data path, and 10-bit shift register. It used 2,300 transistors to execute 60,000 operations per second, a top speed at the time [3]. The development of the first microprocessor caught the world off guard. Even Biscom, the company that had commissioned Hoff, did not understand the potential of the 4004. So, they requested him to design the 12-chip set that they had originally wanted him to design [3]. In 1972, Intel introduced the 8008, an 8-bit microprocessor based on the 4004. The 8008 used 3,300 transistors and was the first microprocessor to use a compiler, a system program that interprets user inputs into machine code and machine code to system outputs understandable by the user. The 8008 supported the compiler called PL/M. Both the 4004 and the 8008 were specific application microprocessors. The truly general-purpose microprocessor came out in 1974. It was the 8080, an 8-bit device with 4,500 transistors, performing an astonishing 200,000 operations per second. From 1974 forward, the development of microprocessors exploded as companies like Motorola developed the 6800 in 1974, MOS Technology developed the 6502 in 1975, and Zilog developed the Z80 in 1976. Since then, many new companies have sprung up and the speed, density of transistors, and functionality of microprocessors has been on the rise. 1.1.4 Historical Development of Computer Software and the Personal Computer (PC) Up until the mid-1970s, the development of computing science was led by hardware. Computers were first designed, and then, software was designed to fit the hardware. The development of software to run the computers was in the hands of the companies that designed the hardware. The break from this routine came from two fronts: In 1976, the Apple I and Apple II microcomputers were unveiled, and in 6 1 History of Computing 1981, IBM joined the PC wars. These two developments started a new industry, the personal computing industry. Perhaps the PC industry would not be the way it is today were it not the development of the personal computer operating system (OS). It involved three players: IBM; Gary Kildall, the developer of CP/M, the PC operating system many believe to be the first PC operating system; and Bill Gates, the developer of the disk operating system (DOS). The story behind these players, part legend, is the story of the beginning of the PC. The legend has it that when IBM developed the personal computer based on Intel’s 8088 microprocessor, in 1981, it needed an operating system. It is alleged that IBM approached both Kildall and Gates. However, Kidall was out flying and failed to attend to IBM’s request before Gates did [2, 4]. Gates developed the first DOS and a version of the BASIC programming language for IBM, and the rest is history. Two dominant brands of chips for the first PCs were Intel and Motorola. The IBM/DOS combination, which later led to the Windows brand, gained over Apple’s Motorola-based model because of Intel’s policy of backward compatibility—that is, developing new products based on the previous versions—which Apple never embraced. Microsoft took the lead over other software developers by creating an operating system and application software for both standards. It has since then gained an overwhelming dominance of the marketplace although Apple has regained that lead with the coming of mobile technology. 1.2 Development of the Internet The Internet, a global network of computers, owes its development to the invention of four technologies: telegraph, telephone, radio, and computers. History has it that the Internet originated from the early work of J.C.R. Licklider of the Massachusetts Institute of Technology (MIT) on “galactic networks.” Licklider conceptualized a globally interconnected set of computers with communication channels between them through which programs and data could be accessed quickly by any computer from any computer [5, 6]. This networking concept envisioned by Licklider would support communication between network nodes using a concept of packets instead of circuits, thus enabling computers to talk to one another. Licklider left MIT to head the computer research program at the Department of Defense’s Defense Advanced Research Projects Agency (DARPA) in 1962. A year before, at MIT, researcher Leonard Kleinrock had written what is believed to be the first published work on packet-switching theory [5]. This work created the momentum for the concept of a packet-switching network. However, it was not the only work on the concept. There were two additional independent projects on this same topic: that of Donald Davies and Roger Scantlebury at the British National Laboratory (BNL), which later was credited with coining the term “packet,” and that of Paul Baran at RAND. In 1965, Lawrence Roberts at MIT, who had been collaborating with Licklider, and Thomas M. Roberts connected the TX-2 computer in Boston to the Q-32 computer in Los Angeles with a low-speed dial-up telephone 1.2 Development of the Internet 7 line. This test experiment created the first working wide area network (WAN). This experiment opened up the doors to all computer network communications as it is known today. In 1966, Roberts left MIT for DARPA to develop the computer network concept, publishing the first plan for ARPANET in 1967 [5, 7]. In 1968, a go-ahead was given by DARPA for the development of the packet switches called interface message processors (IMP). As the team, led by Frank Heart and including Bob Kahn, developed the IMP, a team consisting of Roberts and Howard Frank designed the network topology and economics. The network measurement system was created by Kleinrock and his team [5, 7]. The work of these teams led to the testing of the first IMP at UCLA in 1969, connected to a second node at the Stanford Research Institute (SRI). After these tests, more nodes were added to ARPANET, and by the end of 1969, four nodes formed the ARPANET [5]. From this point on, the Internet started to grow. However, more work was needed to incorporate the host-to-host protocol into ARPANET. The first host-to-host protocol, called network control protocol (NCP), was developed by the Network Working Group (NWG) in 1970. But NCP did not have “the ability to address networks further downstream than a destination IMP on the ARPANET” [5, 7]. Kahn then developed what later became the Transmission Control Protocol/Internet Protocol (TCP/IP). The first day of January in 1983 was the transition day from NCP to TCP/IP. By this time, ARPANET was being used by a significant number of users both in the military and nonmilitary. As the number of nodes increased, more universities joined the exclusive club, and ARPANET became not only a research facilitator but also a free, federally funded postal system of electronic mail. In 1984, the US National Science Foundation (NSF) joined ARPANET in starting its own network, code named NSFNET. NSFNET set a new pace in nodes, bandwidth, speed, and upgrades. This NSF-funded network brought the Internet within the reach of many universities throughout the country, as well as around the world, which would not otherwise have been able to afford the costs, and many government agencies joined in. At this point, other countries and regions were establishing their own networks. With so much success and fanfare, ARPANET ceased to exist in 1989. As the number of nodes on the Internet climbed into hundreds of thousands worldwide, the role of sponsoring agencies like ARPA and NSF became more and more marginalized. Eventually, in 1994, NSF also ceased its support of the Internet. The Internet by now needed no helping hand since it had assumed a momentum of its own. 1.3 Development of the World Wide Web The World Wide Web, as we know it today, had its humble beginning in concepts contained in Tim Berners-Lee’s 1989 proposal to physicists calling for comments. Berners-Lee, a physicist–researcher at the European High-Energy Particle Physics 8 1 History of Computing lab—the Conseil Europeenne pour la Recherché Nucleaire (CERN), Switzerland— wrote the proposal called HyperText and CERN, to enable collaboration between physicists and other researchers in the high-energy physics research community. Three new technologies were incorporated. They were (1) HyperText Markup Language (HTML) based on hypertext concepts, to be used to write Web documents; (2) HyperText Transfer Protocol (HTTP), a protocol to be used to transmit Web pages between hosts; and (3) a Web browser client software program to receive and interpret data and display results. His proposal also included a very important concept for the user interface. This browser-supported interface was based on the concept that it would be consistent across all types of computer platforms to enable users to access information from any computer. The line-mode interface was developed and named at CERN in late 1989. It came to be known as the World Wide Web or WWW [6]. By 1991, the concept developed only 2 years earlier was put into practice on a limited network at CERN. From the central computer at CERN with only a few Web pages, the number of servers started to grow from the only one at CERN in 1991 to 50 worldwide by 1992, to 720,000 by 1999, and to over 24 million by 2001 [6]. In the USA, in 1993, Marc Andreessen, a student at the University of Illinois at Urbana-Champaign, and his team, while working for the National Center for Supercomputing Applications (NCSA), developed another graphic user interface browser: They named Mosaic. The graphic user interface (GUI) popularized the user and fueled the growth of the World Wide Web to bring it to the point where it is today. 1.4 The Emergence of Social and Ethical Problems in Computing 1.4.1 The Emergence of Computer Crimes The known history of computer crimes is not as old as computing is. One can perhaps say that the history of computer crimes started with the invention of the computer virus. Thinking along these lines, therefore, we will track the development of the computer virus. The term virus is a Latin word which means poison. For generations, even before the birth of modern medicine, the term had remained mostly in medical circles, meaning a foreign agent injecting itself into a living body, feeding on it to grow and multiply. As it reproduces itself in the new environment, it spreads throughout the victim’s body, slowly disabling the body’s natural resistance to foreign objects, weakening the body’s ability to perform needed life functions, and eventually causing serious, sometimes fatal, effects to the body. A computer virus, defined as a self-propagating computer program designed to alter or destroy a computer system resource, follows almost the same pattern, but instead of using the living body; it uses software to attach itself, grow, reproduce, 1.4 The Emergence of Social and Ethical Problems in Computing 9 and spread in the new environment. As it spreads in the new environment, it attacks major system resources that include the surrogate software itself, data, and sometimes hardware, weakening the capacity of these resources to perform the needed functions. Eventually, it brings the system down. The word “virus” was first assigned a nonbiological meaning in the 1972 science fiction stories about the G.O.D. machine, which were compiled in the book When Harly Was One by David Gerrod (Ballantine Books, New York, 1972). In the book, according to Karen Forchat, the term was first used to describe a piece of unwanted computer code [8]. Later association of the term with a real-world computer program was made by Fred Cohen, then a graduate student at the University of Southern California. Cohen first presented his ideas to a graduate seminar class on information security in 1983. His seminar advisor, Len Adleman, was the first to assign the term “virus” to Cohen’s concept. During his student days at the University of Southern California, Cohen did more theoretical research and practical experiments regarding viral-type programs. As part of these experiments, Cohen wrote five programs, actually viruses, to run on a VAX 11/750 running Unix —not to alter or destroy any computer resources but for class demonstration. During the demonstration, each virus obtained full control of the system within an hour [8]. From that simple beginning, computer viruses, and hence computer crimes, have been on the rise. To many, the growth of the Internet, together with massive news coverage of virus incidents, has caused an explosion of all types of computer viruses [9]. 1.4.2 The Present Status: An Uneasy Cyberspace As the level of computer crimes increases on the one hand and our reliance and dependence on computer and telecommunications technology increase on the other, we are becoming more and more susceptible and exposed to cyberspace evils and insecurity. In addition, all critical components of the national infrastructure such as telecommunication, electrical power grids, gas and oil storage, water supply systems, banking and finance, transportation, and emergency services that include medical, police, fire, and rescue—all of which are connected to cyberspace in some form—are becoming unreliable and vulnerable as well. This makes cyberspace an important security concern not only to those in government and those charged with the security of the nation but to all of us, for our personal individual security and well-being, because of the potential for a cyberspace attack, a kind of “cyber Pearl Harbor,” is high. If the recent trend in cyber attacks is any indication, we are in for an avalanche of cyber vandalism as society becomes more dependent on computer networks and as more people jump on the cyber train. The rate of cyber vandalism, both reported and unreported, is on the rise. This rise is an indication of the poor state of our cyberspace security and the vulnerability of all cyberspace resources. Yet, there are 10 1 History of Computing no signs on the horizon to indicate a slowdown in these acts. Indeed, all predictions are that they are likely to continue because of the following reasons [10]: • Cyberspace infrastructure and communication protocols are inherently weak. • The average user in cyberspace has very limited knowledge of the computer network infrastructure, its weaknesses and gapping loopholes. • Society, as a whole, is increasingly becoming irreversibly dependent on an infrastructure and technology that it does not fully understand. • There are no long-term, let alone immediate, plans or mechanisms in place to better educate the public. • There is a high degree of complacency in a society that still accords a “Wiz Kid” status to cyberspace vandals. • The only known and practiced remedies are patching loopholes after an attack has occurred. • The price of this escalating problem is not yet known. • Reporting is voluntary and haphazard. • The nation has yet to understand the seriousness of cyber vandalism. If we as a society are concerned about individual as well as collective security, privacy, and civil liberties, we need to start finding solutions. A good national cyberspace security policy is needed to [10]: 1. Make everyone aware of the vulnerability and consequences of a cyberspace attack on their well-being. 2. Ensure that everyone is well equipped to safely deal with a cyber attack in this technology-driven and fast-changing society. 3. Help put in place a set of mechanisms to detect, prevent, and handle any cyber attack. 4. Devise a legal and regulatory framework to handle cyberspace’s social consequences. 1.5 The Case for Computer Ethics Education 1.5.1 What Is Computer Ethics? According to James H. Moore, who is believed to have first coined the phrase “computer ethics,” computer ethics is the analysis of the nature and social impact of computer technology and the corresponding formulation and justification of policies for the ethical use of such technology [11]. Moore’s definition focuses on the human actions that are rooted in computer technology or influenced by computer technology. In other words, it is an analysis of the values of human actions influenced by computer technology. Computer influence on human actions is 1.5 The Case for Computer Ethics Education 11 widespread throughout the decision-making process preceding an action. In the previous sections of this chapter, we discussed the many problems we are facing today as a result of computer technology. We are looking for a way to deal with these problems, probably through education. So, the definition of computer ethics, as outlined by Moore, gives us a starting point on this long journey. 1.5.2 Why You Should Study Computer Ethics Moore’s contention is that the central task of computer ethics in decision-making processes that involve computer technology should be to “determine what should be done” whenever there is a policy vacuum. Moore first observed that there are times when policy vacuums are created in the decision-making processes, especially those that involve processes in which computer technology is “essentially involved.” It is difficult to fully explain the cause of these vacuums, but one can say that they are mainly caused by the “confusion” between the known policies and what is presented. Moore tries to explain these muddles by a software example. As we will see in Chap. 6, software can either be a product in which case patent laws apply or it can be a service where no intellectual property laws apply. The multiplicity of choices like this, presented to a decision maker by computer technology, can result in policy vacuums. Several other factors contribute to the creation of these muddles. It is likely that computer users, especially computer professionals, may be unprepared to deal effectively with the ethical issues that arise in their places of work and everywhere else computers, and computer-related technology is used. So, naturally, one would come to the conclusion that since we cannot stop computer technology that causes these muddles, we need a plan of action that will work with the changing computing technology and at the same time deal with the ethical issues that do arise. We need computer ethics education. There are two schools of thought on this subject. One school believes in the study of computer ethics as remedial moral education. The other school believes in computer ethics education not as a moral education but as a field worthy of study in its own right. But for it to exist as a separate independent field of study, there must be a unique domain for computer ethics distinct from the domain for moral education, distinct even from the domains of other kinds of professional and applied ethics [12]. In his paper “Is Computer Ethics Unique?” Walter Maner explains the existence of the two schools with two views that: 1. Certain ethical issues are so transformed by the use of computers that they deserve to be studied on their own, in their radically altered form. 2. The involvement of computers in human conduct can create entirely new ethical issues, unique to computing, which do not surface in other areas. 12 1 History of Computing According to Maner, there are six levels of justifications for the two views: the first two for the first school and the last four for the second school [12]: 1. We should study computer ethics because doing so will make us behave like responsible professionals. 2. We should study computer ethics because doing so will teach us how to avoid computer abuse and catastrophes. 3. We should study computer ethics because the advance of computing technology will continue to create temporary policy vacuums. 4. We should study computer ethics because the use of computing permanently transforms certain ethical issues to the degree that their alterations require independent study. 5. We should study computer ethics because the use of computing technology creates and will continue to create, novel ethical issues that require special study. 6. We should study computer ethics because the set of novel and transformed issues is large enough and coherent enough to define a new field. 7. Whatever school one falls in, there is enough justification to study computer ethics. Exercises 1. Give and discuss two reasons why it is good to study computer ethics? 2. Walter Maner believes that computer ethics education should not be given purely as a remedial moral education. Do you agree? Discuss. 3. Give support to the argument that computer ethics must be taken as a remedial moral course. 4. Computer ethics education taken as a remedial education does not provide an adequate rationale. Discuss. 5. Discuss each of Walter Maner’s six levels of justification for the study of computer ethics. 6. Write a chronology of the history of computers listing the milestones in a timeline. 7. List and discuss the major categories of computers based on processing powers. 8. Discuss three reasons that led to the development of PCs. 9. What government agencies underwrote the development of the Internet? Why did this support stop? Was it good for the Internet? 10. How is the Internet governed today? Discuss the governing structure. 11. What is Mosaic? When was it developed and by whom? Why is not popular today? References 1. The history of computers. http://www.ptc.dcs.edu/moody/comphistory/comphistory_print. html 2. Baron RJ, Heigbie L (1992) Computer architecture. Addison-Wesley, Reading 3. Mackenzie I The man who invented the microprocessor. http://www.bbc.com/news/ technology-13260039 References 13 4. Miller MJ (2011) The rise of DOS: how microsoft got the IBM PC OS contract. PC, 10 Aug 2011 http://forwardthinking.pcmag.com/software/286148-the-rise-of-dos-how-microsoft-gotthe-ibm-pc-os-contract 5. Bruce Sterling. Short history of the internet. Internet Society. http://www.internetsociety.org/ internet/what-internet/history-internet/short-history-internet 6. Gribble C (2001) History of the web beginning at CERN. Hitmill, 18 Jan 7. Kizza JM (1998) Civilizing the internet: global concerns and efforts towards regulation. McFarland, Jefferson 8. Forch K (1994) Computer security management. Boyd & Fraser, Danvers 9. Carnegie Mellon University, Software Engineering Institute. http://www.cert.org/stats/cert_ stats.html#incidents 10. Kizza JM (2001) Computer network security and cyber ethics. McFarland, Jefferson 11. Bynum TW (ed) (1985) Computers & ethics. Basil Blackwell, New York 12. Maner W (1996) Is computer ethics unique? Sci Eng Ethics 2(2):137–154 2 Morality and the Law © Springer International Publishing AG 2017 J. M. Kizza, Ethical and Social Issues in the Information Age, Texts in Computer Science, https://doi.org/10.1007/978-3-319-70712-9_2 15 16 2 Morality and the Law Learning Objectives After reading this chapter, the reader should be able to: 1. Learn to make sound moral reasoning 2. Learn about moral values and ideals in a person’s life 3. Learn about the relationship between morality and religion 4. Distinguish between morality and etiquette, law, and professional code of conduct 5. Learn what it means to have moral principles, the nature of conscience, and the relationship between morality and self-interest. Scenario 1 With Stem Cell Research We Can Grow Just About Anything Human! The parliament of the Republic of Kazini passed a legislation, and the president signed it into law, authorizing its citizens and scientists working on Kazini territory to carry out stem cell research to the best extent possible only limited by the physical resources. Scientists in Kazini have spearheaded such research and have made major breakthroughs in recent years. Stem cells abound in bodies, but as human bodies age, the number of these cells and their potential and functions start to diminish as well. Embryonic stem cells that are found in the early stages of the body’s development have the ability to divide indefinitely in culture and can therefore, at least in the laboratory, develop into virtually any cell type in the body. The scientists in Kazini and their counterparts from around the world believe in the great benefits of stem cell research, especially embryonic stem cells. Many newspapers and scientific journals, not only in Kazini but also from other countries, have written stories of limitless benefits, the most immediate being the replacement of insulin-producing cells in the pancreas, damaged muscle cells, and dead nerve cells due to strokes, spinal injury, and degenerative diseases that include Alzheimer’s and Parkinson’s. It may also lead to the development and replacement of liver cells destroyed by hepatitis and other liver diseases. Dr. Don Rogan, a brilliant young scientist, is the director of Kazini Clinical Research Laboratory, the leading research nerve center in Kazini. Rogan is convinced that the legislature’s action is morally wrong. However, his laboratory has been chosen for funding and his dedicated scientists and staff are excited by the legislature’s actions. They had lobbied hard for the passage of the bill. Now they see a ray of hope for millions of people not only on Kazini but also around the world. Rogan is facing a personal dilemma. Discussion Questions 1. What options does Rogan have? 2. If you were Dr. Rogan, what would you do? 3. Is Dr. Rogan bound by the legislation? 2.1 Introduction Whether you believe in a supreme being or you are an atheist, you acknowledge the existence of human life because you are alive. You are alive because someone nurtured you and protected you from all adversities. Whoever did so followed a set 2.1 Introduction 17 of rules of conduct that kept both of you alive. Such shared rules, written or not, play a vital role in all human existence. Human beings do not live randomly. We follow a script—a life script. In that script are hundreds of subscripts we follow both for survival (e.g., eating and sleeping) and for specific tasks. For example, when you meet a stranger, you follow a subscript different from the one you follow when you meet a long-lost friend. If you are hungry, the subscript you follow is different from the one you use to overcome anger. Within each subscript are variations we introduce to suit the situation. For example, when meeting an old friend, some people cry and others jump up and down, but both responses remain within the same subscript of meeting an old friend. The most important purpose of all these subscripts is human life, our own as well as others. Believing in human life implies that we also believe life has a purpose. And because no one wants to live a life of pain, every human being believes in happiness as a purpose for life. To be happy, we need those conditions that create happiness, namely, life, liberty, and property. Each condition is embodied in each of the three basic human survival subscripts: morality, ethics, and law. In this chapter, we discuss morality and law, and in Chap. 3, we discuss ethics. 2.2 Morality Morality is a set of rules for right conduct, a system used to modify and regulate our behavior. It is a quality system in human acts by which we judge them right or wrong, good or bad. This system creates moral persons who possess virtues like love for others, compassion, and a desire for justice; thus, it builds character traits in people. In particular, morality is a survival script we follow in our day-to-day living. According to Wikipedia [1], morality has three different definitions: • A descriptive definition according to which morality means a set of rules (code) of conduct that governs human behavior in matters of right and wrong. An example of the descriptive usage could be “common conceptions of morality have changed significantly over time.” • A normative and universal definition which is more prescriptive and refers to an ideal code of conduct that would be observed by all rational people, under specified conditions. An example is a moral value judgment such as “murder is immoral.” • A definition of morality that is synonymous with ethics. Ethics is the systematic philosophical study of the moral domain. We will define and discuss ethics in the coming chapter. 18 2 Morality and the Law In each one of these definitions, morality concerns itself with a set of shared rules, principles, and duties, independent from religion, applicable to all in a group or society, and having no reference to the will or power of any one individual whatever his or her status in that group or society. Although moral values are generally shared values in a society, the degree of sharing these values varies greatly. We may agree more on values like truth, justice, and loyalty than on others. To paraphrase Shakespeare, life is but a stage on which there is continuous acting from the subscript of morality. Every time we interact in a society or group, we act the moral subscript that was developed by that society or group for its members over time. Because morality is territorial and culturally based, as long as we live in a society, we are bound to live within that society’s guidelines. The actions of individuals in a society only have moral values if taken within the context of this very society and the culture of the individual. A number of factors influence the context of morality, including time and place. 2.2.1 Moral Theories If morality is a set of shared values among people in a specific society, why do we have to worry about justifying those values to people who are not members of that society? In other words, why do we need moral theories? What do moral theories have to do with the moral subscripts? If you write a script for a play, you want both the audience and the cast to understand the message of the play. If you can find a way to help them get that message and believe it, then you have put credibility in the script. This is where moral theories come in. According to MacDonnell, moral theories “seek to introduce a degree of rationality and rigor into our moral deliberations” [1]. They give our deliberations plausibility and help us to better understand those values and the contradictions therein. Because many philosophers and others use the words moral and ethical synonymously, we delay the discussion of moral theories until we discuss ethics. 2.2.2 Moral Decision Making Every human action results from a decision process. Because every human action follows a subscript, the decision-making process follows a subscript as well. A decision is morally good if the result from it is good. A good moral decision embodies nearly all moral theories and usually takes into consideration the following: 1. All the facts surrounding the situation, taking into account the interests of all parties involved. 2. The moral principles involved and how they will affect all others involved. 2.2 Morality 19 Combining 1 and 2 implies there must be reasoning and impartiality in any moral decision. Moral and ethical theorists have outlined four ways of ensuring reason and impartiality in moral decision making: 1. The use of rational intuition of moral principles, which helps us perceive moral principles such as the notion of justice and deciding what is good. 2. The use of reason to determine the best way to achieve the highest moral good. 3. The ability to distinguish between primary and secondary moral principles. Primary moral principles are more general; secondary principles are more specific and are generally deduced from the primary ones. 4. The rational calculation of the consequences of our actions. The calculation should tell us whether the action is good or bad depending on the consequences [2]. Nearly all moral theories embody one or more of these themes. 2.2.3 Moral Codes The Internet Encyclopedia of Philosophy defines moral codes as rules or norms within a group for what is proper behavior for the members of that group [2]. The norm itself is a rule, standard, or measure for us to compare something else whose qualities we doubt. Moral codes are often complex definitions of right and wrong that are based upon well-defined group’s value systems. In a way, moral codes are shared behavioral patterns of a group. These patterns have been with us since the beginning of human civilization and have evolved mainly for the survival of the group or society. Societies and cultures survive and thrive because of the moral code they are observing. History has shown failures of societies and cultures like the once mighty civilizations and great empires of the Babylonians, the Romans, and the Byzantines probably because their code failed to cope with the changing times. Although different cultures have different codes, and we have established that morality is relative to time, there have been some timeless and culture-free (moral) codes that have been nearly universally observed. Such codes include this partial list created by the astronomer Sagan [3]: 1. The Golden Rule: “Do unto others as you would have them do unto you.” Versions of the Golden Rulegolden in Different Religions1 BUDDHIST: Hurt not others in ways that you would find hurtful. CHRISTIAN: All things whatsoever ye would that men should do to you, do ye even so to them. http://web.engr.oregonstate.edu/  mjb/cs419h/Handouts/VisEthics/visethics.pdf. 1 20 2. 3. 4. 5. 6. 2 Morality and the Law CONFUCIAN: Do not do unto others what you would not have them do unto you. HINDU: This is the sum of duty; do naught unto others which if done to thee would cause thee pain. ISLAMIC: No one of you is a believer until he desires for his brother that which he desires for himself. JAIN: In happiness and suffering, in joy and grief, we should regard all creatures as we reg...
Purchase answer to see full attachment
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Explanation & Answer

Attached.

Surname 1
Name
Professor
Course
Date
The Imperatives of Citizenship for the Ainu Population in the Japanese Empire
Introduction
Empires are considered large multi-ethnic units that result from the conquest of smaller nations
and territories by a dominant society. Historically, monarchs prefer to divide their empires into
groups that are supervised by subordinates but controlled by them despite the far distance of their
conquest. Also, empires are responsible for imperialism which is known as the political approach
of the dominant unit to the creative actions and laws that ensure the maintenance of its
dominance and control over the acquired smaller nations and territories. An additional important
concept that is useful for understanding the issues that would be discussed in this paper is settler
colonialism, which is described as the complete dispossession of the earlier inhabitants of a
location through the creation and passage of laws that gradually disadvantage the indigenous
population. The leaders of the Japan Empire in the early periods of the twentieth-century utilized
these concepts to shift its geographical boundaries, increase its political and economic interests
but maintained the historical identity of an ethnically homogenous entity. The analysis of the
political and economic policies that were used by the Japan Empire showed that it prevented the
Ainu population from developing the capacity to imbibe the imperialist ideology. Finally, the
implication of these policies is the failure of the Ainu people to enjoy the benefit of citizenship

Surname 2
despite the willingness to assimilate and incompatibility of the ideologies with their ethnic
identities.
Analysis of the Citizenship of the Ainu Population
Japan's adoption of the emperor system after the Meiji Restoration of 1868 introduced
ideologies that were designed to transform the diversity of the colonial empire from the
multiethnic structure into a homogenous nationalistic perspective. It is the political perspective
that was believed as the most appropriate strategy for making the empire a community whose
members are subjects that were dedicated to the common bond that resulted from imperialism.
According to Gordon Andrew, the emperors of nineteenth-century Japan were interested in
making the "empire a potent symbol of the identity and unity of the Japanese people." He further
stated it is "these ways that reflected imperialism and also contributed to a changed relationship
of Japanese subjects to their state" (115). This system affected the multiethnic structure of the
society that has supported its growth and development for centuries, which made the colonial
subjects to the Japanese state members of a single community. Therefore, the colonization of the
Ainu population made them the Japanese citizens, which include the adoption of the language
and customs of their colonial masters.
Meanwhile, the imperatives for the Emperor to maintain domestic order and the status of
the Empire on the global stage meant that the citizenship and assimilation of the Ainu should
result in the changes to their identities. It was an ideology that is necessary for preventing
resistance by the subjects and creating a common system that could be controlled easily.
However, the process of citizenship for the Ainu was not easy and not different from the
Okinawans in Hokkaido because of the need to discards cultural practices that could prevent
assimilation and portray the people as rebellious to the Emperor. Tami Sakiyama wrote in the

Surname 3
Island of Confinement that "depopulation was already a serious problem, casting a cloud of
uncertainty over people's lives" (113), which served as evidence that the indigenous people that
opposed their subjugation needed to leave since they did not have many roles in the community.
Unlike the Okinawans that realized the adverse effect of colonization on their survival, the Ainu
were used to Japan's subjugation efforts and accepted the imperialist ideologies because the
citizenship of Japan appeared as the means to the end of the exploitations of their economic
activities.
Furthermore, the citizenship of a nation or empire means that the people must contribute
to the development and growth of the country through economic and political contributions that
strengthen the capacity of the government to support the people. Also, the status requires the
integration of the Ainu into the family-state system of the emperor required the adoption of the
Japanese cultural and social identities since it promoted the state's policy of monolithic society.
Howell David argued that the integration and assimilation of the Ainu into Japanese society
provided the embedding of their cultural identities into them. The scholar further that "as
imperial subjects first and indigenes second, while hardly empowering, the acceptance of the
state's ethnic negation policy allowed for the possibility of lasting cultural diversity in the
imperial Japanese state" (6). Thus the desire of the Ainu to enjoy economic advancement and
social inclusion necessitated the acceptance of the citizenship of a state that is known to have
exploited them for centuries.
The Implications of the Citizenship of the Ainu People
In spite of the willingness of the Ainu people to integrate into the broader Japanese
society in response to the family-state ideology and the desire of the indigenous people for
economic advancement and social acceptance, citizenship did not result in these needs. One of

Surname 4
the implications of citizenship for the Ainu was the need to accept the government's industrial
development and agricultural policies of the government that resulted in the migration of
mainlanders to the island. These policies were part of the Emperor's desire to exploit the fertile
land of the Hokkaido, turn the indigenous people from hunters and commercial fishers to
farmers, which is an occupation that was critical to their integration into the broader Japanese
population. According to Howell David, while the government considered the trade
transformation of the Ainu to farmers as the right path to their economic advancement and
growth through the protection law, the program did little good for the Ainu as a whole and left
many decidedly worse off than they had been before its implementation" (7). He added that
"aside from their lack of experience or interest in farming, agriculture was generally less
rewarding economically than wage"(7).
A further implication of citizenship on the Ainu that reflected the failure of the familystate ideology to integrate and assimilate the people into the larger Japanese community is the
lack of cultural identity and social exclusion. As a part of the minority ethnic group regarded as
commoners, the Ainu experienced implicit oppressive practices from the dominant groups that
they could not protest against or end its continued existence in their society. The evidence of this
assertion can be found in the perspective of Ikeda Kyle's book titled the Island of Protest. While
describing the experience of the Okinawans after becoming Japanese citizens, the author wrote
that "Okinawa was deemed Japanese territory, yet disregard for the new prefecture's denizens
reached an extreme despite their best efforts to assimilate to Japanese culture" (3). Similarly, the
Ainu were compelled to suppress their cultural heritage and learn the customs and traditions of
Japan as part of the implication of the citizenship. Thus, the consequences of citizenship were

Surname 5
not limited to the continuation of oppressive economic practices but the loss of individual and
group cultural identities.
Conclusion
In a conclusion, the citizenship is designed to provide opportunities for economic and social
advancement for the population but one that should eliminate all forms of oppressive practices
against ethnic groups. Although the objective of the Meiji Restoration of 1868 was to integrate
and assimilate all colonial subjects for Japan’s economic political and economic growth, as well
as sustain its recognition by Western, the impacts of the agenda on the Ainu and other minority
ethnic groups was negative. As Medoruma Shun stated in her article titled hope that despite the
Okinawa was called “a peace-loving, healing island,” the government considered the colonial
subjects as “maggots who clustered around the shit of land rents and subsidy monies splattered
by the bases” (22). Finally, the desire of the Ainu to assimilate and become Japanese was futile
because the government policies prevented the capacity to adopt the imperialist ideology that
promoted community integration.

Surname 6
Works Cited
Gordon, Andrew. A modern history of Japan: from Tokugawa times to the present. New York:
Oxford University Press, (2003): 115: 217
Howell, David L. "Making “useful citizens” of Ainu subjects in early twentieth-century
Japan." The Journal of Asian Studies 63.1 (2004): 5-29.
Ikeda, Kyle. "Introduction.” In Islands of Protest: Japanese Literature from Okinawa." (2017):
1-18.
Medoruma Shun. " Hope.” In Islands of Protest: Japanese Literature from Okinawa." (2017): 118.
Tami, Sakiyama. " Islands of Confinement.” In Islands of Protest: Japanese Literature from
Okinawa." (2017): 1-18.

Attached.

Internet of Things- Thesis
Thesis: The internet of things refers to the integration of different networks and this involves the
analysis of the services and the need to enhance the protection of the networks and the promotion
of the ideas that are important towards enhancing communication.
I.

Model Attacks
A. Network independence
B. Monitoring
C. System capabilities

II.

Challenges Faced in IoT
A. Dynamic environments
B. Information uniformity

III.

Security Constraints

A. Technology expression
B. Sources of data
C. Trust relationships
D. Networking


Running head: INTERNET OF THINGS

1

Internet of Things
Name
Institution

INTERNET OF THINGS

2
Internet of Things

The internet of things refers to the integration of different networks and this involves the
analysis of the services and the need to enhance the protection of the networks and the promotion of
the ideas that are important towards enhancing communication. The conceptual framework that is
applied focuses on the innovation of the choices of the people and this helps in setting the roadmap
towards the unification of the networks. It is important that there is focus on the environment that
helps in increasing the efficiency of the analysis of the data and the projects that helps in the
management of the connections in the internet. The impact of the connections is important as it helps in
the observation of the computing needs and the understanding of the different dynamics in the internet
world. It is important that the strategies that are developed focus on the impact of the change in the
operation of the internet towards opening up of different opportunities and the distinction of the
challenges that are likely to face. In the case, the internet of things creates the need to secure
information and this is important in the analysis of the risks that are likely to arise in the environment.
The security of the internet is one of the challenge that is faces in the integration of the internet
of things and this creates the need to focus on the efficiency of the communication methods. There is
focus on the attacks that are likely to take place in the system and this involves attacks such as denial of
service and spoofing attacks. It affects the privacy of the users creating the need to focus on the
protection of the users from such attacks. The assessment of the growth measures is important as this
helps in focusing on the different paradigms and the orientation of the knowledge that helps in funding
the communication process. The interconnection of the devi...


Anonymous
Very useful material for studying!

Studypool
4.7
Indeed
4.5
Sitejabber
4.4

Similar Content

Related Tags