INFORMATION, COMPUTER AND ROBOETHICSCOMPUTER AND INFORMATION ETHICS: HISTORY
INFORMATION, COMPUTER AND ROBOETHICS
COMPUTER AND INFORMATION ETHICS: HISTORY Information ethics is the main field of which computer ethics is a subfield
COMPUTER AND INFORMATION ETHICS: HISTORY WIENER Robert Wiener (1948): Cybernetics second industrial revolution: the automatic age The effects of information technology on life, health, happiness, abilities, knowledge, freedom security, and opportunities.
COMPUTER AND INFORMATION ETHICS: HISTORY WIENER Physical structure of human being and the potential for learning and creativity Cybernetics takes the view that the structure of the machine or of the organism is an index of the performance that may be expected from it The mechanical fluidity of the human being provides for his almost indefinite intellectual expansion The purpose of human life: to flourish as the kind of information organisms that humans naturally are
COMPUTER AND INFORMATION ETHICS: HISTORY WIENER Living organisms, including human beings, are patterns of information that persists through an ongoing exchange of matter-energy “The individuality of the body is that of a flame… of a form rather that of a bit of substance”
COMPUTER AND INFORMATION ETHICS: HISTORY WIENER For human beings to flourish they must be free to engage in creative and flexible actions Society is called to maximize this ability Great Principles of Justice: The Principle of Freedom: “the liberty of each human being to develop in his freedom the full measure of the human possibilities embodied in him.” The Principle of Equality: “the equality by which what is just for A and B remains just when the positions of A and B are interchanged.” The Principle of Benevolence: “a good will between man and man that knows no limits short of those of humanity itself.” + The Principle of Minimum Infringement of Freedom: What compulsion the very existence of the community and the state may demand must be exercised in such a way as to produce no unnecessary infringement of freedom
COMPUTER AND INFORMATION ETHICS: HISTORY WIENER Diversity of cultures can provide a context in which humans can flourish But no ethical relativism: keeping the great principles of justice as objective cross-cultural foundation for ethics
COMPUTER AND INFORMATION ETHICS: HISTORY WIENER Methodology: 1. Identify an ethical question or case regarding the integration of information technology into society. Focus: technology-generated possibilities 2. Clarify any ambiguous or vague ideas or principles that may apply to the case or the issue in question. 3. If possible, apply already existing, ethically acceptable principles, laws, rules, and practices (the “received policy cluster”) that govern human behavior in the given society. 4. If ethically acceptable precedents, traditions and policies are insufficient to settle the question or deal with the case, use the purpose of a human life plus the great principles of justice to find a solution that fits as well as possible into the ethical traditions of the given society.
COMPUTER AND INFORMATION ETHICS: HISTORY Walter Maner (1976): Computer ethics Wholly new ethics problems that would not have existed if computers had not been invented Deborah Johnson (1985): Textbook Computer Ethics not ethically new problems, but a new twist given by computers to traditional problems
COMPUTER AND INFORMATION ETHICS: HISTORY James Moor (1985): What is Computer Ethics? Computers as logically malleable no laws or standards of good practice policy vacuums and conceptual muddles Core human values (life, health, happiness, security, resources, opportunities, and knowledge) without which a community cannot survive + combining deontology and consequentialism constraints on consequentialist evaluations (for ex. I can try to realize the goals of core human values, but I have to rule out those actions, regardless of their possible good consequences, that infringe those things that every rational and impartial person would regard as unjust)
COMPUTER AND INFORMATION ETHICS: HISTORY Gotterbarn (1991) Computer Ethics: Responsibility Regained Developing a professional ethics of responsibility for those involved with computers
COMPUTER AND INFORMATION ETHICS: HISTORY Krystyna Górniak-Kocikowska (1995): The Computer Revolution and the Problem of Global Ethics “Górniak hypothesis”: computer ethics will evolve into a global ethic applicable in every culture on the earth (replacing “local” ethical systems) Global character: 1. the entire globe; 2. no borders; 3. encompassing the totality of human actions and relations
COMPUTER AND INFORMATION ETHICS: HISTORY FLORIDI Luciano Floridi (1995): Information Ethics Everything that exists as ‘informational’ objects or processes Informational systems as such, rather than just living systems, are raised to the role of agents and patients of any action, with environmental processes changes and interactions equally described informationally
COMPUTER AND INFORMATION ETHICS: HISTORY FLORIDI Everything that exists is an informational object or process the INFOSPHERE Damages to the infosphere: ENTROPY an evil that should be avoided or minimized Everything in the infosphere has at least a minimum worth that should be ethically respected: a right to persist in its own status and a constructionist right to flourish Preserving and enhancing the infosphere: a “patient-based” non-anthropocentric ethical theory to be used in addition to traditional “agent-based” anthropocentric theories (like utilitarianism, deontology and virtue ethics)
EXAMPLE TOPICS IN COMPUTER ETHICS: WORKPLACE Computers pose a threat to traditional jobs But generating new kind of jobs (hardware engineers, software engineers, system analysts, webmasters…) Even if not eliminate a job can be radically altered by computers de-skilling of workers Problems for health and safety
EXAMPLE TOPICS IN COMPUTER ETHICS: COMPUTER CRIME VIRUSES (not working on their own, but inserted in computers or programs) WORMS (moving from machine to machine across networks) TROJAN HORSES (appear as one sort of program and doing damages behind the scenes) LOGIC BOMBS (activated only when there are particular conditions) BACTERIA OR RABBITS (multiply rapidly and fill up computers’ memories) + The issue of HACKERS: benevolent defenders of the freedom of cyberspace or criminals?
EXAMPLE TOPICS IN COMPUTER ETHICS: PRIVACY AND ANONIMITY Big Brother Government collecting data on citizens (from the novel 1984 by George Orwell) Data-mining Data-matching Redefinition of privacy: from control over personal information to restriction of access to personal information + an idea of privacy in public spaces - The question of anonymity on the internet: advantages and risks
EXAMPLE TOPICS IN COMPUTER ETHICS: INTELLECTUAL PROPERTY Intellectual property rights on software ownership Patenting a computer algorithm and the complaints of mathematicians of this action as a way to remove algorithms from the public domain and threatening the development of science
EXAMPLE TOPICS IN COMPUTER ETHICS: GLOBALIZATION Global Laws: state laws on the internet do not apply to the rest of the world although websites are crossing borders Global Cyberbusiness: the problem of the technological infrastructure gap Global Education: internet access as a source of information or a disinformation?
ROBOETHICS Isaac Asimov’s Laws of Robotics (1942, I Robot): A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. + the later on added and highly problematic Added Zero Law: A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
ROBOETHICS: BETRAYING ASIMOV ROBOTS IN WAR From Asimov’s idea that robots should be designed not to harm humans to the idea of using robots in war (and the added justifications) 56 nations are developing robotic weapons Human controlled robots: drones (Predators aircrafts), mine detectors, sensing devices Autonomously operating robots: the robot makes its own decisions regarding the use of force on the filed without requiring human consent at the moment landmines(?), Cruise missiles (relatively), Patriot missiles (defensive system), spying micro-robots, drone aircrafts, nighttime sentry with automatic shoot, automatic machine-gun and grenades launchers, automatic attack or counter-attack missiles (?), ‘intelligent’ robots with the ability to decide when to shoot
ROBOETHICS: BETRAYING ASIMOV ROBOTS IN WAR Three Reasons are provided by the military for using robots in war (as 2007, source: Arkin) Force multiplication (reducing the number of soldiers needed) Expanding the battle space (conducting combat over larger areas) Extend war fighters’ reach (allowing individual soldiers to strike further) The use of robots for reducing ethical infractions does not appear anywhere
ROBOETHICS: BETRAYING ASIMOV ROBOTS IN WAR Can robots behave more ethically than soldiers in the battlefield? Ronald Arkin (university researcher under contract with the US Army): robots can be designed to have no instinct for self-preservation (not lashing out in fear), can be built to show no anger or recklessness (vs soldiers who have human reactions that can lead to mistreatment of enemies and civilians) Incorporating existing battlefield and military protocols (Geneva Conventions, Rules of Engagements, Codes of Conduct) According to Arkin this is respecting the “spirit” if not the laws of Asimov (?!) “It is not my belief that an unmanned system will be able to be perfectly ethical in the battlefield, but I am convinced that they can perform more ethically than human soldiers are capable of.”
ROBOETHICS: BETRAYING ASIMOV ROBOTS IN WAR Problems of human controlled robots: Lack of real experience of war makes it more probable No risks for operators: lower the barriers of warfare and potentially leading to a new arms race Civilians at more risk: it is already difficult to distinguish them on the battlefield, it is even worse when the device is remotely operated Killing a human being or deleting a shadow from the screen at thousands of kilometers of distance?
ROBOETHICS: BETRAYING ASIMOV ROBOTS IN WAR Problems of autonomous robots: Recognizing civilians Recognizing wounded soldiers and soldiers willing to surrender Deciding when to shoot No emotions: lack of empathy Undermining human responsibility
computer_and_roboethics.ppt
- Количество слайдов: 25