Cyberethics - JSNE Group

Download Report

Transcript Cyberethics - JSNE Group

CIS 200
Professional and ethical issues in
computing
Edited by Rawan T. Khasawneh
Cybertechnology
Computer
Ethics
Information
Ethics
Internet
Ethics
CyberEthics
What Is Cyberethics?




Cyberethics is the study of moral, legal, and
social issues involving cybertechnology.
As a field of applied ethics, it:
examines the impact that cybertechnology
has for our social, legal, and moral systems.
evaluates the social policies and laws that we
frame in response to issues generated by the
development and use of cybertechnology.
What Is Cybertechnology?





Cybertechnology refers to a wide range of
computing and communications devices
– from standalone computers, to "connected"
or networked computing and communications
technologies, to the Internet itself.
Cybertechnologies include:
digital electronic devices;
networked computers (including servers,
desktops, laptops, etc.);
stand-alone computers.
Cybertechnology (Continued)





Networked devices can be connected
directly to the Internet.
They also can be connected to other
devices through one or more privately
owned computer networks.
Privately owned networks include both:
Local Area Networks (LANs),
Wide Area Networks (WANs).
Why the term cyberethics?

a)
b)

Cyberethics is a more accurate label than
computer ethics, which can suggest the study
of ethical issues limited either to:
computing machines,
computing professionals.
Cyberethics is also more accurate than
Internet ethics, which is limited only to ethical
issues affecting (only) networked computers
and devices.
The Evolution of
Cybertechnology and
Cyberethics: Four Phases

1940s


Before World War II


A person who calculated numbers
After World War II


We start noting the meaning of ‘computer’
Calculating Machine
1980s


More than a machine!
New kind of medium for communications!


Computer technology emerged in the
late 1940s, when some analysts
confidently predicted that no more than
six computers would ever need to be
built.
Informal noting of related ethical and
social issues
Phase #1 (1950s and 1960s)
Huge Mainframe
Stand-alone machines
Unconnected Computers
Phase #1 (1950s and 1960s)

AI




“Giant Brain!’’
Can machine think? If so,
Should we invent thinking machine?
Intelligent entities! Our sense of self!
Phase #1 (1950s and 1960s)

Privacy threats and the fear of Big
Brother


National database!! Electronic records !!
How citizens’ personal information will be
used?!


It might be used to monitor and control the
actions of ordinary citizens 
ARPANET !
Phase #2 (1970s and 1980s)
Connected Computers
Computers Networks
LANs & WANs
Privately Owned
Information Exchange
Phase #2 (1970s and 1980s)

Personal Privacy

Much more WORRIES about:



The amount of collected personal information
Ways of usage
Private sector and the commercial DB


Information Exchange
Intellectual property

Software programs proprietary duplication
Phase #2 (1970s and 1980s)

Computer crime


Remote computer terminal usage
Computer systems in large organizations
can be disrupted
Phase #3 (1990-present)
Internet Availability
World Wide Web
Web-based Technologies
Phase #3 (1990-present)

Free Speech

Can Internet users post any message they
wish on:



Publicly accessible websites?!
Their own personal web pages?!
Will they be protected by free speech or
freedom of expression?
Phase #3 (1990-present)

Anonymity

Should Internet users be permitted to:



Post anonymous messages on web pages?
Be allowed to navigate the web anonymously
or under the cover of a pseudonym?
Jurisdiction


No clear national or geographical
boundaries!
Where Internet crime will be prosecuted?
Phase #3 (1990-present)

Trust



E-commerce
Doing online business (personal and
financial information)
Public vs. private aspects of personal
information that has become increasingly
available on the Internet


Social networking sites
Interactive web-based forums
Phase #4 (present to near future)
Computing devices will soon be a part of our clothing
and even our bodies
Pervasive/Ubiquitous
Computing
Biotechnology
• Ambient Intelligence
• Wireless technology
• RFID
Phase #4 (present to near future)
Web 2.0
less visible as
distinct entities
miniaturized and integrated
into ordinary objects
The Evolution of Cybertechnology
and Cyberethics (Continued)

a)
b)

In Phase 4, computers are becoming less visible
as distinct entities, as they:
continue to be miniaturized and integrated into
ordinary objects,
blend unobtrusively into our surroundings.
Cybertechnology is also becoming less
distinguishable from other technologies as
boundaries that have previously separated them
begin to blur because of convergence.
The Evolution of Cybertechnology
and Cyberethics (Continued)





Additional ethical/social concerns associated with
Phase IV include controversies that are made
possible by the following kinds of technologies:
autonomous machines and sophisticated robots (used
in warfare, transportation, care for the elderly, etc.);
nanocomputing and nano-scale devices;
artificial agents (including “soft bots”) that act on
behalf of humans and corporations;
AI-induced bionic chip implants (that can cause us to
question what it means to be human vs. cyborg).
Table 1-1: Summary of Four
Phases of Cyberethics
Phase
Time Period
Technological Features
Associated Issues
1
1950s-1960s
Stand-alone machines (large
mainframe computers)
Artificial intelligence (AI),
database privacy ("Big Brother")
2
1970s-1980s
Minicomputers and PCs
interconnected via privately owned
networks
Issues from Phase 1 plus
concerns involving intellectual
property and software piracy,
computer crime, privacy and the
exchange of records.
3
1990s-Present
Internet and World Wide Web
Issues from Phases 1 and 2 plus
concerns about free speech,
anonymity, legal jurisdiction,
virtual communities, etc.
4
Present to
Near Future
Convergence of information and
communication technologies with
nanotechnology research and
bioinformatics research, etc.
Issues from Phases 1-3 plus
concerns about artificial agents
("bots") with decision-making
capabilities, AI-induced bionic
chip implants, nanocomputing,
pervasive computing, etc.
Are Any Cyberethics Issues Unique Ethical Issues?
Their opinions

Traditionalists:



Nothing is new
Crime is crime and murder is murder
Uniqueness proponents:


Computers have brought in new issues
Cybertechnology has created (at least
some) new and unique ethical issues that
couldn’t have existed before computers
What is wrong with their views?

Traditionalists:

Underestimate scope and scale issues


i.e. Cyberbullies can bully multiple victims
simultaneously (scale) and globally (because of the
scope or reach of the Internet). Cyberbullies can
also operate without ever having to leave the
comfort of their homes.
Uniqueness proponents:

Overstate the effect of the technology on
ethics. i.e. Maner opinion: computers are uniquely
fast, uniquely malleable, etc. So, there may indeed be
some unique aspects of computer technology.
What is right with their views?

Traditionalists:

No new ethical issues have been
introduced by computers.

Uniqueness proponents:

Cybertechnology has complicated our
analysis of traditional ethical issues.
The Uniqueness Debate
(Continued)


Proponents of the uniqueness thesis tend to confuse
unique features of computer technology with unique
ethical issues.
Their argument is based on a logical fallacy:
Premise. Cybertechnology has some unique
technological features.
Premise. Cybertechnology generates some ethical
issues.
Conclusion. (At least some of the) Ethical issues
generated by cybertechnology must be unique.
The Uniqueness Debate
(Continued)
So, in analyzing the issues involved
in this debate, it is useful to
distinguish between any:
 unique technological features;
 (alleged) unique ethical issues.

Alternative Strategy for
Analyzing the Uniqueness Issue



Moor (2000) argues that computer
technology generates “new possibilities
for human action” because computers
are logically malleable.
Logical malleability in computers means
that they can be molded in ways that
allow for many different kinds of uses.
Some of the unanticipated uses of computers have introduced policy
vacuums.
Policy Vacuums and Conceptual
Muddles




Policy vacuums are “voids” or gaps in
our laws and policies.
One solution might seem simply to fill
the voids with new or revised policies.
Some policy vacuums cannot easily be
filled because of conceptual muddles.
In these cases, conceptual muddles first
need to be elucidated before clear
policies can be formulated and justified.
A Policy Vacuum in Duplicating
Software




Consider again Scenario 1-5 (in the textbook)
involving the duplication of software.
In the early 1980s, there were still no clear laws
regarding the duplication of software programs,
which had been made easy because of the
avaioability of personal computers.
Because there were no clear rules for copying
programs, a policy vacuum arose.
Before the policy vacuum could be filled, a
conceptual muddle had to be elucidated: What,
exactly, is software?
Three distinct perspectives of applied ethics (as applied
to cyberethics):
1.
2.
3.
Professional Ethics:
•
the purpose of cyberethics is to identify and analyze issues
of ethical responsibility for computer/information technology
(IT)professionals.
Philosophical Ethics;
•
cyberethics is a field of philosophical analysis and inquiry
that goes beyond professional ethics.
Sociological/Descriptive Ethics.
•
Descriptive (and sociological) investigations report about
“What is the case.“