AI: Fact or Fiction? - Department of Computer Science and Electrical

Download Report

Transcript AI: Fact or Fiction? - Department of Computer Science and Electrical

Artificial Intelligence:
Human vs. Machine
Professor Marie desJardins
CMSC 100
Fall 2008
Memory is at the Core (Literally)
 Remember Hal?
 “Open the pod bay door, Hal.”
 “My mind is going...”
 Memory is at the core of our being (and a computer’s)
The first magnetic core memory [www.columbia.edu/acis/history]
thebrain.mcgill.ca
 ...but our memories look very different!
CMSC 100 - Fall 2008
2
Overview
What is AI? (and why is it so cool?)
AI: Past and Present
 History of AI
 AI Today
Computational vs. Biological Memory
The Skeptics Speak
CMSC 100 - Fall 2008
3
What is AI?
AI: A Vision
 Could an intelligent agent living on your home computer
manage your email, coordinate your work and social
activities, help plan your vacations…… even watch your
house while you take those well planned vacations?
CMSC 100 - Fall 2008
5
Main Goals of AI
Represent and store knowledge
Retrieve and reason about knowledge
Behave intelligently in complex environments
Develop interesting and useful applications
Interact with people, agents, and the environment
CMSC 100 - Fall 2008
6
Foundations of AI
Mathematics
Economics
Psychology
Computer
Science &
Engineering
AI
Cognitive
Science
CMSC 100 - Fall 2008
Philosophy
Biology
Linguistics
7
Big Questions
 Can machines think?
 If so, how?
 If not, why not?
 What does this say about human
beings?
 What does this say about the
mind?
 And if we can make machines
think, should we?
CMSC 100 - Fall 2008
8
History of AI
History
CMSC 100 - Fall 2008
11
Early AI: Eliza
 ELIZA: A program that simulated a psychotherapist
interacting with a patient and successfully passed the
Turing Test
 Implemented at MIT during 1964-1966 by Joel
Weizenbaum
 First script was DOCTOR
 Simple pattern-matching
 Each pattern had an associated reply which might include
bits of the input (after simple transformations (my  your)
 Weizenbaum was shocked at reactions
 Psychiatrists thought it had potential
 People unequivocally anthropomorphized
Let me tell you my problems...
 Many thought it solved the natural language problem!!
CMSC 100 - Fall 2008
12
Eliza Speaks















I am the psychotherapist. Please, describe your problems.
i'm having a hard time with my students
Why do you say that?
well, they keep asking me for money
Maybe your plans have something to do with this.
what plans?
Is it because of your life that you say what plans?
No, it's because I didn't know what you were talking about
Is it because you didn't know what I was talking about that you
came to me?
no, it's because i wanted to see if you could pass the turing test
Does it bother you that you wanted to see if I could really pass the
turing test?
…..
How do you reconcile problems at school?
This is a waste of time. Good bye
My secretary will send you a bill.
CMSC 100 - Fall 2008
13
Early AI: Parry
 Kenneth Colby modeled a paranoid using the same
techniques as Eliza, around 1968
 PARRY has basic emotions. If it gets angry, its replies
become more hostile
 In a Turing test, expert psychiatrists were unable to
distinguish PARRY’s ramblings from those of real paranoids
CMSC 100 - Fall 2008
14
PARRY Speaks













Good afternoon. My name is Doctor Jones.
Pleased to meet you.
How are you?
All right I guess.
Could you tell me your name?
Pat Smith.
I though you might have some problems.
I've been upset lately.
By what?
People get on my nerves sometimes.
Tell me more.
Do you know anything about bookies?
...
CMSC 100 - Fall 2008
15
P: I don't understand your motives.
E: Why don't you understand my motives?
P: Shouldn't I be?
E: Why do you say that just now?
P: That's just the way it is.
E: Suppose that's the way it were not.
P: So?
E: I am not sure I understand you fully.
P: You harp on the same questions too much.
E: What does that similarity suggest to you?
P: I don't get you.
E: Do you wish that you did get I?
CMSC 100 - Fall 2008
16
Turing Test
 Three rooms contain a person, a computer, and an
interrogator
 The interrogator can communicate with the other two by
“teleprinter” (or, say, AIM)
 The interrogator tries to determine which is the person and
which is the machine
 The machine tries to fool the interrogator into believing that
it is the person
 If the machine succeeds, then we conclude that the
machine can think
CMSC 100 - Fall 2008
17
The Loebner Contest
 A modern version of the Turing Test, held annually, with a
$100,000 cash prize
 Hugh Loebner was once director of UMBC’s Academic
Computing Services (née UCS, lately OIT)
 http://www.loebner.net/Prizef/loebner-prize.html
 Participants include a set of humans, a set of computers,
and a set of judges
 Scoring
 Rank from least human to most human
 Highest median rank wins $2000
 If better than a human, win $100,000 (Nobody yet…)
 2008 winner: Elbot
CMSC 100 - Fall 2008
18
What’s Easy and What’s Hard?
 It’s been easier to mechanize many of the high-level tasks
we usually associate with “intelligence” in people
 e.g., symbolic integration, proving theorems, playing
chess, medical diagnosis
 It’s been very hard to mechanize tasks that lots of animals
can do





walking around without running into things
catching prey and avoiding predators
interpreting complex sensory information (e.g., visual, aural, …)
modeling the internal states of other animals from their behavior
working as a team (e.g., with pack animals)
 Is there a fundamental difference between the two
categories?
CMSC 100 - Fall 2008
19
AI Today
Who Does AI?
 Academic researchers (perhaps the most Ph.D.-generating
area of computer science in recent years)
 Some of the top AI schools: CMU, Stanford, Berkeley, MIT, UIUC,
UMd, U Alberta, UT Austin, ... (and, of course, UMBC!)
 Government and private research labs
 NASA, NRL, NIST, IBM, AT&T, SRI, ISI, MERL, ...
 Lots of companies!
CMSC 100 - Fall 2008
21
Applications
 A sample from the 2008 International Conference on
Innovative Applications of AI:









Event management (for Olympic equestrian competition)
Language and culture instruction
Public school choice (for parents)
Turbulence prediction (for air traffic safety)
Heart wall abnormality diagnosis
Epilepsy treatment planning
Personalization of telecommunications services
Earth observation flight planning (for science data)
Crop selection (for optimal soil planning)
CMSC 100 - Fall 2008
22
What Can AI Systems Do Now?
Here are some example applications:
 Computer vision: face recognition from a large set
 Robotics: autonomous (mostly) automobile
 Natural language processing: simple machine translation
 Expert systems: medical diagnosis in a narrow domain
 Spoken language systems: ~2000 word continuous speech
 Planning and scheduling: Hubble Telescope experiments
 Learning: text categorization into ~1000 topics
 User modeling: Bayesian reasoning in Windows help (the infamous
paper clip…)
 Games: Grand Master level in chess (world champion), checkers,
backgammon, etc.
Breaking news (8/7/08) - MoGo beats professional Go player
CMSC 100 - Fall 2008
23
Robotics
 SRI: Shakey / planning sri-shakey.ram
 SRI: Flakey / planning & control sri-Flakey
 UMass: Thing / learning & control
umass_thing_irreg.mpeg
umass_thing_quest.mpeg
umass-can-roll.mpeg
 MIT: Cog / reactive behavior
mit-cog-saw-30.mov
mit-cog-drum-close-15.mov
 MIT: Kismet / affect & interaction
mit-kismet.mov
mit-kismet-expressions-dl.mov
 CMU: RoboCup Soccer / teamwork & coordination
cmu_vs_gatech.mpeg
CMSC 100 - Fall 2008
24
DARPA Grand Challenge
 Completely autonomous vehicles (no human guidance)
 Several hundred miles over varied terrain
 First challenge (2004) – 142 miles
 “winner” traveled seven(!) miles
 Second challenge (2005) – 131 miles
 Winning team (Stanford) completed
the course in under 7 hours
 Three other teams completed the
course in just over 7 hours
 Onwards and upwards (2007)
 Urban Challenge
 Traffic laws, merging, traffic
circles, busy intersections...
 Six finishers (best time: 2.8 miles in 4+ hours)
CMSC 100 - Fall 2008
25
Art: NEvAr
 Use genetic algorithms to evolve aesthetically interesting
pictures
 See http://eden.dei.uc.pt/~machado/NEvAr
CMSC 100 - Fall 2008
26
ALife: Evolutionary Optimization
 MERL: evolving ‘bots
CMSC 100 - Fall 2008
27
Human-Computer Interaction: Sketching

Step 1: Typing
 Step 2: Constrained handwriting

Step 3: Handwriting recognition
 Step 4: Sketch recognition (doodling)!
 MIT sketch tablet
CMSC 100 - Fall 2008
28
Driving: Adaptive Cruise Control
 Adaptive cruise control and precrash safety system (ACC/PCS)
 Offered by dozens of makers,
mostly as an option (~$1500) on
high-end models
 Determines appropriate speed for
traffic conditions
 Senses impending collisions and reacts (brakes, seatbelts)
 Latest AI technology: automatic parallel parking!
CMSC 100 - Fall 2008
29
AxonX
 Smoke and fire monitoring system
CMSC 100 - Fall 2008
30
Rocket Review
 Automated SAT essay grading system
CMSC 100 - Fall 2008
31
What Can’t AI Systems Do (Yet)?
 Understand natural language robustly (e.g., read
and understand articles in a newspaper)
 Surf the web (or a wave)
 Interpret an arbitrary visual scene
 Learn a natural language
 Play Go well √
 Construct plans in dynamic real-time domains
 Refocus attention in complex environments
 Perform life-long learning
CMSC 100 - Fall 2008
32
Computational vs. Biological Memory
How Does It Work? (Humans)
 Basic idea:
 Chemical traces in the neurons of the brain
 Types of memory:
 Primary (short-term)
 Secondary (long-term)
 Factors in memory quality:
 Distractions
 Emotional cues
 Repetition
CMSC 100 - Fall 2008
34
How Does It Work? (Computers)
 Basic idea:
 Store information as “bits” using physical processes (stable
electronic states, capacitors, magnetic polarity, ...)
 One bit = “yes or no”
 Types of computer storage:




Primary storage (RAM or just “memory”)
Secondary storage (hard disks)
Tertiary storage (optical jukeboxes)
Off-line storage (flash drives)
Size
Speed
 Factors in memory quality:
 Power source (for RAM)
 Avoiding extreme temperatures
CMSC 100 - Fall 2008
35
Measuring Memory
 Remember that one yes/no “bit” is the basic unit







Eight (23) bits = one byte
1,024 (210) bytes = one kilobyte (1K)*
1,024K (220 bytes) = one megabyte (1M)
1,024K (230 bytes) = one gigabyte (1G)
1,024 (240 bytes) = one terabyte (1T)
1,024 (250 bytes) = one petabyte (1P)
... 280 bytes = one yottabyte (1Y?)
* Note
that external storage is usually measured in decimal rather than binary (1000 bytes = 1K, and so on)
CMSC 100 - Fall 2008
36
Moore’s Law
 Computer memory (and processing speed, resolution, and
just about everything else) increases exponentially
CMSC 100 - Fall 2008
39
Showdown
 Computer capacity:
 Human capacity:
 Primary storage: 64GB
 Secondary storage: 750GB
(~1012)
 Tertiary storage: 1PB? (1015)
 Computer retrieval speed:
 Primary: 10-7 sec.
 Secondary: 10-5 sec.
 Primary storage: 7 ± 2 “chunks”
 Secondary storage: 108432
bits?? (or maybe 109 bits?)
 Human retrieval speed:
 Primary: 10-2 sec
 Secondary: 10-2 sec
 Computing capacity: 1 petaflop
(1015 floating-point instructions
per second), very special
purpose
 Computing capacity: possibly
100 petaflops, very general
purpose
 Digital
 Extremely reliable
 Not (usually) parallel
 Analog
 Moderately reliable
 Highly parallel
More at movementarian.com
CMSC 100 - Fall 2008
40
It’s Not Just What You “Know”






Storage
Indexing
Retrieval
Inference
Semantics
Synthesis
 ...So far, computers are good at storage, OK at indexing and retrieval,
and humans win on pretty much all of the other dimensions
 ...but we’re just getting started
 Electronic computers were only invented 60 years ago!
 Homo sapiens has had a few hundred thousand years to evolve...
CMSC 100 - Fall 2008
41
The Skeptics Speak
Mind and Consciousness
 Many philosophers have wrestled with the question:
 Is Artificial Intelligence possible?
 John Searle: most famous AI skeptic
 Chinese Room argument
?
!
 Is this really intelligence?
CMSC 100 - Fall 2008
43
What Searle Argues
 People have beliefs; computers and machines don’t.
 People have “intentionality”; computers and machines
don’t.
 Brains have “causal properties”; computers and machines
don’t.
 Brains have a particular biological and chemical structure;
computers and machines don’t.
 (Philosophers can make claims like “People have
intentionality” without ever really saying what
“intentionality” is, except (in effect) “the stuff that
people have and computers don’t.”)
CMSC 100 - Fall 2008
44
Let’s Introspect For a Moment...
 Have you ever learned something by rote that you didn’t
really understand?
 Were you able to get a good grade on an essay where you
didn’t really know what you were talking about?
 Have you ever convinced somebody you know a lot about
something you really don’t?
 Are you a Chinese room??
 What does “understanding” really mean?
 What is intentionality? Are human beings the only entities
that can ever have it?
 What is consciousness? Why do we have it and other
animals and inanimate objects don’t? (Or do they?)
CMSC 100 - Fall 2008
45
Just You Wait...
CMSC 100 - Fall 2008
46