Turing machine
Download
Report
Transcript Turing machine
CDT314
FABER
Formal Languages, Automata
and Models of Computation
Lecture 15
School of Innovation, Design and Engineering
Mälardalen University
2011
1
Content
Church-Turing Thesis
Other Models of Computation
Turing Machines
Recursive Functions
Post Systems
Rewriting Systems
Matrix Grammars
Markov Algorithms
Lindenmayer-Systems
Natural/Physical Computation
Biological Computing
Quantum Computing
2
THE CONCEPT OF COMPUTABILITY
“I explore the conceptual foundations of Alan Turing’s analysis of computability, which
still dominates thinking about computability today. I argue that Turing’s account
represents a last vestige of a famous but unsuccessful program in pure mathematics,
viz., Hilbert’s formalist program.
It is my contention that the plausibility of Turing’s account as an analysis of the
computational capacities of physical machines rests upon a number of highly
problematic assumptions whose plausibility in turn is grounded in the formalist stance
towards mathematics.
More specifically, the Turing account conflates concepts that are crucial for
understanding the computational capacities of physical machines. These concepts
include the idea of an “operation” or “action” that is “formal,” “mechanical,” “welldefined,” and “precisely described,” and the idea of a “symbol” that is “formal,”
“uninterpreted,” and “shaped”.
When these concepts are disentangled, the intuitive appeal of Turing’s account is
significantly undermined. This opens the way for exploring models of
hypercomputability that are fundamentally di:erent from those currently entertained in
the literature.” Carol E. Cleland
http://www.sciencedirect.com/science/journal/03043975
http://www.damtp.cam.ac.uk/events/strings02/dirac/hawking Stephen Hawking: Gödel and
the end of Physics
3
Church-Turing Thesis*
*Source: Stanford Encyclopaedia of Philosophy
4
A Turing machine is an abstract
representation of a computing device.
It is more like a computer program (software)
than a computer (hardware).
5
LCMs [Logical Computing Machines:
Turing’s expression for Turing machines]
were first proposed by Alan Turing,
in an attempt to give
a mathematically precise definition
of "algorithm" or "mechanical procedure".
6
The Church-Turing thesis
concerns an
effective or mechanical method
in logic and mathematics.
7
A method, M, is called ‘effective’ or ‘mechanical’ just in case:
1. M is set out in terms of a finite number of exact
instructions (each instruction being expressed by means
of a finite number of symbols);
2. M will, if carried out without error, always produce the
desired result in a finite number of steps;
3. M can (in practice or in principle) be carried out by a
human being unaided by any machinery except for
paper and pencil;
4. M demands no insight or ingenuity on the part of the
human being carrying it out.
8
Turing’s thesis: LCMs [logical computing
machines; TMs] can do anything that could
be described as "rule of thumb" or "purely
mechanical". (Turing 1948:7)
He adds: This is sufficiently well established that it is
now agreed amongst logicians that "calculable by
means of an LCM" is the correct accurate rendering
of such phrases.
http://www.alanturing.net/turing_archive/pages/Reference%20Articles/The
%20Turing-Church%20Thesis.html AlanTuring.net (Jack Copeland)
9
Turing introduced this thesis in the course of
arguing that the Entscheidungsproblem, or
decision problem, for the predicate calculus posed by Hilbert (1928) - is unsolvable.
10
Church’s account of the Entscheidungsproblem
By the Entscheidungsproblem of a system of
symbolic logic is here understood the problem
to find an effective method by which, given any
expression Q in the notation of the system, it
can be determined whether or not Q is provable
in the system.
11
The truth table test is such a method for
the propositional logic.
Turing showed that, given his thesis,
there can be no such method for the
predicate logic.
Propositional logic (also called sentential calculus or sentential logic) is a formal system in which formulas of a formal language
may be interpreted as representing propositions. A system of inference rules and axioms allows theorems to be derived as true
propositions. http://en.wikipedia.org/wiki/Propositional_calculus
In mathematical logic, predicate logic is the symbolic formal systems like first-order logic, second-order logic or many-sorted
logic. This formal system is distinguished from other systems in that its formulae contain variables which can be quantified.
Two common quantifiers are the existential ∃ ("there exists") and universal ∀ ("for all") quantifiers.
12
http://en.wikipedia.org/wiki/Predicate_logic
Turing proved formally that there is no TM
which can determine, in a finite number of
steps, whether or not any given formula of the
predicate calculus is a theorem of the
calculus.
So, given his thesis that if an effective method
exists then it can be carried out by one of his
machines, it follows that there is no such
method to be found.
13
Church’s thesis: A function of positive
integers is effectively calculable only if
recursive.
14
Misunderstandings of the Turing Thesis
Turing did not show that his machines can solve any
problem that can be solved "by instructions, explicitly
stated rules, or procedures" and nor did he prove that
a universal Turing machine "can compute any
function that any computer, with any architecture, can
compute".
15
Turing proved that his universal machine can
compute any function that any Turing machine can
compute; and he put forward, and advanced
philosophical arguments in support of, the thesis here
called Turing’s thesis.
16
A thesis concerning the extent of effective methods procedures that a human being unaided by
machinery is capable of carrying out - has no
implication concerning the extent of the procedures
that machines are capable of carrying out, even
machines acting in accordance with ‘explicitly stated
rules’.
17
Among a machine’s repertoire of atomic
operations there may be those that no human
being unaided by machinery can perform.
18
Turing introduces his machines as an
idealised description of a certain human
activity, the tedious one of numerical
computation, which until the advent of
automatic computing machines was the
occupation of many thousands of people in
commerce, government, and research
establishments. (Jack Copeland)
19
Turing’s "Machines". These machines are
humans who calculate. (Wittgenstein)
A man provided with paper, pencil, and
rubber, and subject to strict discipline, is in
effect a universal machine. (Turing)
20
The Entscheidungsproblem is the problem of
finding a humanly executable procedure of a
certain sort, and Turing’s aim was precisely to
show that there is no such procedure in the
case of predicate logic.
21
Other Models of
Computation
Equivalent to
Turing Machines
22
Equivalent Models of Computation
• Turing Machines
• Recursive Functions
• Post Systems
• Rewriting Systems
23
Turing’s Thesis
A computation is mechanical if and only if
it can be performed by a Turing Machine.
Church’s Thesis (extended)
All models of computation are equivalent.
24
Post Systems
• Axioms
• Productions
Similar to unrestricted grammars
A Post canonical system, as created by Emil Post, is a string-manipulation system that starts with finitely-many
strings and repeatedly transforms them by applying a finite set of specified rules, thus generating a formal language.
Today they are mainly of historical relevance because every Post canonical system can be reduced to a string
rewriting system
25
Example: Unary Addition
Axiom: 1 1 11
Productions:
V1 V2 V3 V11 V2 V31
V1 V2 V3 V1 V21 V31
26
Production:
Axiom: 1 1 11
V1 V2 V3 V11 V2 V31
1 1 11 11 1 111 11 11 1111
V1 V2 V3 V1 V21 V31
27
Rewriting Systems
They convert one string to another
• Matrix Grammars
• Markov Algorithms
• Lindenmayer-Systems (L-Systems)
Very similar to unrestricted grammars.
28
Matrix Grammars
Example:
P1 : S S1S2
P2 : S1 aS1, S2 bS2c
P3 : S1 , S2
Derivation:
S S1S2 aS1bS2c aaS1bbS2cc aabbcc
A set of productions is applied simultaneously.
29
P1 : S S1S2
P2 : S1 aS1, S2 bS2c
P3 : S1 , S2
n n n
L {a b c : n 0}
30
Markov Algorithms
Grammars that produce
Example:
ab S
aSb S
S
Derivation:
aaabbb aaSbb aSb S
31
ab S
aSb S
S
n n
L {a b : n 0}
*
In general:
L {w : w }
32
Lindenmayer-Systems
They are parallel rewriting systems
Example: a aa
Derivation: a aa aaaa aaaaaaaa
L {a
2n
: n 0}
33
Lindenmayer-Systems are not general
as recursively enumerable languages
Extended Lindenmayer-Systems: ( x, a, y ) u
context
34
L-System Example: Fibonacci numbers
Consider the following simple grammar:
variables : A B
constants : none
start: A
rules: A B
B AB
35
This L-system produces the following sequence
of strings ...
start: A
rules: A B
B AB
Stage 0 : A
Stage 1 : B
Stage 2 : AB
Stage 3 : BAB
Stage 4 : ABBAB
Stage 5 : BABABBAB
Stage 6 : ABBABBABABBAB
Stage 7 : BABABBABABBABBABABBAB
36
If we count the length of each string, we obtain
the Fibonacci sequence of numbers :
1 1 2 3 5 8 13 21 34 ....
37
Example - Algal growth
The figure shows the pattern of cell lineages found in
the alga Chaetomorpha linum.
To describe this pattern, we must let the symbols
denote cells in different states, rather than different
structures.
38
This growth process can be generated from an
axiom A and growth rules
A DB
BC
CD
DE
EA
39
Here is the pattern generated by this model.
It matches the arrangement of cells in the original alga.
Stage 0 :
Stage 1 :
Stage 2 :
Stage 3 :
Stage 4 :
Stage 5 :
Stage 6 :
Stage 7 :
Stage 8 :
Stage 9 :
Stage 10 :
Stage 11 :
A
D
E
A
B
C
D
D
B
E
E
C
A
A
D
D
B
D B E
E
C
E C A
A
D
A D D B D B E
D B E E C E C A
E C A A D A D D B
A DB
BC
CD
DE
EA
40
Example - a compound leaf (or branch)
Leaf1 {
; Name of the l-system, "{" indicates start
; Compound leaf with alternating branches,
angle 8
; Set angle increment to (360/8)=45 degrees
axiom x
; Starting character string
a=n
; Change every "a" into an "n"
n=o
; Likewise change "n" to "o" etc ...
o=p
p=x
b=e
e=h
h=j
j=y
x=F[+A(4)]Fy ; Change every "x" into "F[+A(4)]Fy"
y=F[-B(4)]Fx ; Change every "y" into "F[-B(4)]Fx"
[email protected]@i1.18
}
; final } indicates end
41
http://www.xs4all.nl/~cvdmark/tutor.html
L-systems tutorial with animations
http://nptel.iitm.ac.in/courses/106106049/39 Lecture on L systems by Prof.
Kamala Krithivasan
http://www.youtube.com/watch?v=l3Txk_difKA&feature=related rose
http://www.youtube.com/watch?v=ZVyi4ykVHJM&feature=related apple
tree generated by L system
42
Here is a series of forms created by slowly changing the angle parameter.
lsys00.ls
Check the rest of the Gallery of L-systems:
http://home.wanadoo.nl/laurens.lapre
43
Plant
Environment
Reception
Response
Internal processes
Internal processes
Response
Reception
A model of a horse chestnut tree
inspired by the work of Chiba
and Takenaka.
Here branches compete for light
from the sky hemisphere. Clusters
of leaves cast shadows on branches
further down. An apex in shade
does not produce new branches. An
existing branch whose leaves do not
receive enough light dies and is
shed from the tree. In such a
manner, the competition for light
controls the density of branches in
the tree crowns.
44
Plant
Environment
Reception
Response
Internal processes
Internal processes
Response
Reception
45
Apropos adaptive reactive systems:
"What's the color of a chameleon put onto a mirror?"
Stewart Brand
(Must be possible to verify experimentally, isn’t it?)
http://research.microsoft.com/pubs/144550/CACM_11.pdf Biology as Reactivity From
Communications of ACM (Association for Computing Machinery)
46
Physical (Natural)
Computation
47
Feynman, R. The Character of Physical Law, page 57:
"It always bothers me that, according to the laws as we
understand them today, it takes a computing machine an infinite
number of logical operations to figure out what goes on in no
matter how tiny a region of space, and no matter how tiny a
region of time. How can all that be going on in that tiny space?
Why should it take an infinite amount of logic to figure out what
one tiny piece of space/time is going to do? So I have often
made the hypotheses that ultimately physics will not require a
mathematical statement, that in the end the machinery will be
revealed, and the laws will turn out to be simple, like the
chequer board with all its apparent complexities".
48
Biological Computing
This part of the lecture can be made into a whole course, see
http://web.eecs.utk.edu/~mclennan
http://web.eecs.utk.edu/~mclennan/Classes/420/ Biologically-Inspired Computation
49
DNA Computing
DNA computing is a form of computing which uses
DNA, biochemistry and molecular biology, instead of
the traditional silicon-based computer technologies.
DNA computing, or, more generally, biomolecular
computing, is a fast developing interdisciplinary area.
Research and development in this area concerns
theory, experiments and applications of DNA
computing.
http://en.wikipedia.org/wiki/DNA_computing
50
DNA
51
DNA Computing
Despite their respective complexities, biological and
mathematical operations have some similarities:
• The very complex structure of a living being is the
result of applying simple operations to initial
information encoded in a DNA sequence (genes).
• All complex math problems can be reduced to
simple operations like addition and subtraction.
52
DNA Computing
Despite their respective complexities, biological and
mathematical operations have some similarities:
• The very complex structure of a living being is the
result of applying simple operations to initial
information encoded in a DNA sequence (genes).
• All complex math problems can be reduced to simple
operations like addition and subtraction.
53
For the same reasons that DNA was presumably
selected (by natural selection) for living organisms as
a genetic material, its stability and predictability in
reactions, DNA strings can also be used to encode
information for mathematical systems.
54
The Hamiltonian Path Problem
The objective is to find a path from start to end going
through all the points only once.
This problem is difficult for conventional (serial logic)
computers because they must try each path one at a
time. It is like having a whole bunch of keys and
trying to see which fits a lock.
55
Conventional computers are very good at math, but
poor at "key into lock" problems. DNA based
computers can try all the keys at the same time
(massively parallel) and thus are very good at keyinto-lock problems, but much slower at simple
mathematical problems like multiplication.
The Hamiltonian Path problem was chosen because
every key-into-lock problem can be solved as a
Hamiltonian Path problem.
56
Solving the Hamiltonian Path Problem
1. Generate random paths through the graph.
2. Keep only those paths that begin with the start city
(A) and conclude with the end city (G).
3. Because the graph has 7 cities, keep only those
paths with 7 cities.
4. Keep only those paths that enter all cities at least
once.
5. Any remaining paths are solutions.
57
Solving the Hamiltonian Path Problem
The key to solving the problem was using DNA to
perform the five steps in the above algorithm.
These interconnecting blocks can be used to model
DNA:
58
DNA tends to form long double helices:
The two helices are joined by "bases", represented
here by coloured blocks. Each base binds only one
other specific base. In our example, we will say that
each coloured block will only bind with the same
colour. For example, if we only had red blocks, they
would form a long chain like this:
Any other colour will not bind with red:
59
Programming with DNA
Step 1: Create a unique DNA sequence for each city
A through G. For each path, for example, from A to B,
create a linking piece of DNA that matches the last
half of A and first half of B:
Here the red block represents city A, while the
orange block represents city B. The half-red halforange block connecting the two other blocks
represents the path from A to B.
In a test tube, all the different pieces of DNA will
randomly link with each other, forming paths through
the graph.
60
Programming with DNA
Step 2: Because it is difficult to "remove" DNA from
the solution, the target DNA, the DNA which started
at A and ended at G was copied over and over again
until the test tube contained a lot of it relative to the
other random sequences.
This is essentially the same as removing all the other
pieces. Imagine a sock drawer which initially contains
one or two coloured socks. If you put in a hundred
black socks, chances are that all you will get if you
reach in is black socks!
61
Programming with DNA
Step 3: Going by weight, the DNA sequences which
were 7 "cities" long were separated from the rest.
A "sieve" was used which allows smaller pieces of
DNA to pass through quickly, while larger segments
are slowed down. The procedure used actually allows
you to isolate the pieces which are precisely 7 cities
long from any shorter or longer paths.
62
Programming with DNA
Step 4: To ensure that the remaining sequences went
through each of the cities, "sticky" pieces of DNA
attached to magnets were used to separate the DNA.
The magnets were used to ensure that the target
DNA remained in the test tube, while the unwanted
DNA was washed away. First, the magnets kept all
the DNA which went through city A in the test tube,
then B, then C, and D, and so on. In the end, the only
DNA which remained in the tube was that which went
through all seven cities.
63
Programming with DNA
Step 5: All that was left was to sequence the DNA,
revealing the path from A to B to C to D to E to F to G.
64
Advantages of DNA computing
The above procedure took approximately one week
to perform. Although this particular problem could be
solved on a piece of paper in under an hour, when
the number of cities is increased to 70, the problem
becomes too complex for even a supercomputer.
While a DNA computer takes much longer than a
normal computer to perform each individual
calculation, it performs an enormous number of
operations at a time (massively parallel).
65
DNA computers also require less energy and space
than normal computers. 1000 litres of water could
contain DNA with more memory than all the
computers ever made, and a pound of DNA would
have more computing power than all the computers
ever made.
66
The Future of DNA computing
DNA computing was initially developed by Leonard
Adleman of the University of Southern California, in
1994 and it is an research area under intense
development. It is too early for either great optimism
or great pessimism.
Early computers such as ENIAC filled entire rooms,
and had to be programmed by punch cards. Since
that time, computers have become much smaller and
easier to use.
67
DNA computers will probably become more common
in the future for solving very complex problems.
Just as DNA cloning and sequencing were once
manual tasks, DNA computers will also become
automated.
Read more at:
http://research.microsoft.com/en-us/projects/dna/
http://www.casi.net/D.BioInformatics1/D.Fall2000ClassPage/DC2/dc2.htm
http://www.alexpetty.com/2010/09/11/vortex-math-based-computing/
http://news.bbc.co.uk/2/hi/technology/8184033.stm
68
Quantum Computing
This part of the lecture can also be made into a whole
course, see courses in Quantum Computing:
http://www.qubit.org/courses.html
69
Today: fraction of micron (10-6 m) wide logic
gates and wires on the surface of silicon
chips.
Soon they will yield even smaller parts and
inevitably reach a point where logic gates are
so small that they are made out of only a
handful of atoms.
1 nm = 10-9 m Helium atom 10-10 m
70
On the atomic scale matter obeys the rules of
quantum mechanics, which are quite different
from the classical rules that determine the
properties of conventional logic gates.
So if computers are to become smaller in the
future, new, quantum technology must
replace or supplement what we have now.
71
What is quantum mechanics?
The deepest theory of physics; the framework
within which all other current theories, except
the general theory of relativity, are
formulated. Some of its features are:
Quantisation (which means that observable
quantities do not vary continuously but come
in discrete chunks or 'quanta'). This is the one
that makes computation, classical or
quantum, possible at all.
72
Interference (which means that the outcome
of a quantum process in general depends on
all the possible histories of that process).
This is the feature that makes quantum
computers qualitatively more powerful than
classical ones.
73
Entanglement (Two spatially separated and
non-interacting quantum systems that have
interacted in the past may still have some
locally inaccessible information in common –
information which cannot be accessed in any
experiment performed on either of them
alone.)
This is the one that makes quantum
cryptography possible.
74
The discovery that quantum physics allows
fundamentally new modes of information
processing has required the existing theories
of computation, information and cryptography
to be superseded by their quantum
generalisations.
75
Let us try to reflect a single photon off a half-silvered
mirror i.e. a mirror which reflects exactly half of the
light which impinges upon it, while the remaining half
is transmitted directly through it.
It seems that it would be sensible to say that the
photon is either in the transmitted or in the reflected
beam with the same probability.
76
Indeed, if we place two photodetectors behind the halfsilvered mirror in direct lines of the two beams, the
photon will be registered with the same probability
either in the detector 1 or in the detector 2.
Does it really mean that after the half-silvered mirror
the photon travels in either reflected or transmitted
beam with the same probability 50%?
No, it does not ! In fact the photon takes `two paths at
once'.
77
This can be demonstrated by recombining the two beams
with the help of two fully silvered mirrors and placing
another half-silvered mirror at their meeting point, with two
photodectors in direct lines of the two beams.
With this set up we can observe a truly amazing quantum
interference phenomenon.
78
If it were merely the case that there were a 50%
chance that the photon followed one path and a 50%
chance that it followed the other, then we should find
a 50% probability that one of the detectors registers
the photon and a 50% probability that the other one
does.
However, that is not what happens. If the two
possible paths are exactly equal in length, then it
turns out that there is a 100% probability that the
photon reaches the detector 1 and 0% probability that
it reaches the other detector 2. Thus the photon is
certain to strike the detector 1!
79
It seems inescapable that the photon must, in some
sense, have actually travelled both routes at once for if an
absorbing screen is placed in the way of either of the two
routes, then it becomes equally probable that detector 1
or 2 is reached.
80
Blocking off one of the paths actually allows detector
2 to be reached. With both routes open, the photon
somehow knows that it is not permitted to reach
detector 2, so it must have actually “felt out” both
routes.
It is therefore perfectly legitimate to say that between
the two half-silvered mirrors the photon took both the
transmitted and the reflected paths.
81
Using more technical language, we can say that the
photon is in a coherent superposition of being in the
transmitted beam and in the reflected beam.
In much the same way an atom can be prepared in a
superposition of two different electronic states, and in
general a quantum two state system, called a
quantum bit or a qubit, can be prepared in a
superposition of its two logical states 0 and 1. Thus
one qubit can encode at a given moment of time both
0 and 1.
82
In principle we know how to build a quantum
computer; we can start with simple quantum logic
gates and try to integrate them together into quantum
circuits.
A quantum logic gate, like a classical gate, is a simple
computing device that performs one elementary
quantum operation, usually on two qubits, in a given
period of time.
Of course, quantum logic gates are different from their
classical counterparts because they can create and
perform operations on quantum superpositions.
83
So the advantage of quantum computers
arises from the way they encode a bit, the
fundamental unit of information.
The state of a bit in a classical digital
computer is specified by one number, 0 or 1.
An n-bit binary word in a typical computer is
accordingly described by a string of n zeros
and ones.
84
A quantum bit, called a qubit, might be
represented by an atom in one of two
different states, which can also be denoted as
0 or 1.
Two qubits, like two classical bits, can attain
four different well-defined states (0 and 0, 0
and 1, 1 and 0, or 1 and 1).
85
But unlike classical bits, qubits can exist
simultaneously as 0 and 1, with the
probability for each state given by a
numerical coefficient.
Describing a two-qubit quantum computer
thus requires four coefficients. In general, n
qubits demand 2n numbers, which rapidly
becomes a sizable set for larger values of n.
86
For example, if n equals 50, about 1015 numbers are
required to describe all the probabilities for all the
possible states of the quantum machine--a number
that exceeds the capacity of the largest conventional
computer.
A quantum computer promises to be immensely
powerful because it can be in multiple states at once
(superposition) -- and because it can act on all its
possible states simultaneously.
Thus, a quantum computer could naturally perform
myriad operations in parallel, using only a single
processing unit.
87
The most famous example of the extra power of a
quantum computer is Peter Shor's algorithm for
factoring large numbers.
Factoring is an important problem in cryptography; for
instance, the security of RSA public key cryptography
depends on factoring being a hard problem.
Despite much research, no efficient classical
factoring algorithm is known.
88
However if we keep on putting quantum gates
together into circuits we will quickly run into some
serious practical problems.
The more interacting qubits are involved the harder it
tends to be to engineer the interaction that would
display the quantum interference.
Apart from the technical difficulties of working at
single-atom and single-photon scales, one of the
most important problems is that of preventing the
surrounding environment from being affected by the
interactions that generate quantum superpositions.
89
The more components the more likely it is that
quantum computation will spread outside the
computational unit and will irreversibly dissipate
useful information to the environment.
This process is called decoherence. Thus the race is
to engineer sub-microscopic systems in which qubits
interact only with themselves but not not with the
environment.
90
But, the problem is not entirely new!
Remember STM?
(Scanning Tunnelling Microscope )
STM was a Nobel Prize winning invention by Binning and
Rohrer at IBM Zurich Laboratory in the early 1980s
91
Title : Quantum Corral
Media : Iron on Copper (111)
92
Scientists discovered a new method for confining
electrons to artificial structures at the nanometer
length scale.
Surface state electrons on Cu(111) were confined to
closed structures (corrals) defined by barriers built
from Fe atoms.
The barriers were assembled by individually
positioning Fe atoms using the tip of a low
temperature scanning tunnelling microscope (STM).
A circular corral of radius 71.3 Angstrom was
constructed in this way out of 48 Fe atoms.
93
The standing-wave patterns in the local density of states of the Cu(111)
surface. These spatial oscillations are quantum-mechanical interference
patterns caused by scattering of the two-dimensional electron gas off the Fe
atoms and point defects.
94
What will quantum computers be good at?
The most important applications currently known:
• Cryptography: perfectly secure communication.
• Searching, especially algorithmic searching (Grover's
algorithm).
• Factorising large numbers very rapidly
(Shor's algorithm).
• Simulating quantum-mechanical systems efficiently
Read more:
http://www.howstuffworks.com/quantum-computer.htm (How stuff works)
http://www.sciencedaily.com/news/computers_math/quantum_computers/
95
WHAT IS COMPUTATION?
Theoretical Computer Science 317 (2004)
Burgin, M., Super-Recursive Algorithms, Springer
Monographs in Computer Science, 2005, ISBN: 0-38795569-0
Minds and Machines (1994, 4, 4) “What is Computation?”
Journal of Logic, Language and Information (Volume 12 No 4
2003) What is information?
96
NATURAL COMPUTING RESOURCES
http://en.wikipedia.org/wiki/Natural_computing
http://nptel.iitm.ac.in/courses/106106049/39 Theory of Automata,
Formal Languages and Computation. Video Lectures by Prof.
Kamala Krithivasan
http://nptel.iitm.ac.in/courses/106106049
http://www.nature.com/nature/journal/v419/n6905/full/41
9343a.html Cellular abstractions: Cells as computation
http://www.wisdom.weizmann.ac.il/~biospi CONCURRENT
BIOLOGICAL PROCESSES
97
NATURAL COMPUTING RESOURCES
http://www.amazon.co.uk/Fundamentals-Natural-ComputingApplicationsInformation/dp/1584886439/ref=sr_1_4?ie=UTF8&q
id=1326058908&sr=8-4#reader_1584886439
Fundamentals of Natural Computing: Basic Concepts (book)
http://www.springerlink.com/content/1567-7818/10 Natural
Computing Journal, Springer
http://www.oldcitypublishing.com/IJUC/IJUC.html International
Journal of Unconventional Computing
http://www.gcn.us.es Research Group on Natural Computing
98
NATURAL COMPUTING RESOURCES
http://www.amazon.co.uk/Fundamentals-Natural-Computing-ApplicationsInformation/dp/1584886439/ref=sr_1_9?ie=UTF8&qid=1326119722&sr
=8-9#reader_1584886439 Leandro Nunes de Castro,
Fundamentals of Natural Computing: Basic Concepts,
Algorithms, and Applications (book)
http://web.eecs.utk.edu/~mclennan Bruce MacLennan
http://bio.informatics.indiana.edu/bio/index.php
Bioinformatics Program at Indiana University
https://sites.google.com/site/naturalcomputingaisbiacap
2012 NATURAL/UNCONVENTIONAL COMPUTING AND ITS
PHILOSOPHICAL SIGNIFICANCE @ AISB/IACAP, 2nd - 6th
July 2012
99
Theoretical Computer Science, 2004 Volume:317 issue:1-3
Hypercomputation
Three aspects of super-recursive algorithms and hypercomputation or finding black swans
Burgin, Klinger
Toward a theory of intelligence Kugel
Algorithmic complexity of recursive and inductive algorithms Burgin
Characterizing the super-Turing computing power and efficiency of classical fuzzy Turing
machines Wiedermann
Experience, generations, and limits in machine learning Burgin, Klinger
Hypercomputation with quantum adiabatic processes Kieu
Super-tasks, accelerating Turing machines and uncomputability Shagrir
Natural computation and non-Turing models of computation MacLennan
Continuous-time computation with restricted integration capabilities Campagnolo
The modal argument for hypercomputing minds Bringsjord, Arkoudas
Hypercomputation by definition Wells
The concept of computability Cleland
Uncomputability: the problem of induction internalized Kelly
Hypercomputation: philosophical issues Copeland
100
COURSE WRAP-UP
Highlights
101
REGULAR LANGUAGES
Mathematical Preliminaries
Languages, Alphabets and Strings
Operations on Strings
Operations on Languages
Regular Expressions
Finite Automata
Regular Grammars
102
CONTEXT-FREE LANGUAGES
Context-Free Languages, CFL
Pushdown Automata, PDA
Pumping Lemma for CFL
Selected CFL Problems
103
TURING MACHINES AND DECIDABILITY
Unrestricted Grammars
Turing Machines
Decidability
OTHER MODELS OF COMPUTATION
104
EXAM
1.
2.
3.
4.
5.
6.
7.
DFA & RE...minimal DFA
REGULAR OR NOT?
PUSH DOWN AUTOMATON (PDA)
CONTEXT FREE LANGUAGES
PRIMITIVE RECURSION
TURING MASCHINES
DECIDABLE OR NOT? JUSTIFY!
105
The final exam is open-book/open-notebook
which means you can have one book or
notebook of your choice with you.
If you have passed all midterms, you do not
need to write final.
If you did not pass one of them (Regular,
Context-free or Turing Machines) you write
only that part.
106
THAT’S ALL FOLKS!
…so you should now be well on your way to finishing the course…
good luck!
107
REFERENCES
1. Lenart Salling, Formella språk, automater och
beräkningar
2. Peter Linz, An Introduction to Formal Languages
and Automata
3. http://www.cs.rpi.edu/courses/
Models of Computation, C Busch
4. http://www.cs.duke.edu/~rodger
Mathematical Foundations of Computer Science;
Susan H. Rodger; Duke University
5. Rich Elaine, Automata, Computability and
Complexity: Theory and Applications, Prentice Hall
108