slides - Seidenberg School of Computer Science and Information

Download Report

Transcript slides - Seidenberg School of Computer Science and Information

IT 691
Final Presentation
Pace University
Created by:
Robert M Gust
Mark Lee Samir Hessami
Project Description


Research Numenta Platform for Intelligent
Computing (NuPIC) which attempts to harness
brain like function to solve problems
Brainchild of Jeff Hawkins, inventor of the Palm
Pilot
2
Artificial Intelligence

Two schools of thought


Understand how the human brain works and build
models that behave in the same way (model the
whole brain as a single entity)
Examine the interconnections of neurons in the brain
and produce similar activity by leveraging it’s structure
*Picton, P. (2000). Neural Networks. Palgrave.
3
Artificial Intelligence

Based around logic, usually in the form of a set
of rules

Expert Systems (medical diagnosis, game playing)


Fuzzy logic
Mathematical models attempt to leverage
organization of neurons (Nodes are arranged in
various configurations to emulate brain function)

Neural Networks (pattern recognition)
4
Artificial Intelligence

What is a Neural Network?

“Neural Networks are an attempt to create machines
that work in a similar way to the human brain by
building these machines using components that
behave like biological neurons”
*Picton, P. (2000). Neural Networks. Palgrave.
5
Examples of Neural Networks


The perceptron is a type of artificial neural network
invented in 1957 at the Cornell Aeronautical Laboratory
by Frank Rosenblatt.
The Perceptron is a single layer neural network whose
weights and biases could be trained to produce a correct
target *vector when presented with the corresponding
input vector.

The training technique used is called the perceptron learning
rule.Perceptrons are especially suited for simple problems in
pattern classification.
*Vector: Any device of transportation or movement
*Source: http://en.wikipedia.org/wiki/
6
Examples of Neural Networks

The best-known example of a neural network
training algorithm is back propagation

*Backpropagation is a supervised learning
technique used for training artificial neural networks.
The term is an abbreviation for “backwards
propagation of errors”, which requires that the transfer
function used by the artificial neurons (or “nodes”) be
differentiable.
*Source: http://en.wikipedia.org/wiki/
7
Hierarchical Temporal
Memory (HTM)

Hierarchical


Temporal


Subdivide problem so it may be addressed in a
hierarchy
Include time in pattern recognition problem
Memory

Spatial (stored images used for comparative
purposes in pattern recognition)
8
Numenta HTM Image Recognition
• Image is fed into network
• Static image is moved within field of view
during learning phase and identifying phase.
9
Numenta HTM Network Level 1
Level 1 – 64 nodes
Level 1 is a direct
mapping of the input
image - A’s receptive
field is a 4 x 4
Examining a
single node
from Level 1
This is unsupervised
learning
10
HTM Node in Inference Mode
Level 2
The node is attempting to
name the given input.
In this case the pattern
is identified as the binary
0100. Which is passed
to level 2.
11
Single HTM Node’s Set of Static Images

Step 1:
Spatial – form sets of
images using pixelwise similarity.
Creates a finite set of
images for temporal
analysis.
12
Single HTM Node’s Set of Sequences

Step 2:
Temporal – form sets of
images using their
temporal proximity to
one another (is pattern
a frequently followed by
pattern b).
13
HTM Node in Inference Mode

Spatial:


Remove Noise
Temporal:


Capable of pooling together patterns that are very
dissimilar from a pixel-wise perspective
Must have a finite set of points to use
14
Numenta HTM Network Levels 1 and 2
Level 2 – 16 nodes
Level 2 receives its
Input from the output
of 4 level 1 nodes –
C & D’s receptive
fields are 8 x 8
15
Numenta HTM Network Structure
Level 1 – 64 nodes
Level 2 – 16 nodes
Level 1 is a direct
mapping of the input
image - A’s receptive
field is a 4 x 4
Level 2 receives its
Input from the output
Of 4 level 1 nodes –
C & D’s receptive
fields are 8 x 8
Level 3 - the invariant
form (label / name)
*George, D & Jaros, B. (2007). The HTM Learning Algorithms. Numenta.
16
Prediction

What is prediction?


Statistically based assumption
Neuroanatomists have known for a long time that the
brain is saturated with feedback connections. For
example, in the circuit between the neocortex and a
lower structure called the thalamus, connections going
backward (toward the input) exceed the connections
going forward by almost a factor of ten! That is, for
every fiber feeding information forward into the
neocortex, there are ten fibers feeding information back
toward the senses.
* Hawkins, (2004): On Intelligence, 25
17
Further Research



Not just for image recognition
Unsupervised learning opens up the possibility
for determining causality for novel problems
Hawkin’s: Weather pattern example
18