Transcript talk

Physics and Machine
Learning
“All the tricks that physicists’ use
eventually end up in machine
learning”
Energy – Physics definitions
Energy - A measure of being able to do work. There are many forms of
energy, such as heat, mechanical, electrical, radiant, chemical, and nuclear
energies. Energy is measured in such units as the joule (J), erg, kilowatthour (kW-hr), kilocalorie (kcal), foot-pound (ft-lb.), electron-volt (ev), and
British thermal unit (BTU). –NASA.gov
"It is important to realize that in physics today, we have no
knowledge of what energy is. We do not have a picture that
energy comes in little blobs of a definite amount." -Richard
Feynman "Lectures on Physics"
Energy - Physics II
• Magnetic Arrays and Spin
Images: http://meso.phys.northwestern.edu/research/magneticarrays.html
Energy - Machine Learning
• H=-½iJwiJSiSJ
Si
Vs.
Vs.
Vs.
Vs.
WiJ
SJ
Energy - Machine Learning II
• Energy is the difference in weight between
all nodes that agree and all nodes that
disagree.
• The more weights, the greater energy.
• The “closer” the call, the lower |H|.
Energy Minima
• Retrieval States –
attractors
• Mixture States –
linear combinations of
odd numbered
attractors
• Spin Glass States –
uncorrelated to
attractors.
Ferromagnetics
Energy Metaphor
Imagine the atomic magnets as movable
objects with the freedom to flip, but you
control their position. Each iteration of
learning is like forcing all magnets to be
closer together, as such the network
energy is potential energy and the flipping
of spins is the expression of kinetic
energy.
Temperature – Physics I
• Extending the ferromagnetic example
• As temperature increases, the impact of
other atomic magnets’ spins is decreased.
• At absolute zero, temperature has no
impact.
• At the critical temperature(Tc), spin has no
impact.
Temperature – Ferromagnetics
•
•
•
•
•
•
•
Si = +1 w/ probability g(hi); else -1
g(h) = 1/(1+exp(-2βh))
β = 1/(kBT)
kB = Boltzman’s constant
T = temperature
Fβ(+/-hi)=1/(1+exp(-/+ 2βhi) Fβ(+/-hi)
Fβ(+/-hi) is a logistic function
Temperature – Machine Learning
• Logistic function
• Noise
• Used in the elimination of spurious local
minima
Mean Field Theory – Physics
• The individual measurement and
summation of each member of a magnetic
array is too expensive
• Physicists look to average values as an
inexpensive way to extract further truth
from a complex combinatronics problem.
Mean Field Theory – Physics
•
•
•
•
hi=JwiJSJ+hext
<hi>=JwiJ<SJ>+hext
<Si>=tan(β<hi>) = tanh(βJwiJ<SJ>+hext)
<S>=tanh(βJ(S)) <S>
1
T
Tc
Mean Field Theory – stochastic
model
• <Si>=tanh(β/NJuζuiζuJ<SJ>)
• We allow an assumption, that <Si> is
proportional to one of the stored patterns
ζ vi
• <Si>=m ζvi
• <Ncorrect>=½N(1+m)
T
Tc
Mean Field Theory
• <Ncorrect>=½N(1+m)
• There is a point at which noise overcomes
the ability of a network to make an
informed decision.
Conclusions
• All these metaphors that pull from physics
are very tightly linked to energy.
• All metaphors concentrate on atomic
events. (exception that proves the rule:
mean field theory).
Extra Time?
Extra Topics!
Entropy – Physics
• The inevitable progression toward chaos
• The motion of energy and matter away
from an organized state.
Entropy – Machine Learning
• S = -PlogP
• For S = -Plog2P (the binary case)
this is the average amount of additional
information required to specify one of the
staties.
Quantum Mechanics
Fits within the context of our expectation for
where to look for Physics crossover
• Atomic – discrete and binary
• Energy specific
Last Class Lecture – use of Dyads.