S - Department of Computer and Information Science and Engineering

Download Report

Transcript S - Department of Computer and Information Science and Engineering

Physical Limits of Computing
Dr. Mike Frank
Slides from a Course Taught at
the University of Florida
College of Engineering
Department of Computer &
Information Science & Engineering
Spring 2000, Spring 2002, Fall 2003
Overview of First Lecture
• Introduction: Moore’s Law vs. Known Physics
• Mechanics of the course:
–
–
–
–
–
Course website
Books / readings
Topics & schedule
Assignments & grading policies
misc. other administrivia
Physical Limits of Computing
Introductory Lecture
Moore’s Law vs. Known Physics
Moore’s Law
• Moore’s Law proper:
– Trend of doubling of number of transistors per
integrated circuit every 18 (later 24) months
• First observed by Gordon Moore in 1965 (see readings)
• “Generalized Moore’s Law”
– Various trends of exponential improvement in many
aspects of information processing technology (both
computing & communication):
• Storage capacity/cost, clock frequency, performance/cost,
size/bit, cost/bit, energy/operation, bandwidth/cost …
Law–
- Transistors
per Chip
Moore’sMoore's
Law
Devices
per IC
1,000,000,000
Madison
Itanium 2
P4
P3
Intel µpu’s
P2
486DX Pentium
386
286
8086
100,000,000
10,000,000
1,000,000
100,000
10,000
4004
1,000
Early
100 Fairchild
ICs
10
1
1950
1960
1970
Avg. increase
of 57%/year
1980
1990
2000
2010
Microprocessor Performance Trends
Source:
Hennessy &
Patterson,
Computer
Architecture:
A Quantitative
Approach.
Added
Performance
analysis
based on data
from the
ITRS 1999
roadmap.
Raw technology
performance
(gate ops/sec/chip):
Up ~55%/year
Super-Exponential Long-Term Trend
Ops/second/
$1,000
Vacuum Tubes
Integrated
Circuits
Mechanical
Discrete
Electromechanical Transistors
Relays
Source: Kurzweil ‘99
Known Physics:
• The history of physics has
been a story of:
– Ever-increasing precision,
unity, & explanatory power
• Modern physics is very
nearly perfect!
– All accessible phenomena are
exactly modeled, as far as we
know, to the limits of
experimental precision, ~11
decimal places today.
• However, the story is not
quite complete yet:
– There is no experimentally
validated theory unifying GR
& QM (yet)
String theory?
M-theory?
Loop quantum gravity?
Other?
Fundamental Physical Limits of Computing
Thoroughly
Confirmed
Physical Theories
Theory of
Relativity
Quantum
Theory
Implied
Affected Quantities in
Universal Facts Information Processing
Speed-of-Light
Limit
Uncertainty
Principle
Definition
of Energy
Reversibility
2nd Law of
Thermodynamics
Adiabatic Theorem
Gravity
Communications Latency
Information Capacity
Information Bandwidth
Memory Access Times
Processing Rate
Energy Loss
per Operation
ITRS Feature Size Projections
1000000
uP chan L
DRAM 1/2 p
100000
min Tox
Human hair
thickness
max Tox
Eukaryotic
cell
Feature Size (nanometers)
10000
1000
Bacterium
Virus
100
Protein
molecule
10
DNA molecule
thickness
1
Atom
0.1
1955 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005 2010 2015 2020 2025 2030 2035 2040 2045 2050
Year of First Product Shipment
We are here
ITRS Feature Size Projections
1000
Bacterium
uP chan L
DRAM 1/2 p
min Tox
max Tox
Feature Size (nanometers)
100
Virus
Protein
molecule
10
DNA molecule
thickness
1
Atom
0.1
1995
2000
2005
We are here
2010
2015
2020
2025
2030
Year of First Product Shipment
2035
2040
2045
2050
A Precise Definition of Nanoscale
10−4.5 m ≈ 31.6 µm
Microscale:
Characteristic length scale in
Microcomputers
10−6 m = 1 µm
10−7.5 m ≈ 31.6 nm
10−9 m = 1 nm
~Atom size
10−10.5 m ≈ 31.6 pm
10−12 m = 1 pm
Near
nanoscale
Far
nanoscale
Nanoscale:
Characteristic length scale in
Nanocomputers
Picoscale:
Characteristic length scale in
Picocomputers (if possible)
Min transistor switching energy, kTs
Trend of minimum transistor switching energy
(½CV2 gate energy calculated from ITRS ’99 geometry/voltage data)
1000000
High
100000
10000
Low
1000
trend
100
10
1
1995
2005
2015
2025
Year of First Product Shipment
2035
What is entropy?
• First was characterized by Rudolph Clausius in 1850.
– Originally was just defined as heat ÷ temperature.
– Noted to never decrease in thermodynamic processes.
– Significance and physical meaning were mysterious.
• In ~1880’s, Ludwig Boltzmann proposed that entropy S
is the logarithm of the number N of states, S = k ln N
– What we would now call the information capacity of a system
– Holds for systems at equilibrium, in maximum-entropy state
• The modern consensus that emerged from 20th-century
physics is that entropy is indeed the amount of unknown
or incompressible information in a physical system.
– Important contributions to this understanding were made by
von Neumann, Shannon, Jaynes, and Zurek.
Landauer’s 1961 Principle from basic quantum theory
Before bit erasure:
0
s′0
1
1
s″N−1
0
s″N
0
2N
distinct
states
…
…
s′N−1
Unitary
(1-1)
evolution
0
…
…
sN−1
…
N
distinct
states
s″0
0
…
N
distinct
states
s0
After bit erasure:
s″2N−1
0
Increase in entropy: S = log 2 = k ln 2. Energy lost to heat: ST = kT ln 2
Bit-operations per US dollar
Adiabatic Cost-Efficiency Benefits
1.00E+33
1.00E+32
1.00E+31
1.00E+30
Scenario: $1,000/3-years,
100-Watt conventional
computer, vs. reversible
computers w. same capacity.
~100,000×
1.00E+29
~1,000×
1.00E+28
1.00E+27
1.00E+26
1.00E+25
All curves
would →0
if leakage
not reduced.
1.00E+24
1.00E+23
1.00E+22
2000
2010
2020
2030
2040
2050
2060