CS 110 – Lecture 2

Download Report

Transcript CS 110 – Lecture 2

CS 300 – Lecture 1
Intro to Computer Architecture
/ Assembly Language
Welcome Back!
Calling All Seniors
Come see me about capstone projects!
The Text
We're using "Patterson / Hennessy". You
should have this by now. You should also
have the CD. If not, I'll help you out.
Note that some chapters are on the CD only!
We'll use appendix B from the CD
Our Goals
* Understand how computing hardware works
* Investigate the differences between hardware
and software and why they are important
* Understand low level programming (C and
assembler)
* Find out how high level programs are translated
into machine level programs
* Understand the performance of programs
* Look at parallelism
Course Outline
* Introduction (.5 wks)
* History (.5 wks)
* Logic circuits and digital design (2 wks)
* Basic Machine language and the C programming
language (2 wks)
* Computer Arithmetic (2 wks)
* Understanding performance (1 wk)
* Datapath and control (1 wk)
* Pipelining (2 wks)
* Memory issues (2 wks)
* Storage and IO (1 wk)
* Multi-processing (1 wk)
Work Required
* Quizzes: 1 per chapter in the book, about
20 minutes each
* Homework assignments: 1 per week
(approximately)
* Comprehensive final
* Classroom participation (come prepared!)
* No "big projects" but some homeworks will
involve small amounts of software
Danger!
I've chosen a "classic" textbook for this
class. It's normally used in a 2 or 3
semester sequence of classes.
I plan to cherry-pick the good stuff and avoid
getting into too much depth. Please let me
know if the book gets over your heads.
I'll also be doing some units using a simple
embedded processor like the PIC that will be
easier to understand.
What is a computer?
A computer is really just a collection of 4 basic
things:
•Computational element that performs reasoning
•Storage elements that retain data
•Communication elements that move data from
one computer to another
•Sensing and controls that allow the computer to
interact with its environment
What are some examples of these elements?
History
Each element of a computer has evolved
separately. What are the ancestors of these
elements?
•Computing
•Storage
•Communication
•Sensing and control
A Historical Approach
We’re going to dive in to the innards of
computing devices from a historical
perspective. Why?
•Older devices were a lot simpler yet
addressed the same problems
•It is interesting to address the context of
each element of a computer from an
evolutionary perspective
•History is cool
The Computing Element
The original computing element was the human brain. But
eventually mechanical devices were created to speed up
the calculation process.
The apex of mechanical computing was Babbage’s
“analytical engine”, a device too complex to ever work.
This early computing was mathematical – building tables of
numbers for navigation and engineering purposes.
John Von Neumann, one of the pioneers of
computing, used the word “Organ” to describe
these elements. The biological metaphors started
from day 1 …
Historical Computing Devices
Electronic Computing
The big innovation in computing was the
replacement of mechanical computing
devices by purely electronic ones.
A gear or relay is too big / slow / unreliable
to use in large quantities.
An electronic switch has no moving parts – it
operates by pushing electrons around.
The original electronic computers used
vacuum tubes – later transistors took over.
Electronic Gates
A gate is a device in which one signal
controls another. In a vacuum tube, the grid
could block or allow flow from input to
output. So this is just like a relay.
Transistors are very similar – just a lot
smaller.
Silicon
The “computer revolution” came
about when VLSI technology
allowed a single chip to contain
LOTS of transistors. A Pentium
has about 50 million transistors.
That would have been a lot of
vacuum tubes. Manufacturing
cost is something like $0.000001
per transistor.
Assessing Computation
How can we assess a computational technology?
This turns out to be REALLY HARD! Knowing how
fast a device can do one task doesn’t tell us a lot
about other tasks.
Approaches:
•Clock rate (not very accurate)
•MFLOP (only helps for numeric calculations)
•Specific benchmarks
Units: tasks / second
Information Storage
Storing information is as important as processing it.
This all started with written language:
Important ideas:
• Precise relationship between spoken and written
languages
• Ability to make a “perfect copy” of a document
• A medium (clay, paper, …) is used to preserve
information over time
Information Access
A large information repository is much more useful if it can
be accessed quickly via mechanical means.
Punch cards predate computers (by a long shot!) and were
used to store and process large volumes of information.
A key insight was that alphabetic information can be
processed as if it is numeric
Herman Hollerith patented a system in which needles
sensed the presence or absence of holes in a card.
This converted information into electric impulses.
His machine was used for the 1890 census
What company did he start?
Storage Media
Assessing Storage Technology
•Read/write or read-only
•Latency (time it takes to find what you want)
(time)
•Transfer rate (how fast you get the
information) (bits / second)
•Capacity (bits)
•Cost / bit ($)
•Error rate (errors / bit)
•Durability (time)
Interfacing
Getting (electronic) information from or to
the real world is another BIG part of
computing.
The first big breakthrough was a loom
controlled by punched cards.
Interface Technology
The big idea here is converting between
electronic representation and human
sensing for audio and video objects.
Other interface technology includes pointing
(mouse), typing (keyboard), and even GPS.
Interfaces usually come in two “layers” – one
that is specific to the device and one
“general” part like USB or Firewire.
Babbage’s Insight
Instead of programming a computer mechanically,
use the storage to encode the program.
That is, instead of building a machine to
accomplish just one task, build a general machine
that could be programmed to do any task (a
“stored program” computer).
The same data that a program manipulates can
also be the program that controls the machine.
Building Electronic Computers
So what is it that makes a computer go?
As you peel back the layers (circuit boards,
chips, memory, processing, …) you finally
get to the “bottom” – the indivisible atoms
that a computer is composed of: Logic
Gates
Logic Gates
The basic building block of a digital computer is the “logic
gate”, a hardware device that calculates a very simple
function on 1’s and 0’s.
Logic gates have been constructed with mechanical relays,
vacuum tubes, and (mainly) transistors on an integrated
circuit. Logic gates use power as they switch – that’s why
you have a fan in your computer.
We assess the “quality” of a logic gate in a number of ways:
speed, size, energy use
Moore’s Law
Technology improves at an exponential rate.
Integrated circuit capacity (number of gates)
doubles every 18 months or so.
Problems: some systems improve much faster
than others (disk speed doesn’t increase as fast as
processor speed).
Circuit size can’t keep getting smaller forever.
It’s very hard to make all of the transistors on a
chip useful – doubling the number of transistors
does not double the effective speed of a processor
How Logic Gates Work
A gate has “inputs” and “outputs”. The outputs are
determined by the inputs.
For a relay: input is voltage on the coil, outputs are
connections between terminals
Transistors are similar except a lot smaller
Key characteristics:
• Delay: how soon the output has the “right”
answer
• Power dissipation: how much heat is generated
• Size: how much silicon is needed on the chip
Building Logic Gates
Many different sorts of
logic gates are used in
computing
A Pentium has about 50,000,000 transistors
A Universal Logic Gate
Nand:
X
0
0
1
1
Y Output ( 1 = True, 0 = False)
0
1
1
1
Let’s create a “Circuit” using
0
1
these gates …
1
0
Boolean Algebra
Logic designers use boolean algebra to
understand their circuits. There is a lot of
math in the boolean world; writing + for or
and * for and,
a*(b*c) = (a*b)*c
a*(b+c) = a*b+a*c
a+a’ = 1
Truth tables are used to specify boolean
functions
Adding Two Numbers
Truth table for addition:
x y Sum Carry
0 0
0
0
0 1
1
0
1 0
1
0
1 1
0
1
Inputs
Outputs
Memory
Consider this circuit:
1
0
1
1
Wires, Clocks, and Words
A clock is the “heartbeat” a computer. The signal
on a wire carries a new value at each clock. The
clock determines the flow rate of information along
a wire.
If you want information to flow faster you use more
wires (bits).
A “word” is fixed number of bits/wires used in
calculations and storage.
A “byte” is an 8 bit word (often used to store
characters)
More About Hardware
All the gates can work in parallel: hardware
is not restricted to a “one step at a time” like
programs generally are.
Wire length is just as big a problem as gate
speed: signals move at just around the
speed of light but that’s not very fast on a
chip.
Scalability
Remember that sizes don’t mean much. A
64 bit machine doesn’t run 2x faster than a
32 bit one; a 50M transistor chip isn’t twice
as fast as a 25M one, and a 4 GHz machine
won’t execute programs twice as fast as a
2GHz one (why???).
Will a task get done twice as fast if you use
twice as many workers?
Stored Programs
The key idea in a computer is to execute
arbitrary programs for the user.
Computers use "machine language" language understood by the hardware that
controls what the computer does.
Different processors have different machine
languages.
Somehow, all of our programs need to get
turned into this machine language.
Instructions
Machine language breaks the computational
process into instructions. An instruction is
just data (0s and 1s) that is executed.
Instructions have two representations:
* Binary (in the computer)
* Textual (the assembler or textbook)
The translation between binary and textual
representations is done using an assembler
(or disassembler).
Data
Data in an executing program can live in
many different places:
* Memory
* Registers
* Disk
* Cache
An instruction mutates this data (the
"processor state") in some predefined way
Programming
* Bad old days: Program by rewiring the computer
logic
* Early computers: program written in binary
* Back in my day: program written in assembly
language
* Low level languages (Fortran, C, Pascal)
* High level languages (Java, Haskell, C#, …)
What's the difference between low and high level
languages?
About This Course
Why are we here?
* Systems are composed of both "hard" and "soft"
components. We need to understand what "hard"
components can do
* Hardware is fundamentally different from software: it is
inherently parallel. We'll learn about exploiting parallelism
in computation
* All "soft" programs are executed on hardware – we can't
understand how a program performs without knowing a lot
about hardware
* Many issues in hardware design are also present in
software systems
A Plethora of Systems
* High performance computing
* Servers
* Desktop systems
* Highly functional embedded applications
(cell phones, PDAs, cameras)
* Minimally functional embedded
applications (calculators, toys, controllers)
Understanding architecture helps a lot at
both ends of this spectrum
A Short History of Computer
Architecture
* 40's, 50's; Simple architectures, emphasis on building
hardware capable of carrying out basic math and control
* 60's: Development of "tricks" in the processor to make
execution go faster
* 70's: mini-computers (PDP8), super computers (Cray 1)
* 80's: RISC vs CISC debate, Language driven architecture
* 90's: Increasing clock speed, multi-processor
architectures, memory concerns, networking
* Recent: clock speed limitations: parallel processing,
cheap embedded systems, pervasive use of computing.
Networking becomes very important.