CS 110 – Lecture 2

Download Report

Transcript CS 110 – Lecture 2

CS 300 – Lecture 2
Intro to Computer Architecture
/ Assembly Language
History
Help!
190 needs your help! Can you be a TA?
Homework 1
Due next Friday.
Check the wiki.
Stored Programs
The key idea in a computer is to execute
arbitrary programs for the user.
Computers use "machine language" language understood by the hardware that
controls what the computer does.
Different processors have different machine
languages.
Somehow, all of our programs need to get
turned into this machine language.
Instructions
Machine language breaks the computational
process into instructions. An instruction is
just data (0s and 1s) that is executed.
Instructions have two representations:
* Binary (in the computer)
* Textual (the assembler or textbook)
The translation between binary and textual
representations is done using an assembler
(or disassembler).
Data
Data in an executing program can live in
many different places:
* Memory
* Registers
* Disk
* Cache
An instruction mutates this data (the
"processor state") in some predefined way
Programming
* Bad old days: Program by rewiring the computer
logic
* Early computers: program written in binary
* Back in my day: program written in assembly
language
* Low level languages (Fortran, C, Pascal)
* High level languages (Java, Haskell, C#, …)
What's the difference between low and high level
languages?
The "Von Neumann" Computer
John Von Neumann was one of the first to
describe in detail the architecture of a computer.
Ideas:
* Memory to hold data / program
* Various "wired-in" functionality (math, for the
most part)
* Instructions that perform calculations and move
data around
* A single thread of control (instruction at a time
processing)
* Instructions for conditional branching
A Short History of Computer
Architecture
1940s and 50s: Simple architectures, based
on the "Von Neumann" model. Emphasis
was on building hardware capable of
carrying out basic math and control.
Programming in assembly language and
some primitive high level languages.
A Short History of Computer
Architecture
1960's: Commercialization of computing – the "Mainframe"
era. Development of "tricks" in the processor to make
execution go faster or programming easier:
* Instruction pipelining / Prefetch
* Virtual memory / Security issues
* Microcode
High level languages began to take over
My first computer!!
A Short History of Computer
Architecture
The 1970s was an area of much architectural
diversity as the computing field expanded.
Mini-computers PDP-8, -11, Data General Nova
Super Computers: Cray 1
Unusual mainframes: B6700
Seymour Cray
Cray was one of the true geniuses
of computer architecture
Studying his machines reveals an essential
simplicity that is missing from many current
architectures.
Cray died in a traffic accident in Colorado
Springs in 1996.
Burton Smith carried Cray's legacy
on after his death.
A Short History of Computer
Architecture
1980's: the RISC vs CISC debate
Patterson postulated that it would be easier to
optimize execution if the instruction language is
simplified. He demonstrated a processor (RISC-1)
that outperformed current CPUs using ¼ the
transistors.
This led to SPARC and a host of other chip
designs.
VLSI begins to take over.
C becomes the language of choice for low level
programming.
A Short History of Computer
Architecture
1990s:
* Increasing clock speed
* Dominance of Intel / Pentium
* multi-processor architectures
* memory concerns
* networking
* Embedded systems becomes a big
market
A Short History of Computer
Architecture
Architecture today:
* clock speed limitations hit
* power issues become more critical
* parallel processing / Multicore chips
* embedded systems are everywhere
* more special purpose processors
* pervasive use of computing
* Custom chips become cheaper and cheaper
* Devices like FPGAs sit between hardware and
software
About This Course
Why are we here?
* Systems are composed of both "hard" and "soft"
components. We need to understand what "hard"
components can do
* Hardware is fundamentally different from software: it is
inherently parallel. We'll learn about exploiting parallelism
in computation
* All "soft" programs are executed on hardware – we can't
understand how a program performs without knowing a lot
about hardware
* Many issues in hardware design are also present in
software systems
Starting at the Bottom
We're going to go bottom up through this
material. So we'll start with logic gates and
start working our way up.
Gates use electrons (voltage) to represent
information. Low = 0, High = 1. We can use
a probe (light) to see the current state of the
voltage in a circuit.
How Logic Gates Work
A gate has “inputs” and “outputs” (wires). The
outputs are determined by the inputs.
For a relay: input is voltage on the coil, outputs are
connections between terminals
Transistors are similar except a lot smaller
Key characteristics:
• Delay: how soon the output has the “right”
answer
• Power dissipation: how much heat is generated
• Size: how much silicon is needed on the chip