CS2 (Java) Exam 1 Review
Download
Report
Transcript CS2 (Java) Exam 1 Review
Computer Science
Theory & Introduction
Week 1 Lecture Material – F'13 Revision
Doug Hogan
Penn State University
CMPSC 201 – C++ Programming for Engineers
CMPSC 202 - FORTRAN Programming for Engineers
Hardware vs. Software
Hardware
essentially, things you can touch
input, output, storage devices
memory
Software
essentially, what the computer knows
data, 0s and 1s
programs
(this is a software course)
Categories of Memory
Read-only memory (ROM)
can only read data
Random-access memory (RAM)
can read and write information
primary storage - computer’s main
memory
volatile
Sequential Access vs. Random
Access Memory
Sequential Access:
must access each
location in memory in
order
Random Access:
can access memory
locations using
addresses, in any order
Track 1
Track 2
Speed implications?
Measuring Memory
base unit: 1 bit = binary digit, 0 or 1
8 bits = 1 byte (B)
1000 bytes ≈ 1 kilobyte (KB)
1000 KB ≈ 1,000,000 B ≈ 1 megabyte (MB)
1000 MB ≈ 1,000,000,000 B ≈ 1 gigabyte
(GB)
1000 GB ≈ 1,000,000,000,000 B ≈ 1 terabyte
(TB)
Storage Device Capacities
Floppy disk (the old 3.5” ones)
1.44 MB
Compact disc (CD)
650-700 MB
Digital Versatile/Video Disc (DVD)
4.7 GB
Hard disks, flash drives
typically sizes in GB
Software Overview
System software
Controls basic operations of computer
The operating system
manages memory, files, application software
File management tasks – deleting, etc.
Software Overview
Application software
Not essential to system running
Enables you to perform specific tasks
Ex:
Office software
Web browsers
Media players
Games
Algorithms and Languages
An algorithm is a set of instructions to solve a
problem.
Think recipes.
Many algorithms may solve the same
problem.
How do we choose?
We use a programming language to explain
our algorithms to computer and write
programs.
Programming Paradigms/Models
Imperative Programming: specify steps to
solve problem, use methods, methods could
get long
Object-Oriented Programming (OOP): create
objects to model real-world phenomena, send
messages to objects, typically shorter
methods
Event-Driven Programming: create methods
that respond to events like mouse clicks, key
presses, etc.
Others: Functional, logic, etc.
Compiled vs. Interpreted
Languages
Interpreted Language
Requires software called
an interpreter to run the
code
Code is checked for
errors as it runs
(erroneous code: do the
best we can…)
Examples: HTML,
JavaScript, PHP
Compiled Language
Requires software called
a compiler to run the
code
Code must be compiled
into an executable before
running (and thus errorfree)
Examples: C, C++,
Pascal, Fortran, BASIC
Compiling Process
Source
Code
(C++,
Fortran, …)
compiler
Object
Code
linker
Object
Code from
Libraries
Executable
Program
Errors
Syntax Errors
Misuse of the language, much like using incorrect
punctuation in English
Compiler reports; program won’t run until they’re
resolved
Logic Errors
Program doesn’t solve the problem at hand
correctly
Runtime Errors
Errors that occur while the program is running,
e.g. problems accessing memory and files, divide
by zero
Abstraction
Poll:
Who can use a CD player?
Who can explain how a CD player works?
Who can drive a car?
Who is an auto mechanic?
Abstraction
Principle of ignoring details that allows us to use
complex devices
Focus on the WHAT, not the HOW
Fundamental to CS
Other examples?
Levels of Abstraction
0. Digital Logic
1. Microprocessor
2. Machine Language
3. Operating System
4. Assembly Language
5. High-Level Language
6. Application Software
Binary Numbers
Use two symbols: 0 and 1
Base 2
Compare with decimal number system
Uses symbols 0, 1, 2, 3, 4, 5, 6, 7, 8, 9
Base 10
At the lowest level of abstraction,
everything in a computer is expressed
in binary.
Binary Numbers, ctd.
0
1
10
11
100
101
110
111
1000
1001
1010
1011
1100
1101
1110
1111
10000
Places:
Decimal: 1s, 10s, 100s,
etc.
Binary, 1s, 2s, 4s, 8s, etc.
Conversion between
decimal and binary is
done by multiplying or
adding by powers of 2.
“There are 10 kinds of
people in the world…”
Other Number Systems
Any positive integer could be the base
of a number system. (Big topic in
number theory.)
Others used in computer science:
Octal: Base 8
Hexadecimal:
Base 16
New symbols A, B, C, D, E, F
ASCII
Every character on a computer -- letters,
digits, symbols, etc. -- is represented by
a numeric code behind the scenes.
This system of codes is called ASCII,
short for American Standard Code for
Information Interchange.
We’ll learn more in lab…
# Transistors on a Processor
Data for
Intel
processors:
Data from Section 4.1 of :
Yates, Daniel S., and David S.
Moore and Daren S. Starnes.
The Practice of Statistics. 2nd
Ed. New York: Freeman,
2003.
Processor
Date
Number of Transistors
4004
1971
2,250
8008
1972
2,500
8080
1974
5,000
8086
1978
29,000
286
1982
120,000
386
1985
275,000
486 DX
1989
1,180,000
Pentium
1993
3,100,000
Pentium II
1997
7,500,000
Pentium III
1999
24,000,000
Pentium 4
2000
42,000,000
A Graphical View
Pay attention to the units on the axes…
Graph from Intel's web site (http://www.intel.com/technology/mooreslaw/index.htm); Retrieved 9/24/2006
Moore’s Law
Prediction from Gordon Moore of Intel
in 1965.
Implication: The speed of processors
doubles roughly every 12 to 18 months.
Exponential relationship in the data.
For the curious: the regression equation from the
x
data two slides back is yˆ 1648.161.393
Can this go on forever?