Transcript Slides01x
The “Nothing is Obvious” Story
Imagine a young boy in the Amazon jungles.
This boy has always lived in the jungle without
any modern conveniences. He has never been
in a city; he has never seen a television nor seen a book.
Now imagine that for some unknown reason this young
boy travels to Colorado in the Winter time. The little boy
stands in a yard somewhere and watches the snow with
bewilderment. He is astonished; he does not understand
what is falling from the sky.
Another little boy, about the same age, from Colorado,
looks at the boy's behavior. The Colorado boy is
confused, why is the boy acting so odd?
Obviously it is snowing, so what is the big deal?
Corn Flakes & Iced Tea
Most Americans consider it
"obvious" that cold milk is poured
on corn flakes. However, in
Europe, everybody knows you put
warm milk on your cereal.
Most Europeans consider it
"obvious" that Tea is to be served
warm, preferably hot. They are
completely baffled when Texans
actually put ICE in their Tea.
The Exposure Equation
Bewilderment + Exposure = Obvious
Exposure in
Extracurricular Activities
• Drill team performance
• Half-time band show
• Football Team blocking
• Basketball free throws
• Baseball batting
The Curious
Exposure Discrepancy
Students recognize that only continuous
practice will result in a good showing at a brief
performance or brief competition
Many of the same students barely read or
practice a topic once for an academic subject.
It appears that preparation for a known, short
performance requires practice, but preparation
for life receives only minimal effort from many
students.
Computer Fundamentals
Getting started with computer science is none too easy.
The course that you are taking assumes that this is your
first formal computer science course.
Furthermore, it is also assumed that you have no
knowledge of programming.
If you do know some programming, fine, but it is not any
kind of a prerequisite.
This means that we should start at
Three Ways Where
Computers Beat People
• Computers are faster.
• Computers are more accurate.
• Computers do not forget.
Motherboard & Computers Chips
motherboard
The main board with all the primary computer
components. Has several computer chips attached:
Read Only Memory (ROM)
This chip stores permanent information for the
computer.
Random Access Memory (RAM)
This chip stores temporary information for the
computer.
Central Processing Unit (CPU)
This chip is the “brains” of the computer.
Measuring Memory
KB
MB
GB
TB
PB
EB
ZB
YB
Kilo Byte
1 thousand bytes
1,000
Mega Byte 1 million bytes
1,000,000
Giga Byte
1 billion bytes
1,000,000,000
Tera Byte
1 trillion bytes
1,000,000,000,000
Peta Byte
1 thousand terabytes
1,000,000,000,000,000
Exa Byte
1 million terabytes
1,000,000,000,000,000,000
Zetta Byte 1 billion terabytes
1,000,000,000,000,000,000,000
Yotta Byte
1,000,000,000,000,000,000,000,000
1 trillion terabytes
Note: Technically, a kilobyte is exactly 210 or 1024 bytes.
Secondary Storage Devices
Since RAM is lost when the computer is
turned off, files must be saved to some
secondary storage device for later use.
Hardware
Hardware refers to physical pieces of computer equipment.
This included the main computer system unit, as well as all
of the peripherals (things that plug into the computer.)
Software
Software provides instructions to a computer.
The most important aspect of this course is to learn how to give correct and
logical instructions to a computer with the help of a programming language.
Software falls into two categories:
• System Software
• Application Software.
Applications Software refers to the instructions that the computer
requires to do something specific for you.
Word Processors and Electronic Spreadsheets are the two most
common applications for a computer.
System Software refers to the instructions that the computer requires
to operate properly.
The major operating systems are Windows, UNIX, Linux & the MAC OS.
The First Era
Counting Tools
A long time ago some caveman must
have realized that counting on fingers
and toes was very limiting.
They started making marks on rocks,
carving notches in bones and tying
knots in rope.
Eventually, mankind found more
practical ways to not only keep track
of large numbers, but also to perform
mathematical calculations with them.
The Abacus
3000 B.C.
The Abacus was originally invented in the Middle
Eastern area. This rather amazing computing device
is still very much used in many Asian countries today.
Napier's "Bones"
A.D. 1617
John Napier used some bones marked with special scales
to simplify arithmetic by using addition for multiplication.
The Slide Rule
1622
William Oughtred created the slide rule.
This device allows sophisticated mathematical
calculations and was used for centuries until it was
replaced by the scientific calculator in the 1970s.
The Second Era
Gear Driven Devices
More complicated tasks required
more complicated devices. These
devices would have rotating gears.
Since they did not use electricity,
they would require some form of
manual cranking in order to
function.
One problem with devices that
have moving parts is that they
wear and break.
Numerical Calculating
Machine 1642
Blaise Pascal built the Pascaline. This was the first
numerical calculating machine. The inner workings of this
device are similar to the tumbler odometers found in old
cars. It could perform addition and subtraction.
A version of this device was still being used in the 1970s!
Jacquard's Loom
1805
Joseph Jacquard invented a special
loom that would accept special
flexible cards that are punched with
information in such a manner that it
is possible to program how cloth
will be weaved. It is one of the first
examples of programming.
Analytical Engine
1833
Charles Babbage designed a machine that can read
instructions from a sequence of punched cards. This
would have been the first general purpose computing
machine. Unfortunately, he was not able to complete
the device due to lack of funding.
Programming
1842
Ada Byron, the Countess of Lovelace, was Charles
Babbage’s assistant. She knew his device would require
instructions – what we would today call programs or
software. So, over 170 years ago, she started designing
computer programs. In so doing she developed certain
programming ideas and techniques that are still used in
modern programming languages today.
The Third Era
Electro-Mechanical Devices
The term electro-mechanical device means the
device uses electricity, but still has moving parts.
These devices are not yet “fully electronic”.
Since they do use electricity, the manual cranking
is no longer needed. Since they still have moving
parts, they still break down easily.
Tabulating Machine 1889
Herman Hollerith invented a tabulating machine that records
statistics for the U.S. Bureau of Census. The 1880 census
took 8 years to tabulate by hand. With this punch-card
tabulating machine, the 1890 census was tabulated in just
1 year. In 1896, Herman Hollerith founded the Tabulating
Machine Company.
After merging with some other
companies Hollerith’s company was eventually known as
International Business Machines.
Turing Machine
1936
Alan Turing described a device which he called an “automatic
machine”. This hypothetical machine has the ability to read
instructions and information from a tape and can write to the
tape as well.
In theory, this could solve any computational problem provided
you had enough tape and a lot of patience.
Eventually, these “devices” were called “Turing Machines”.
This lead to the term “Turing Complete”. A computer is
considered Turing Complete if it can solve any computational
problem provided memory and execution time are not issues.
Technically, Babbage’s Analytical Engine would have been the
first Turing Complete computer, had it been built at the time it
was designed.
Bombe
Based on the Polish Bomba, Alan
Turing designed the improved Bombe
in 1939 at the UK Government Code
and Cyper School at Bletchley Park.
Like its predecessor, it was designed
to decrypt coded messages from the
Nazi’s Enigma machine, which was
now far more complex than it was
before World War II.
NOTE: In the recent movie
The Imitation Game this machine
was erroneously called Christopher.
This was one of several examples of
“artistic license” used in the movie.
1939
Z3 1941
Konrad Zuse builds an electro-mechanical computer capable
of automatic computations in Germany during World War II.
It was the first functional,
programmable, fully
automatic digital computer.
The Z3 was destroyed in
1943 during the Allied
bombing of Berlin.
Some people argue that the Z3 was Turing Complete and
therefore proclaim Konrad Zuse as the “inventor of the
computer”.
Mark-I 1944
This electro-mechanical calculator was 51 feet long and 8
feet tall. It was the first machine that could execute long
computations automatically. These computations could
involve several numbers, each up to 23 digits in length.
Grace Hopper
Grace Hopper, then a Navy Lieutenant, was one of the
first programmers of the Mark-I. She would make so
many contributions to the world of computer science
that the United States Congress allowed her to stay in
the Navy past mandatory retirement age. She finally
retired as an Admiral in 1986 at the age of 79.
Von Neumann Architecture 1945
John von Neumann describes the design for an electronic
digital computer. Like the Turing machine, this is a description
of a Stored Program Computer, which means the computer
stores both the program’s instructions and its data in RAM.
Mark-II 1947
On September 9,
1947 the Mark-II
stopped working.
A technician
found and
removed moth
from one of its
relays. This was
the first literal
computer bug.
The actual moth
is currently on
display at the San
Diego Computer
Museum.
The Fourth Era
Fully Electronic Computers
with Vacuum Tubes
This is often referred to as
“The First Generation of Computers”.
Fully electronic computers do not
rely on moving parts. This makes
them faster and more reliable.
The vacuum tubes used at the time
still had their share of drawbacks.
• They were big and bulky.
• They would get hot and burn out.
ABC
The very first
electronic digital
computer, the ABC,
was invented by
John Atanasoff and
Clifford Berry at Iowa
State University.
This was not a
“general purpose
computer”, nor was
it programmable (So
not Turing Complete.)
It was specifically
designed to solve
systems of linear
equations.
1940
Colossus
1943
The Nazis had a machine called the
Lorenz SZ-40 Cypher which was even
more complicated than the Enigma.
The Colossus was the first computer
that could reliably decrypt the Lorenz
Cypher.
This was also the first electronic digital
computer that was somewhat programmable.
It was designed by an engineer named Tommy Flowers.
A total of 10 Colossus computers were made. At the end of the war, all Colossus
computers were destroyed for secrecy. The British code breakers at Bletchley
Park would not receive any credit until these events were declassified in 2012.
ENIAC 1946
The ENIAC (Electronic Numerical
Integrator and Computer) was the
first electronic general purpose
computer.
It was invented by John Mauchly
and J. Presper Eckert. This
computer is twice the size of the
Mark-I, contained 17,468 vacuum
tubes, and is programmed by
rewiring the machine.
ENIAC 1946
The ENIAC was capable of performing 385
multiplication operations per second.
In 1949, it was the first computer used to calculate PI.
The press called it “The Giant Brain.”
EDVAC
The EDVAC (Electronic
Discrete Variable
Automatic Computer)
was the successor to
the ENIAC.
The main improvement
was that it was a Stored
Program Computer.
This meant is could
store a program in
electronic memory.
(about 5½ kilobytes).
1949
UNIVAC I
The UNIVAC I (UNIVersal Automatic
Computer) was the world’s first
commercially available computer.
The computer became famous when
it correctly predicted the results of
the 1952 presidential election.
1951
The Fifth Era
Computers with Transistors
or Integrated Circuits
The invention of the transistor began
“The Second Generation of Computers”.
The University of Manchester made
the first transistor computer in 1953.
Transistors have certain key
advantages over vacuum tubes.
• They are much smaller.
• They do not get hot and burn out.
Integrated Circuit 1958
Jack Kilby, of Texas Instruments, in Richardson, Texas, developed
the first integrated circuit which has multiple transistors on a tiny
thin piece of metal, called a chip. Jack Kilby used germanium.
Six months later Robert Noyce came up with his own idea for an
improved integrated circuit which uses silicon.
He is now known as “The Mayor of Silicon Valley”.
Both gentlemen are credited
as co-inventors of the
integrated circuit.
This began “The Third
Generation of Computers”.
As technology improved,
we developed the ability
to put billions of transistors
on a tiny microchip.
Video Games
1958/1962
The first video game was called Tennis for Two.
It was created by William Higinbotham and played on a
Brookhaven National Laboratory oscilloscope.
Since this game did not use an
actual computer monitor, some
give credit for the first video game
to SpaceWar! written by
Stephen Russell at MIT in 1962.
IBM System/360
1964
IBM creates a family of computers which cover a complete range of
applications. These computers work for both the math community and the
business community. All computers in this series are compatible, but sell
for different prices based on their speed. System/360 is responsible for
establishing a number of industry standards including the 8 bit byte.
Apple II
1977
The Apple Computer
Company was created
and introduced the
Apple II Personal
Computer.
It became the first
commercially
successful personal
computer.
VisiCalc
1979
Dan Bricklin created VisiCalc, a spreadsheet program,
which became the first wide spread software to be sold.
WordStar
1979
MicroPro releases WordStar, which became the most
popular word processing program.
IBM PC
1981
IBM's entry into the
personal computer
market gave the
personal computer a
serious image as a
true business
computer and not
some sophisticated
electronic game
playing machine.
MS-DOS
1981
Microsoft, an
unknown little
company run by Bill
Gates, agreed to
produce the
operating system
for the IBM
Personal Computer
and became a
company larger
than IBM.
Compaq Portable 1982
The Compaq Portable is known for two things. It was the
first portable computer. By today’s standards it was
nothing like a modern laptop. The 28 pound computer
was the size of a small suitcase. Compaq was also the
first computer to be 100% compatible with an IBM PC.
The Macintosh
1984
The "Mac" was the first
commercially successful
computer with the
mouse/windows technology.
The mouse technology was
already developed earlier by
Xerox Corporation.
Link to 1984 Macintosh Super Bowl commercial.
Windows 1.0
1985
The first Windows operating system was actually an
Operating Environment which acted as a front while
MS-DOS was running in the background.
Windows 95
1995
Microsoft introduces Windows 95, which uses an
operating system similar to the Macintosh computer.
Windows Versions 1985-2015
Version
Home Editions
Professional / Power User Editions
Windows 1.0 – 3.1
Windows NT 3.1
Windows 95
Windows NT 3.51
1
2
3
4
Windows 98
Windows NT 4.0
Windows Millennium
Windows 2000
5
Windows XP Home Edition
Windows Home Server
Windows XP Professional Edition
Windows Server 2003
Windows Server 2003 R2
Windows Vista
Windows Server 2008
Windows 7
Windows Home Server 2011
Windows Server 2008 R2
Windows 8, 8.1
Windows Phone 8
Windows RT
Windows Server 2012
Windows Server 2012 R2
6
7
8
9
Windows 10?
Tianhe-2 Supercomputer 2013
As of June 2013, China’s Tianhe-2 is the fastest computer
in the world. It can perform 33,860,000,000,000,000
floating point operations in 1 second.
That is 88 TRILLION times as fast as the ENIAC!
Program Definition
A program is a sequence of instructions
that makes a computer perform a desired
task.
A programmer is a person who writes a
program for a computer.
Programming in Machine Code
Programming in Machine Language a.k.a. Machine
Code means you are directly manipulating the 1s and
0s of the computer’s binary language.
In some cases, this means
you are manipulating the
wires of the machine.
In other cases, you are flipping switches on and off.
Even if you had the ability to “type” the 1s and 0s,
machine language would still be incredibly tedious.
Assembly Language & the EDSAC 1949
Assembly Language was first
introduced by the British with
the EDSAC (Electronic Delay
Storage Automatic Computer).
EDSAC had an assembler
called Initial Orders which
used single-letter mnemonic
symbols to represent different
series of bits.
While still tedious, entering a
program took less time and
fewer errors were made.
“Amazing Grace”
In the 1940s, Grace Hopper did not like the way we were programming
computers. There had to be a better way. The whole reason computers were
invented in the first place was to do tedious. It should be possible to
program a computer using English words instead of 1s and 0s.
Grace Hopper wrote the first compiler (a type of translator) in 1952 for the
language A-0. This paved the way for the other languages that followed.
Many of these were also created in part or in whole by Grace Hopper.
Her immeasurable contributions to computer science have earned her the
nickname “Amazing Grace”. The Cray XE6 Hopper supercomputer and the
USS Hopper Navy destroyer are also named after her.
Types of Languages
Low-Level Languages
Languages that function at, or very close to 1s and 0s.
Powerful, but very difficult.
Examples: Machine Language, Assembly Language
High-Level Languages
Languages that use English-like words as instructions.
Easier, but less powerful.
Examples: BASIC, Pascal, FORTRAN, COBOL, LISP, Java, Python
Very High-Level Languages
Languages that use clickable pictures as instructions.
Example: Lego Mindstorms NXT
Computer Translators
A translator (compiler or interpreter)
translates a high-level language into
low-level machine code.
A compiler translates the entire program
into an executable file before execution.
An interpreter translates one program
statement at a time during execution.
FORTRAN
1957
The first successful high-level programming
language.
FORTRAN stands for FORmula TRANslation
Developed by a team of IBM programmers
for mathematicians, scientists and
engineers.
While good for number crunching,
FORTRAN could not handle the record
processing required for the business world.
LISP
1958
LISP (LISt Processing) was designed by John McCarthy at MIT.
It is known for being one of the languages specifically designed
to help develop artificial intelligence.
LISP introduced several important programming concepts
which are used in modern programming languages today.
NOTE: The question
“Can computers think?”
was first propose by Alan
Turing in his 1950 paper
Computing Machinery
and Intelligence.
The first section of his paper was titled The Imitation Game.
Eventually, Alan Turing would become known as “The Father
of Modern Computer Science and Artificial Intelligence”.
COBOL
1959
Created (largely by Grace Hopper) for the
business community and the armed forces.
COBOL stands for
COmmon Business Oriented Language.
COBOL became extremely successful when the
Department of Defense adopted COBOL as its
official programming language.
PL/I
1964
PL/I stands for Programming Language 1.
After IBM standardized hardware with System/360, they
set out to standardize software as well by creating PL/I.
This language combined all of the number crunching
features of FORTRAN with all of the record handing
features of COBOL.
The intention was that this language would be
“everything for everyone”.
The reality was that the FORTRAN programmers did not
like the COBOL features, the COBOL programmers did
not like the FORTRAN features, and new programmers
found the language too complex to learn.
BASIC
1964
Tom Kurtz and John Kemeny created BASIC (Beginners Allpurpose Symbolic Instruction Code) at Dartmouth College.
Their intention was that a simple language would give non-math
and non-science majors the ability to use computers.
The use of BASIC became
widespread when personal
computers hit the market.
The first was the Altair in 1976.
BASIC required little memory, and it was the only language that
could initially be handled by the first personal computers.
The Altair was shipped with Altair BASIC a.k.a. Microsoft BASIC.
Pascal
1969
College professors did not like BASIC because it did not teach
proper programming structure.
Instead, it taught quick-and-dirty programming.
Niklaus Wirth decided to create a language specifically for the
purpose of teaching programming.
He named this new language Pascal after Blaise Pascal.
Unlike PL/I, Pascal is a very lean language.
It has just enough of both the math features of FORTRAN and
the record handling features of COBOL to be functional.
In 1983, the College Board adopted Pascal as the first official
language for the AP® Computer Science Exam.
C
1972
In 1966, BCPL (Basic Combined Programming Language)
was designed at the University of Cambridge by Martin
Richards.
This language was originally intended for writing compilers.
In 1969, Ken Thompson, from AT&T Bell Labs, created a
slimmed down version of BCPL which was simply referred
to as B.
In 1972, an improved version of B was released.
This was called C.
In 1973, C was used to rewrite the kernel for the UNIX
operating system.
ABC
1975
ABC (the programming language, not the
Atanasoff Berry Computer) was developed at
Centrum Wiskunde & Informatica (CWI) in the
Netherlands by Leo Geurts, Lambert Meertens
and Steven Pemberton.
Their intention was to make a powerful,
readable, intuitive language more “modern”
than BASIC and less “wordy” than Pascal.
C++
1983
As computer programs grew more complex a new,
more powerful, and more reliable type of programming
was needed. This lead to the development of
Object Oriented Programming (OOP).
Bjarne Stroustrup wanted to create a new language
that uses OOP, but did not want programmers to have
to learn a new language from scratch. He took the
existing, very popular language C and added OOP to it.
This new language became C++.
In 1997, C++ replaced Pascal as the official language
for the AP® Computer Science Exam.
HTML and the World Wide Web 1990
In 1990, Tim Berners-Lee created the first web
server and the first web browser.
No he did not “invent the Internet,” but he did
invent the Word Wide Web.
That same year, he also developed HTML
(HyperText Markup Language).
This is the language used by all web browsers.
Medium-Level Languages
C and C++ are sometimes considered to
be medium-level languages.
This is because they have the English
commands of a high-level language as
well as the power of a low-level language.
This made C, and later C++, very popular
with professional programmers.
Python
1994
In 1989, Guido van Rossum started implementing
Python as a successor to ABC with new features.
In 1994, Python version 1.0 was
released. Several backwardly
compatible versions followed.
Starting with Version 3.0 Python
eliminated redundant ways
to perform the same task.
This means Python 3.0 is
NOT backwardly compatible with any earlier version
of Python. In 2015, Python 3.5 was released.
Java
1995
Released in 1995 by Sun Microsystems.
Java is a Platform Independent Language.
Platform Independence means that the language does
not cause problems as programs are transported
between different hardware and software platforms.
Unlike C++, were OOP is optional, Java requires you to
use OOP which caused many universities to adopt it.
For this reason, in 2003, Java replaced C++ as the
official language for the AP® Computer Science Exam.
In 2010, Oracle acquired Sun Microsystems.
Java has continued to improve in the same manner as
when Sun Microsystems owned the company.
Lego Mindstorms NXT
2006
A new kind of programming has
come about that is very high-level.
In this style of programming, the programmers can click
on different blocks. Each block performs a different task. By creating a
sequence of these blocks, you can program a computer.
In 1998, the Lego Corporation created their first point-and-click language,
RCX (Robotic Command eXplorers) for use with their Lego Mindstorms
robots. In 2006, they released their next language, and decided to call it NXT.
In 2013, Lego released EV3 which refers to the language’s “3rd Evolution.”
What will you be learning?
During this school year, you will primarily be
learning Java. There will also be one unit
on Python
and another unit on HTML.
SneakerNet
Early personal computers were not networked at all.
Every computer was a stand-alone computer.
Some computers were hooked up to printers and many
others were not.
If you needed to print something, and you were not
directly connected to a printer, you saved your work to a
floppy disk, put on your sneakers, and walked to the
printing computer.
Sharing files was done in the same way.
Peer-To-Peer Networks
The first practical networks for personal computers were peer-to-peer
networks.
These are small groups of computers with a common purpose all
connected to each other.
These types of networks were frequently called Local Area Networks
or LANs.
Initially, the networks were true peer-to-peer networks.
This means that every computer on the network was equal.
Client-Server Networks
A server is a special computer that is connected to the
LAN for one or more purposes.
It services the other computers in the network which are
called clients.
Servers can be used for printing, logon authentications,
permanent data storage and communication.
The Department of Defense
The Internet has existed since the 1960s and has its origins in the
"Cold War." During the Cold War there was a major concern about
the country being paralyzed by a direct nuclear hit on the Pentagon.
A means of communication had
to be created that was capable
to keep working regardless of
damage created anywhere.
This was the birth of the Internet.
The Internet has no central
location where all the control
computers are located.
Any part of the Internet can be
damaged and all information will
then travel around the damaged area.
The Modern Internet
Normally, businesses and schools have a series of LANs that all
connect into a large network called an Intranet.
An Intranet behaves like the Internet on a local business level.
This promotes security, speed and saves cost.
Now the moment a school, a business, your home, wants to be
connected to the outside world and giant world-wide network
known as the Internet, you have access to millions of lines of
telecommunications.
This will cost money and every person, every school, every
business, who wants this access needs to use an Internet
Service Provider or ISP.
You pay a monthly fee to the ISP for the Internet connection.
The amount of money you pay depends on the speed of your
Internet connection.