Transcript computers

ES-103 Computers and Information Systems







Instructor
Email
Room
: Gürhan Küçük
: [email protected]
: A409 (1st Floor)
Recommended Book: Computer Tools for an Information Age HL
Capron, Addison-Wesley, 8th Ed,2003
Course Web Page : http://cse.yeditepe.edu.tr/~gkucuk/es103
Attendance
: 80% theoretical, 80% lab
Evaluation
:



30 % Midterm
40 % FINAL
30 % Quizzes & Homeworks
1
Material to be Covered






Week 2
Computing today, historical thoughts,
introductory terms, Lab: Windows, Explorer,
Internet, email, ftp, …
Week 3 Data storage and memory, hardware,
software, specialty items, Lab: DOS
Week 4-6 Hardware part I: Numbering systems and
codes, CPU, Character-based peripheral
devices, Lab: Word
Week 7 Hardware Part II: Data storage and
organization,magnetic storage, Lab: Unix
Week 8 Data communication hardware, local area
networks, Lab: Unix
Week 9 SUMMARY AND EXAM, Lab: Summary
2
Material to be Covered





Week 10
Week 11
Week 12
Week 13
Networking, Lab: Excel
Operating systems, Lab: Excel
Application software, Lab: Powerpoint
The art of programming, programming
languages, flowcharts, Lab: Powerpoint
Week 14 Database Management Systems, Lab: Access

Week ?? Expert Systems, Robotics and Virtual Reality,
Lab: Programming
3
Introduction

What is a computer?

Computer as a useful tool:


Wide application area: Companies, Schools, Airports, Hospitals,
Banks, Military, Airports...
Quite new: A product of information age.



Industrial age:Electricity, telephones, radio, automobiles, planes
Information age: Computers, Internet, Mobile Communication.
Why a Computers and Information Systems course?

Computer Literacy




Awareness of the computers in the society.
Knowledge about computers and how they work.
Benefit from the use of applications and software provided.
Benefit from using the world wide web.
4
Introduction

Another dimension:


Scientific cruosity for understanding intelligence, the
nature of computation and logic.
Computer is an end product.


A product of the desire to understand intelligence and build
machines that can automatically perform calculations,
computation and other intelligent tasks.
An old problem.

Goes back to ancient times.
5
Calculation and Computation

Calculation:


determining something by mathematical or logical methods
Transforming one or more inputs into one or more results.


Multiply 7 and 8.
Computation:


can be defined as finding a solution to a problem from given
inputs by means of an algorithm.
Denotes a more general process involving data and algorithms.


Algorithm: A well-defined set of instructions to perform a certain task.
A program that keeps records of students in a school and
answers queries about the data it keeps.
6
Calculation and Computation

Scientific research on the theory of calculation, computation and logic
began before the modern electronic computers were built.
 Involves the work of mathematicians, philosophers and engineers.

To understand the computer technology today, need to go back in
history.
 Make a tour and find out the milestones that headed the human-beings
to the modern technology today.
 Three dimensions:



Mathematics
Philosphical issues: Development in logic and understanting intelligence.
Progress in Engineering and Technology.

The ability to built machines and tools that can perform certain tasks.
7
Timeline of Computer Technology
1900 – 1800 BC
The first use of place-value number system
(eg the decimal system – value of the number depends
both on the digit itself and the position of the digit)
1000 – 500 BC
The invention of abacus: the first actual calculating
mechanism known to man
300 – 600 AD
The first use of the number 0, and negative numbers
(first appeared in India)
8
Logic

Aristotle (384 - 322 B.C.)




The first comprehensive work on formal reasoning.
The roots of formal logic goes back to Aristotle.
Aristotle’s logic forms the base for the mathematical or symbolic
logic that we use today.
Leibniz (1646 - 1716)

Alphabet of human thought.


A Symbolic approach.
He tried to represent all fundamental concepts using symbols and
combined these symbols to represent more complex thoughts.

Calculus and logic came into play for the first time.
9
Blaise Pascal
In 1640 Pascal started developing a
device to help his father add sums of
money.
The Arithmetic Machine could only add
and subtract, while multiplication and
division operations were implemented by
performing a series of additions or
subtractions.
Gottfried von Leibniz
Leibniz developed Pascal's ideas and,
in 1671, introduced the Step Reckoner,
a device which, as well as performing
additions and subtractions, could
multiply, divide, and evaluate square
roots by series of stepped additions.
Arithmetic machine (1642)
Step Reckoner (1671)
Pascal's and Leibniz's devices were the forebears of today's desk-top computers, and
derivations of these machines continued to be produced until their electronic equivalents
finally became readily available and affordable in the early 1970s.
10
In the early 1800s, a French silk weaver called Joseph-Marie Jacquard invented a way
of automatically controlling the a silk loom by recording patterns of holes in a string of
cards.
In the years to come, variations on Jacquard's punched cards would find a variety of
uses, including representing the music to be played by automated pianos and the
storing of programs for computers
IBM 80-column punched card format
11
Charles Babbage
The first device that might be considered to be a computer in
the modern sense of the word was conceived in 1822 by the
eccentric British mathematician and inventor
Charles Babbage.
The Difference Engine, which was reconstructed in 1970s
from cast iron, bronze and steel, consisted of 4,000
components, weighed three tons, and was 10 feet wide and
6½ feet tall.
In Babbage's time, mathematical tables, such as logarithmic and trigonometric
functions, were generated by teams of mathematicians working day and night on
primitive calculators.
Due to the fact that these people performed computations they were referred to as
"computers."
In fact the term "computer" was used as a job description (rather than referring to the
machines themselves) well into the 1940s.
This term later became associated with machines that could perform the computations
on their own.
12
The device performed its first sequence of calculations in the early 1990's and returned
results to 31 digits of accuracy, which is far more accurate than the standard pocket
calculator.
However, each calculation requires the user to turn a crank hundreds, sometimes thousands
of times…
… so anyone employing it for anything more than the most rudimentary calculations is
destined to become one of the fittest computer operators on the face of the planet!
13
The Difference Engine was actually only partially completed when Babbage
conceived the idea of another, more sophisticated machine called an Analytical
Engine (around 1830).
The Analytical Engine was intended to use loops of Jacquard’s punched cards to
control an automatic calculator, which could make decisions based on the results of
previous computations. This machine was also intended to employ several features
subsequently used in modern computers, including sequential control, branching, and
looping.
Working with Babbage was Augusta Ada Lovelace, the
daughter of the English poet Lord Byron. Ada, who was a
splendid mathematician and one of the few people who fully
understood Babbage's vision, created a program for the
Analytical Engine.
Had the Analytical Engine ever actually worked, Ada's
program would have been able to compute a mathematical
sequence known as Bernoulli numbers. Based on this work,
Ada is now credited as being the first computer programmer
and, in 1979, a modern programming language was named
ADA in her honor.
14
George Boole
Boole made significant contributions in several areas of
mathematics, but was immortalized for two works in 1847 and
1854, in which he represented logical expressions in a
mathematical form now known as Boolean Algebra.
Boole's work was all the more impressive because, with the
exception of elementary school and a short time in a
commercial school, he was almost completely self-educated.
Boole’s work was only learned by Philosophy and Logic
students, until in 1938 Claude E. Shannon published an
article based on his master's thesis at MIT, where he
showed how Boole's concepts of TRUE and FALSE could
be used to represent the functions of switches in
electronic circuits.
Claude Shannon,
Creator of information theory
15
1888
US Census Bureau invites Herman Hollerith and his
tabulating machines (including a card puncher, card
reader and tabulator – or electric adding machine-.
1900s
Hollerith’s machines are a success, and he forms the
Computing-Time Clock-Reading (CTR) company –
selling tabulating machines, time clocks and meat scales.
1914
CTR company hires Thomas J Watson, Sr., as president. He
renames the company :
International Business Machines Corporation (IBM)
16
Alan Turing
In 1937 Turing invented a theoretical computer as an abstract
"paper exercise."
This theoretical model, which became known as a Turing
Machine, was both simple and elegant, and subsequently
inspired many "thought experiments."
During World War II, Alan Turing worked as a cryptographer, decoding codes and ciphers at
one of the British government's top-secret establishments.
Turing was a key player in the breaking of the German's now-famous ENIGMA Code.
In 1943 Turing and colleagues began constructing COLOSSUS, to decode the German
Geheimfernschreiber cipher.
COLOSSUS was one of the world's earliest working programmable electronic digital computers.
17
1943 – 1947
ENIAC – Electronic Numerical Integrator and Calculator:
10 feet tall,
occupied 1,000 square feet of floor- space,
weighed in at approximately 30 tons,
and used more than 70,000 resistors, 10,000 capacitors, 6,000 switches, and 18,000
vacuum tubes.
The final machine required 150 kilowatts of power,
which was enough to light a small town.
ONE OF THE MAIN PROBLEMS:
it did not have any internal memory as such,
but needed to be physically programmed by means of switches and dials
18
1944 – 1952
EDVAC – Electronic Discrete Variable Automatic Computer:
(EDSAC – Electronic Differential Storage and Computer)
EDVAC's average error-free up-time was approximately 8 hours.
Based on EDVAC concept, a machine was set was operating at Manchester University,
England, by June 1948…
… consisting of 32 words of memory and a 5-instruction instruction set.
19
Johann von Neumann
In June 1944, the Hungarian- American mathematician Johann (John) Von
Neumann first became aware of ENIAC.
Von Neumann, who was a consultant on the Manhattan Project, immediately
recognized the role that could be played by a computer like ENIAC in solving
the vast arrays of complex equations involved in designing atomic weapons.
20
In 1945, he published a paper titled “First draft of a report to the EDVAC”:
A memory containing both data and instructions. Also to allow both data and
instruction memory locations to be read from, and written to, in any desired order.
A calculating unit capable of performing both arithmetic and logical operations on
the data.
A control unit, which could interpret an instruction retrieved from the memory and
select alternative courses of action based on the results of previous operations.
The computer structure resulting from the criteria presented in the "First Draft" is popularly
known as a von Neumann Machine, and virtually all digital computers from that time
forward have been based on this architecture
21
1951
Mauchley and Eckert deliver the first Universal Calculator
(UNIVAC)
This was the first computer sold for commercial, non-military
purposes
22
SMIL, one of the first Swedish computers, built at Lund University in the mid-fifties.
The original SMIL consisted of about 2000 vacuum tubes.
SMIL was the main university computer for more than 15 years and wasn't
decommissioned until 1972.
This picture shows SMIL as it looked in 1956.
23
Two of the greatest inventions of the 20th century:
Transistors and Integrated Circuits
Formed from materials known as semi-conductors,
not very well understood until 1950s
They would be much smaller, lighter and would require less power than the vacuum
tubes that were being used until that time.
The world's first transistor,
invented at Bell Labs in 1947
Dr. John Bardeen, Dr. William Shockley, and
Dr. Walter Brattain, inventors
24
Transistors, may range in number from 2 to more than 100,000, are integrated
together on pieces of silicon to produce Integrated Circuits to perform more
complex functions.
An integrated circuit contains transistors, capacitors, resistors and other parts packed in
high density on one chip.
The transistors, resistors, and capacitors are formed very small, and in high density on a
foundation of silicon.
25
1955
IBM introduces 704 series of computers, the first large-scale
systems using transistors
IBM 610 Auto-Point Computer (1957) was described as being "IBM's first personal
computer" on the premise that it was intended for use by a single operator…
but this machine was not based on the stored program concept and it cost $55,000!
1958
IBM introduces 1401 series of computers, bringing card-based
data processing to the average company
1964
IBM introduces the System/360, using microtransistors and
mass-produced core storage devices, and the idea of nondedicated, microprogrammed system
26
Other contenders include MIT's LINC (1963), CTC's Datapoint 2200 (1971), the
Kenbak-1 (1971), and the Xerox Alto (1973)…
… but all of these machines were either cripplingly expensive, relatively unusable, or
only intended as experimental projects.
"Personal Computer" for all our purposes will refer to :
an affordable, general-purpose, microprocessor- based computer intended for the
consumer market.
integrated circuit semiconductor chip that performs the bulk of the processing and
controls the parts of a system;
"a microprocessor functions as the central processing unit of a microcomputer";
27
Microprocessors
Computers were somewhat scarce in the 1960s, and only used by large institutions, and
operated by “super-smart” heroes and heroines…
… but there was a large and growing market for electronic desktop calculators.
In 1970, the Japanese calculator company Busicom approached Intel with a request to
design a set of twelve integrated circuits for use in a new calculator
The first microprocessor developed by Hoff contained approximately 2,300 transistors
and could execute 60,000 operations per second.
This design was so radically different from what Busicom had requested that they
politely said that they weren't really interested…
In 1974 Intel presented the first true general-purpose microprocessor, which contained
around 4,500 transistors and could perform 200,000 operations per second, and destined to
be the central processor of many of the early home computers.
28
1973
The Intel Corporation delivers the first integrated circuit
capable of executing a fully usable programme, the Intel 8080.
The microprocessor is born.
1977
The Apple Computer Company is started by two collegedropouts in their garage, Steve Jobs and Steve Wozniak.
The machine uses inexpensive parts and home color television.
The BASIC programming language is written by Bill Gates of
Microsoft.
1981
Microsoft provides the Disk Operating System (DOS) for
the IBM Personal Computer
Late 1980s
The use of Windows operating shell produced by Microsoft
provides a Graphical User Interface (GUI) for users
29
Computers as we know them :
A computer is :
A programmable machine.
The two principal characteristics of a computer are:
• It responds to a specific set of instructions in a welldefined manner.
• It can execute a prerecorded list of instructions (a
program).
Computers can be generally classified by size and power,
though there is considerable overlap:
30
personal computer : A small, single-user computer based on a microprocessor. In
addition to the microprocessor, a personal computer has a keyboard for entering data, a
monitor for displaying information, and a storage device for saving data.
workstation : A powerful, single-user computer. A workstation is like a personal
computer, but it has a more powerful microprocessor and a higher-quality monitor.
minicomputer : A multi-user computer capable of supporting from 10 to hundreds of
users simultaneously.
mainframe : A powerful multi-user computer capable of supporting many hundreds or
thousands of users simultaneously.
supercomputer : An extremely fast computer that can perform hundreds of millions of
instructions per second.
31
The actual machinery -- wires, transistors, and circuits is called the hardware;
the instructions and data are called software.
All general-purpose computers require the following hardware components:
memory : Enables a computer to store, at least temporarily, data and programs.
mass storage device : Allows a computer to permanently retain large amounts of data.
Common mass storage devices include disk drives.
input device : Usually a keyboard and mouse, the input device is the conduit through
which data and instructions enter a computer.
output device : A display screen, printer, or other device that lets you see what the
computer has accomplished.
central processing unit (CPU): The heart of the computer, this is the component that
actually executes instructions. (If the CPU is built around a microprocessor device, it is
also referred to as a Microprocessor Unit, MPU)
32
Information
in any business or science is the core of the operations.
Information in the computer business is usually called data, a Latin word
meaning information
In addition, you have:
Temporary Memory, Random Access Memory (RAM): where the
temporary information to carry out the immediate commands are stored
Data Buses: where the information flows between different parts of the
mainboard and the storage devices
Expansion slots/ports: where you can connect your extra TV-card
ROM - acronym for Read Only Memory: memory that can only be read
from and not written to.
33
Storage Devices:
disk - a spinning platter made of magnetic or optically etched material on which data
can be stored
disk drive - the machinery that writes the data from a disk and/or writes data to a disk
Hard Disk Drive (HDD): where you save your results permanently after you finish
your work
CDROM Drive: where you can read the data stored on a CDROM
(Compact Disc Read-Only Memory)
DVD Drive: where you can read the data stored on a DVDROM (much larger capacity
than a CDROM)
Floppy Drive (FDD): where you save the information which you would give to your
instructor after your lab sessions
BUG - a programming error that causes a program to behave in an unexpected way
34
Where are we today with computers???
35