sec_l10_2004student - Department of Computer Science

Download Report

Transcript sec_l10_2004student - Department of Computer Science

Introduction to Computer Security
Lecture 10
PLU
Summer 2004
Dr. Richard Spillman
Pacific Lutheran University
Summer 2004
1
Last Lecture
• Computer Crimes
• History
• More Digital Signatures
– Blind Signatures
– DSS
• Certificates
• Quantum Cryptography
– Quantum Factoring
– Quantum Key Management
2
Review – Quantum Properties
• There are four quantum phenomena that
make quantum computing weird
– Interference
– Superposition
– Entanglement
– Non-clonability
3
Outline
• Computer Crimes
• History
• Introduction to Operating System Security
• Access Control
• Biometrics
4
CyberAttacks
Results of the Symantec Internet Security Threat Report: Attack
Trends for Q3 and Q4 of 2002
• One of the most
intriguing and
challenging questions
about cyber attacks is
that of intent—was the
attacker targeting a
specific organization,
or simply scanning
the Internet in search
of an opportunity to
exploit vulnerable
systems.
– Analysis over the past
six months revealed
that only 24% of attacks
appeared to be targeted
5
Crimes 1
• Trainee hacked into London’s Subway
computers
• Modified text for dot-matrix displays at
Piccadily, Elephant & Castle and Regent's
Park subway stations on 16 Aug
• “All signalmen are w***ers” (??)
• Message unnoticed by tube staff >12
hours
• Message reappeared on tube station
displays on 29 August, possibly by
technical error.
6
Crimes 2
Nov 1995:
• A student at Monmouth University
caused a 5 hour disruption of school email system by sending 24,000 e-mail
messages sent to 2 administrators
– Student's computer access had been
terminated 9 Nov 1995 for posting
advertising and business venture
solicitations to USENET
– It 44 hours to trace source of attack through
an ISP provider in Atlanta, GA
7
Crimes 3
• PayPal users are under attack by an increasingly
sophisticated series of e-mail worms.
– Since the beginning of the year, at least four e-mail messages
disguised as security upgrade announcements from the financial
service have hit users' inboxes.
– The most recent mailing, sent Feb. 10 was full of spelling errors.
With the
subject line "PayPal Account Manager," it read: "PayPal has just
finish our lastest breakthrough in customer server. The PayPal
Account Manager. With this program, you can now have LIVE
24/7 support with aPayPal Tech Support Operator. We hope this
increases your PayPal experience."
– Attached to the message was a compressed file called
AccountManager.zip, which contained an executable file that
installed a program to surreptitiously intercept and log keystrokes
on the user's computer (in order to steal passwords and other
confidential information).
8
Early US Codes
• Between WWI and WWII the story of US efforts in
communications intelligence involves both good
news and bad news
• The good news is that prior to WWII we were able to
break purple, the Japanese code and while working
with the British at BP we continued to break both
Japanese and German ciphers
• The bad news is that up until the middle of WWII, we
used some of the weakest ciphers of any nation.
9
Pre-WWI
• Prior to WWI, the US named their diplomatic codes by
the color of their binding
– we had two codes, RED and BLUE - both used 5 figure groups
• In 1912, the President’s Commission on Economy and
Efficiency asked the State, War, and Navy departments
to look into the used of a standard code and less
expensive ciphers
– Their choice was the V-cipher with small key words - we called
it the Larrabee
– In 1917, the State department used PEKIN and POKES for
keys
– We used this code throughout WWI
10
What could go wrong?
• The American ambassador to Rumania found it easier to
keep his copy of Larrabee under his mattress rather
than in a locked safe
• One day, it disappeared (it ended up in Moscow), but the
ambassador never reported the loss
• Rather, he let the messages pile up and when it became
necessary he would hop a train to our embassy in
Vienna where he would decode the messages and
encode his reply
• It wasn’t until WWI broke out that he admitted to the loss11
US Cryptanalysis
• During WWII US military codes improved greatly over the
codes and ciphers of the State department
• Cryptanalysis, however, was always strong - in fact it
was often better than agents in the field.
• For example, up to 1943, the Army’s Signal Security
Agency was reading messages of Japanese military
attaches around the world
– In 1943, the OSS penetrated the offices of the Japanese
embassy in Portugal without telling the SSA
– Of course, the SSA never told the OSS that they had broken the
Japanese codes
– The result was that the Japanese detected evidence of a break
in, decided that their codes might have been compromised and
12
changed them - cutting off the SSA’s source of information
The Operating System
• Provides an interface between the user and the
hardware
• Manages and allocates
system resources
The User
Word
Excel
Windows 95
desktop
Compilers
Window manager
Command
interpreter
Operating system
application
programs
and user
interface
System
programs
Machine language
Hardware
Microprogramming
Physical devices and hardware
13
The Machine
OS Functions 1
• Files
– maintain file directory
– create, delete, move files
• Processes (programs)
– allocate memory for program and data
– start and stop process
– switch between processes
14
OS Functions 2
• User interface
– commands to copy files, list directories,
etc., invoke operating system processes
– graphical desktop environment
• Input/output
– interface to monitor, printers, disks, etc.
15
OS Structures
• Most CPUs have two modes:
– user mode
– kernel mode
• Kernel mode allows access to all
instructions
• For security, user mode allows only some
instructions
• Normal programs run in user mode
16
• The operating system runs in kernel mode
What is the Kernel?
• The Operating System ‘program’
– Offers services to ‘userland’
• Creates and maintains processes
• Separation of privileges and memory
• Access to devices
•…
– Extensible: network protocols,
filesystems
– No internal privilege levels
17
User Access to the Kernel
• ‘Userland’ can
– inquire about kernel state
– change kernel state
– For example: state of network devices
• Through:
– System calls
– /dev devices (e.g. /dev/kmem)
– /proc filesystem
Most OS Attacks are against the Kernel
18
Multiprogramming OS Functions
Most of these
functions are
security based
•
Identification of users
•
Authentication of users
•
Protection of memory
•
File and I/O device access control
•
Allocation and access control to general
objects: concurrency and synchronizing
•
Enforcement of sharing
•
Guarantee of fair service
•
Interprocess communication and
synchronization
•
Logging of user actions
•
Report actual (and potential?) security
violations
19
Need for a Secure OS
• All applications are executed on the OS
• Hostile software runs with user’s privileges
and permissions
• Distinction between data and code is
vanishing
• Malicious code is unknowingly introduced
20
OS Vulnerabilities
• OS Vulnerabilities discovered in 2000
21
Trusted OS
• The new commercial market is asking for a
TOS, Trusted Operating System
• In general, it involves three components:
1) a mandatory access policy
2) a least privilege model (admin control)
3) independent validation (assurance)
There are several available or under
development
22
Levels of OS Security
• No protection (OK when sensitive procedures are run at
separate times)
• Isolation (Each process has own address space, files,
and
other objects)
• Share all or share nothing (Everything is either public or
private to owner)
• Share via access limitation (Whether user can access
object is on a list)
• Share by capabilities (Dynamic creation of sharing rights
for objects)
• Limit use of an object (Protects use as well as access
(e.g.,
view, don't print)
23
OS Isolation
• A typical OS will isolate users and the OS
in memory:
0
Operating
System
Space
User 1 Space
User 2 Space
.
.
.
User N Space
High
Operating
System
Space
24
Address Protection
• Fence confines users to one side of a boundary
• Fixed fence confines operating system to fixed
addresses and users to all other addresses
0
Operating
System
n
n+1
Addressing
Range
High
User Program
Space
Memory
25
Variable Fence Register
0
n+1
Address Limit
Register
Operating
System
Version 1
n
High
0
Address Limit
Register
P
Operating
System
Version 2
P+1
n+1
Addressing
Range
P+1
User Program
Space
User Program
Space
Addressing
Range
High
Can protect operating system from users, but
can’t protect users from each other
26
Base/Bound Register
• A variable fence register is generally known as a base
register.
• But these only provide lower bounds (starting
addresses)
• So a second address register, a bounds register, can be
used.
• This catches user errors like subscript out of range,
which may inadvertently clobber someone else’s
program if not detected; will not save user’s own
address space.
• This can be solved by using two pairs of base/bounds
registers, one for program code (instruction fetches),
other for data accesses
• This also allows program to be split and the parts
relocated separately.
27
Example
Base Register
0
n+1
Program Base
Operating System
n
n+1
Bounds Register
P
Operating
System
User A
Program Space
Program Bounds
User A
Program Space
p
Data Base
p+1 User B
Program Space
User B
Data
Space
User A
Data Space
User
Program
Space
q
Data Bounds
q+1 User C
Program Space
User
Program
and
Data
Space
User C
Program Space
User C
Data Space
User B
Program Space
High
CONTROLLING PROGRAM ONLY
28
CONTROLLING PROGRAM AND DATA FETCHES
Security Features
• Base/Bound Registers combined with OS
software control features form the basis of OS
security
• It is a combination of software and hardware
working together that provides an acceptable
level of security
• The primary issue is user access to system
resources
29
Definition
• Generally speaking, a security policy
describes how people may access
documents or other information.
• A computer’s version of a security
policy consists of a precise set of rules
for determining authorization as a
basis for making access control
decisions.
30
Access Policy
• Access to systems based upon user
identification.
• Access to objects (such as files, directories,
etc.) based upon user identification, where
owners of objects can, at their discretion,
grant access to other users.
• Access to objects (such as files, directories,
etc.) based upon the clearance level of the
user.
31
Access Control
• Access Control is a two step process
– Telling the system who you are: Identification
– Proving to the system that you are who you
say you are: Authentication
• Benefits of I and A are:
– Can provide a complete log of access and
attempted accesses.
– Access privileges granted/removed quickly
32
Authentication
• There are 3 classical ways to prove
you are who you say you are:
– by something you know
– by something you have
– by something you are
• The first one involves knowledge of a
password
33
Something you are . . .
• Biometrics are increasingly
common as identification rates
improve.
– fingerprints
– retinal scan, iris scan
– facial heat
– voice recognition
– signatures (handwriting)
34
Something you have . . .
• This one is very similar to the “something you
know” technique - in order to implement it there
needs to be:
– an object which may or may not be unique, but to
which the access is limited to “authorized” users or
other subjects (subjects are for now defined as “active
participants which operate on objects)
– a way to present this object to the entity which
requires the subject to provide proof
– A way to determine if the object as presented is the
one which was expected
35
Something you know . . .
•
•
•
•
•
•
a word (password)
an algorithm (pass-algorithm)
a phrase (pass-phrase)
a picture (pass-picture)
a certificate
a combination or sequence of the above
•
One can ask the user to produce the secret, or
to select the secret from a set. This is called the challengeresponse method, and may occur more than once throughout a
session; it is used to validate the identity claimed by the unknown
party.
– Challenge = Demand identification
– Response = Exhibited/chosen secret
36
Challenge/Response
37
Example
• Smart cards for challenge-response
User
PassPort
Channel
1. User enters ID
User ID
4. User enters
PIN on PassPort
User PIN
3. Issue challenge
5. Passport
computes response
using secret Key
6. Response
Auth. Server
User ID Counter
Service Provider
2. Host looks up ID,
fetches sequence #
Challenge
Generator
Assess
Results
7. Authentication results
passed to service provider
38
Next Step
• Once a user is identified and confirmed, the OS must control that
users access to system objects
• Objects that needs protection
–
–
–
–
–
–
–
–
–
–
Memory
Files on storage devices
A executing program in memory
A directory of files
A hardware device
A data structure, such as a stack
Parts of the operating system
Instructions
Passwords
The protection mechanism
39
Goal
• The OS should check every access to an object
• The OS should operate on the principle of least
privilege
– a process should have access to the smallest number
of objects necessary to accomplish a given task
• The OS should always verify acceptable usage
40
Method One – Directory
• Works like a file directory.
• Easy to implement: one list per user, naming all
the objects the user is allowed to access.
• Sharing is possible: several users can have a
pointer (a name) to the same object in their
directory.
• Problem: revocation of access
– To change the access of a group of users the system
must search through all lists for the object.
• Too simple for most object protection situations.
41
Method Two – Access Control
• One list for each object.
• The list shows all subjects who should
have access to the object and what their
access is.
• Groups can also be used as subjects.
• Often used, e.g. in Windows NT.
42
Access Control Policies
• In general, users can and should have the
ability to decide who has access to their files,
programs, . . .
• There are three general polices for allowing
access in this fashion
– Discretionary Access Control, DAC
– Mandatory Access Control, MAC
– Role Based Access Control, RBAC
43
DAC Concept
• Discretionary Access Control (DAC) is a data
access control policy that allows users to grant
or deny other users access to their files.
• Common implementations
–
–
–
–
Permission Bits
Password Schemes
Capability Lists
Access Control Lists (ACLs)
44
Permission Bits
• Used by Unix, VMS and other systems.
– A user is specified as the owner of each file or directory.
– Each file or directory is associated with a group.
– At any specific time each user is associated with a group.
Example
Drawbacks
Insufficient granularity (how
does Alice give ONLY Bob
read access to file1?)
Can not deny access to a
single user
45
DAC Weakness
• Suppose you have a system that:
–
–
–
–
–
Correctly enforces an I and A policy
Correctly enforces a DAC policy
Stores both Unclassified and Secret information
Has both Unclassified and Secret users
Also suppose that all Secret users act in accordance
with the procedures for handling classified
information in that they do not set access
permissions on files containing Secret information
such that Unclassified users can view them
• What can go wrong?
– Malicious software
46
Example
•
An unclassified user, Ivan, brings a great Star Trek game into work. The
game becomes very popular. Unbeknownst to users the program
surreptitiously copies user’s files into Ivan’s directories with permissions such
that Ivan can read them.
– This type of program is called a Trojan Horse program. It performs a useful
function so that users will use it, but it secretly performs other actions.
•
•
When Alice, a Secret user, runs programs, those programs (text editors, etc.)
are able to access all files accessible by Alice, because those programs are
running on behalf of Alice.
When Alice runs the Star Trek program, it too runs on her behalf and can
access all files accessible by Alice.
– Thus, the game program can read all files readable by Alice and make a copies of
them into Ivan’s directory with permissions on the files set such that they are
readable by Ivan.
– So when Alice runs the game program (or any malicious software) it can do any
thing that Alice can do.
– Conclusion: DAC mechanisms have an inherent weakness. They are vulnerable
to Trojan Horse attacks.
47
Risks of Malicious Software
• Consider this:
– How much software on your own system did you write?
– How much software on your system can you absolutely vouch
for?
– More and more software is written overseas these days.
– It only takes one bad engineer in a group of a thousand good
engineers to embed a Trojan Horse in a product.
– If you store information that is worth stealing, the Trojan Horse
attack is very attractive
– Are you running a browser that downloads and executes Java
applets?
48
MAC Policy
• A Mandatory Access Control policy is a policy in
which people do not have control over the
authorization of access to information.
– As a user, you can not set access authorizations
• Why do we need a MAC?
– To plug the loophole in DAC that allows malicious
software to invade an OS
49
MAC Implementation
• In an implementation of a MAC policy
– Each subject has a label (or access class).
– Each object has a label (or access class).
– The ability of a subject to access an object is based upon a comparison
of the subject’s label and the object’s label.
– Two labels are compared using the "dominance" operator.
– i.e., if label A dominates label B, we write A  B .
• As an example, consider the set of military classification levels
– {Top Secret, Secret, Confidential, Unclassified}. Where:
•
•
•
•
•
•
Top Secret Secret
Top Secret Confidential
Top Secret Unclassified
Secret Confidential
Secret Unclassified
Confidential Unclassified
50
Bell & LaPadula Model
• Lets S be the set of all subjects in a system and O be the set of all
objects in a system.
– For each subject s in S there exists a label or access class for s called
C(s).
– For each subject o in O there exists a label or access class for o called
C(o).
No Read Up
• Simple Security Property
– A subject s may have read access to an object o only if
C(s) C(o)
(You shall only view objects which are classified at the same level or lower
than the level for which you are cleared)
No Write Down
• *-Property or Confinement Property
– A subject s may have write access to an object o only if
C(s) C(o)
(You shall not talk to anyone who is cleared at a level below you)
51
Confinement Property
• Why have the confinement property?
– Recall the Star Trek game that contained a Trojan Horse
program. If a Secret user uses the program on a system that
does not enforce the *-Property, the Trojan Horse could read
Secret files and write them to Unclassified files, where Ivan (the
person who wrote the Star Trek game) (who is an Unclassified
user) can read them.
– If, however, a system enforces the *-Property, a Trojan horse can
not write down.
• Thus:
– In a computer system, a mandatory policy can protect
information in objects from unauthorized access even in the face
of malicious software.
52
Questions 1
• Can an Unclassified user blindly write to Secret?
– Yes. The model allows it, but most implementations prohibit
arbitrary blind write ups.
• Who puts the access class label values on objects
(files)?
– When an object (a file) is created (e.g., with a text editor), its
access class value is specified as part of the creation process.
– When files are imported into a system (off a floppy disk, from the
network, etc.), they are labeled appropriately.
– If a file is downloaded from an Unclassified network, it is labeled
as Unclassified.
– If a file is downloaded from a Secret network, it is labeled
Secret.
– If a file is imported off an Unclassified Floppy Disk, it is labeled
as Unclassified.
53
Questions 2
• How does Alice, a Secret user, write information to an
Unclassified file?
– Systems that support MAC policies, must also support the
notion of a session level.
– When a user logs on they request a session level, which can be
any level up to their clearance level.
– If Alice logs on and requests a session level of Secret, a Secret
level subject is created on her behalf. This subject can read
files at or below Secret and can write files at or above Secret.
– While Alice is logged in, she can re-negotiate a new session
level to any other level that she is allowed to operate at. This
means if she needs to write an Unclassified file, she must
negotiate an Unclassified session.
54
Trojan Horse in a MAC
• Note that a Trojan Horse can write information
between objects at the same security level.
– For example a Trojan horse can read one Secret file
and copy it to another Secret file.
– This is not a problem because:
• This scenario would require a bad guy (e.g., Ivan) to have a
Secret clearance. (So you need personnel security too.)
• He brings in his killer Star Trek game (with an embedded
Trojan Horse).
• Sue, a Secret user, plays the Star Trek game and the Trojan
Horse copies her Secret files into Ivan’s directory.
• But Ivan is already cleared for Secret information so the
Trojan Horse does not get him any information he is not
already cleared to see.
55
Breaking a MAC
• Covert Channels can still leak information from high to
low in spite of a MAC policy.
• Covert channels are flows of information between
access class levels counter to a MAC policy but
which are allowed by an implementation.
– Covert channels are a means of leaking information from high
to low, one bit at a time.
– If the rate of transmitting bits across the channel (the channel
baud rate) is great, this threat is significant.
– Covert channels involve two programs, of which one must be a
Trojan Horse. Covert channels are a little complicated to
implement.
56
Covert Storage Channel - Setup
• Ivan, (a low user) introduces a Trojan Horse program
(e.g., Star Trek game) into the system and somehow
gets a high user to execute it.
– When the high user plays the Star Trek game a sub-program is
spawned and goes to sleep. The sub-program contains the
Trojan Horse and wakes up and starts running at a time when
activity on the system is low (e.g., at 0100).
– Ivan starts another program (a low program) that will wake up
at 0105, (5 minutes later than the high program). This allows
the high program time to initialize the channel.
– The high program finds a high file to copy (fileA).
– The high program initializes the channel by repeatedly creating
files until the "disk full" exception is returned.
57
Covert Storage Channel - Payoff
• The two programs will synchronize with each other by reading a
system clock. The high program will signal bits on every even
millisecond and the low program will receive bits on every odd
millisecond.
– The high program starts reading the bits out of FileA. The following steps
are repeatedly performed until the high program is through reading the
file.
– The high program does: (on even milliseconds)
• If a bit is a 0, the high program deletes one file. (Creating room on the disk for
a file to be created.)
• If a bit is a 1, the high program does not delete a file. (So there is no room on
the disk to create a file).
– The low program does: (on odd milliseconds)
• The low program always tries to create a file. If there is room on the disk, the
create file call is successful. If the call is successful, the low program writes a 0
into a destination file.
• If there is no room on the disk, the create file call will fail, with the "disk full"
exception. If the call is unsuccessful, the low program writes a 1 into the
destination file.
58
Covert Timing Channel
• Covert timing channels exploit a mechanism
where a high subject can affect the timing of a
low subject.
– A potential timing channel, which exists on single processor
systems, uses the fact that both the high subject and the low
subject use the same physical processor.
– To signal a 1, the high subject performs a lengthy operation
(e.g., disk I/O) and signals a 0 by performing a short operation.
– When the high subject finishes its operation, the low subject is
scheduled to run.
– When the low subject gets scheduled, it reads the system clock
and determines how long the high subject operation took.
59
Covert LED Channel
• Researchers have demonstrated the possibility
of reproducing streams of data by observing the
LED indicator lights on computer equipment
– Modems
– Network routers
– Hard drives
• More than a third of the devices tested were
susceptible to optical eavesdropping at
distances up to at least 20 meters and at data
speeds of up to at least 56k
60
Optical Eavesdropping
• Optical Eavesdropping is possible because
LED lights are very bright and respond quickly
– in 10’s of nanoseconds to applied voltage
– As a result they mirror the data signal
• In addition the RS-232 communications
protocol has features such as its start and stop
bits that simplify the analysis of observed LED
light patterns
– A hacker armed with a telescope, an optical detector
and a fairly simple analysis code could read an
entire data stream of many network devices with
61
very little difficulty
Other Eavesdropping
• The researchers found other opportunities to
exploit this covert channel
– Even in a room filled with flashing modems and other
light sources, it would be possible to reconstruct data
from a single modem
– CRT displays may also be vulnerable to similar
methods
– They also discovered an exploit involving special
software installed on various platforms that would
allow a remote spy to read computer information
directly from keyboard LEDs
62
RBAC Concept 1
• A user’s permissions are determined by the
user’s roles
– rather than identity (DAC) or clearance (MAC)
– roles can encode arbitrary attributes
• Facilitates
– administration of permissions
– articulation of policy
• ranges from very simple to very sophisticated
63
RBAC Concept 2
• Regulate the access of users to the information
on the basis of the activities that the users
execute in the system
• Use organizational roles as an origin
• Access control is not centered around
– “who is allowed to do what”
• But instead
– “which job (role) is allowed to do what” and
– “who is currently filling which job (role)”
64
Users and Roles
• Users are
– human beings or
– other active agents
– Each individual should be known as exactly one user
• A role brings together
– a collection of users and
– a collection of permissions
• These collections will vary over time
– A role has significance and meaning beyond the
particular users and permissions brought together at
65
any moment
Roles vs. Groups
• Groups are often defined as a collection of users
• A role is
– a collection of users and
– a collection of permissions
• Individuals are assigned to roles
• A role is activated to get permissions for
performing a task
66
RBAC Model
User 1
User 2
Users are assigned roles
Role 2
Role 1
Roles are assigned
permission to access
objects
Object
1
Object
2
Object
3
67
Biometrics
• Depend on unique physiological and behavioral
characteristics that can be examined and
quantified
• Obtains data from you, converts to digital
representation, and compares to stored sample
• Should be used in a two-factor authentication
system
– that is, with passwords
68
Why Biometrics?
• Passwords cost to much
• 20-50% of corporate help desk calls are
password related
– 24/7 help desk support costs about $150/yr
per user
– At the NY times web site, about 1,000 people
per week forget their passwords
– Lost productivity from password lockout
69
Popular Biometric Techniques
•
•
•
•
•
•
•
•
Fingerprint verification
Hand geometry
Voice verification
Retinal scanning
Iris scanning
Signature verification
Facial recognition
“Bleeding edge” biometrics
– Gait, odor, ear, hand vein, thermography…70
Biometric Companies
71
Biometric Process 1
• The first step is an enrollment process in which
the initial biometric information is gathered and
associated with a specific individual
Biometric Capture
Image
Processing
Image
101000
011010
101110
110001
Template
Storage
Template
72
Biometric Process 2
• The second step is a verification process
Biometric Capture
Template
Storage
Image
Processing
Image
Template
Extraction
101000
011010
101110
110001
Live Template
101000
011010
101110
110001
Stored Template
Biometric
Matching
95%
Matching
Score
73
Biometric Evaluation 1
• A decision made by a biometric system is either a
genuine individual type of decision or an imposter
individual type of decision.
• There are two types of decision outcomes: true or false.
Given these two types of decisions and the two decision
outcomes, there are 4 possible combined outcomes
–
–
–
–
A genuine individual is accepted.
A genuine individual is rejected.
An imposter is rejected.
An imposter is accepted.
• Outcomes 1 & 3 are correct, whereas outcomes 2 & 4
are incorrect.
74
Biometric Evaluation 2
• In principle we can use the following to assess systems
– False (genuine individual) Rejection Rate (FRR) (also called
Type I error), and
– The False (imposter) Acceptance Rate (FAR) (also called Type II
error),
– The equal error rate (rate where FAR and FRR are equal)
• These are test population and system configuration
dependent and can not be generalized even for the
same system under different populations or test
conditions!
• Statistical methods are used to assess system
performance
75
Fingerprint
• False rejection rate: 1 - 5% (three tries).
• False acceptance rate: 0.01 - 0.0001 % (three tries).
• Vulnerability: Dummy fingers and dead fingers
• Ease of use: Easy to use, but “suspect”
• Speed: 2 seconds
• Storage: 800–1203 bytes
76
Steal a Fingerprint
• lift fingerprint – unnoticed and take less than a minute
• A dental technician could prepare a silicon finger in 1 to 2
days
77
Hand Geometry
• False rejection rate: 0.2 % (one-try)
• False acceptance rate: 0.2 % (one-try)
• Vulnerability: difficult without cooperation
• Speed: < 3 seconds
• Storage: 9 bytes
78
Retinal Scan
• False rejection rate: 12.4 % (one-try) 0.4 % (three-try);
• False acceptance rate: 0
• Vulnerability: None;
• Ease of use: difficult, socially unacceptable
• Speed: 1.5 seconds;
• Storage: 40 bytes
79
Performance Observation
• At Newark airport, an average of 70,000
passengers pass through daily. If all of these
used biometric-authenticated smart cards for
identification, there would be 140 falsely
rejected (and inconvenienced) passengers per
day for fingerprints, and 10,500 for face or
voice.
– Lawrence O’Gorman, “Seven Issues with Human
Authentication Technologies”, AutoID 2002
80
Privacy & Legal
Questions
• These questions must be considered to the
satisfaction of the user, the law, and society.
– Is the biometric data like personal information (e.g. such
as medical information) ?
– Can medical information be derived from the biometric
data?
– Does the biometric system store information enabling a
person’s “identity” to be reconstructed or stolen?
– Is permission received for any third party use of
biometric information?
– What happens to the biometric data after the intended
use is over?
– How is a theft detected and “new” biometric
recognized?
– Notice of Biometric Use. Is the public aware a biometric
system is being employed?
81
Possible Quiz
• Remember that even though each quiz is
worth only 5 to 10 points, the points do
add up to a significant contribution to your
overall grade
• If there is a quiz it might cover these
issues:
– What is a covert channel – give an example?
– What is the primary issue in OS Security?
82
– Why are Biometrics necessary?
Summary
• Computer Crimes
• History
• Introduction to Operating System Security
• Access Control
– DAC
– MAC
– RBAC
• Biometrics
83