Information Theory
Download
Report
Transcript Information Theory
Information Theory and Games
(Ch. 16)
Information Theory
• Information theory studies information flow
• Information has no meaning
– As opposed to daily usage, and
– As opposed to semiotics
• Component 1 passing information to component 2:
1
information
2
-How much information 2 gained?
-Was there any distortion (“noise”)
while passing the information?
• Measure of information gained is a number in the [0,1]
range:
– 0 bit: gained no information
– 1 bit: gained the most information
Recall: Probability Distribution
•The events E1, E2, …, Ek must meet the following
conditions:
•One always occur
•No two can occur at the same time
•The probabilities p1, …, pk are numbers associated with
these events, such that 0 pi 1 and p1 + … + pk = 1
A probability distribution assigns probabilities to events
such that the two properties above holds
Information Gain versus Probability
•Suppose that I flip a “totally unfair” coin (always come heads):
what is the probability that it will come heads: 1
How much information you gain when it fall: 0
•Suppose that I flip a “fair” coin:
what is the probability that it will come heads: 0.5
How much information you gain when it fall: 1 bit
Information Gain versus Probability (2)
•Suppose that I flip a “very unfair” coin (99% will come heads):
what is the probability that it will come heads: 0.99
How much information you gain when it fall: Fraction of
A bit
• Imagine a stranger, “JL”. Which of the following questions,
once answered, will provide more information about JL:
Did you have breakfast this morning?
What is your favorite color?
Information Gain versus Probability (3)
•If the probability that an event occurs is high, I gain less
information when the event actually occurs
•If the probability that an event occurs is low, I gain
more information when the event actually occurs
•In general, the information provided by an event decreases
with the increase in the probability that that event occurs.
Information gain of an event e (Shannon and Weaver, 1949):
I(e) = log2(1/p(e))
Don’t be scared. We’ll come
back tot his soon
Information, Uncertainty, and Meaningful
Play
• Recall discussion of relation between uncertainty and
Games
– What happens if there is no uncertainty at all in a game
(both at macro-level and micro-level)?
• What is the relation between uncertainty and information
gain?
If there is no uncertainty, information gained is 0
Lets Play Twenty Questions
• I am thinking of an animal:
• You can ask “yes/no” questions only
• Winning condition:
– If you guess the animal correctly after asking 20
questions or less, and
– you don’t make more than 3 attempts to guess the right
animal
What is happening?
(Constitutive Rules)
• We are building a binary (two children) decision tree
# nodes
# levels
yes
0
a question
no
20
1
21
2
22
3
23
The number of questions made. And it is equal to log2(#nodes)
Noise and Redundancy
• Noise: affects component to component communication
– Example in a game? Charades: playing with noise
1
information
2
-Noise: distortion in the
communication
• Redundancy: counterbalance to noise
– Making sure information is communicated properly
– Example in game?
Crossword puzzle
1
information
information
2
-Redundancy: passing the same
information by two or more different
channels
• Balance act:
– Too much structure and game becomes too overdetermined
– Little structure: Chaos
Open Discussion
•
•
•
•
Course so far
First time taught
What would you change/continue?
Remember: I never remember your names!