from the second lecture on IJCAI 99 in powerpoint format

Download Report

Transcript from the second lecture on IJCAI 99 in powerpoint format

Artificial Intelligence
More IJCAI 99
Ian Gent
[email protected]
Artificial Intelligence
Three more papers
from IJCAI
Part I :
Part II:
Part III:
SAT for Data Encryption
Automated Discovery in Maths
Expert level Bridge player
SAT for data encryption
 “Using Walk-SAT and Rel-SAT for cryptographic key
search”
 Fabio Massacci, Univ. di Roma I “La Sapienza”
 Proceedings IJCAI 99, pages 290-295
 Challenge papers section
 Rel-SAT? A variant of Davis-Putnam with added “CBJ”
 Walk-SAT? A successful incomplete SAT algorithm
3
Cryptography background
 Plaintext P, Cyphertext C, Key K
 (can encode each as sequence of bits)
 Cryptographic algorithm is function E
 C = EK(P)
 If you don’t know K, it is meant to be hard to
calculate
 P = EK-1(C)
4
Data Encryption Standard





Most widely used encryption standard by banks
Predates more famous “public key” cryptography
DES encodes blocks of 64 bits at a time
Key is length 56 bits
Loop 16 times




break the plaintext in 2
combine one half with the key using “clever function” f
XOR combination with the other half
swap the two parts
 Security depends on the 16 iterations and on f
5
Aim of Paper
 Answer question “Can we encode cryptographic key
search as a SAT problem so that AI search
techniques can solve it?”
 Provide benchmarks for SAT research
 help to find out which algorithms are best
 failures and successes help to design new algorithms
 Don’t expect to solve full DES
 extensive research by special purpose methods
 aim to study use of general purpose methods
6
DES as a SAT problem
 Use encoding of DES into SAT
 Each bit of C, P, K, is propositional variable
 Operation of f is transformed into boolean form
 CAD tools used separately to optimise this
 Formulae corresponding to each step of DES
 This would be huge and unwieldy, so
 “clever optimisations” inc. some operations precomputed
 Result is a SAT formula (P,K,C)
 remember bits are variable, so this encodes the algorithm
 not a specific plain text
 set some bits (e.g. bits of C) for specific problem
7
Results
 We can generate random keys, plaintext
 unlimited supply of benchmark problems
 problems should be hard, so good for testing algorithms
 Results
 Walk-SAT can solve 2 rounds of DES
 Rel-SAT can solve 3 rounds of DES
 compare specialist methods, solving up to 12 rounds
 Have not shown SAT can effectively solve DES
 Shown an application of SAT,and new challenges
8
Automated Discovery in Maths
 “Automatic Concept Formation in Pure Mathematics”
 Simon Colton, Alan Bundy
 University of Edinburgh
 Toby Walsh
 University of Strathclyde (now York)
 Proceedings of IJCAI-99, pages 786-791
 Machine Learning Section
 Introduces the system HR
 named for Hardy & Ramunajan, famous mathematicians
 Discovered novel mathematical concepts
9
Concept Formation
 HR uses a data table for concepts
 A concept is a rule satisfied by all entries in the table
 Start with some initial concepts
 e.g. axioms of group theory
 use logical representation of rules, I.e. “predicates”
 Now we need to do two things
 produce new concepts
 identify some of the new ones as interesting
 to avoid exponential explosion of dull concepts
10
Production rules
 Use 8 production rules to generate new concepts
 new table, and definition of new predicate
 e.g. “match” production rule
 finds rows where columns are equals
 e.g. in group theory, general group A*B = C
 match rule gives new concept “A*A = A”
 Production rules can combine two old concepts
 Claim that these 8 can produce interesting concepts
 No claim that all interesting concepts covered
11
Heuristic Score of Concepts
 Want to identify promising concepts
 Parsimony
 larger data tables are less parsimonious
 Complexity
 few production rules necessary means less complex
 Novelty
 novel concepts don’t already exist
 Concepts and production rules can be scored
 promising ones used
12
Results
 Can use HR to build mathematical theories
 This paper uses group theory
 HR has introduced novel concepts into the
handbook of integer sequences
 e.g. Refactorable numbers
 the number of factors of a number is itself a factor
 e.g. 9 is refactorable
 the 3 factors are 1, 3, 9. So 9 is refactorable
13
Expert level bridge play
 “GIB: Steps towards an expert level bridge playing
program”
 Matthew Ginsberg, Oregon University
 Proceedings IJCAI 99, pages 584-589
 Computer Game Playing section
14
Expert level bridge play
 Aren’t games well attacked by AI?
 Deep Blue, beat Kasparov
 Chinook, World Man-Machine checkers champion
 subject of a later lecture
 Connect 4 solved by computer
 Little progress on on 19x19 board
 because of two types of game
 Go, Oriental game huge branching rate
 Card games like bridge
 because of uncertain information, I.e. other players
cards
15
What’s the problem?
 If we knew location of all cards, no problem
 << 52! Sequences of play, because of suit following
 dramatically less than games like chess
 one estimate is 10120
 We have imperfect information
 estimates of quality of play have to be probabilistic
 To date, computer bridge playing very weak
 Slightly below average club player
 “They would have to improve to be hopeless”
 Bob Hamman, six time winner of Bermuda Bowl
16
What’s the solution?
 Ginsberg implemented brilliantly simple idea
 Pretend we do know the location of cards
 by dealing them out at random
 Find best play with this known position of cards
 score initial move by expected score of hand
 Repeat a number of times (e.g. 50, 100)
 Pick out move which has best average score
 This is called the “Monte Carlo” method
 standard name in many areas where random data is
generated to simulate real data
17
GIB
 Ginsberg implemented (and sells) system called GIB
 Best play in given deal found by standard methods
 general methods subject of forthcoming lectures
 Dealt at random consistent with existing knowledge
 cards played to date, bidding history
 Separate method for bidding (less successful)
 GIB has some good results
 won every match in 1998 World Computer Championship
 lost to Zia Mahmoud & Michael Rosenberg by 6.4 IMPs
 surprisingly close, though only over short match
18