TIME Threat Information Management Engine

Download Report

Transcript TIME Threat Information Management Engine

BITTOR
A New Foundation for Logical & Numeric Uncertainty
Technical Overview
Markov Monoids are used as a Mathematical Foundation
for a New Theory of Logical & Numerical Uncertainty
Replacing Current Computer Logic & Computation Infrastructures
January 4, 2006
Joseph E. Johnson, Ph.D.,
Professor of Physics
University of South Carolina
Columbia SC, 29208
[email protected] 803.777.6431
Discussion I:

Background:





It is well known that computers are too exacting and do not perform ‘estimates’ and
‘approximations’ unless explicitly programmed. A computer could not order food for a party
not knowing how many people would come or how much they would eat – yet a person can
easily operate with such extremely limited information.
It is also realized by many scientists that all observations are estimates and furthermore most
intelligent decision-making is based upon making optimal decisions using very limited
knowledge.
A good example is the field of pharmacy and medicine where drug dosage, risk analysis, and
procedures have large errors. Likewise one thinks of the complex error analysis for
engineering and architectural efforts with unknown strengths of materials, temperatures,
loads and stresses. And there is certainly the business investment and financial domains of
uncertain currencies, interest rates, supply, and demand.
It is also known that while mathematics has developed a rigorous system of integer, rational,
real, and complex numbers, that the current system of truncating real numbers is not a
rigorous method of managing uncertainly. Thus the entire field of statistics is overlaid on
traditional mathematics to manage these uncertainties.
Finally, one notes that all values in the sciences and engineering (length, time, mass…) are
assumed to be ‘real’ numbers in spite of the fact that it would take infinite effort to measure
such a number, with a resulting infinite information level, and in certain violation of quantum
theory – thus impossible. Real numbers cannot truly represent the results of observations.
Discussion II
 Our line of thought:



A natural (and almost mandatory) line of thought would be to
generalize the fundamental piece of information – the bit, 1 &
0, to continuous probability values – but how?
My work with continuous Markov transformations led me to
realize, that the values (representation space) (x1, x2) upon
which they acted, could provide the natural generalizations
of 1 & 0 with 1=(1,0) and 0 = (0,1) and thus allowing all
intermediate values when x1 & x2 assumed the value range 0
to 1 (i.e. x1 + x0 = 1).
This occurred to me because the (linear) Markov
transformations maintain both the sum of the values (i.e. x1
+x2= 1) and their non-negativity thus allowing them to be
interpreted as probabilities.
Discussion III - Postulates
 Postulate 1: <x1, x2> is the fundamental entity




I thus make the postulate that the fundamental entity of
information is the bit vector (x1, x2) (bittor i.e. the pair of
numbers) which is formally the representation space of the
Markov Lie Monoid (part of the general linear group of
continuous real transformations). (x1+x0=1 & xi nonnegative).
This allows continuous values between 1 and 0 that are to
be interpreted as the probability (x1) that ‘1’ is the correct
value and that the probability that the value is zero is x0 .
Thus ‘1’ = (1, 0) and ‘0’ = (0, 1) and (1/2, 1/2) represents
equal probability for a 1 or a 0 and thus represents zero
information.
These <x1, x2> bittors now become the fundamental objects
in a new mathematics that we propose below.
Discussion III - Postulates
 Postulate 2: Definition of Logic




The next most foundational concept is that the fundamental entities in a
system must have rules of combination and these rules must provide a
generalization of Boolean logic (combinational product rules for AND,
OR, NOR, NOT etc).
Realizing that since x = <x1, x2> represent probabilities, and since
independent probabilities multiply, then the rule must be z = x y where
if x and y are bittors then z will also be a bittor (components are
nonnegative and sum to unity).
We postulate zi = caijk xj yk generalizing the normal logic (computer)
operations of AND, OR, NOR, NAND, NOT, etc. where a = 1, 2, … 16
and represent the 16 different ways of partitioning the four products xj
yk in two components: (e.g. for ‘AND’ we have z1 = x1y1 , z0 = x1y0 + x0y1
+ x0y0 ).
The unary NOT operation reverses (x1,x0) into (x0,x1) value wise with an
obvious symmetric off-diagonal matrix form.
Discussion III - Postulates
 Postulate 2a – Linear combination
 A second operation similar to a sum is defined as the
weighted linear combination x = a1x1 + a2x2 +… anxn
where xi are different bittors and where the ai are an n
dimensional Markov Lie monoid representation (ie sum
to unity and are non-negative – thus a larger
dimensional bittor).
 This operation gives a weighted linear combination of
bittors, by bittors. This new mathematics thus has 16
independent products and one method of linear
‘addition’ (combination).
 Note that these bittor objects close under these two
operations.
Discussion III - Postulates
 Postulate 3: Bittor Numbers



Just as the binary reals are defined as a sequence of binary
values with a decimal (1101.101), we now define a bittor
number to be the outer product of several Markov monoid
two dimensional representations (xj,yk) (xj,yk)() . ()()
Thus a number is an (outer product) of Markov monoid
representations and thus is itself such a representation in the
product space of Markov transformations.
One need only write the upper of the two values. Also, as
one needs only a limited accuracy for the ‘error’ one can use
a binary value such as 01101. This makes a Bittor number
take the abbreviated form <(1)(1)(0).(1)(011101)> or more
succinctly 110.1(011101) where (1) means an x1 value that
rounds to 100000.
Discussion III - Postulates
 Postulate 4: Bittor Arithmetic
 Arithmetic is defined with Bittors in exactly the same
schema as with binary numbers. For example in
adding two bits: 1+0 = 0+1 = 1 and 1+1 = 0+0 = 0
(except carry 1 if 1+1).
 This is defined as XOR (exclusive OR) of the bittors: z 1
= x1y0 + x0y1 showing that the probability that one gets
a one is the product of probabilities for one value to be
1 and the other to be 0. Likewise, z0 = x1y1 + x0y0 gives
the probability to get a 0 upon addition of the two
bittors.
 The other part of addition is the carry which is
computed as z1 = x1y1 ie the AND operation as both
values must be 1 in order to have a carry digit.
Discussion III - Postulates
 Postulate 5: Information Defined



Shannon’s definition, I = log2(P) gives us an information
value of 1 when P is 0 or 1 ie a well defined bit of information
with a binary choice.
It can be shown that the smooth generalization of Shannon
entropy, via Renyi’s form of I = log2 (a(x1b +x0b)) determines
the constants a and b uniquely to be a=2 and b=2 which is
already in the extended bittor logic as the operation ‘EQV’ of
the bittor with itself.
Thus I = log2 (2(x12 +x02)) (which is the log base 2 of self
equivalence) is defined to be the information content of a
bittor. The information value in an entire bittor number is
thus the sum of the information in each component bittor.
Discussion IV
 Observation: Bittor operations, numbers, and
information smoothly reduce to the standard
numbers, logic, and binary system in traditional use
when the bittors are exact (1,0) or (0,1).


Consequently the proposed system is a smooth
generalization of the current logic and arithmetic.
Furthermore, the bittor structures include the full
Boolean logic, binary values, existing number system
(integer, rational, real, and complex numbers) with
existing mathematical operations – all as a special
case.
Technical Summary – I Generalized
Logic
 1. Basic Entity: I suggest that probability is not a scalar but a
component of an ordered ‘vector’ (n-tuple - representation space) that
behaves as a ‘bit vector’ or ‘bittor’ under the Lie Markov Monoid of
transformations (e.g. {P, (1-P)} = {x1, x0} in two dimensions).

These objects generalize the fundamental concepts of ‘1’ and ‘0’ of
binary logic and arithmetic allowing for continuous intermediary states
of logic.
 2. A New Mathematics: Boolean Logic is generalized to be a set of 16
product operations zi = caijk xj yk generalizing the normal logic
(computer) operations of AND, OR, NOR, NAND, NOT, etc. where a =
1, 2, … 16 of the 16 possible partitions of the four products xj yk into
two parts. A second operation is defined as the weighted linear
combination x = a1x1 + a2x2 +… anxn where xi are different bittors and
where the ai are an n dimensional Markov Lie monoid representation
(ie sum to unity and are non-negative – thus a larger dimensional
bittor).
Technical Summary – II Generalized
Arithmetic
 3. New Bittor Generalized Numbers: Binary numbers
are now defined as outer products of these bittor
objects where ‘1’ = {1,0} and ‘0’ = {0,1}. Total
uncertainly is thus {0.5, 0.5}. It is only necessary to
explicitly show the upper component to express a
value e.g.(1101.101<.5>)
 4. Arithmetic: Arithmetic operations (+-*/^) are now
defined by the natural Boolean generalizations
between the Bittors that comprise a ‘number’.
Technical Summary – III Generalized
Information
 5. Shannon information is normally defined as
‘1 bit’ for a binary value of 1 or 0 (log2(P)). We
extend this for a Bittor as I=log2(P12+P02) via
second order Renyi entropy.
 6. The probability values in the bittor need not
have the accuracy of a real number but only a
binary number of desired length (perhaps 5,
or 6) eg x = (00110 , 01010).
Discussion
 Summary:



The fundamental objects of observational information are
proposed to be Markov Lie monoid two-dimensional
representations (called bittors as short for ‘bit vectors’) with
the component interpretation of the bittor <x1, x0> to be the
probabilities to be 1 and 0 respectively.
The generalized logic defined by zi = caijk xj yk provides an
entirely new kind of mathematics among the bittor objects
that consists of 16 different products and bittor weighted
linear combinations.
Bittors and bittor type numbers are capable of automated
management of uncertainty and constitute a new kind of
mathematical structure that generalizes the existing number
systems and contain them.
Proposal
 We propose that this new type of mathematics be used to
partially manage logical and numerical uncertainty by building
this mathematics into computers in a fundamental way.
 We propose that this be done in three stages: (1) simulation in
CC++ or JAVA code for demonstrations & testing, (2) built in an
integral way for automatic use in CC++ and JAVA programming,
and (3) incorporated as hardware as an chip accelerator to
speed the processing.
 We propose that experts from different areas consider the
impact of such an extension to our current mathematics and
identify problems, and new ways of utilizing these structures.
End Technical Overview