#### Transcript Slide 1

Probabilistic Approach to Design under Uncertainty Dr. Wei Chen Associate Professor Integrated DEsign Automation Laboratory (IDEAL) Department of Mechanical Engineering Northwestern University Outline • • • • • Uncertainty in model-based design What is probability theory? How does one represent uncertainty? What is the inference mechanism? Connection between probability theory and utility theory • Dealing with various sources of uncertainty in model-based design • Summary Types of Uncertainty in Model-Based Design • • • • Model (lack of knowledge) Parametric (lack of knowledge, variability) Numerical Testing data Problem faced in design under uncertainty • To choose from among one set of possible design options X, where each involves a range of uncertain outcomes Y • To avoid making an “illogical choice” Basic Concepts of Probability Theory • Probability theory is the mathematical study of probability. • Probability derives from fundamental concepts of set theory and measurement theory. Sample Space e2 e3 Event A e1 e4 Example: Flip two coins Sample space – set of all possible outcomes of a random experiment under uncertainty Outcomes {e1=HH, e2=HT, e3=TH, e4=TT} Event –subset of a sample space e.g., A {e2 and e3} –experiments result in two different faces Probability P(e1)=P(e2)=P(e3)=P(e4)=0.25 P(null) = 0 P()=1 P(A)= P(e2)+P(e3)=0.5 Mathematics in Probability Theory • Three axioms of probability measure – 0 P(A) 1; P()=1; P(Ai)=P(Ai) Ai are disjoint events • Arithmetic of probabilities – Union, Intersection, and Conditional probabilities • Random variable is a function that assigns a real number to each outcome in the sample space Example: define x = total number of heads among the two tosses Possible values {X=0}={TT}; {X=1}={HT, TH}, {X=2}={HH} P{X=1}=0.5 • Probability density function & arithmetic of moments of a random variable, e.g., E[XY]=E[X]E[Y] if X and Y are independent • Convergence (law of large numbers) and central limit theorem Probabilistic Design Metrics in Quality Engineering Robustness Probability Density (pdf) Target M sy R=Area = Prob{g(x)c} pdf Bias 0 Reliability my sy Performance y Minimizing the effect of variations without eliminating the causes C Performance g To assure proper levels of “safety” for the system designed Philosophies of Estimating Probability • Frequentist – Assign probabilities only to events that are random based on outcomes of actual or theoretical experiments – Suitable for problems with well-defined random experiments • Bayesian – Assign probabilities to propositions that are uncertain according to subjective or logically justifiable degrees of belief in their truth Example of proposition: “there was life on Mars a billion years ago” – More suitable for design problems: events in the future, not in the past; all design models are predictive. – More popular among decision theorists Bayesian Inference • In the absence of data (experiments), we have to guess – A probability guess relies on our experience with “related” events • Once data is collected, inference relies on Bayes theorem – Probabilities are always personal degrees of belief – Probabilities are always conditional on the information currently available – Probabilities are always subjective • “Uncertainty of probability” is not meaningful. Bernardo, J.M. and Smith, A. F., Bayesian Theory, John Wiley, New York, 2000. Bayes’ Theorem P (H) P( D | H ) P( H ) P( H | D) P( D) Prior mean H Belief about H before obtaining data, prior P H - Hypothesis D - Data P (D | H) = L(H) Updated by data Max. Likelihood. Est. Data P (H | D) Posterior mean H H Belief about H after obtaining data, posterior P Bayes’ theorem provides •A solution to the problem of how to learn from data •A form of uncertainty accounting •A subjective view of probability Formalism of Bayesian Statistics • Offers a rationalist theory of personalistic beliefs in contexts of uncertainty with axioms clearly stated • Establishes that expected utility maximization provides the basis for rational decision making • Not descriptive, i.e., not to model actual behavior. • Prescriptive, i.e., how one should act to avoid undesirable behavioural inconsistency Connection of Probability Theory and Utility Theory • Three basic elements of decision – the alternatives (options) X – the predicted outcomes (performance) Y – decision maker’s preference over the outcomes, expressed as an objective function f in optimization • Utility theory – Utility is a preference function built on the axiomatic basis originally developed by von Neumann and Morgenstern (1947) – Six axioms (Luce and Raiffa, 1957; Thurston, 2006) Completeness of complete order Transitivity Monotonicity Probabilities exist and can be quantified Monotonicity of Probability Substitution-independence In agreement to employing probability to model uncertainty Decision Making – Ranking Design Alternatives • Without uncertainty – objective function f = V(Y) = V(Y(X)) V - value function, e.g. profit • With uncertainty – objective function f = E(U ) U ( V ) pdf ( V )dV E(U) - expected utility. The preferred choice is the alternative (lottery) that has the higher expected utility. pdf (V) U (V) Risk averse 1 Risk neutral Risk prone A B V (e.g. profit) 0 worst best V Issues in Model-Based Design • How should we provide probabilistic quantification of uncertainty associated with a model? • How should we deal with model uncertainty (reducible) and parameter uncertainty (irreducible) simultaneously? • How should we make a design decision with good confidence? Chen, W., Xiong, Y., Tsui, K-L., and Wang, S., “Some Metrics and a Bayesian Procedure for Validating Predictive Models in Engineering Design”, DETC2006-99599, ASME Design Technical Conference. Bayesian Approach for Quantifying the Uncertainty of Predictive Model Y ( x ) Y ( x ) ( x) e r Y m (x) (x) (x) Y e (x) Y r (x) Y m (x) ( x) ( x) - Physical observation - True but unknown real performance - Computer model output m - Bias function (between Y r (x) and Y (x)) - Random error in physical experiment Bayesian Approach [Y e (x) Y m (x)] (x) (x) ˆ(x) and UQ Bias-Correction Y r (x) Y m (x) (x) Yˆ r (x) and UQ m Yˆ m (x) Metamodel of Y (x) Computer experiments Physical experiments Observations (data) of (x) Bayesian posterior of (x) Uncertainty is accounted for by (x) More about the Bayesian Approach Model assumption (x) - Gaussian process . p R (xi x j ) exp k ( xik x j k )2 mean: m (x) f (x) covariance: s R k 1 (x) - Gaussian process (I.I.D.) mean: zero variance: s 2 s 2 Priors distribution of parameters (nondeterministic) T s2 N (b s2V ) s2 2 IG( ) Data IG( ) ye ( ye (x1 ) ye (xne ))T Physical experiment Computer experiment s 2 ym ( ym (x1 ) ym (xne ))T m m m T or y (Yˆ (x1 ) Yˆ (xne )) δne ye (x1 ) ym (x1 ) Known parameters (deterministic) k Estimated from data, by MLE or Cross validation Posterior distribution of parameters s 2 (omitted here) Posterior distribution of (x) (x) ye ym T (n em m em (x) s2em (x)) That is, the posterior of (x) is a non-central t process ye (xne ) ym (xne ))T Integrated Framework for Handling Model and Parameter Uncertainties Given computer model Sequential experiment design Physical Computer experiments experiments Parameter uncertainty Specified confidence level Pth Predictive model Yˆ r ( x) and uncertainty quantification Design objective function fˆ (x) and uncertainty quantification Design validation metrics M D Design validity requirements satisfied (MD < Pth)? No Yes Design decision Expected Utility Optimization Uncertainty Quantification of Design Objective Function with Parameter Uncertainty A robust design objective x f ( x) (smaller-is-better) is used to determine the optimal solutions. f (x) w1 mY r (x) w2 sY r (x) x is a design variable and a noise variable w1, w2 : weighting factors Uncertainty of Uncertainty of f (x) Y r (x) Apley et al. (2005) developed analytical formulations to approximately quantify the mean and variance of f (x) . In this example, Monte Carlo Simulation is employed. Y r (x) f ( x) 58.5 58.5 f ( x) 58 Yrpredction realization of Y r 95% PI 57.5 57 57 56.5 56.5 56 f y 57.5 fpredction 58 95% PI 55.5 56 55.5 55 55 54.5 54.5 54 54 53.5 53.5 0 0.1 0.2 0.3 0.4 0.5 x 0.6 0.7 0.8 0.9 r Realizations of Y (x) 1 0 0.1 0.2 0.3 0.4 0.5 x 0.6 Mean of 95% PI 0.7 0.8 f ( x) 0.9 1 Validation Metrics Probabilistic measure of whether a candidate optimal design is better than other design choices with respect to a particular design objective m f 2s f mf m f 2s f f m f 2s f mf m f 2s f f x x1 x* x x1 x2 Smaller confidence x* x2 Larger confidence Three types of design validation metrics (MD) – f is small-the-better 1 Type 1: Multiplicative Metric Type 2: Additive Metric Type 3: Worst-Case Metric M D (x*) P f (x*) f (xi ) xi d , xi X 0 1 M D (x*) P f (x*) f (xi ) N xi d , xi X 0 M D (x*) min xi d , xi X 0 K averaging P f (x*) f (xi ) MD is intended to quantify the confidence of choosing x* as the optimal design among all design candidates or within design region d . Summary • Prediction is the basis for all decision making, including engineering design. • Probability is a belief (subjective), while observed frequencies are used as evidence to update the belief. • Probability theory and the Bayes theorem provide a rigorous and philosophically sound framework for decision making. • Predictive models in design should be described as stochastic models. • The impact of model uncertainty and parameter uncertainty can be treated separately in the process of improving the predictive capability. • Probabilistic approach offers computational advantages and mathematical flexibility.