Thesis Defense: Jiaxin Zhang, “Uncertainty Quantification from Small Data: A Multimodel Approach”
THE DEPARTMENT OF CIVIL ENGINEERING
ADVISOR MICHAEL SHIELDS, ASSISTANT PROFESSOR
ANNOUNCE THE THESIS DEFENSE OF
Tuesday, August 7, 2018
“Uncertainty Quantification from Small Data: A Multimodel Approach”
As a central area of computational science and engineering (CSE), uncertainty quantification (UQ) is playing an increasingly important role in computationally evaluating the performance of complex mathematical, physical and engineering systems. UQ includes the quantification, integration, and propagation of uncertainties that result from stochastic variations in the natural world as well as uncertainties created by lack of statistical data or knowledge and uncertainty in the form of mathematical models. A common situation in engineering practice is to have a limited cost or time budget for data collection and thus to end up with sparse datasets. This leads to epistemic uncertainty (lack of knowledge) along with aleatory uncertainty (inherent randomness), and a mix of these two sources of uncertainties (requiring imprecise probabilities) is a particularly challenging problem.
A novel methodology is proposed for quantifying and propagating uncertainties created by lack of data. The methodology utilizes the concepts of multimodel inference from both information-theoretic and Bayesian perspectives to identify a set of candidate probability models and associated model probabilities that are representative of the given small dataset. Both model-form uncertainty and model parameter uncertainty are identified and estimated within the proposed methodology. Unlike the conventional method that reduces the full probabilistic description to a single probability model, the proposed methodology fully retains and propagates the total uncertainties quantified from all candidate models and their model parameters. This is achieved by identifying an optimal importance sampling density that best represents the full set of models, propagating this density and reweighting the samples drawn from the each of candidate probability model using Monte Carlo sampling. As a result, a complete probabilistic description of both aleatory and epistemic uncertainty is achieved with several orders of magnitude reduction in Monte Carlo-based computational cost.
Along with the proposed new UQ methodology, an investigation is provided to study the effect of prior probabilities on quantification and propagation of imprecise probabilities resulting from small datasets. It is illustrated that prior probabilities have a significant influence on Bayesian multimodel UQ for small datasets and inappropriate priors may introduce biased probabilities as well as inaccurate estimators even for large datasets. When a multi-dimensional UQ problem is involved, a further study generalizes this novel UQ methodology to overcome the limitations of the independence assumption by modeling the dependence structure using copula theory. The generalized approach achieves estimates for imprecise probabilities with copula dependence modeling for a composite material problem. Finally, as applications of the proposed method, an imprecise global sensitivity analysis is performed to illustrate the efficiency and effectiveness of the developed novel multimodel UQ methodology given small datasets.