征稿已开启

查看我的稿件

注册已开启

查看我的门票

已截止
活动简介

This workshop will focus on recent advances in Dynamic Data-Driven Application Systems (DDDAS). DDDAS seeks to enable new capabilities through the synergistic integration of models and data in a dynamic feedback control loop, whereby instrumentation data of a system are dynamically integrated with simulation to enhance modeling and prediction, while models, in turn, can be used to enhance the effective use of system instrumentation, including networks of heterogeneous sensors and controllers. Application models can encompass large numeric and non-numeric methods as well as analytical methods. Topics include:

Why DDDAS (Darema) This talk will introduce the basic concept of Dynamic Data Driven Application Systems (DDDAS). DDDAS embodies the power of Dynamic Data Driven Applications Systems (DDDAS), a concept whereby an executing application model is dynamically integrated, in a feed-back loop, with the real-time dataacquisition and control components, as well as other data sources of the application system. Advanced capabilities can be created through such new computational approaches in modeling and simulations, and in instrumentation methods, and include: enhancing the accuracy of the application model; speeding-up the computation to either allow more comprehensive models of a system, or create decision support systems with the accuracy of full-scale simulations. In addition, the notion of controlling instrumentation processes by the executing application results in more efficient management of application data and addresses challenges of how to architect and dynamically manage large sets of heterogeneous sensors and controllers, an advance over the static and ad-hoc ways of today - with DDDAS these sets of resources can be managed adaptively and in optimized ways. Supporting the integrated computational and instrumentation aspects of InfoSymbiotics/DDDAS environments entails a unified computational-instrumentation underlying platform. The talk will address opportunities for such new capabilities together with corresponding research challenges in applications, application mathematical and statistical algorithms, instrumentation, and systems software, with illustrative examples from several application areas where InfoSymbiotics/DDDAS engenders transformative impact; examples include: analysis and decision support for structural engineered systems, environmental systems, critical infrastructure systems, such as electrical powergrids, and manufacturing systems and processes.

Data-Based Techniques for Model Calibration, Correction, and Refinement (Bernstein) It is often said that “all models are wrong, but some are useful.” With this utilitarian/practical view in mind, we recognize that most models are based on a combination of physics and data, where the physics provide the backbone of the model and data are used to fill in details that are difficult to specify from physics alone. This portion of the workshop will focus on the following question: How can data be used to improve the accuracy of a given model? This is a problem in model calibration, correction, and refinement, where the goal is to use data to modify the original model so that it more accurately represents the underlying physical system. This problem is relevant to a wide range of applications that require accurate models, such as large-scale data assimilation for DDDAS. Model refinement can be viewed as a specialized problem in system identification, where, unlike the usual case, where a complete model is identified, the goal is to identify a subsystem of an overall model. The subsystem may be static or dynamic, and it may be inaccessible, that is, its inputs and outputs may be unmeasured. In this session, model refinement will be addressed through retrospective cost techniques. The theory and algorithm will be reviewed, and applications will include illustrative examples from circuits and structures, as well as large-scale applications such as the identification of missing physics in a worldwide model of the ionosphere-thermosphere. Finally, the same technique will be used to reconstruct unknown drivers for use in large-scale data assimilation.

Learning, Information and Perception in Environmental Systems Science (Ravela) Dynamic data-driven systems science is valuable for environmental investigations. Non-linear processes represented by high-dimensional numerical models with uncertain parameters, states, forcing and structure require data in order to be effective tools for analysis , prediction and discovery. The coupling of models and measurements takes on a variety of forms that require inferences from data to understand models, inferences from data and models to understand the physical system state, and inferences from models to reason about data. Several inference problems can be expressed in terms of fixed-interval, fixed-lag or fixed-point estimation problems where the need is efficacy in the presence of high-dimensional, non- Gaussian state-spaces with sparse, noisy measurements. In this talk, using state estimation as a prototypical inference problem, we begin with a review of two-point boundary value problems and sequential Bayesian state estimation culminating in the particle filter and smoother. We then discuss three directions of interest. Learning: Error distributions of model predictions are poorly understood. Even, for example, as one must deal with emerging non-Gaussian uncertainties in nonlinear system states, overall variance remains an important objective in the presence of model error and cannot be ignored. We approach inference using multiple distributions to model the underlying uncertainty, and propose an Ensemble Learning framework for non-Gaussian estimation. We show how this framework produces estimators with tighter posterior uncertainties and robustness to model bias. Information: Sampling remains a critical problem in non-Gaussian inference. One way to ameliorate this problem is to approach non-Gaussian estimation in part as an optimization problem. Using the mutual information as a “distribution free” summary measure is relatively well recognized but does not by itself reduce sampling burdens. Less known and more interesting is to employ quadratic forms of mutual information which we argue leads to tractable non- Gaussian estimation via gradient-based optimization. After a review of this framework, we show that quadratic mutual information can be effectively optimized for high dimensional non- Gaussian distributions. In particular, we examine techniques ranging from fast gauss transform to hashing and show that a slight relaxation of exactness can lead to highly efficient and effective non-parametric estimators. We additionally propose a scale-space information graph for added efficiency and show how model identification of this graph is cost effective. With these techniques in hand, we will build filters, fixed-lag and fixed-interval smoothers that are both adjoint free and require minimal sampling. Perception: The analysis, synthesis and representation of coherent fields is fundamental to many spatial inference problems. We first show the failure of classical methods in the context of coherent fields and then synthesize an effective approach utilizing the above results in the context of a pattern theory for fluids. In this theory, we posit that position (deformation, geometry) and amplitude (appearance, intensity) both play a significant role, which leads to several new approaches. These include super-resolution, reduced models, targeting (adaptive sampling), data assimilation (state estimation), and uncertainty quantification.

Dynamic Data Monitoring for Tracking Resident Space Objects and Atmospheric Release (Singla) The main focus of this talk is to introduce a dynamic data monitoring framework for the estimation of spatio-temporal dynamical variables by optimally managing mobile sensors, which is conducive to scaling to accommodate increasing numbers of sensors. The crux of the work lie in accounting for uncertainties in model parameters of systems driven by stochastic input, characterizing the evolution of the uncertainty of the forecast variables, and integrating disparate sources of asynchronous data with the model output using a Bayesian framework. Intimately tied to the data assimilation problem is the optimal deployment of mobile sensors (e.g., UAV) with the objective of minimizing the uncertainty of the forecast variable, which corresponds to the maximization of the degree of Situational Awareness (SA) so as to facilitate decision making. The most critical challenge here is to provide a quantitative assessment of how closely our estimates reflect reality in the presence of model uncertainty, discretization errors as well as measurement errors and uncertainty. The quantitative understanding of uncertainty is essential when predictions are to be used to inform policy making or mitigation solutions where significant resources are at stake. This talk will focus on recent development of mathematical and algorithmic fundamentals for uncertainty propagation, forecasting, model-data fusion and optimal control for nonlinear systems. The central idea is to replace evolution of initial conditions for a dynamical system by evolution of probability density functions (pdf) for state variables. Various academic and engineering problems such as tracking resident space objects and forecasting of toxic material plume being advected through the atmosphere, where traditional methods either fail or perform very poorly, are considered to assess the reliability and limitations of the newly established methods. Some results from these studies will be discussed.

征稿信息
留言
验证码 看不清楚,更换一张
全部留言
重要日期
  • 06月30日

    2015

    会议日期

  • 06月30日 2015

    注册截止日期

主办单位
American Automatic Control Council
联系方式
移动端
在手机上打开
小程序
打开微信小程序
客服
扫码或点此咨询