- Browse by Author

### Browsing by Author "Nearing, Grey S."

Now showing 1 - 8 of 8

###### Results Per Page

###### Sort Options

Item An approach to quantifying the efficiency of a Bayesian filter(American Geophysical Union, 2013-04-26) Nearing, Grey S.; Gupta, Hoshin V.; Crow, Wade T.; Gong, Wei; University of Arizona; United States Department of Agriculture (USDA); Beijing Normal University; University of Alabama TuscaloosaShow more Data assimilation is the Bayesian conditioning of uncertain model simulations on observations to reduce uncertainty about model states. In practice, it is common to make simplifying assumptions about the prior and posterior state distributions, and to employ approximations of the likelihood function, which can reduce the efficiency of the filter. We propose metrics that quantify how much of the uncertainty in a Bayesian posterior state distribution is due to (i) the observation operator, (ii) observation error, and (iii) approximations of Bayes' Law. Our approach uses discrete Shannon entropy to quantify uncertainty, and we define the utility of an observation (for reducing uncertainty about a model state) as the ratio of the mutual information between the state and observation to the entropy of the state prior. These metrics make it possible to analyze the efficiency of a proposed observation system and data assimilation strategy, and provide a way to examine the propagation of information through the dynamic system model. We demonstrate the procedure on the problem of estimating profile soil moisture from observations at the surface (top 5 cm). The results show that when synthetic observations of 5 cm soil moisture are assimilated into a three-layer model of soil hydrology, the ensemble Kalman filter does not use all of the information available in observations.Show more Item Assessing the Robustness of Deep Learning Streamflow Models Under Climate Change(University of Alabama Libraries, 2022) Qualls, Logan Michelle; Tick, Geoffrey R.; University of Alabama TuscaloosaShow more Long Short-Term Memory networks provide the most accurate rainfall-runoff predictions to-date, but their reliability under climate change is not well understood. We explore the robustness of these models under climate nonstationarity by creating train and test data splits thatare designed to simulate climate bias. By training on forcing data from hydrological years of high (low) aridity and testing on data from hydrological years of low (high) aridity, we can begin to quantify the performance and relative robustness of that performance under climate nonstationarity. We benchmark against a calibrated conceptual model (the Sacramento Soil Moisture Accounting model) and a calibrated process-based model (the NOAA National WaterModel), and found that LSTMs were generally more accurate than both, even when trained on climatologically biased data splits. The process-based model did not show as large of a performance gap as the conceptual and deep learning models, however (i) this model was not calibrated on a climate-biased data split and (ii) LSTMs always out-performed the process-based benchmark, even when the LSTM training data had climatological bias. We find that although all hydrologic models reported here degrade under nonstationarity, DL models demonstrate greater robustness. We also tested the hypothesis that dynamic climate attributes as inputs into the LSTM would improve performance under climate nonstationarity. We found no predictive value with the addition of dynamic, as opposed to static, climate attribute inputs.Show more Item Debates—The future of hydrological sciences: A (common) path forward? Using models and data to learn: A systems theoretic perspective on the future of hydrological science(2014-06-02) Gupta, Hoshin V.; Nearing, Grey S.; University of Alabama TuscaloosaShow more Key Points: Discovery can be advanced by taking a perspective based in Information Theory. Much can be gained by focusing on the a priori role of Process Modeling. System Parameterization can result in information loss.Show more Item Deep Learning for Operational Streamflow Forecasts, Or More Specifically: Long Short-Term Memory Networks As a Rainfall-Runoff Modulefor the U.S. National Water Model(University of Alabama Libraries, 2022) Frame, Jonathan Martin; Zhang, Yong; University of Alabama TuscaloosaShow more This dissertation investigates deep learning (DL) and combining hydrologic process-based (PB) models with DL for a hybrid (HB) modeling approach (often referred to as ''physics-informed machine learning" or ''theory-guided learning") for improving the predictive performance of streamflow in the U.S. National Water Model. An in-depth analysis is made of the benefits of DL and the potential drawbacks of the HB models. No evidence is found supporting the use HB models over the "pure" DL models in the use cases analyzed. The performance of the HB models is found to degrade in ungauged basins, whereas the DL models do not. The DL models are the best performing models for predicting extremely high runoff events, even when such events are not included in the training set. Adding physics inspired constraints to data-driven models causes a loss of system information relative to the DL models. As such, a "pure" DL model, specifically the Long Short-Term Memory (LSTM), is chosen as one of the core modules for the Next Generation (Nextgen) U.S. National Water Model. The LSTM (via Nextgen) is applied to simulate streamflow for a three-year period across the 191,020 km^2 New England region.Show more Item Ensembles vs. information theory: supporting science under uncertainty(Springer, 2018) Nearing, Grey S.; Gupta, Hoshin V.; University of Alabama Tuscaloosa; University of ArizonaShow more Multi-model ensembles are one of the most common ways to deal with epistemic uncertainty in hydrology. This is a problem because there is no known way to sample models such that the resulting ensemble admits a measure that has any systematic (i.e., asymptotic, bounded, or consistent) relationship with uncertainty. Multi-model ensembles are effectively sensitivity analyses and cannot - even partially - quantify uncertainty. One consequence of this is that multi-model approaches cannot support a consistent scientific method - in particular, multi-model approaches yield unbounded errors in inference. In contrast, information theory supports a coherent hypothesis test that is robust to (i.e., bounded under) arbitrary epistemic uncertainty. This paper may be understood as advocating a procedure for hypothesis testing that does not require quantifying uncertainty, but is coherent and reliable (i.e., bounded) in the presence of arbitrary (unknown and unknowable) uncertainty. We conclude by offering some suggestions about how this proposed philosophy of science suggests new ways to conceptualize and construct simulation models of complex, dynamical systems.Show more Item The impact of vertical measurement depth on the information content of soil moisture times series data(2014-07-25) Qiu, Jianxiu; Crow, Wade T.; Nearing, Grey S.; Xingguo, Mo; Liu, Suxia; University of Alabama TuscaloosaShow more Item Nonparametric triple collocation(American Geophysical Union, 2017-07-07) Nearing, Grey S.; Yatheendradas, Soni; Crow, Wade T.; Bosch, David D.; Cosh, Michael H.; Goodrich, David C.; Seyfried, Mark S.; Starks, Patrick J.; National Aeronautics & Space Administration (NASA); NASA Goddard Space Flight Center; National Center Atmospheric Research (NCAR) - USA; University System of Maryland; University of Maryland College Park; United States Department of Agriculture (USDA); University of Alabama TuscaloosaShow more Triple collocation has found widespread application in the hydrological sciences because it provides information about the errors in our measurements without requiring that we have any direct access to the true value of the variable being measured. Triple collocation derives variance-covariance relationships between three or more independent measurement sources and an indirectly observed truth variable in the case where the measurement operators are additive. We generalize that theory to arbitrary observation operators by deriving nonparametric analogues to the total error and total correlation statistics as integrations of divergences from conditional to marginal probability ratios. The nonparametric solution to the full measurement problem is underdetermined, and we therefore retrieve conservative bounds on the theoretical total nonparametric error and correlation statistics. We examine the application of both linear and non-linear triple collocation to synthetic examples and to a real-data test case related to evaluating space-borne soil moisture retrievals using sparse monitoring networks and dynamical process models.Show more Item The quantity and quality of information in hydrologic models(American Geophysical Union, 2015-01-26) Nearing, Grey S.; Gupta, Hoshin V.; National Aeronautics & Space Administration (NASA); NASA Goddard Space Flight Center; University of Arizona; University of Alabama TuscaloosaShow more The role of models in science is to facilitate predictions from hypotheses. Although the idea that models provide information is widely reported and has been used as the basis for model evaluation, benchmarking, and updating strategies, this intuition has not been formally developed and current benchmarking strategies remain ad hoc at a fundamental level. Here we interpret what it means to say that a model provides information in the context of the formal inductive philosophy of science. We show how information theory can be used to measure the amount of information supplied by a model, and derive standard model benchmarking and evaluation activities in this context. We further demonstrate that, via a process of induction, dynamical models store information from hypotheses and observations about the systems that they represent, and that this stored information can be directly measured.Show more