Learning About and From Ecological Forecasting Models: A Decision Science Approach
Abstract
Models are ubiquitous in science and science-based aiding of decision-making. Despite the frequency of their use, little progress has been made in articulating how modelers and model users learn from model results; specifically, there is a paucity of frameworks that are theoretically-grounded and independent of scientific discipline. For ecological forecasting and other model-dependent disciplines, this can be a problem for two reasons. First, models are often used as a means of integrating knowledge across disciplines or subject areas (e.g., ecology and economics). If modeling conventions and epistemic norms are not transparent, then differing practices can easily be misinterpreted, leading to inaccurate model insights. Second, model-building and interpretation is, conceptually, a learning process that consists of a chain of decisions, from choosing the system boundary to deciding how to validate the model to testing model robustness. This framing is important because experts and non-experts alike can exhibit biases and heuristics during the process of learning, especially from model-based evidence. In aggregate, these learning biases and heuristics can lead to systematic errors in how prior knowledge is elicited and incorporated, information or evidence is sought out, and model-based evidence is used to learn and provide insight. Using decision science as a foundation, we develop a framework that guides model builders and model users through the process of learning about and from a model in a way that minimizes error introduced by biases and heuristics and aids in choosing a model that maximizes learning. This is accomplished through an application of Bayesian inference that treats models as constructed evidence-generating processes that are used in conjunction with modeler and user judgment of how adequate the model is for learning. Following this framework generates a comprehensive set of questions to ask in the learning process that is theoretically-grounded and independent of scientific discipline. Importantly, questions are linked to mathematical terms with foundations in Bayesian inference, which allows for use of well-established elicitation techniques. Thus all the steps of generating model-based insight are clearly delineated and their links to final insight are transparent.
Description
In AGU Fall Meeting Abstracts
Rights
This publication is a work of the U.S. Government as defined in Title 17, United States Code, Section 101. Copyright protection is not available for this work in the United States.Collections
Related items
Showing items related by title, author, creator and subject.
-
Comparative efficacies of decision strategies and the effects of learning in dynamic environments: a computer simulation approach
Rutledge, Spencer, III (Monterey, California. Naval Postgraduate School, 1993-09);Models of aggregation in management science and economics are not consistent with micro-empirical knowledge of individual decision making. This has occurred as a result of using heuristics that are derived from behavioral ... -
Human factors influencing decision making
Jacobs, Patricia A. (Monterey, California. Naval Postgraduate School, 1998-07-01); NPS-OR-98-003This report supplies references and comments on literature that identifies human factors influencing decision making, particularly military decision making. The literature has been classified as follows (the ... -
Behavioral Biases within Defense Acquisition
Mortlock, Robert; Dew, Nick (Monterey, California. Naval Postgraduate School, 2021-05-10); SYM-AM-21-049This paper contributes to the process of building knowledge about what we term as behavioral acquisition, which explores defense acquisition from a behavioral standpoint, including the impact of psychology, organizational ...