Decisions, decisions, decisions in an uncertain environment
@article{Cressie2022DecisionsDD, title={Decisions, decisions, decisions in an uncertain environment}, author={Noel Cressie}, journal={Environmetrics}, year={2022}, volume={34} }
Decision‐makers abhor uncertainty, and it is certainly true that the less there is of it the better. However, recognizing that uncertainty is part of the equation, particularly for deciding on environmental policy, is a prerequisite for making wise decisions. Even making no decision is a decision that has consequences, and using the presence of uncertainty as the reason for failing to act is a poor excuse. Statistical science is the science of uncertainty, and it should play a critical role in…
One Citation
Environmental data science: Part 1
- Computer ScienceEnvironmetrics
- 2023
This editorial identifies and discusses common strands of research that appear in the contributions to Part 1, which largely focus on statistical methodology; these include temporal, spatial and spatio‐temporal modeling; statistical computing; machine learning and artificial intelligence; and the critical question of decision‐making in the presence of uncertainty.
References
SHOWING 1-10 OF 35 REFERENCES
Bayesian Decision Analysis: Principles and Practice
- Computer Science
- 2010
Evolving from a third-year undergraduate course taught by the author over many years, all of the material in this book will be accessible to a student who has completed introductory courses in probability and mathematical statistics.
Making management decisions in the face of uncertainty: a case study using the Burdekin catchment in the Great Barrier Reef
- Environmental Science
- 2018
Modelling and monitoring pollutants entering into the Great Barrier Reef (GBR) lagoon remain important priorities for the Australian and Queensland governments. Uncertainty analysis of pollutant load…
Assessment and Propagation of Model Uncertainty
- Computer Science
- 1995
A Bayesian approach to solving this problem that has long been available in principle but is only now becoming routinely feasible, by virtue of recent computational advances, is discussed and its implementation is examined in examples that involve forecasting the price of oil and estimating the chance of catastrophic failure of the U.S. Space Shuttle.
A theorem for Bayesian group decisions
- Economics
- 2011
This paper presents a natural extension of Bayesian decision theory from the domain of individual decisions to the domain of group decisions. We assume that each group member accepts the assumptions…
A general framework for updating belief distributions
- Computer ScienceJournal of the Royal Statistical Society. Series B, Statistical methodology
- 2016
It is argued that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case.
Design considerations for neyman-pearson and wald hypothesis testing
- Mathematics
- 1989
SummaryThe Neyman-Pearson Lemma describes a test for two simple hypotheses that, for a given sample size, is most powerful for its level. It is usually implemented by choosing the smallest sample…
A few statistical principles for data science
- Computer ScienceAustralian & New Zealand Journal of Statistics
- 2021
This article presents a few statistical principles for data scientists that have helped me, and continue to help me, when I work on complex interdisciplinary projects.
Great expectations and even greater exceedances from spatially referenced data
- Environmental Science, EconomicsSpatial Statistics
- 2020
Physical‐statistical modelling
- Environmental Science
- 2014
The assimilation of measurements with deterministic dynamical systems evolved from the work of R. E. Kalman in 1960 (Grewal and Andrews, 2010). Kalman’s work focussed on improving aerospace…
Focused Bayesian prediction
- Computer ScienceJournal of Applied Econometrics
- 2019
A new method for conducting Bayesian prediction is proposed that delivers accurate predictions without correctly specifying the unknown true data generating process, and finds notable gains in predictive accuracy relative to conventional likelihood-based prediction.