Account for uncertainties and optimize decision-making with this thorough exposition
Decision theory is a body of thought and research seeking to apply a mathematical-logical framework to assessing probability and optimizing decision-making. It has developed robust tools for addressing all major challenges to decision making. Yet the number of variables and uncertainties affecting each decision outcome, many of them beyond the decider's control, mean that decision-making is far from a 'solved problem'. The tools created by decision theory remain to be refined and applied to decisions in which uncertainties are prominent.
Probabilistic Forecasts and Optimal Decisions introduces a theoretically-grounded methodology for optimizing decision-making under conditions of uncertainty. Beginning with an overview of the basic elements of probability theory and methods for modeling continuous variates, it proceeds to survey the mathematics of both continuous and discrete models, supporting each with key examples. The result is a crucial window into the complex but enormously rewarding world of decision theory.
Readers of Probablistic Forecasts and Optimal Decisions will also find:
Probabilistic Forecasts and Optimal Decisions is ideal for advanced undergraduate and graduate students in the sciences and engineering, as well as predictive analytics and decision analytics professionals.