Loading…
Estimating Uncertainties Using Judgmental Forecasts with Expert Heterogeneity
In this paper, Saurabh Bansal and Genaro J. Gutierrez develop a data-driven approach to aggregate point forecasts provided by multiple experts into probability distributions while accounting for expert heterogeneity. They consider the commonly encountered business problem of estimating probability d...
Saved in:
Published in: | Operations research 2020-03, Vol.68 (2), p.363-380, Article opre.2019.1938 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In this paper, Saurabh Bansal and Genaro J. Gutierrez develop a data-driven approach to aggregate point forecasts provided by multiple experts into probability distributions while accounting for expert heterogeneity. They consider the commonly encountered business problem of estimating probability distributions when historical data for the underlying uncertainty are not available or relevant. In such situations, firms typically ask a panel of experts to provide their forecasts. The authors develop a new approach to use the experts’ prior judgments to characterize their biases and the consistency of their judgments and use this information to aggregate their forward-looking judgments to obtain probability distributions. This approach quantifies the heterogeneity in expert judgments and incorporates this information into the aggregation protocol. It also establishes that bias and consistency in expert judgments are complements and not substitutes, and therefore, choosing an expert with less bias is not necessarily better than choosing an expert with a stronger bias.
In this paper, we develop a new characterization of multiple-point forecasts provided by experts and use it in an optimization framework to deduce actionable signals, including the mean, standard deviation, or a combination of the two for underlying probability distributions. This framework consists of three steps: (1) calibrate experts’ point forecasts using historical data to determine which quantile they provide, on average, when asked for forecasts, (2) quantify the precision in the experts’ forecasts around their average quantile, and (3) use this calibration information in an optimization framework to deduce the signals of interest. We also show that precision and accuracy in expert judgments are complementary in terms of their informativeness. We also discuss implementation of the development and the realized benefits at a large government project in the agribusiness domain. |
---|---|
ISSN: | 0030-364X 1526-5463 |
DOI: | 10.1287/opre.2019.1938 |