Bar-Hillel, Maya .
“The Base-Rate Fallacy In Probability Judgments”.
Discussion Papers 2019. Web.
Publisher's VersionAbstractThe base-rate fallacy is people's tendency to ignore base rates in favor of, e.g., individuating information (when such is available), rather than integrate the two. This tendency has important implications for understanding judgment phenomena in many clinical, legal, and social-psychological settings. An explanation of this phenomenon is offered, according to which people order information by its perceived degree of relevance, and let high-relevance information dominate low-relevance information. Information is deemed more relevant when it relates more specifically to a judged target case. Specificity is achieved either by providing information on a smaller set than the overall population, of which the target case is a member, or when information can be coded, via causality, as information about the specific members of a given population. The base-rate fallacy is thus the result of pitting what seem to be merely coincidental, therefore low-relevance, base rates against more specific, or causal, information. A series of probabilistic inference problems is presented in which relevance was manipulated with the means described above, and the empirical results confirm the above account. In particular, base rates will be combined with other information when the two kinds of information are perceived as being equally relevant to the judged case.
Sergiu Hart, Dean P. Foster .
“Forecast-Hedging And Calibration”.
Discussion Papers 2019. Web.
Publisher's VersionAbstractCalibration means that for each forecast x the average of the realized actions in the periods in which the forecast was x is, in the long run, close to x. Calibration can always be guaranteed (Foster and Vohra 1998), but it requires the forecasting procedure to be stochastic. By contrast, smooth calibration, which combines in a continuous manner nearby forecasts, can be guaranteed by a deterministic procedure (Foster and Hart 2018). In the present paper we develop the concept of forecast-hedging, which consists of choosing the forecasts in such a way that, no matter what the realized action will be, the expected forecasting track record can only improve. This approach integrates the existing calibration results by obtaining them all from the same simple basic argument, and at the same time differentiates between them according to the forecast-hedging tools that are used: deterministic and fixed point-based vs. stochastic and minimax-based. Additional benefits are new calibration procedures in the one-dimensional case that are simpler than all known such procedures, and a short proof for deterministic smooth calibration, in contrast to the complicated existing proof.