Posterior Probabilities: Nonmonotonicity, Log-Concavity, and Tur¡n's Inequality


In the standard Bayesian framework the data are assumed to be generated by a distribution parametrized by ¸ in a parameter space , over which a prior distribution is defined. A Bayesian statistician quantifies the belief that the true parameter is ¸_0 in by its posterior probability given the observed data. We investigate the behavior of the posterior belief in ¸_0 when the data are generated under some parameter ¸_1, which may or may not be be the same as ¸_0. Starting from stochastic orders, specifically, likelihood ratio dominance, that obtain for resulting distributions of posteriors, we consider monotonicity properties of the posterior probabilities as a function of the sample size when data arrive sequentially. While the ¸_0-posterior is monotonically increasing (i.e., it is a submartingale) when the data are generated under that same ¸_0, it need not be monotonically decreasing in general, not even in terms of its overall expectation, when the data are generated under a different ¸_1; in fact, it may keep going up and down many times. In the framework of simple iid coin tosses, we show that under certain conditions the overall expected posterior of ¸_0 eventually becomes monotonically decreasing when the data are generated under ¸_1 ¸_0. Moreover, we prove that when the prior is uniform this expected posterior is a log-concave function of the sample size, by developing an inequality that is related to Tur¡n's inequality for Legendre polynomials.