In the standard Bayesian framework the data are assumed to be generated by a distribution parametrized by θ in a parameter space Θ, over which a prior distribution π is defined. A Bayesian statistician quantifies the belief that the true parameter is θ_0 in Θ by its posterior probability given the observed data. We investigate the behavior of the posterior belief in θ_0 when the data are generated under some parameter θ_1, which may or may not be be the same as θ_0. Starting from stochastic orders, specifically, likelihood ratio dominance, that obtain for resulting distributions of posteriors, we consider monotonicity properties of the posterior probabilities as a function of the sample size when data arrive sequentially. While the θ_0-posterior is monotonically increasing (i.e., it is a submartingale) when the data are generated under that same θ_0, it need not be monotonically decreasing in general, not even in terms of its overall expectation, when the data are generated under a different θ_1; in fact, it may keep going up and down many times. In the framework of simple iid coin tosses, we show that under certain conditions the overall expected posterior of θ_0 eventually becomes monotonically decreasing when the data are generated under θ_1≠θ_0. Moreover, we prove that when the prior is uniform this expected posterior is a log-concave function of the sample size, by developing an inequality that is related to Turán's inequality for Legendre polynomials.