Publications

2015
Peter Sudholter Bezalel Peleg. 2015. “On Bargaining Sets of Convex NTU Games”. Publisher's Version Abstract
We show that the Aumann-Davis-Maschler bargaining set and the Mas-Colell bargaining set of a non-leveled NTU game that is either ordinal convex or coalition merge convex coincides with the core of the game. Moreover, we show by means of an example that the foregoing statement may not be valid if the NTU game is marginal convex.
Miller and Sanjurjo (2015) suggest that many analyses of the hot hand and the gambler s fallacies are subject to a bias. The purpose of this note is to describe our understanding of their main point in terms we hope are simpler and more accessible to non-mathematicians than is the original.
Nori Jacoby Yonatan Loewenstein Merav Ahissar Sagi Jaffe-Dax, Ofri Raviv. 2015. “A Computational Model of Implicit Memory Captures Dyslexics Perceptual Deficits”. Publisher's Version Abstract
Dyslexics are diagnosed for their poor reading skills. Yet they characteristically also suffer from poor verbal memory, and often from poor auditory skills. To date, this combined profile has been accounted for in broad cognitive terms. Here, we hypothesize that the perceptual deficits associated with dyslexia can be understood computationally as a deficit in integrating prior information with noisy observations. To test this hypothesis we analyzed the performance of human participants in an auditory discrimination task using a two-parameter computational model. One parameter captures the internal noise in representing the current event, and the other captures the impact of recently acquired prior information. Our findings show that dyslexics perceptual deficit can be accounted for by inadequate adjustment of these components; namely, low weighting of their implicit memory of past trials relative to their internal noise. Underweighting the stimulus statistics decreased dyslexics ability to compensate for noisy observations. ERP measurements (P2 component) while participants watched a silent movie, indicated that dyslexics perceptual deficiency may stem from poor automatic integration of stimulus statistics. Taken together, this study provides the first description of a specific computational deficit associated with dyslexia.
Classically, risk aversion is equated with concavity of the utility function. In this work we explore the conceptual foundations of this definition. In accordance with neo-classical economics, we seek an ordinal definition, based on the decisions maker s preference order, independent of numerical values. We present two such definitions, based on simple, conceptually appealing interpretations of the notion of risk-aversion. We then show that when cast in quantitative form these ordinal definitions coincide with the classical Arrow-Pratt definition (once the latter is defined with respect to the appropriate units), thus providing a conceptual foundation for the classical definition. The implications of the theory are discussed, including, in particular, to the understanding of insurance. The entire study is within the expected utility framework.
Building on the work of Nash, Harsanyi, and Shapley, we define a cooperative solution for strategic games that takes account of both the competitive and the cooperative aspects of such games. We prove existence in the general (NTU) case and uniqueness in the TU case. Our main result is an extension of the definition and the existence and uniqueness theorems to stochastic games - discounted or undiscounted.
Probability estimation is an essential cognitive function in perception, motor control, and decision making. Many studies have shown that when making decisions in a stochastic operant conditioning task, people and animals behave as if they underestimatethe probability of rare events. It is commonly assumed that this behavior is a natural consequence of estimating a probability from a small sample, also known as sampling bias. The objective of this paper is to challenge this common lore. We show that in fact, probabilities estimated from a small sample can lead to behaviors that will be interpreted as underestimatingor as overestimating the probability of rare events, depending on the cognitive strategy used. Moreover, this sampling bias hypothesis makes an implausible prediction that minute differences in the values of the sample size or the underlying probability will determine whether rare events will be underweighted or overweighed. We discuss the implications of this sensitivity for the design and interpretation of experiments. Finally, we propose an alternative sequential learning model with a resetting of initial conditions for probability estimation and show that this model predicts the experimentally-observed robust underweighting of rare events.
Motty Perry Sergiu Hart, Ilan Kremer. 2015. “Evidence Games: Truth and Commitment”. Publisher's Version Abstract
An evidence game is a strategic disclosure game in which an agent who has different pieces of verifiable evidence decides which ones to disclose and which ones to conceal, and a principal chooses an action (a "reward"). The agent's preference is the same regardless of his information (his "type") he always prefers the reward to be as high as possible whereas the principal prefers the reward to fit the agent's type. We compare the setup where the principal chooses the action only after seeing the disclosed evidence, to the setup where the principal can commit ahead of time to a reward policy (the latter is the standard mechanism-design setup). We compare the setup where the principal chooses the action only after seeing the disclosed evidence to the setup where the principal can commit ahead of time to a reward policy (the mechanism-design setup). The main result is that under natural conditions on the truth structure of the evidence, the two setups yield the same equilibrium outcome.
Feasible elimination procedures (Peleg, 1978) play a central role in constructing social choice functions which have the following property: in the associated game form, for any preference profile there exists a strong Nash equilibrium resulting in the sincere outcome. In this paper we provide an axiomatic characterization of the social choice correspondence resulting from applying feasible elimination procedures. The axioms are anonymity, Maskin monotonicity, and independent blocking.
We consider the problem of implementation in models of independent private values in which the valuation an agent attributes to a particular alternative is a function from a multidimensional Euclidean space to the real line. We first consider implementation by standard mechanisms, that include a decision rule and a profile of personal transfers. We present impossibility results on the implementation of decision rules that assign different outcomes to profiles of signals that result in the same profile of valuations. We then consider implementation by extended mechanisms that include, in addition to a decision rule and a profile of personal transfers, a profile of functions that affect the arguments of the valuation functions. We show that decision rules that assign different outcomes to profiles of signals that result in the same profile of valuations can be implemented by such mechanisms.
From drop-down computer menus to department-store aisles, people in everyday life often choose from simultaneous displays of products or options. Studies of position effects in such choices show seemingly inconsistent results. For example, in restaurant choice, items enjoy an advantage when placed at the beginning or end of the menu listings, but in multiple-choice tests, answers are more popular when placed in the middle of the offered list. When reaching for a bottle on a supermarket shelf, bottles in the middle of the display are more popular. But on voting ballots, first is the most advantageous position. Some of the effects are quite sensible, while others are harder to justify and can aptly be regarded as biases. This paper attempts to put position effects into a unified and coherent framework, and to account for them simply, using a small number of familiar psychological principles.
Naor (1969) was the first to observe that in a single-server memoryless queue, customers who inspect the queue length upon arrival and accordingly decide whether to join or not may join even if from the social point of view they are worse of. The question then is how to mechanically design the system such that customers will join only queue lengths that are advised by society, while still minding their own selfish utility. After reviewing some existing mechanisms (some involving money transfers and some not), we suggest novel ones that do not involve money transfers. They possess some advantages over the existing ones, which we itemize.
In this work we present five axioms for a risk-order relation defined over (monetary) gambles. We then characterize an index that satisfies all these axioms "the probability of losing money in a gamble multiplied by the expected value of such an outcome "and prove its uniqueness. We propose to use this function as the risk of a gamble. This index is continuous, homogeneous, monotonic with respect to first- and second-order stochastic dominance, and simple to calculate. We also compare our index with some other risk indices mentioned in the literature.
It is a commonly held intuition that increasing punishment leads to less crime. Let's move our glance from the punishment for the crime itself to the punishment for the attempt to commit a crime, or to the punishment for the threat to carry it out. We'll argue that the greater the punishment for the attempt to rob, i.e. for the threat, "give me your money or else ¦", the greater the number of robberies and attempts there will be. The punishment for the threat makes the withdrawal from it more expensive for the criminal, making the relative cost of committing the crime lower. In other words, the punishment of the attempt turns the attempt into a commitment by the robber, while at the same time turning an incredible threat into a credible one. Therefore, the robber has a strong interest in a legal system that increases the punishment of the attempt.
Dana Sherill-Rofe Omer Edhan, Ziv Hellman. 2015. “Sex And Portfolio Investment”. Publisher's Version Abstract
We attempt to answer why sex is nearly ubiquitous when asexual reproduction is ostensibly more efficient than sexual reproduction. From the perspective of a genetic allele, each individual bearing that allele is akin to a stock share yielding dividends equal to that individual's number of offspring, and the totality of individuals bearing the allele is its portfolio investment. Alleles compete over portfolio growth, and evolutionary reproduction strategies are essentially on-line learning algorithms seeking improved portfolio growth, with sexual reproduction a goal-directed algorithmic exploration of genotype space by sampling in each generation. The model assumes a stochastically changing environment but not weak selection. We show that in finite population models the algorithm of sexual reproduction yields, with high probability, higher expected growth than the algorithm of asexual reproduction does, proposing this as an explanation to why a majority of species reproduce sexually.
We propose to smooth out the calibration score, which measures how good a forecaster is, by combining nearby forecasts. While regular calibration can be guaranteed only by randomized forecasting procedures, we show that smooth calibration can be guaranteed by deterministic procedures. As a consequence, it does not matter if the forecasts are leaked, i.e., made known in advance: smooth calibration can nevertheless be guaranteed (while regular calibration cannot). Moreover, our procedure has finite recall, is stationary, and all forecasts lie on a finite grid. We also consider related problems: online linear regression, weak calibration, and uncoupled Nash dynamics in n-person games.
We consider the class of two-person zero-sum allocation games known as Captain Lotto games (Hart 2014). These are Colonel Blotto type games in which the players have capacity constraints. We show that the players optimal strategies are unique in most cases.
2014
Shmuel Zamir Todd R. Kaplan. 2014. “Advances in Auctions”. Publisher's Version Abstract
As a selling mechanism, auctions have acquired a central position in the free market econ-omy all over the globe. This development has deepened, broadened, and expanded the theory of auctions in new directions. This chapter is intended as a selective update of some of the developments and applications of auction theory in the two decades since Wilson (1992) wrote the previous Handbook chapter on this topic.
A Lotto game is a two-person zero-sum game where each player chooses a distribution on nonnegative real numbers with given expectation, so as to maximize the probability that his realized choice is higher than his opponent's. These games arise in various competitive allocation setups (e.g., contests, research and development races, political campaigns, Colonel Blotto games). A Captain Lotto game is a Lotto game with caps, which are upper bounds on the numbers that may be chosen. First, we solve the Captain Lotto games. Second, we show how to reduce all-pay auctions to simpler games expenditure games using the solution of the corresponding Lotto games. As a particular application we solve all-pay auctions with unequal caps, which yield a significant increase in the seller's revenue (or, the players' efforts).
Drawing intuition from a (physical) hydraulic system, we present a novel framework, constructively showing the existence of a strong Nash equilibrium in resource selection games with nonatomic players, the coincidence of strong equilibria and Nash equilibria in such games, and the invariance of the cost of each given resource across all Nash equilibria. Our proofs allow for explicit calculation of Nash equilibrium and for explicit and direct calculation of the resulting (invariant) costs of resources, and do not hinge on any fixed-point theorem, on the Minimax theorem or any equivalent result, on the existence of a potential, or on linear programming. A generalization of resource selection games, called resource selection games with I.D.-dependent weighting, is defined, and the results are extended to this family, showing that while resource costs are no longer invariant across Nash equilibria in games of this family, they are nonetheless invariant across all strong Nash equilibria, drawing a novel fundamental connection between group deviation and I.D.-congestion. A natural application of the resulting machinery to a large class of constraint-satisfaction problems is also described.
We show that feasible elimination procedures (Peleg, 1978) can be used to select k from m alternatives. An important advantage of this method is the core property: no coalition can guarantee an outcome that is preferred by all its members. We also provide an axiomatic characterization for the case k = 1, using the conditions of anonymity, Maskin monotonicity, and independent blocking. Finally, we show for any k that outcomes of feasible elimination procedures can be computed in polynomial time, by showing that the problem is computationally equivalent to finding a maximal matching in a bipartite graph.