check
Publications | The Federmann Center for the Study of Rationality

Publications

2010
Aumann, Robert J. . A Response Regarding The Matter Of The Man With Three Wives. Discussion Papers 2010. Web. Publisher's VersionAbstract
A response to criticism of the paper "On the Matter of the Man with Three Wives," Moriah 22 (1999), 98- 107 (see also Rationality Centerdp 102, June 1996). The Moriah paper is a non-mathematical account, written in Hebrew for the Rabbinic public, of "Game-Theoretic Analysis of a Bankruptcy Problem from the Talmud," by R. Aumann and M. Maschler, J. Econ. Th. 36 (1985), 195- 213. The current response appeared in Hama'yan 50 (2010), 1- 11.
Maya Bar-Hillel, Alon Maharshak, Avital Moshinsky Ruth Nofech . A Rose By Any Other Name: A Social-Cognitive Perspective On Poets And Poetry. Discussion Papers 2010. Web. Publisher's VersionAbstract
Evidence, anecdotal and scientific, suggests that people treat (or are affected by) products of prestigious sources differently than those of less prestigious, or of anonymous, sources. The products  which are the focus of the present study are poems, and the sources  are the poets. We explore the manner in which the poet s name affects the experience of reading a poem. Study 1 establishes the effect we wish to address: a poet s reputation enhances the evaluation of a poem. Study 2 asks whether it is only the reported evaluation of the poem that is enhanced by the poet s name (as was the case for The Emperor s New Clothes) or the enhancement is genuine and unaware. Finding for the latter, Study 3 explores whether the poet s name changes the reader s experience of it, so that in a sense one is reading a different  poem. We conclude that it is not so much that the attributed poem really differs from the unattributed poem, as that it is just ineffably better. The name of a highly regarded poet seems to prime quality, and the poem becomes somehow better. This is a more subtle bias than the deliberate one rejected in Study 2, but it is a bias nonetheless. Ethical implications of this kind of effect are discussed.
Lev, Omer . A Two-Dimensional Problem Of Revenue Maximization. Discussion Papers 2010. Web. Publisher's VersionAbstract
We consider the problem of finding the mechanism that maximizes the revenue of a seller of multiple objects. This problem turns out to be significantly more complex than the case where there is only a single object (which was solved by Myerson [5]). The analysis is difficult even in the simplest case studied here, where there are two exclusive objects and a single buyer, with valuations uniformly distributed on triangular domains. We show that the optimal mechanisms are piecewise linear with either 2 or 3 pieces, and obtain explicit formulas for most cases of interest
Hellman, Ziv . Almost Common Priors. Discussion Papers 2010. Web. Publisher's VersionAbstract
{What happens when priors are not common? We show that for each type pro¬le „ over a knowledge space (\copyright,  ), where the state space \copyright is connected with respect to the partition pro¬le  , we can associate a value 0 / 1 that we term the prior distance of „
Arieli, Itai . Backward Induction And Common Strong Belief Of Rationality. Discussion Papers 2010. Web. Publisher's VersionAbstract
In 1995, Aumann showed that in games of perfect information, common knowledge of rationality is consistent and entails the back- ward induction (BI) outcome. That work has been criticized because it uses "counterfactual" reasoning|what a player "would" do if he reached a node that he knows he will not reach, indeed that he him- self has excluded by one of his own previous moves. This paper derives an epistemological characterization of BI that is outwardly reminiscent of Aumann's, but avoids counterfactual reason- ing. Specifically, we say that a player strongly believes a proposition at a node of the game tree if he believes the proposition unless it is logically inconsistent with that node having been reached. We then show that common strong belief of rationality is consistent and entails the BI outcome, where - as with knowledge - the word "common" signifies strong belief, strong belief of strong belief, and so on ad infinitum. Our result is related to - though not easily derivable from - one obtained by Battigalli and Sinischalchi [7]. Their proof is, however, much deeper; it uses a full-blown semantic model of probabilities, and belief is defined as attribution of probability 1. However, we work with a syntactic model, defining belief directly by a sound and complete set of axioms, and the proof is relatively direct.
Noga Alon, Yuval Emek, Michal Feldman, and Moshe Tennenholtz. Bayesian Ignorance. Discussion Papers 2010. Web. Publisher's VersionAbstract
We quantify the effect of Bayesian ignorance by comparing the social cost obtained in a Bayesian game by agents with local views to the expected social cost of agents having global views. Both benevolent agents, whose goal is to minimize the social cost, and selfish agents, aiming at minimizing their own individual costs, are considered. When dealing with selfish agents, we consider both best and worst equilibria outcomes. While our model is general, most of our results concern the setting of network cost sharing (NCS) games. We provide tight asymptotic results on the effect of Bayesian ignorance in directed and undirected NCS games with benevolent and selfish agents. Among our findings we expose the counter-intuitive phenomenon that "ignorance is bliss": Bayesian ignorance may substantially improve the social cost of selfish agents. We also prove that public random bits can replace the knowledge of the common prior in attempt to bound the effect of Bayesian ignorance in settings with benevolent agents. Together, our work initiates the study of the effects of local vs. global views on the social cost of agents in Bayesian contexts.
Rinott, Yaakov Malinovsky, and Yosef. Best Invariant And Minimax Estimation Of Quantiles In Finite Populations. Discussion Papers 2010. Web. Publisher's VersionAbstract
We study estimation of finite population quantiles, with emphasis on estimators that are invariant under monotone transformations of the data, and suitable invariant loss functions. We discuss non-randomized and randomized estimators, best invariant and minimax estimators and sampling strategies relative to different classes. The combination of natural invariance of the kind discussed here, and finite population sampling appears to be novel, and leads to interesting statistical and combinatorial aspects.
Hart, Sergiu . Comparing Risks By Acceptance And Rejection. Discussion Papers 2010. Web. Publisher's VersionAbstract
Stochastic dominance is a partial order on risky assets ("gambles") that is based on the uniform preference, of all decision-makers (in an appropriate class), for one gamble over another. We modify this, first, by taking into account the status quo (given by the current wealth) and the possibility of rejecting gambles, and second, by comparing rejections that are substantive (that is, uniform over wealth levels or over utilities). This yields two new stochastic orders: wealth-uniform dominance and utility-uniform dominance. Unlike stochastic dominance, these two orders are complete: any two gambles can be compared. Moreover, they are equivalent to the orders induced by, respectively, the Aumann-Serrano (2008) index of riskiness and the Foster-Hart (2009a) measure of riskiness.
Babichenko, Yakov . Completely Uncoupled Dynamics And Nash Equilibria. Discussion Papers 2010. Web. Publisher's VersionAbstract
A completely uncoupled dynamic is a repeated play of a game, where each period every player knows only his action set and the history of his own past actions and payoffs. One main result is that there exist no completely uncoupled dynamics with finite memory that lead to pure Nash equilibria (PNE) in almost all games possessing pure Nash equilibria. By "leading to PNE" we mean that the frequency of time periods at which some PNE is played converges to 1 almost surely. Another main result is that this is not the case when PNE is replaced by "Nash epsilon-equilibria": we exhibit a completely uncoupled dynamic with finite memory such that from some time on a Nash epsion-equilibrium is played almost surely.
Kareev, Judith Avrahami, and Yaakov. Detecting Change In Partner's Preferences. Discussion Papers 2010. Web. Publisher's VersionAbstract
Studies of the detection of change have commonly been concerned with individuals inspecting a system or a process, whose characteristics were fully determined by the researcher. We, instead, study the detection of change in the preferences - and hence the behavior - of others with whom an individual interacts. More specifically, we study situations in which one's benefits are the result of the joint actions of one and one's partner when at times the preferred combination is the same for both and at times it is not. In other words, what we change is the payoffs associated with the different combinations of interactive choices and then look at choice behavior following such a change. We find that players are extremely quick to respond to a change in the preferences of their counterparts. This responsiveness can be explained by the players' impulsive reaction to regret - if one was due - at their most recent decision.
Linial1, Amir Ban, and Nati. Dynamics Of Reputation Systems, The. Discussion Papers 2010. Web. Publisher's VersionAbstract
Online reputation systems collect, maintain and disseminate reputations as a summary numerical score of past interactions of an establishmentwith its users. As reputation systems, including web search engines, gain inpopularity and become a common method for people to select sought services, adynamical system unfolds: Experts' reputation attracts the potential customers.The experts' expertise affects the probability of satisfying the customers. Thisrate of success in turn influences the experts' reputation. We consider hereseveral models where each expert has innate, constant, but unknown level ofexpertise and a publicly known, dynamically varying, reputation.The specific
Edith Cohen, Michal Feldman, Amos Fiat Haim Kaplan, and Svetlana Olonetsky. Envy-Free Makespan Approximation. Discussion Papers 2010. Web. Publisher's VersionAbstract
We study envy-free mechanisms for scheduling tasks on unrelated machines (agents) that approximately minimize the makespan. For indivisible tasks, we put forward an envy-free poly-time mechanism that approximates the minimal makespan to within a factor of O(logm), where m is the number of machines. We also show a lower bound of Omega(log m/log logm). This improves the recent result of Mu'alem [22] who give an upper bound of (m + 1)/2, and a lower bound of 2 - 1/m. For divisible tasks, we show that there always exists an envy-free poly-time mechanism with optimal makespan. Finally, we demonstrate how our mechanism for envy free makespan minimization can be interpreted as a market clearing problem.
Samuel-Cahn, Jay Bartroff, and Ester. Fighter Problem: Optimal Allocation Of A Discrete Commodity, The. Discussion Papers 2010. Web. Publisher's VersionAbstract
The Fighter problem with discrete ammunition is studied. An aircraft (fighter) equipped with n anti-aircraft missiles is intercepted by enemy airplanes, the appearance of which follows a homogeneous Poisson process with known intensity. If j of the n missiles are spent at an encounter they destroy an enemy plane with probability a(j), where a(0)=0 and a(j) is a known, strictly increasing concave sequence, e.g., a(j)=1 - qj, 0 < 1. If the enemy is not destroyed, the enemy shoots the fighter down with known probability 1 - u, where 0 u 1. The goal of the fighter is to shoot down as many enemy airplanes as possible during a given time period [0,T ]. Let K(n, t) be an optimal number of missiles to be used at a present encounter, when the fighter has flying time t remaining and n missiles remaining. Three seemingly obvious properties of K(n, t) have been conjectured: [A] The closer to the destination, the more of the n missiles one should use, [B] the more missiles one has, the more one should use, and [C] the more missiles one has, the more one should save for possible future encounters. We show that [C] holds for all 0 u 1, that [A] and [B] hold for the "Invincible Fighter" (u = 1), and that [A] holds but [B] fails for the "Frail Fighter" (u = 0).
Samet, Ziv Hellman, and Dov. How Common Are Common Priors?. Discussion Papers 2010. Web. Publisher's VersionAbstract
To answer the question in the title we vary agents' beliefs against the background of a fixed knowledge space, that is, a state space with a partition for each agent. Beliefs are the posterior probabilities of agents, which we call type profiles. We then ask what is the topological size of the set of consistent type profiles, those that are derived from a common prior (or a common improper prior in the case of an infinite state space). The answer depends on what we term the tightness of the partition profile. A partition profile is tight if in some state it is common knowledge that any increase of any single agent's knowledge results in an increase in common knowledge. We show that for partition profiles which are tight the set of consistent type profiles is topologically large, while for partition profiles which are not tight this set is topologically small.
Babichenko, Yakov . How Long To Pareto Efficiency?. Discussion Papers 2010. Web. Publisher's VersionAbstract
We consider uncoupled dynamics (i.e., dynamics where each player knows only his own payoff function) that reach Pareto efficient and individually rational outcomes. We prove that the number of periods it takes is in the worst case exponential in the number of players.
Peretz, Ron . Learning Cycle Length Through Finite Automata. Discussion Papers 2010. Web. Publisher's VersionAbstract
We study the space-and-time automaton-complexity of the CYCLE-LENGTH problem. The input is a periodic stream of bits whose cycle length is bounded by a known number n. The output, a number between 1 and n, is the exact cycle length. We also study a related problem, CYCLE-DIVISOR. In the latter problem the output is a large number that divides the cycle length, that is, a number k >> 1 that divides the cycle length, or (in case the cycle length is small) the cycle length itself. The complexity is measured in terms of the SPACE, the logarithm of the number of states in an automaton that solves the problem, and the TIME required to reach a terminal state. We analyze the worst input against a deterministic (pure) automaton, and against a probabilistic (mixed) automaton. In the probabilistic case we require that the probability of computing a correct output is arbitrarily close to one.We establish the following results: o CYCLE-DIVISOR can be solved in deterministic SPACE o(n), and TIME O(n). o CYCLE-LENGTH cannot be solved in deterministic SPACE X TIME smaller than (n^2). o CYCLE-LENGTH can be solved in probabilistic SPACE o(n), and TIME O(n). o CYCLE-LENGTH can be solved in deterministic SPACE O(nL), and TIME O(n/L), for any positive L < 1.
Halbersberg, Yoed . Liability Standards For Multiple-Victim Torts: A Call For A New Paradigm. Discussion Papers 2010. Web. Publisher's VersionAbstract
Under the conventional approach in torts, liability for an accident is decided by comparing the injurer's costs of precautions with those of the victim, and, under the negligence rule, also with the expected magnitude of harm. In multiplevictim cases, the current paradigm holds that courts should determine liability by comparing the injurer's costs of precautions with the victims' aggregate costs and with their aggregate harm. This aggregative risk-utility test supposedly results in the imposition of liability on the least-cost avoiders of the accident, and, therefore, is assumed efficient. However, this paradigm neglects the importance of the normal differences between tort victims. When victims are heterogeneous with regard to their expected harm or costs of precaution, basing the liability-decision on the aggregate amounts may be incorrect, causing in some cases over-deterrence, while in other, under-deterrence and dilution of liability. A new paradigm is therefore needed. This Article demonstrates how aggregate liability may violate aggregate efficiency, and concludes that decisions based upon aggregate amounts are inappropriate when the victims are heterogeneous-as they typically are in real life. The Article then turns to an exploration of an alternative to the aggregative risk-utility test, and argues for a legal rule that would combine restitution for precaution costs, plus an added small "bonus," with the sampling of victims' claims.
Sheshinski, Eytan . Limits On Individual Choice. Discussion Papers 2010. Web. Publisher's VersionAbstract
Individuals behave with choice probabilities defined by a multinomial logit (MNL) probability distribution over a finite number of alternatives which includes utilities as parameters. The salient feature of the model is that probabilities depend on the choice-set, or domain. Expanding the choice-set decreases the probabilities of alternatives included in the original set, providing positive probabilities to the added alternatives. The wider probability 'spread' causes some individuals to fur- ther deviate from their higher valued alternatives, while others find the added alternatives highly valuable. For a population with diverse preferences, there ex- ists a subset of alternatives, called the optimum choice-set, which balances these considerations to maximize social welfare. The paper analyses the dependence of the optimum choice-set on a parameter which specifies the precision of individuals' choice ('degree of rationality'). It is proved that for high values of this parame- ter the optimum choice-set includes all alternatives, while for low values it is a singleton. Numerical examples demonstrate that for intermediate values, the size and possible nesting of the optimum choice-sets is complex. Governments have various means (defaults, tax/subsidy) to directly a''''ect choice probabilities. This is modelled by 'probability weight'parameters. The paper analyses the structure of the optimum weights, focusing on the possible exclusion of alternatives. A binary example explores the level of 'type one'and 'type two'errors which justify the imposition of early eligibility for retirement benefits, common to social security systems. Finally, the e''''ects of heterogeneous degrees of rationality among individuals are briefly discussed.
Bar-Hillel, Maya . Maya Bar-Hillel. Discussion Papers 2010. Web. Publisher's VersionAbstract
Scientists try to find out the truth about our world. Judges in a court of law try to find out the truth about the target events in the indictment. What are the similarities, and what are the differences, in the procedures that govern the search for truth in these two systems? In particular, why are quantitative tools the hallmark of science, whereas in courts they are rarely used, and when used, are prone to error? (In Hebrew)
Harel, Moses Shayo, and Alon. Non-Consequentialist Voting. Discussion Papers 2010. Web. Publisher's VersionAbstract
Standard theory assumes that voters' preferences over actions (voting) are induced by their preferences over electoral outcomes (policies, candidates). But voters may also have non-consequentialist (NC) motivations: they may care about how they vote even if it does not a''''ect the outcome. When the likelihood of being pivotal is small, NC motivations can dominate voting behavior. To examine the prevalence of NC motivations, we design an experiment that exogenously varies the probability of being pivotal yet holds constant other features of the decision environment. We find a significant e''''ect, consistent with at least 12.5% of subjects being motivated by NC concerns.