1995
Milchtaich, I. . (1995).
Vector Measure Games Based on Measures with Values in an Infinite Dimensional Vector Space.
Discussion Papers. presented at the 12, Games and Economic Behavior 24 (1998), 25-46. Retrieved from
/files/dp89.pdf Publisher's VersionAbstractThe following generalization of a theorem of Aumann and Shapley is proved: A vector measure game of the form f'$^\circ$%, where % is a nonatomic banach-space measure of bounded variation and f is a weakly continuously differentiable real-valued function defined on the closed convex hull of the range of % such that f(0)=0, is in pNA. If the game is monotonic, then the conclusion holds even if at 0 f is only continuous, and not differentiable. The value of the game is given by the diagonal formula. These results are used for giving a new, relatively short, proof to the result that, under certain conditions, a market game is in pNA.
Maya Bar-Hillel, E. N. . (1995).
Why Are People Reluctant to Exchange Lottery Tickets?.
Discussion Papers. presented at the 3, Journal of Personality and Social Psychology 70 (1996), 17-27. Retrieved from
/files/dp71.pdf Publisher's VersionAbstractIn a series of experiments, we demonstrate that people are reluctant to exchange lottery tickets. In other words, when given a small incentive to exchange a lottery ticket with which they had just been endowed for a different one, with the same probability of winning the same prize, only about 50% choose to do so. In contrast, when given the same incentive to exchange a pen with which they had just been endowed for another pen just like it, over 90% choose to do so. We discuss – and rule out – a series of possible explanations for this effect, including: distorted subjective probabilities; fear of finding out that you gave up a wining ticket; lack of sufficient incentive (i.e.,transaction cost); general confusion or "paranoia"; etc. We conclude that people will not exchange ex ante identical tokens of the same type unless the two tokens will be identical ex post as well. A lottery ticket with which one has been endowed becomes at once the status quo, or reference point, with respect to which changes are evaluated for possible gains and losses. Since losses loom larger than gains, two lottery tickets which are symmetrical before they pass into one's possession are no longer symmetrical once one of them becomes one's own.
1994
Winter, D. B., & Eyal, . (1994).
A Necessary and Sufficient Epistemic Condition for Playing Backward Induction.
Discussion Papers. presented at the 6, Journal of Mathematical Economics 27 (1997), 325-345. Retrieved from
/files/dp48.pdf Publisher's VersionAbstractIn an epistemic framework due to Aumann we characterize the minimal condition on the rationality of the players that implies backward induction in perfect information games in agent form. This condition requires each player to know that the players are rational at later, but not at previous decision nodes.
Morgan, V. K., & John, . (1994).
An Analysis of the War of Attrition and the All-Pay Auction.
Discussion Papers. presented at the 8. Retrieved from
/files/dp56.pdf Publisher's VersionAbstractWe study the war of attrition and the all-pay auction when players' signals are affiliated and symmetrically distributed. We (a) find sufficient conditions for the existence of symmetric monotonic equilibrium biddingstrategies; and (b) examine the performance of these auction forms in terms of the expected revenue accruing to the seller. Under our conditions the war of attrition raises greater expected revenue than all other known sealed did auction forms.
Neyman, P. D., & Abraham, . (1994).
An Equivalence Principle for Perfectly Competitive Economies.
Discussion Papers. presented at the 5, Journal of Economic Theory 75 (1997), 314-344. Retrieved from
/files/dp47.pdf Publisher's VersionAbstractIt is a striking fact that different solutions become equivalent in the setting of perfectly competitive economies. We provide an axiomatic approach to this equivalence phenomenon. A solution is viewed as a correspondence which maps each economy to a subset of its individually rational and Pareto-optimal allocations. Four axioms are placed on the correspondence: anonymity, equity, consistency and restricted continuity. It is shown that the axioms categorically determine the Walrasian correspondence. The equivalence of other solutions, such as the core or value allocations, now follows by checkingthat they too satisfy the axioms.
Agastya, M. . (1994).
An Evolutionary Bargaining Model (revision of Discussion Paper #38).
Discussion Papers. presented at the 12. Retrieved from
/files/dp61.pdf Publisher's VersionAbstractA non-negative function f defined on the class of subsets of a finite set of factors of production describes the production possibilities at each date. The problem of allocating the surplus among the factors is studied in a dynamic learning model. Representatives for the factors (called players) make wage demands naively based on precedent and ignorant of each others' utilities for this good. A global convergence result shows that players learn to reach some (and only a) core allocation in the long run. If players make mistakes however, only a strict subset of the core allocations are likely, i.e., stochastically stable. The main result shows that in the limit, these stable allocations for a particular set of players, converge to the allocation that maximizes the product of all the players' utilities over core allocations.
Aumann, R. J. . (1994).
Backward Induction and Common Knowledge of Rationality.
Discussion Papers. presented at the 12, Games and Economic Behavior 8 (1995), 6-19. Retrieved from
' Publisher's VersionAbstractWe formulate precisely and prove the proposition that if common knowledge of rationality obtains in a game of perfect information, then the backward induction outcome is reached.
Volij, N. D., & Oscar, . (1994).
Bilateral Comparisons and Consistent Fair Division Rules in the Context of Bankruptcy Problems.
Discussion Papers. presented at the 6, International Journal of Game Theory 26 (1997), 11-26.
AbstractWe analyze the problem of extending a given bilateral principle of justice to a consistent n-creditor bankruptcy rule. Based on the bilateral principle, we build a family of binary relations on the set of creditors in order to make bilateral comparisons between them. We find that the possibility of extending a specific bilateral principle of justice in a consistent way is closely related to the quasi-transitivity of the binary relations mentioned above.
Dagan, N. . (1994).
Consistency, Decentralization and the Walrasian Allocations Correspondence.
Discussion Papers. presented at the 1.
AbstractIn this paper we study finite-agent exchange economies. We extend the classical model by adding an imports-exports vector, which defines the markets clearing conditions of the economy. Equipped with this new definition, self-consistency properties are naturally defined. We show that the Core correspondence and the Walrasian allocations correspondence are self-consistent. In addition, we present an axiomatic characterization of the Walrasian allocations correspondence for a class of convex and smooth economies. All the axioms presented in the characterization are satisfied by the Core, except for a converse-consistency property, which can be interpreted as a requirement of decentralization.
Ezra Einy, R. H., & Shitovitz, B. . (1994).
Core and Stable Sets of Large Games Arising in Economics.
Discussion Papers. presented at the 11, Journal of Economic Theory 68 (1996), 200-211. Retrieved from
/files/dp58.pdf Publisher's VersionAbstractIt is shown that the core of a non-atomic glove-market game which is defined as the minimum of finitely many non-atomic probability measures is a von-Neumann Morgenstern stable set. This result is used to characterize some stable set of large games which have a decreasing returns to scale property. We also study exact non-atomic glove-market games. In particular we show that in a glove-market game which consists of the minimum of finitely many mutually singular non-atomic measures, the core is a von-Neumann Morgenstern stable set if the game is exact. We also discuss the intuitive appeal of the equivalence of the core and stable set. We do this by employing the theory of social situations [5] and highlighting the negotiation processes that underlie these two notions.
Gary Bornstein, I. E., & Goren, H. . (1994).
Effect of Repeated Play in the IPG and IPD Team Games, The.
Discussion Papers. presented at the 3, Journal of Conflict Resolution 38 (1994), 690-707. Retrieved from
/files/dp46.pdf Publisher's VersionAbstractRepeated interaction in intergroup conflict was studied in the context of two team games: The Intergroup Public Goods (IPG) game and the Intergroup Prisoner's Dilemma (IPD) game. The results reveal (a) a main effect for game type; subjects were twice as likely to contribute towards their group effort in the IPG game than in the IPD game, and (b) a game-type X time interaction; subjects contributed less over time in the IPD game while continuing to contribute at about the same rate in the IPG game. The second finding supports the hypothesis that subjects learn the structure of the game and adapt their behavior accordingly, and is compatible with a simple learning model (Roth & Erev, 1993) which assumes that choices that have led to good outcomes in the past are more likely to be repeated in the future. A reciprocal cooperation hypothesis which assumes that players make their choices contingent on the earlier choices of the other players received little support.
Brandenburger, R. J. A., & Adam, Ori et al, . (1994).
Epistemic Conditions for Nash Equilibrium.
Discussion Papers. presented at the 10, Econometrica 63 (1995), 1161-1180. Retrieved from
/files/dp57.pdf Publisher's VersionAbstractSufficient conditions for Nash equilibrium in an n-person game are given in terms of what the players know and believe - about the game, and about each other's rationality, actions, knowledge, and beliefs. Mixed strategies are treated not as conscious randomizations, but as conjectures, on the part of other players, as to what a player will do. Common knowledge plays a smaller role in characterizing Nash equilibrium than had been supposed. When n = 2, mutual knowledge of the payoff functions, of rationality, and of the conjectures implies that the conjectures form a Nash equilibrium. When n % 3 and there is a common prior, mutual knowledge of the payoff functions and of rationality, and common knowledge of the conjectures, imply that the conjectures form a Nash equilibrium. Examples show the results to be tight.
Margalit, A. . (1994).
Ethics of Second-Order Beliefs, The.
Discussion Papers. presented at the 3. Retrieved from
' Publisher's VersionAbstractThe questions I address my paper are: Are people morally responsible for their beliefs? Are people's beliefs voluntary - can they be chosen and decided upon? What is the nature of the analogy between obligation of belief and the obligation of have certain emotions and not others? Can the skeptic suspend his belief? And finally, what is the sin of the heretic? I argue that it is right and proper for belief to be evaluated morally even if they are not voluntary and therefore not under our control. The key to my view is the fact that human beings possess second-order as well as first-order beliefs.
Feinberg, Y. . (1994).
Evolutionary Selection of an Equilibrium.
Discussion Papers. presented at the 1.
AbstractWe analyze the long-run behavior of a population engaged in a 2x2 evolutionary game undergoing mutation effects. We assume that the rates of mutation are exogenously and randomly determined. It is shown that if high mutation rates are possible but highly improbable, then the population evolves towards the risk dominant equilibrium (Harsanyi and Selten, 1988).
Ma, J. . (1994).
Infinitely Repeated Rental Model with Incomplete Information.
Discussion Papers. presented at the 6, Economics Letters 49 (1995), 261-266. Retrieved from
/files/dp54.pdf Publisher's VersionAbstractIn an infinitely repeated rental model with two types of buyer and no discounting, the set of all Nash equilibrium payoffs for the seller and the buyer is characterized.
Daniel Granot, Michael Maschler, G. O., & Zhu, W. R. . (1994).
Kernel/Nucleolus of a Standard Tree Game, The.
Discussion Papers. presented at the 3, International Journal of Game Theory 25 (1996), 219-244.
AbstractIn this paper we characterize the nucleolus (which coincides with the kernel) of a tree enterprise. We also provide a new algorithm to compute it, which sheds light on its structure. We show that in particular cases, including a chain enterprise one can compute the nucleolus in O(n) operations, where n is the number of vertices in the tree.
Dagan, N. . (1994).
New Characterizations of Old Bankruptcy Rules.
Discussion Papers. presented at the 1, Social Choice and Welfare 13 (1996), 51-59.
AbstractThis paper presents axiomatic characterizations of two bankruptcy rules discussed in Jewish legal literature: the Constrained Equal Awards rule and the Contested Garment principle (the latter is defined only for two-creditor problems). A major property in these characterizations is independence of irrelevant claims, which requires that if an individual claim exceeds the total to be allocated it should be considered irrelevant.
Orshan, G. . (1994).
Non-Symmetric Prekernels.
Discussion Papers. presented at the 12. Retrieved from
' Publisher's VersionAbstractA "symmetry" property, either in the version of "equal treatment" or in the version of "anonymity", is one of the standard intuitively acceptable properties satisfied by most well known solution concepts in game theory. However,there are many instances where symmetry is counterintuitive. This paper analyzes non-symmetric prekenels: solution concepts that satisfy Peleg's axioms for the prekernel [1986, 1987], with equal treatment replaced by the requirement that the solution of each 2-person game consists of a unique point. It is shown that non-symmetric prekernels do exits and then a full characterization is provided.
Pitowsky, I. . (1994).
On the Concept of the Proof in Modern Mathematics.
Discussion Papers. presented at the 3.
AbstractThis paper deals with the attempts to characterize the set of all proofs in a given mathematical domain such as geometry or number theory. The characterization usually takes the form of a finite list of axiom schemata and inference rules, which is thought to be complete. A related effort, which originated with descartes, is to replace proofs - that is, reasoning about concepts and relations - by the solution of algebraic equations which are shown to be equivalent to the proofs. These formalist tendencies have always been opposed by intuitionists. I trace the dispute from descartes and Leibnitz through Kant all the way to its climax in the fifty years between the demonstration of the relative consistency of hyperbolic geometry and the discovery of Godel's theorems. My purpose is both historical and philosophical. On the historical level, I argue that Hilbert's program was not only a foundationalist effort to secure the consistency of mathematics. It was, in addition, an internal mathematical program in the aforementioned cartesian tradition of replacing proofs by computations. The demise of Hilbert's philosophical pretensions brought considerable and unexpected success to the mathematical program: Godel's theorem, which shows how to replace proofs by computations in very extensive domains of mathematics, and, ultimately, the Davis-Robinson-Putnam-Matijacevic theorem, which demonstrates, roughly, that every proof in those domains is equivalent to a solution of an algebraic (i.e. polynomial) equation. The fact that the notion of proof in number theory is indefinitely extensible (by Godel's theorem) depends on a complete characterization of the concept of `computation' (the Church-Turing thesis). On the philosophical level, I argue that this dependence undermines some contemporary intuitionist claims (by Weyl and Dummett) which are based on Godel's results.
Dagan, N. . (1994).
On the Least Sacrifice Principle in Taxation.
Discussion Papers. presented at the 6. Retrieved from
' Publisher's VersionAbstractUtilitarian philosophers and economists recommended that when applying taxation programs, government should minimize the sum total of sacrifice made by individuals. This paper presents a model and an axiom system of taxation policies, in which the Least Sacrifice Principle is derived. A key axiom in our characterization is self-consistency. Other relations between self-consistency and welfare maximization in our model and in other models are also discussed.