2010
Babichenko, Yakov .
“How Long To Pareto Efficiency?”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractWe consider uncoupled dynamics (i.e., dynamics where each player knows only his own payoff function) that reach Pareto efficient and individually rational outcomes. We prove that the number of periods it takes is in the worst case exponential in the number of players.
Peretz, Ron .
“Learning Cycle Length Through Finite Automata”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractWe study the space-and-time automaton-complexity of the CYCLE-LENGTH problem. The input is a periodic stream of bits whose cycle length is bounded by a known number n. The output, a number between 1 and n, is the exact cycle length. We also study a related problem, CYCLE-DIVISOR. In the latter problem the output is a large number that divides the cycle length, that is, a number k >> 1 that divides the cycle length, or (in case the cycle length is small) the cycle length itself. The complexity is measured in terms of the SPACE, the logarithm of the number of states in an automaton that solves the problem, and the TIME required to reach a terminal state. We analyze the worst input against a deterministic (pure) automaton, and against a probabilistic (mixed) automaton. In the probabilistic case we require that the probability of computing a correct output is arbitrarily close to one.We establish the following results: o CYCLE-DIVISOR can be solved in deterministic SPACE o(n), and TIME O(n). o CYCLE-LENGTH cannot be solved in deterministic SPACE X TIME smaller than (n^2). o CYCLE-LENGTH can be solved in probabilistic SPACE o(n), and TIME O(n). o CYCLE-LENGTH can be solved in deterministic SPACE O(nL), and TIME O(n/L), for any positive L < 1.
Halbersberg, Yoed .
“Liability Standards For Multiple-Victim Torts: A Call For A New Paradigm”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractUnder the conventional approach in torts, liability for an accident is decided by comparing the injurer's costs of precautions with those of the victim, and, under the negligence rule, also with the expected magnitude of harm. In multiplevictim cases, the current paradigm holds that courts should determine liability by comparing the injurer's costs of precautions with the victims' aggregate costs and with their aggregate harm. This aggregative risk-utility test supposedly results in the imposition of liability on the least-cost avoiders of the accident, and, therefore, is assumed efficient. However, this paradigm neglects the importance of the normal differences between tort victims. When victims are heterogeneous with regard to their expected harm or costs of precaution, basing the liability-decision on the aggregate amounts may be incorrect, causing in some cases over-deterrence, while in other, under-deterrence and dilution of liability. A new paradigm is therefore needed. This Article demonstrates how aggregate liability may violate aggregate efficiency, and concludes that decisions based upon aggregate amounts are inappropriate when the victims are heterogeneous-as they typically are in real life. The Article then turns to an exploration of an alternative to the aggregative risk-utility test, and argues for a legal rule that would combine restitution for precaution costs, plus an added small "bonus," with the sampling of victims' claims.
Sheshinski, Eytan .
“Limits On Individual Choice”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractIndividuals behave with choice probabilities defined by a multinomial logit (MNL) probability distribution over a finite number of alternatives which includes utilities as parameters. The salient feature of the model is that probabilities depend on the choice-set, or domain. Expanding the choice-set decreases the probabilities of alternatives included in the original set, providing positive probabilities to the added alternatives. The wider probability 'spread' causes some individuals to fur- ther deviate from their higher valued alternatives, while others find the added alternatives highly valuable. For a population with diverse preferences, there ex- ists a subset of alternatives, called the optimum choice-set, which balances these considerations to maximize social welfare. The paper analyses the dependence of the optimum choice-set on a parameter which specifies the precision of individuals' choice ('degree of rationality'). It is proved that for high values of this parame- ter the optimum choice-set includes all alternatives, while for low values it is a singleton. Numerical examples demonstrate that for intermediate values, the size and possible nesting of the optimum choice-sets is complex. Governments have various means (defaults, tax/subsidy) to directly a''''ect choice probabilities. This is modelled by 'probability weight'parameters. The paper analyses the structure of the optimum weights, focusing on the possible exclusion of alternatives. A binary example explores the level of 'type one'and 'type two'errors which justify the imposition of early eligibility for retirement benefits, common to social security systems. Finally, the e''''ects of heterogeneous degrees of rationality among individuals are briefly discussed.
Bar-Hillel, Maya .
“Maya Bar-Hillel”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractScientists try to find out the truth about our world. Judges in a court of law try to find out the truth about the target events in the indictment. What are the similarities, and what are the differences, in the procedures that govern the search for truth in these two systems? In particular, why are quantitative tools the hallmark of science, whereas in courts they are rarely used, and when used, are prone to error? (In Hebrew)
Harel, Moses Shayo, and Alon.
“Non-Consequentialist Voting”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractStandard theory assumes that voters' preferences over actions (voting) are induced by their preferences over electoral outcomes (policies, candidates). But voters may also have non-consequentialist (NC) motivations: they may care about how they vote even if it does not a''''ect the outcome. When the likelihood of being pivotal is small, NC motivations can dominate voting behavior. To examine the prevalence of NC motivations, we design an experiment that exogenously varies the probability of being pivotal yet holds constant other features of the decision environment. We find a significant e''''ect, consistent with at least 12.5% of subjects being motivated by NC concerns.
Jay Bartroff, Larry Goldstein, Yosef Rinott, and Ester Samuel-Cahn.
“On Optimal Allocation Of A Continuous Resource Using An Iterative Approach And Total Positivity”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractWe study a class of optimal allocation problems, including the well-known Bomber Problem, with the following common probabilistic structure. An aircraft equipped with an amount x of ammunition is intercepted by enemy airplanes arriving according to a homogenous Poisson process over a fixed time duration t. Upon encountering an enemy, the aircraft has the choice of spending any amount 0
Halbersberg, Yoed .
“On The Deduction Of National Insurance Payments From Tort Victims' Claims”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractIn CA 1093/07 Bachar v. Fokmann [2009] (request for additional hearing denied, 2010) , the Israeli Supreme Court has formed a formula for calculating the deduction of NII payments from a tort victim's claim, when only some of the victim's impairment is causally linked to the tortious act in question. Overall, six Supreme Court Justices have reviewed and affirmed this simple formula. However, this formula is incorrect, as it contradicts some of the most basic tort premises, ignores the way impairment is calculated, and necessarily leads to the under-compensation of the victim, and to an unjust enrichment of either the tortfeasor, the National Insurance Institute, or both. This Article, therefore, calls for the adoption of a different formula that is both legally and arithmetically correct.
Bezalel Peleg, Peter Sudh¶lter, Jos\copyright M. Zarzuelo .
“On The Impact Of Independence Of Irrelevant Alternatives”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractOn several classes of n-person NTU games that have at least one Shapley NTU value, Aumann characterized this solution by six axioms: Non-emptiness, efficiency, unanimity, scale covariance, conditional additivity, and independence of irrelevant alternatives (IIA). Each of the first five axioms is logically independent of the remaining axioms, and the logical independence of IIA is an open problem. We show that for n = 2 the first five axioms already characterize the Shapley NTU value, provided that the class of games is not further restricted. Moreover, we present an example of a solution that satisffies the first 5 axioms and violates IIA for 2-person NTU games (N;V) with uniformly p-smooth V(N).
Marco Francesconi, Christian Ghiglino, and Motty Perry.
“On The Origin Of The Family”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractThis paper presents an overlapping generations model to explain why humans live in families rather than in other pair groupings. Since most non-human species are not familial, something special must be behind the family. It is shown that the two necessary features that explain the origin of the family are given by uncertain paternity and overlapping cohorts of dependent children. With such two features built into our model, and under the assumption that individuals care only for the propagation of their own genes, our analysis indicates that fidelity families dominate promiscuous pair bonding, in the sense that they can achieve greater survivorship and enhanced genetic fitness. The explanation lies in the free riding behavior that characterizes the interactions between competing fathers in the same promiscuous pair grouping. Kin ties could also be related to the emergence of the family. When we consider a kinship system in which an adult male transfers resources not just to his offspring but also to his younger siblings, we find that kin ties never emerge as an equilibrium outcome in a promiscuous environment. In a fidelity family environment, instead, kinship can occur in equilibrium and, when it does, it is efficiency enhancing in terms of greater survivorship and fitness. The model can also be used to shed light on the issue as to why virtually all major world religions are centered around the importance of the family.
Moldovanu, Alex Gershkov, and Benny.
“Optimal Search, Learning And Implementation”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractWe characterize the incentive compatible, constrained efficient policy ("second-best") in a dynamic matching environment, where impatient, privately informed agents arrive over time, and where the designer gradually learns about the distribution of agents' values. We also derive conditions on the learning process ensuring that the complete-information, dynamically efficient allocation of resources ("first-best") is incentive compatible. Our analysis reveals and exploits close, formal relations between the problem of ensuring implementable allocation rules in our dynamic allocation problems with incomplete information and learning, and between the classical problem, posed by Rothschild [19], of finding optimal stopping policies for search that are characterized by a reservation price property .
Deniz Dizdar, Alex Gershkov, and Benny Moldovanu.
“Revenue Maximization In The Dynamic Knapsack Problem”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractWe analyze maximization of revenue in the dynamic and stochastic knapsack problem where a given capacity needs to be allocated by a given deadline to sequentially arriving agents. Each agent is described by a two-dimensional type that reflects his capacity requirement and his willingness to pay per unit of capacity. Types are private information. We first characterize implementable policies. Then we solve the revenue maximization problem for the special case where there is private information about per-unit values, but capacity needs are observable. After that we derive two sets of additional conditions on the joint distribution of values and weights under which the revenue maximizing policy for the case with observable weights is implementable, and thus optimal also for the case with two-dimensional private information. In particular, we investigate the role of concave continuation revenues for implementation. We also construct a simple policy for which per-unit prices vary with requested weight but not with time, and prove that it is asymptotically revenue maximizing when available capacity/ time to the deadline both go to infinity. This highlights the importance of nonlinear as opposed to dynamic pricing.
Kareev, Judith Avrahami, and Yaakov.
“Role Of Impulses In Shaping Decisions, The”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractThis article explores the extent to which decision behavior is shaped by short-lived reactions to the outcome of the most recent decision. We inspected repeated decision-making behavior in two versions of each of two decision-making tasks, an individual task and a strategic one. By regressing behavior onto the outcomes of recent decisions, we found that the upcoming decision was well predicted by the most recent outcome alone, with the tendency to repeat a previous action being affected both by its actual outcome and by the outcomes of actions not taken. Because the goodness of predictions based on the most recent outcome did not diminish as participants gained experience with the task, we conclude that repeated decisions are continuously affected by impulsive reactions.
Aumann, Robert J. .
“Role Of Incentives In The World Financial Crisis, The”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractA lecture explaining the causes of the 2008 9 world financial crisis in terms of ordinary economic processes. The lecture was delivered at the 39th St. Gallen Symposium, University of St. Gallen, Switzerland, 8 May 2009.
Larry Goldstein, Yosef Rinott, and Marco Scarsini.
“Stochastic Comparisons Of Stratifed Sampling Techniques For Some Monte Carlo Estimators”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractWe compare estimators of the (essential) supremum and the integral of a function f defined on a measurable space when f may be observed at a sample of points in its domain, possibly with error. The estimators compared vary in their levels of stratification of the domain, with the result that more refined stratification is better with respect to different criteria. The emphasis is on criteria related to stochastic orders. For example, rather than compare estimators of the integral of f by their variances (for unbiased estimators), or mean square error, we attempt the stronger comparison of convex order when possible. For the supremum the criterion is based on the stochastic order of estimators.For some of the results no regularity assumptions for f are needed, whilefor others we assume that f is monotone on an appropriate domain.
Noga Alon, Michal Feldman, Ariel D. Procaccia, and Moshe Tennenholtz.
“Strategyproof Approximation Mechanisms For Location On Networks”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractWe consider the problem of locating a facility on a network, represented by a graph. A set of strategic agents have different ideal locations for the facility; the cost of an agent is the distance between its ideal location and the facility. A mechanism maps the locations reported by the agents to the location of the facility. Specifically, we are interested in social choice mechanisms that do not utilize payments. We wish to design mechanisms that are strategyproof, in the sense that agents can never benefit by lying, or, even better, group strategyproof, in the sense that a coalition of agents cannot all benefit by lying. At the same time, our mechanisms must provide a small approximation ratio with respect to one of two optimization targets: the social cost or the maximum cost.We give an almost complete characterization of the feasible truthful approximation ratio under both target functions, deterministic and randomized mechanisms, and with respect to different network topologies. Our main results are: We show that a simple randomized mechanism is group strategyproof and gives a tight approximation ratio of 3/2 for the maximum cost when the network is a circle; and weshow that no randomized SP mechanism can provide an approximation ratio better than 2-o(1) to the maximum cost even when the network is a tree, thereby matching a trivial upper bound of two.
David Azriel, Micha Mandel, and Yosef Rinott.
“Treatment Versus Experimentation Dilemma In Dose-Finding Studies, The”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractPhase I clinical trials are conducted in order to find the maximum tolerated dose (MTD) of a given drug from a finite set of doses. For ethical reasons, these studies are usually sequential, treating patients or group of patients with the best available dose according to the current knowledge. However, it is proved here that such designs, and, more generally, designs that concentrate on one dose from some time on, cannot provide consistent estimators for the MTD unless very strong parametric assumptions hold. We describe a family of sequential designs that treat individuals with one of the two closest doses to the estimated MTD, and prove that such designs, under general conditions, concentrate eventually on the two closest doses to the MTD and estimate the MTD consistently. It is shown that this family contains randomized designs that assign the MTD with probability that approaches 1 as the size of the experiment goes to infinity. We compare several designs by simulations, studying their performances in terms of correct estimation of the MTD and the proportion of individuals treated with the MTD.
Edith Cohen, Michal Feldman, Amos Fiat Haim Kaplan, and Svetlana Olonetsky.
“Truth And Envy In Capacitated Allocation Games”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractWe study auctions with additive valuations where agents have a limit on the number of items they may receive. We refer to this setting as capacitated allocation games. We seek truthful and envy free mechanisms that maximize the social welfare. I.e., where agents have no incentive to lie and no agent seeks to exchange outcomes with another.In 1983, Leonard showed that VCG with Clarke Pivot payments (which is known to be truthful, individually rational, and have no positive transfers), is also an envy free mechanism for the special case of n items and n unit capacity agents. We elaborate upon this problem and show that VCG with Clarke Pivot payments is envy free if agent capacities are all equal. When agent capacities are not identical, we show that there is no truthful and envy free mechanism that maximizes social welfare if one disallows positive transfers.For the case of two agents (and arbitrary capacities) we show a VCG mechanism that is truthful, envy free, and individually rational, but has positive transfers. We conclude with a host of open problems that arise from our work.
Noam Bar-Shai, Tamar Keasar, and Avi Shmida.
“Use Of Numerical Information By Bees In Foraging Tasks, The”.
Discussion Papers 2010. Web.
Publisher's VersionAbstractThe ability of invertebrates to perform complex cognitive tasks is widely debated. Bees utilize the number of landmarks en-route to their destination as cues for navigation, but their use of numerical information in other contexts has not been studied. Numerical regularity in the spatial distribution of food occurs naturally in some flowers, which contain a fixed number of nectaries. Bees that collect nectar from such flowers are expected to increase their foraging efficiency by avoiding return visits to empty nectaries. This can be achieved if bees base their flowerdeparture decisions on the number of nectaries they had already visited, or on other sources of information that co-vary with this number.We tested, through field observations and laboratory experiments, whether bees adapt their departure behavior to the number of available food resources. Videorecorded observations of bumblebees that visited Alcea setosa flowers with five nectaries revealed that the conditional probability of flower departure after five probings was 93%. Visit duration, the spatial attributes of the flowers and scent marks could be excluded as flower-leaving cues, while the volume of nectar collected may have guided part of the departure decisions. In the laboratory the bees foraged on two patches, each with three computer-controlled feeders, but could receive only up to two sucrose-solution rewards in each patch visit. The foragers gradually increased their tendency to leave the patches after the second reward, while the frequency of patch departure after the first reward remained constant. Patch-visit duration, nectar volume, scent marks and recurring visit sequences in a patch were ruled out as possible sources of patch-leaving information.We conclude that bumblebees distinguish among otherwise identical stimuli by their serial position in a sequence, and use this capability to increase foraging efficiency. Our findings support an adaptive role for a complicated cognitive skill in a seemingly small and simple invertebrate.
Ro'i Zultan, 5a Bar-Hillel, Nitsan Guy .
“When Being Wasteful Is Better Than Feeling Wasteful”.
Discussion Papers 2010. Web.
Publisher's VersionAbstract"Waste not want not" expresses our culture's aversion to waste. "I could have gotten the same thing for less" is a sentiment that can diminish pleasure in a transaction. We study people's willingness to "pay" to avoid this spoiler. In one scenario, participants imagined they were looking for a rental apartment, and had bought a subscription to an apartment listing. If a cheaper subscription had been declined, respondents preferred not to discover post hoc that it would have sufficed. Specifically, they preferred ending their quest for the ideal apartment after seeing more, rather than fewer, apartments. Other scenarios produced similar results. We conclude that people may sometimes prefer to be wasteful in order to avoid feeling wasteful.