Publications

2011
This paper investigates the effect of compensation of corporate personnel on their investment innew technologies. We focus on a specific corporate activity, namely corporate venture capital(CVC), describing minority equity investment by established-firms in entrepreneurial ventures.The setting offers an opportunity to compare corporate investors to investment experts, theindependent venture capitalists (IVCs). On average, we observe a performance gap betweencorporate investors and their independent counterparts. Interestingly, the performance gap issensitive to CVCs' compensation scheme: it is the largest when CVC personnel are awardedperformance pay. Not only do we study the association between incentives and performancebut we also document a direct relationship between incentives and the actions managersundertake. For example, we observe disparity between the number of participants in venturecapital syndicates that involve a corporate investor, and those that consist solely of IVCs. Thedisparity shrinks substantially, however, for a subset of CVCs that compensate their personnelusing performance pay. We find a parallel pattern when analyzing the relationship betweencompensation and another investment practice, staging of investment. To conclude, the paperinvestigates the three elements of the principal-agent framework, thus providing direct evidencethat compensation schemes (incentives) shape investment practices (managerial action), andultimately investors¡¦ outcome (performance).
Recent debates have centred on the normative influence epistemic peerage should have on the regulation of beliefs in cases of disagreement. A dominant position in this debate is that acknowledging an epistemic peer's possession of a belief contrary to one's own ought, in itself, to lead to the revision of one's doxastic commitments. In what follows I aim to challenge and rethink the notion of peerage underlying the disagreement debate and thus reveal that the traditional view of peerage rests upon an idealized conception of similarly between disagreeing parities, and thus to show that the normative constraints derived from it are equally idealized. Constructively, I will suggest a commonsensical solution to the disagreement problem based on what I propose as a soft, more moderate conception of peerage.
We develop a measure for quantifying rank order of visitation in complex sequences of male-phase versus female-phase flowers. The measure shows whether female flowers are visited before male flowers which enhances plant fitness. We apply the new method to bumble bee visitation in Digitalus purpurea and Echium vulgare and discuss our results in relation to the evolution of protandry in insect pollinated plant species.
We find a herding tendency among both amateur and professional investors and conclude that the propensity to herd is lower in the professionals. These results are obtained both when we consider herding into individual stocks and herding into stocks in general. Herding depends on the firm's systematic risk and size, and the professionals are less sensitive to these variables. The differences between the amateurs and the professionals may be attributable to the latter's superior financial training. Most of the results are consistent with the theory that herding is information-based. We also find that the herding behavior of the two groups is a persistent phenomenon, and that it is positively and significantly correlated with stock market returns' volatility. Finally, herding, mainly by amateurs, causes market volatility in the Granger causality sense.
We provide a new characterization of implementability of reduced form mechanisms in terms of straightforward second-order stochastic dominance. In addition, we present a simple proof of Matthews' (1984) conjecture, proved by Border (1991), on implementability.
{Since its inception, psychology has studied position effects. But the position was a temporal one in sequential presentation, and the dependent variables related to memory and learning. This paper attempts to survey position effects when position is spatial (namely
Amir Ban Linial and Nati. 2011. “Market Share Indicates Quality”. Publisher's Version Abstract
Market share and quality, or customer satisfaction, go hand in hand. Yet the inference that higher'' market share indicates higher quality is seldom made. The skepticism is in part fueled by elitism,'' the association of mass popularity with lower quality, and by cynicism, ascribing market'' leadership to an entrenched position. We find that though such skepticism is often justified, it is'' correct to make a Bayesian inference that the product with the higher market share has the better'' quality under rather tame assumptions.
In matching markets the number of blocking pairs is often used as a criterion to compare matchings. We argue that this criterion is lacking an economic interpretation: In many circumstances it will neither reflect the expected extent of partner changes, nor will it capture the satisfaction of the players with the matching. As an alternative, we set up two principles which single out a particularly "disruptive" subcollection of blocking pairs. We propose to take the cardinality of that subset as a measure to compare matchings. This cardinality has an economic interpretation: The subset is a justified objection against the given matching according to a bargaining set characterization of the set of stable matchings. We prove multiple properties relevant for a workable measure of comparison.
Maskin and Riley (2003) and Lebrun (2006) prove that the Bayes-Nash equilibrium of first-price auctions is unique. This uniqueness requires the assumption that a buyer never bids above his value. We demonstrate that, in asymmetric first-price auctions (with or without a minimum bid), the relaxation of this assumption results in additional equilibria that are "substantial." Although in each of these additional equilibria no buyer wins with a bids above his value, the allocation of the object and the selling price may vary among the equilibria. Furthermore, we show that such phenomena can only occur under asymmetry in the distributions of values.
Maya Bar-Hillel. 2011. “New Unconscious, The”. Publisher's Version Abstract
Recent research in psychology, especially that called "The New Unconscious", is discovering strange and unintuitive phenomena, some of which raise interesting challenges for the law. This paper discusses some of these challenges. For example, if much of our mental life occurs out of our awareness and control, and yet is subject to easy external manipulation, what implications does this have for holding defendants responsible for their deeds? For that matter, what implications does this have for trusting judges to judge and act as they should, and would, if their own mental processes were fully conscious and controlled? Some provocative ideas are suggested, such as how to make prison terms shorter and more deterring at the same time; assisting judges in overcoming inconsistency and biases; etc.
"Very small but cumulated decreases in food intake may be sufficient to have significant effects, even erasing obesity over a period of years" (Rozin et al., 2011). In two studies, one a lab study and the other a real-world study, we examine the effect of manipulating the position of different foods on a restaurant menu. Items placed at the beginning or the end of the list of their category options were up to twice as popular as when they were placed in the center of the list. Given this effect, placing healthier menu items at the top or bottom of item lists and less healthy ones in their center (e.g., sugared drinks vs. calorie-free drinks) should result in some increase in favor of healthier food choices.
We consider the classical problem of selecting the best of two treatments in clinical trials with binary response. The target is to find the design that maximizes the power of the relevant test. Many papers use a normal approximation to the power function and claim that Neyman allocation that assigns subjects to treatment groups according to the ratio of the responses' standard deviations, should be used. As the standard deviations are unknown, an adaptive design is often recommended. The asymptotic justification of this approach is arguable, since it uses the normal approximation in tails where the error in the approximation is larger than the estimated quantity. We consider two different approaches for optimality of designs that are related to Pitman and Bahadur definitions of relative efficiency of tests. We prove that the optimal allocation according to the Pitman criterion is the balanced allocation and that the optimal allocation according to the Bahadur approach depends on the unknown parameters. Exact calculations reveal that the optimal allocation according to Bahadur is often close to the balanced design, and the powers of both are comparable to the Neyman allocation for small sample sizes and are generally better for large experiments. Our findings have important implications to the design of experiments, as the balanced design is proved to be optimal or close to optimal and the need for the complications involved in following an adaptive design for the purpose of increasing the power of tests is therefore questionable.
Do participants bring their own priors to an experiment? If so, do they share the same priors as the researchers who design the experiment? In this article, we examine the extent to which self-generated priors conform to experimenters' expectations by explicitly asking participants toindicate their own priors in estimating the probability of a variety of events. We find in Study 1 that despite being instructed to follow a uniform distribution, participants appear to have used their own priors, which deviated from the given instructions. Using subjects' own priors allows us to account better for their responses rather than merely to test the accuracy of their estimates. Implications for the study of judgment and decision making are discussed.
Yaming Yu Yosef Rinott, Marco Scarsini. 2011. “Probability Inequalities for a Gladiator Game”. Publisher's Version Abstract
Based on a model introduced by Kaminsky, Luks, and Nelson (1984), we consider a zero-sum allocation game called the Gladiator Game, where two teams of gladiators engage in a sequence of one-to-one fights in which the probability of winning is a function of the gladiators' strengths. Each team's strategy consist the allocation of its total strength among its gladiators. We find the Nash equilibria of the game and compute its value. To do this, we study interesting majorization-type probability inequalities concerning linear combinations of Gamma random variables.
Reinforcement learning in complex natural environments is a challenging task because the agent should generalize from the outcomes of actions taken in one state of the world to future actions in different states of the world. The extent to which human experts find the proper level of generalization is unclear. Here we show, using the sequences of field goal attempts made by professional basketball players, that the outcome of even a single field goal attempt has a considerable effect on the rate of subsequent 3 point shot attempts, in line with standard models of reinforcement learning. However, this change in behaviour is associated with negative correlations between the outcomes of successive field goal attempts. These results indicate that despite years of experience and high motivation, professional players overgeneralize from the outcomes of their most recent actions, which leads to decreased performance.
Although possessing many beautiful features, the Hart and Mas-Colell bargaining model is not flawless: the concept of threat in this model may behave quite counter-intuitive, and its SP equilibrium expected payoff vector may not be the same as the min-max solution payoff vector in zero-sum games. If we postpone realizations of all threats to the end of the game, the two problems can be solved simultaneously. This is exactly the 2(a) model suggested by Hart and Mas-Colell in the last section of their paper. I show that the new model, unfortunately, can only guarantee the existence of an SP equilibrium in the two player case. For the original model, I reduce the computation of an SP equilibrium to a system of linear inequalities. Quantitative efficiency and symmetric SP equilibria are also discussed.
The modal view in the cognitive sciences holds that consciousness is necessary for abstract, symbolic and rule-following computations. Hence, mathematical thinking in general, and doing arithmetic more specifically, are widely believed to require consciousness. In the current paper we use continuous flash suppression to expose participants to extremely long-duration (up to 2000 milliseconds) subliminal arithmetic equations. The results of three experiments show that the equations were solved without ever reaching consciousness. In other words, they show that arithmetic can be done unconsciously. These findings imply that the modal view of the unconscious needs to be significantly updated, to include symbolic processes that were heretofore considered to be uniquely conscious.
Gambling frequencies on single numbers in real casino roulette were displayed in a contour map. This resulted not only in a confirmation that gamblers are subject to middle bias, but also to accessibility effects. The figure allowed us to infer the location of the roulette wheel and croupier from the gambling data, as well as infer bounds on the dimensions of the roulette table.
We provide an axiomatic characterization of the measure of riskiness of gambles (risky assets) introduced by Foster and Hart (2009). The axioms are based on the concept of wealth requirement .
2010
{What happens when priors are not common? We show that for each type pro¬le „ over a knowledge space (\copyright,  ), where the state space \copyright is connected with respect to the partition pro¬le  , we can associate a value 0 / 1 that we term the prior distance of „