« Back to Results
Atlanta Marriott Marquis, International 1
Hosted By:
Econometric Society
capture the idea of constant marginal costs in information production: the cost of generating two independent signals is the sum of their costs, and the cost of generating a signal with probability half equals half the cost of generating it deterministically. Together with a monotonicity and a continuity conditions, these axioms completely determine the cost of a signal up to a vector of parameters, one for each pair of states of nature. These parameters have a clear economic interpretation and determine the difficulty of distinguishing between different states. The resulting cost function, which we call log-likelihood ratio cost, is a linear combinations of the Kullback-Leibler divergences (i.e., the expected log-likelihood ratios) between the conditional signal distributions. We argue that this cost function is a versatile modeling tool, and that in various examples of information acquisition it leads to more realistic predictions than the approach based on Shannon entropy.
Learning and Information Aggregation with Misspecified Models
Paper Session
Friday, Jan. 4, 2019 8:00 AM - 10:00 AM
- Chair: Muhamet Yildiz, Massachusetts Institute of Technology
Divisible Updating
Abstract
A characterisation is provided of the belief updating processes that are independent of how an individual chooses to divide up/partition the statistical information they use in their updating. These "divisible” updating processes are in general not Bayesian, but can be interpreted as a reparameterisation of Bayesian updating. This class of rules incorporates over- and under-reaction to new information in the updating and other biases. We also show that a martingale property is, then, sufficient for the updating process to be Bayesian.The Cost of Information
Abstract
We develop an axiomatic theory of costly information acquisition. Our axiomscapture the idea of constant marginal costs in information production: the cost of generating two independent signals is the sum of their costs, and the cost of generating a signal with probability half equals half the cost of generating it deterministically. Together with a monotonicity and a continuity conditions, these axioms completely determine the cost of a signal up to a vector of parameters, one for each pair of states of nature. These parameters have a clear economic interpretation and determine the difficulty of distinguishing between different states. The resulting cost function, which we call log-likelihood ratio cost, is a linear combinations of the Kullback-Leibler divergences (i.e., the expected log-likelihood ratios) between the conditional signal distributions. We argue that this cost function is a versatile modeling tool, and that in various examples of information acquisition it leads to more realistic predictions than the approach based on Shannon entropy.
The Wisdom of the Confused Crowd
Abstract
“Crowds” are often regarded as “wiser” than individuals, and prediction markets are often regarded as effective methods for harnessing this wisdom. If the agents in prediction markets are Bayesians who share a common model and prior belief, then the no-trade theorem implies that we should see no trade in the market. But if the agents in the market are not Bayesians who share a common model and prior belief, then it is no longer obvious that the market outcome aggregates or conveys information. In this paper, we examine a stylized prediction market comprised of Bayesian agents whose inferences are based on different models of the underlying environment. We explore a basic tension—the differences in models that give rise to the possibility of trade preclude generally the possibility of perfect information aggregation.Discussant(s)
Philipp Strack
,
University of California-Berkeley
Alvaro Sandroni
,
Northwestern University
Muhamet Yildiz
,
Massachusetts Institute of Technology
JEL Classifications
- C7 - Game Theory and Bargaining Theory
- D8 - Information, Knowledge, and Uncertainty