Fact-Free Learning

  title={Fact-Free Learning},
  author={Enriqueta Aragon{\`e}s and Itzhak Gilboa and Andrew Postlewaite and David Schmeidler},
  journal={Behavioral \& Experimental Economics},
People may be surprised by noticing certain regularities that hold in existing knowledge they have had for some time. That is, they may learn without getting new factual information. We argue that this can be partly explained by computational complexity. We show that, given a database, finding a small set of variables that obtain a certain value of R^2 is computationally hard, in the sense that this term is used in computer science. We discuss some of the implications of this result and of fact… 

Second-Order Induction and Agreement

Conditions under which rational agents who have access to the same observations are likely to converge on the same predictions, and conditions under which they may entertain different probabilistic beliefs are offered.

Second-Order Induction: Uniqueness and Complexity

It is shown that with many observations and few relevant variables, uniqueness holds, and when there are many variables relative to observations, non-uniqueness is the rule, and finding the best similarity function is computationally hard.

Second-Order Induction and the Importance of Precedents∗

Second-order induction can explain why reputation is harder to reestablish, after having been lost, than to establish a priori, and why rational agents who have access to the same observations may still entertain different probabilistic beliefs.

Executing Complex Cognitive Tasks: Prizes vs. Markets

It is found that Bayesian theory cannot make sense of the data: both systems work equally well, while trading is abundant in the market setup and prices are informative but noisy.


What do we notice and how does this affect what we learn and come to believe? I present a model of an agent who learns to make forecasts on the basis of readily available information, but is

Second-order induction in prediction problems

It is shown that with many observations and few relevant variables, uniqueness holds, and when there are many variables relative to observations, nonuniqueness is the rule, and finding the EOSF is computationally hard.

Non-Bayesian Learning

A series of experiments suggest that, compared to the Bayesian benchmark, people may either underreact or overreact to new information, and shows a basic distinction between the long-run beliefs of agents who underreact to information and agents who over react to information.

Information aggregation, learning, and non-strategic behavior in voting environments

A presumed benefit of group decision-making is to select the best alternative by aggregating privately dispersed information. In reality, people often learn what to make of their private information

Unawareness of theorems

This paper provides a set-theoretic model of knowledge and unawareness. A new property called Awareness Leads to Knowledge shows that unawareness of theorems not only constrains an agent’s knowledge,

Learning and Discovery

A dynamic framework for an individual decision-maker within which discovery of previously unconsidered propositions is possible is formulated, and a semantics rich enough to describe the individual's awareness that currently undiscovered propositions may be discovered in the future is developed.



Probabilistic Representation of Complexity

The implied behavior is rational in the traditional sense, yet consistent with an agent who believes his environment is too complex to warrant precise planing, foregoes finely detailed contingent rules in favor of vaguer plans, and expresses a preference for flexibility.

Computers and Intractability: A Guide to the Theory of NP-Completeness

The experiences, understandings, and beliefs that guide the professional practices of teacher educators are explored, and the book paints a picture of a profession that offers huge rewards, alongside challenges and frustrations.

Philosophical Applications of Kolmogorov's Complexity Measure

Kolmogorov has defined the complexity of a sequence of bits to be the minimal size of (the description of) a Turing machine which can regenerate the given sequence. This paper contains two notes on

Fact, Fiction, and Forecast

liELsoN Goodman's second book,1 which represents?excepting the first chapter, a reprint of the well-known paper, "The Prob lem of Counterfactual Conditionals"?the "Special Lectures in Philosophy" he

On the Approximability of Minimizing Nonzero Variables or Unsatisfied Relations in Linear Systems

A theory of case-based decisions

The authors describe the general theory and its relationship to planning, repeated choice problems, inductive inference, and learning, and compare it to expected utility theory as well as to rule-based systems.

A sub-constant error-probability low-degree test, and a sub-constant error-probability PCP characterization of NP

A new low-degree-test is introduced, one that uses the restriction of low- degree polynomials to planes rather than the restriction to lines, and enables us to prove a low-error characterization of NP in terms of PCP.

Costly Information Acquisition: Experimental Analysis of a Boundedly Rational Model

The directed cognition model assumes that agents use partially myopic option-value calculations to select their next cognitive operation. The current paper tests this model by studying information

A Unique Subjective State Space for Unforeseen Contingencies

We axiomatically characterize a representation of preferences over opportunity sets which exhibit a preference for flexibility, interpreted as a model of unforeseen contingencies. In this

Incomplete Written Contracts: Undescribable States of Nature (Now published in Quarterly Journal of Economics (1994), vol.109, pp.1085-1124.)

This paper extends the classic two-armed bandit problem to a many-agent setting in which N players each face the same experimentation problem. The difference with the single-agent problem is that