| CARVIEW |
-
☰
- Home
- Introduction
- Generative models
- Conditioning
- Causal and statistical dependence
- Conditional dependence
- Social cognition
- Interlude - Bayesian data analysis
- Interlude - Algorithms for inference
- Rational process models
- Learning as conditional inference
- Learning with a language of thought
- Hierarchical models
- Occam's Razor
- Mixture models
- Learning (deep) continuous functions
- Appendix - JavaScript basics
- Appendix - Useful distributions
Probabilistic Models of Cognition
by Noah D. Goodman, Joshua B. Tenenbaum & The ProbMods Contributors
This book explores the probabilistic approach to cognitive science, which models learning and reasoning as inference in complex probabilistic models.
We examine how a broad range of empirical phenomena, including intuitive physics, concept learning, causal reasoning, social cognition, and language understanding, can be modeled using probabilistic programs (using the WebPPL language).
Contributors
This book is an open source project. We welcome content contributions (via GitHub)!
The ProbMods Contibutors are:
Noah D. Goodman (editor)
Joshua B. Tenenbaum
Daphna Buchsbaum
Joshua Hartshorne
Robert Hawkins
Timothy J. O’Donnell
Michael Henry Tessler
Citation
N. D. Goodman, J. B. Tenenbaum, and The ProbMods Contributors (2016). Probabilistic Models of Cognition (2nd ed.). Retrieved YYYY-MM-DD fromhttps://probmods.org/[bibtex]
@misc{probmods2,
title = {{Probabilistic Models of Cognition}},
edition = {Second},
author = {Goodman, Noah D and Tenenbaum, Joshua B. and The ProbMods Contributors},
year = {2016},
howpublished = {\url{https://probmods.org/v2}},
note = {Accessed: }
}
Acknowledgments
We are grateful for crucial technical assitance from: Andreas Stuhlmüller, Tomer Ullman, John McCoy, Long Ouyang, Julius Cheng.
The construction and ongoing support of this tutorial are made possible by grants from the Office of Naval Research, the James S. McDonnell Foundation, the Stanford VPOL, and the Center for Brains, Minds, and Machines (funded by NSF STC award CCF-1231216).
Previous edition
The first edition of this book used the probabilistic programming language Church and can be found here.Chapters
- Introduction
A brief introduction to the philosophy. - Generative models
Representing working models with probabilistic programs. - Conditioning
Asking questions of models by conditional inference. - Causal and statistical dependence
Causal and statistical dependence. - Conditional dependence
Patterns of inference as evidence changes. - Social cognition
Inference about inference - Interlude - Bayesian data analysis
Making scientific inferences from data. - Interlude - Algorithms for inference
Approximate inference. Efficiency tradeoffs of different algorithms. - Rational process models
The psychological reality of inference algorithms. - Learning as conditional inference
How inferences change as data accumulate. - Learning with a language of thought
Compositional hypothesis spaces. - Hierarchical models
The power of statistical abstraction. - Occam's Razor
How inference penalizes extra model flexibility. - Mixture models
Models for inferring the kinds of things. - Learning (deep) continuous functions
Functional hypothesis spaces and deep probabilistic models - Appendix - JavaScript basics
A very brief primer on JavaScript. - Appendix - Useful distributions
A very brief summary of some important distributions.