Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/217098 
Authors: 
Year of Publication: 
2020
Citation: 
[Journal:] Theoretical Economics [ISSN:] 1555-7561 [Volume:] 15 [Issue:] 1 [Publisher:] The Econometric Society [Place:] New Haven, CT [Year:] 2020 [Pages:] 239-278
Publisher: 
The Econometric Society, New Haven, CT
Abstract: 
Agents in a network want to learn the true state of the world from their own signals and their neighbors' reports. Agents know only their local networks, consisting of their neighbors and the links among them. Every agent is Bayesian with the (possibly misspecified) prior belief that her local network is the entire network. We present a tractable learning rule to implement such locally Bayesian learning: each agent extracts new information using the full history of observed reports in her local network. Despite their limited network knowledge, agents learn correctly when the network is a social quilt, a tree-like union of cliques. But they fail to learn when a network contains interlinked circles (echo chambers), despite an arbitrarily large number of correct signals.
Subjects: 
Locally Bayesian learning
rational learning with misspecified priors
efficient learning in finite networks
JEL: 
D03
D83
D85
Persistent Identifier of the first edition: 
Creative Commons License: 
cc-by-nc Logo
Document Type: 
Article

Files in This Item:
File
Size





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.