Article

Causal understanding is not necessary for the improvement of culturally evolving technology

Maxime Derex, Jean-François Bonnefon, Robert Boyd, and Alex Mesoudi

Abstract

Bows and arrows, houses and kayaks are just a few examples of the highly optimized tools that humans have produced and used to colonize new environments1,2. Because there is much evidence that humans’ cognitive abilities are unparalleled3,4, many believe that such technologies resulted from our superior causal reasoning abilities5,6,7. However, others have stressed that the high dimensionality of human technologies makes them very difficult to understand causally8. Instead, they argue that optimized technologies emerge through the retention of small improvements across generations without requiring understanding of how these technologies work1,9. Here we show that a physical artefact becomes progressively optimized across generations of social learners in the absence of explicit causal understanding. Moreover, we find that the transmission of causal models across generations has no noticeable effect on the pace of cultural evolution. The reason is that participants do not spontaneously create multidimensional causal theories but, instead, mainly produce simplistic models related to a salient dimension. Finally, we show that the transmission of these inaccurate theories constrains learners’ exploration and has downstream effects on their understanding. These results indicate that complex technologies need not result from enhanced causal reasoning but, instead, can emerge from the accumulation of improvements made across generations.

Reference

Maxime Derex, Jean-François Bonnefon, Robert Boyd, and Alex Mesoudi, Causal understanding is not necessary for the improvement of culturally evolving technology, Nature Human Behaviour, vol. 3, May 2019, pp. 446–452.

See also

Published in

Nature Human Behaviour, vol. 3, May 2019, pp. 446–452