Abstract
I develop a growth model in which AI-generated content contaminates the knowledge commons, creating two nested irreversibilities. A derivative trap arises when recombinative output crosses a threshold in the corpus, degrading frontier productivity faster than talent reallocation or R&D subsidies can offset. A governance trap arises because the institutional capacity to distinguish frontier from derivative knowledge–epistemic capital–is itself a depletable stock. In the baseline simulation, the governance trap preempts the derivative trap by roughly nine years, closing the window for effective policy while measured innovation remains positive. The competitive equilibrium features a double wedge: frontier knowledge is undervalued and derivative output overvalued, driving a strict instrument hierarchy in which epistemic investment is a precondition for governance, which is a precondition for R&D subsidies. The welfare cost of inaction is 6.8% in consumption-equivalent terms.
Keywords
Derivative trap; data quality, epistemic capital; governance trap; innovation policy; forward invariance;
JEL codes
- O31: Innovation and Invention: Processes and Incentives
- O33: Technological Change: Choices and Consequences • Diffusion Processes
- O38: Government Policy
- D83: Search • Learning • Information and Knowledge • Communication • Belief
Reference
Manh-Hung Nguyen, “Epistemic Capital and Two-Trap Growth in the AI Era”, TSE Working Paper, n. 26-1722, February 2026.
See also
Published in
TSE Working Paper, n. 26-1722, February 2026
