Jaynes machine: The universal microstructure of deep neural networks

Venkat Venkatasubramanian, N. Sanjeevrajan, Manasi Khandekar, Abhishek Sivaram, Collin Szczepanski

Research output: Contribution to journalJournal articleResearchpeer-review

2 Downloads (Pure)

Abstract

Despite the recent stunning progress in large-scale deep neural network applications, our understanding of their microstructure, ‘energy’ functions, and optimal design remains incomplete. Here, we present a new game-theoretic framework, called statistical teleodynamics, that reveals important insights into these key properties. The optimally robust design of such networks inherently involves computational benefit–cost trade-offs that physics-inspired models do not adequately capture. These trade-offs occur as neurons and connections compete to increase their effective utilities under resource constraints during training. In a fully trained network, this results in a state of arbitrage equilibrium, where all neurons in a given layer have the same effective utility, and all connections to a given layer have the same effective utility. The equilibrium is characterized by the emergence of two lognormal distributions of connection weights and neuronal output as the universal microstructure of large deep neural networks. We call such a network the Jaynes Machine. Our theoretical predictions are shown to be supported by empirical data from seven large-scale deep neural networks. We also show that the Hopfield network and the Boltzmann Machine are the same special case of the Jaynes Machine.
Original languageEnglish
Article number108908
JournalComputers and Chemical Engineering
Volume192
Number of pages10
ISSN0098-1354
DOIs
Publication statusPublished - 2025

Keywords

  • Arbitrage equilibrium
  • Boltzmann machine
  • Deep learning
  • Game theory
  • Hopfield networks
  • LLMs

Fingerprint

Dive into the research topics of 'Jaynes machine: The universal microstructure of deep neural networks'. Together they form a unique fingerprint.

Cite this