The weight space of an artificial neural network can be systematically explored using tools from statistical mechanics. We employ a combination of a hybrid Monte Carlo algorithm which performs long exploration steps, a ratchet-based algorithm to investigate connectivity paths, and coupled replica models simulations to study subdominant flat regions. Our analysis focuses on one-hidden-layer networks and spans a range of energy levels and constrained density regimes. Near the interpolation threshold, the low-energy manifold shows a spiky topology. In the overparameterized regime, however, the low-energy manifold becomes entirely flat, forming an extended complex structure that is easy to sample. These numerical results are supported by an analytical study of the training error landscape, and we show numerically that the qualitative features of the loss landscape are robust across different data structures. Our study aims to provide new methodological insights for developing scalable methods for large networks.

Sampling the space of solutions of an artificial neural network

Malatesta, Enrico M.;Tiana, Guido;Zecchina, Riccardo
2025

Abstract

The weight space of an artificial neural network can be systematically explored using tools from statistical mechanics. We employ a combination of a hybrid Monte Carlo algorithm which performs long exploration steps, a ratchet-based algorithm to investigate connectivity paths, and coupled replica models simulations to study subdominant flat regions. Our analysis focuses on one-hidden-layer networks and spans a range of energy levels and constrained density regimes. Near the interpolation threshold, the low-energy manifold shows a spiky topology. In the overparameterized regime, however, the low-energy manifold becomes entirely flat, forming an extended complex structure that is easy to sample. These numerical results are supported by an analytical study of the training error landscape, and we show numerically that the qualitative features of the loss landscape are robust across different data structures. Our study aims to provide new methodological insights for developing scalable methods for large networks.
2025
2025
Zambon, Alessandro; Malatesta, Enrico M.; Tiana, Guido; Zecchina, Riccardo
File in questo prodotto:
File Dimensione Formato  
2503.08266v2.pdf

accesso aperto

Descrizione: article
Tipologia: Documento in Pre-print (Pre-print document)
Licenza: PUBBLICO DOMINIO
Dimensione 1.78 MB
Formato Adobe PDF
1.78 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11565/4075317
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact