Asymptotic properties of one-layer artificial neural networks with sparse connectivity

Abstract

A law of large numbers for the empirical distribution of parameters of a one-layer artificial neural networks with sparse connectivity is derived for a simultaneously increasing number of both, neurons and training iterations of the stochastic gradient descent.

Publication
Stat. Probab. Lett.