Neural networks have been used for modelling the nonlinear characteristics of memoryless nonlinear channels using the backpropagation learning (BP) with experimental training data. The mean transient and convergence behavior of a simplified two-layer neural network has been studied in [2]. The network was trained with zero mean Gaussian data. This paper extends these results to include the effects of the weight fluctuations upon the mean-square-error (MSE). A new methodology is presented which can be extended to other nonlinear learning problems. The new mathematical model is able to predict the MSE learning behavior as a function of the algorithm step size μ. Linear recursions are derived for the variance and covariance of the weights which depend nonlinearly upon the mean weights. As in linear gradient search problems (LMS, etc.), there exists an optimum μ (minimizing the MSE) which is the trade-off between fast learning and small weight fluctuations. Monte Carlo simulations display excellent agreement with the theoretical predictions for various μ.

Additional Metadata
Conference Proceedings of the 1998 32nd Asilomar Conference on Signals, Systems & Computers. Part 1 (of 2)
Bershad, N.J., Ibnkahla, M, Blauwens, G., Cools, J., Soubrane, A., & Ponson, N. (1998). Fluctuation analysis of a two-layer backpropagation algorithm used for modelling nonlinear memoryless channels. In Conference Record of the Asilomar Conference on Signals, Systems and Computers (pp. 678–682).