Neural networks have been used for modelling the nonlinear characteristics of memoryless nonlinear channels using the backpropagation (BP) learning with experimental training data. The mean transient and convergence behavior of a simplified two-layer neural network has been studied in order to better understand this neural network application. The network was trained with zero mean Gaussian data. This paper extends these results to include the effects of the weight fluctuations on the mean square error (MSE). A new methodology is presented that can be extended to other nonlinear learning problems. The new mathematical model is able to predict the MSE learning behavior as a function of the algorithm step size μ. The performance analysis is based on the derivation of linear recursions for the variance and covariance of the weights that depend nonlinearly on the mean weights. These linear recursions can be used to predict the local mean-square stability of the weights. As in linear gradient search problems (LMS, etc.), it is shown that there exists an optimum ft (minimizing the MSE), which is the result of the tradeoff between fast learning and small weight fluctuations. Monte Carlo simulations display excellent agreement between the actual behavior and the predictions of the theoretical model over a wide range of values.

Additional Metadata
Journal IEEE Transactions on Signal Processing
Citation
Bershad, N.J., Ibnkahla, M, Blauwens, G., Cools, J., Soubrane, A., & Ponson, N. (1998). Fluctuation analysis of a two-layer backpropagation algorithm used for modelling nonlinear memoryless channels. IEEE Transactions on Signal Processing, 46(6).