Lin Li, Qinwei Fan, Li Zhou, Recurrent neural network with $L_{1/2}$ regularization for regression and multiclass classification problems, Vol. 2022 (2022), Article ID 10, pp. 1-13

Full Text: PDF
DOI: 10.23952/jnfa.2022.10

Received December 11, 2021; Accepted March 15, 2022; Published March 29, 2022

 

Abstract Recurrent neural network (RNN) is introduced to solve the dynamic system problem. In this paper, a new RNN of the gradient method with L_{1/2} regularization learned sequential behavior is presented. L_{1/2} regularization can drive redundant weight vectors of nodes to zero efficiently. The usual L_{1/2} regularization involves a non-smooth absolute value function, which causes the oscillation of the norm of gradient and the error function in the numerical computation. However, by smoothing techniques, those drawbacks can be well addressed. Simulation results of regression and multiclass classification problems demonstrate that our algorithm has better performance than three other algorithms.

 

How to Cite this Article:
L. Li, Q. Fan, L. Zhou, Recurrent neural network with L_{1/2} regularization for regression and multiclass classification problems, J. Nonlinear Funct. Anal. 2022 (2022) 10.