Loading [MathJax]/extensions/MathZoom.js
Multistability of Recurrent Neural Networks With Piecewise-Linear Radial Basis Functions and State-Dependent Switching Parameters | IEEE Journals & Magazine | IEEE Xplore

Multistability of Recurrent Neural Networks With Piecewise-Linear Radial Basis Functions and State-Dependent Switching Parameters


Abstract:

This paper presents new theoretical results on the multistability of switched recurrent neural networks with radial basis functions and state-dependent switching. By part...Show More

Abstract:

This paper presents new theoretical results on the multistability of switched recurrent neural networks with radial basis functions and state-dependent switching. By partitioning state space, applying Brouwer fixed-point theorem and constructing a Lyapunov function, the number of the equilibria and their locations are estimated and their stability/instability are analyzed under some reasonable assumptions on the decomposition of index set and switching threshold. It is shown that the switching threshold plays an important role in increasing the number of stable equilibria and different multistability results can be obtained under different ranges of switching threshold. The results suggest that switched recurrent neural networks would be superior to conventional ones in terms of increased storage capacity when used as associative memories. Two examples are discussed in detail to substantiate the effectiveness of the theoretical analysis.
Published in: IEEE Transactions on Systems, Man, and Cybernetics: Systems ( Volume: 50, Issue: 11, November 2020)
Page(s): 4458 - 4471
Date of Publication: 31 July 2018

ISSN Information:

Funding Agency:


I. Introduction

Nowadays, neural networks are of fundamental importance with numerous applications in many fields such as engineering, science, economics, and sociology. The dynamic analysis of recurrent neural networks plays an important role in neural network designs and applications. When a recurrent neural network is applied for associative memory or pattern recognition, multiple stable equilibria are often needed. The stability of neural networks with multiple stable equilibria, called multistability, is quite different from the stability of neural networks with a unique equilibrium, known as mono-stability. Numerous results on mono-stability analysis of recurrent neural networks are available; e.g., [1]–[7]. In many applications, multistability issues arise more often than the mono-stability ones. In associative memories, the number of stable equilibria of the recurrent neural networks represents the storage capacity of the memory. Therefore, the multistability analysis of equilibria plays an important role in the design of associative memories based on recurrent neural networks. For this reason, many valuable results on this topic have been obtained. Recent works are available on the coexistence of multiple stable attractors, including stable equilibria, periodic orbits, and almost periodic orbits; e.g., [8]–[35]. The results on the multistability of recurrent neural networks with time delays are also available; e.g., [8]–[13]. Specially, the multistability of various neural network models are characterized, such as cooperative model, competitive model, bidirectional associative memory model, Cohen–Grossberg model, complex-valued model and fractional-order model, and memristor-based neural networks; e.g., [14]–[24]. What is worthy of more attention is that the multistability analysis of neural networks mainly depends on the type of activation functions. In most existing results, the activation functions employed in multistability analysis are mainly monotone nondecreasing functions such as sigmoid functions; e.g., [25]–[31]. Radial basis function networks are popular neural network models. However, radial basis functions are not monotonic. Therefore, the neural networks with radial basis functions are totally different [32]–[37].

Contact IEEE to Subscribe

References

References is not available for this document.