I. Introduction
Nowadays, neural networks are of fundamental importance with numerous applications in many fields such as engineering, science, economics, and sociology. The dynamic analysis of recurrent neural networks plays an important role in neural network designs and applications. When a recurrent neural network is applied for associative memory or pattern recognition, multiple stable equilibria are often needed. The stability of neural networks with multiple stable equilibria, called multistability, is quite different from the stability of neural networks with a unique equilibrium, known as mono-stability. Numerous results on mono-stability analysis of recurrent neural networks are available; e.g., [1]–[7]. In many applications, multistability issues arise more often than the mono-stability ones. In associative memories, the number of stable equilibria of the recurrent neural networks represents the storage capacity of the memory. Therefore, the multistability analysis of equilibria plays an important role in the design of associative memories based on recurrent neural networks. For this reason, many valuable results on this topic have been obtained. Recent works are available on the coexistence of multiple stable attractors, including stable equilibria, periodic orbits, and almost periodic orbits; e.g., [8]–[35]. The results on the multistability of recurrent neural networks with time delays are also available; e.g., [8]–[13]. Specially, the multistability of various neural network models are characterized, such as cooperative model, competitive model, bidirectional associative memory model, Cohen–Grossberg model, complex-valued model and fractional-order model, and memristor-based neural networks; e.g., [14]–[24]. What is worthy of more attention is that the multistability analysis of neural networks mainly depends on the type of activation functions. In most existing results, the activation functions employed in multistability analysis are mainly monotone nondecreasing functions such as sigmoid functions; e.g., [25]–[31]. Radial basis function networks are popular neural network models. However, radial basis functions are not monotonic. Therefore, the neural networks with radial basis functions are totally different [32]–[37].