Loading [MathJax]/jax/output/HTML-CSS/autoload/mtable.js
An analysis of global asymptotic stability of delayed Cohen-Grossberg neural networks via nonsmooth analysis | IEEE Journals & Magazine | IEEE Xplore

An analysis of global asymptotic stability of delayed Cohen-Grossberg neural networks via nonsmooth analysis


Abstract:

In this paper, using a method based on nonsmooth analysis and the Lyapunov method, several new sufficient conditions are derived to ensure existence and global asymptotic...Show More

Abstract:

In this paper, using a method based on nonsmooth analysis and the Lyapunov method, several new sufficient conditions are derived to ensure existence and global asymptotic stability of the equilibrium point for delayed Cohen-Grossberg neural networks. The obtained criteria can be checked easily in practice and have a distinguished feature from previous studies, and our results do not need the smoothness of the behaved function, boundedness of the activation function and the symmetry of the connection matrices. Moreover, two examples are exploited to illustrate the effectiveness of the proposed criteria in comparison with some existing results.
Published in: IEEE Transactions on Circuits and Systems I: Regular Papers ( Volume: 52, Issue: 9, September 2005)
Page(s): 1854 - 1861
Date of Publication: 30 September 2005

ISSN Information:


I. Introduction

In the past few decades, neural networks have received increasing interest due to their wide range of applications, for example, pattern recognition, associative memory and combinational optimization. Among them, Cohen–Grossberg neural network [1] was an important one which could be described by the following ordinary differential equation:\mathdot{x}(t)=-a\left(x(t)\right)\left[b\left(x(t)\right)-Af\left(x(t)\right)\right]\eqno{\hbox{(1)}}

where \eqalignno{x(t)=&\,(x_{1}(t),\cdots, x_{n}(t))^{T}\in\BBR^{n} \cr a(x(t))=&\,{\rm diag}(a_{1}(x_{1}(t)),\cdots, a_{n}(x_{n}(t)))\in\BBR^{n\times n}\cr b(x(t))=&\,(b_{1}(x_{1}(t)),\cdots, b_{n}(x_{n}(t)))^{T}\in\BBR^{n} \cr f(x(t))=&\,(f_{1}(x_{1}(t)),\cdots, f_{n}(x_{n}(t)))^{T}\in\BBR^{n}.}
indicates the strength of the neuron interconnection within the network; represents an amplification function; is a behaved function; denotes the activation function of the th neuron at time .

Contact IEEE to Subscribe

References

References is not available for this document.