I. Introduction
In the past few decades, neural networks have received increasing interest due to their wide range of applications, for example, pattern recognition, associative memory and combinational optimization. Among them, Cohen–Grossberg neural network [1] was an important one which could be described by the following ordinary differential equation:\mathdot{x}(t)=-a\left(x(t)\right)\left[b\left(x(t)\right)-Af\left(x(t)\right)\right]\eqno{\hbox{(1)}}
where \eqalignno{x(t)=&\,(x_{1}(t),\cdots, x_{n}(t))^{T}\in\BBR^{n} \cr a(x(t))=&\,{\rm diag}(a_{1}(x_{1}(t)),\cdots, a_{n}(x_{n}(t)))\in\BBR^{n\times n}\cr b(x(t))=&\,(b_{1}(x_{1}(t)),\cdots, b_{n}(x_{n}(t)))^{T}\in\BBR^{n} \cr f(x(t))=&\,(f_{1}(x_{1}(t)),\cdots, f_{n}(x_{n}(t)))^{T}\in\BBR^{n}.}
indicates the strength of the neuron interconnection within the network; represents an amplification function; is a behaved function; denotes the activation function of the th neuron at time .