Loading [MathJax]/extensions/MathMenu.js
A Degree-Dependent Polynomial-Based Reciprocally Convex Matrix Inequality and Its Application to Stability Analysis of Delayed Neural Networks | IEEE Journals & Magazine | IEEE Xplore

A Degree-Dependent Polynomial-Based Reciprocally Convex Matrix Inequality and Its Application to Stability Analysis of Delayed Neural Networks


Abstract:

In this article, several improved stability criteria for time-varying delayed neural networks (DNNs) are proposed. A degree-dependent polynomial-based reciprocally convex...Show More

Abstract:

In this article, several improved stability criteria for time-varying delayed neural networks (DNNs) are proposed. A degree-dependent polynomial-based reciprocally convex matrix inequality (RCMI) is proposed for obtaining less conservative stability criteria. Unlike previous RCMIs, the matrix inequality in this article produces a polynomial of any degree in the time-varying delay, which helps to reduce conservatism. In addition, to reduce the computational complexity caused by dealing with the negative definite of the high-degree terms, an improved lemma is presented. Applying the above matrix inequalities and improved negative definiteness condition helps to generate a more relaxed stability criterion for analyzing time-varying DNNs. Two examples are provided to illustrate this statement.
Published in: IEEE Transactions on Cybernetics ( Volume: 54, Issue: 7, July 2024)
Page(s): 4164 - 4176
Date of Publication: 28 February 2024

ISSN Information:

PubMed ID: 38416629

Funding Agency:

No metrics found for this document.

I. Introduction

The versatility and effectiveness of neural networks have led to various applications in different fields, including but not limited to optimization, linear and nonlinear programming, associative memory, pattern recognition, and computer vision [1]. It is widely recognized that time delays, an inherent aspect of neural networks, can cause instability [2], [3], [4], [5], [6], [7]. The stability of time-varying delayed neural networks (DNNs) is fundamental and significant. Therefore, the stability analysis problem of DNNs has been a hot topic in recent decades. The use of the Lyapunov–Krasovskii functional (LKF) method combined with linear matrix inequality (LMI) techniques to analyze the stability of DNNs is one of the current mainstream methods [8], [9], [10], [11], [12], [13], [14]. However, this method is conservative, as it provides only sufficient conditions. To reduce the conservatism of the stability criteria of DNNs, the novel choice of a positive Lyapunov function and the relaxed expansion of its time derivative play crucial roles.

Usage
Select a Year
2025

View as

Total usage sinceFeb 2024:542
010203040JanFebMarAprMayJunJulAugSepOctNovDec263437000000000
Year Total:97
Data is updated monthly. Usage includes PDF downloads and HTML views.
Contact IEEE to Subscribe

References

References is not available for this document.