Loading [MathJax]/extensions/MathMenu.js
Exponential Contingency Explosion: Implications for Artificial General Intelligence | IEEE Journals & Magazine | IEEE Xplore

Exponential Contingency Explosion: Implications for Artificial General Intelligence


Abstract:

The failure of complex artificial intelligence (AI) systems seems ubiquitous. To provide a model to describe these shortcomings, we define complexity in terms of a system...Show More

Abstract:

The failure of complex artificial intelligence (AI) systems seems ubiquitous. To provide a model to describe these shortcomings, we define complexity in terms of a system’s sensors and the number of environments or situations in which it performs. The complexity is not looked at in terms of the difficulty of design, but in the final performance of the system as a function of the sensor and environmental count. As the complexity of AI, or any system, increases linearly the contingencies increase exponentially and the number of possible design performances increases as a compound exponential. In this worst case scenario, the exponential increase in contingencies makes the assessment of all contingencies difficult and eventually impossible. As the contingencies grow large, unexpected and undesirable contingencies are all expected to increase in number. This, the worst case scenario, is highly connected, or conjunctive. Contingencies grow linearly with respect to complexity for systems loosely connected, or disjunctive. Mitigation of unexpected outcomes in either case can be accomplished using tools such as design expertise and iterative redesign informed by intelligent testing.
Page(s): 2800 - 2808
Date of Publication: 04 March 2021

ISSN Information:

Funding Agency:

No metrics found for this document.

I. Introduction

The more complex a system, the greater number of performance contingencies. This growth is a concern requiring serious consideration in the pursuit of artificial general intelligence (AGI) [2], [29], [39]. The analysis herein does not address the goals of AGI research, but rather the difficulty of achieving them. Any effort at achieving AGI will, by necessity, be complex. Our analysis is applicable to all complex systems including complex artificial intelligence (AI).

Usage
Select a Year
2025

View as

Total usage sinceMar 2021:485
01234JanFebMarAprMayJunJulAugSepOctNovDec113000000000
Year Total:5
Data is updated monthly. Usage includes PDF downloads and HTML views.

Contact IEEE to Subscribe

References

References is not available for this document.