Loading [MathJax]/extensions/MathZoom.js
Deep Metacyclic Parameter Search: Non-Convex optimization Based on Evolutionary Computing with a Few Twists | IEEE Conference Publication | IEEE Xplore

Deep Metacyclic Parameter Search: Non-Convex optimization Based on Evolutionary Computing with a Few Twists


Abstract:

This paper proposes a new framework for non-convex optimization referred to as Metacyclic Parameter Search (MEPS). The framework combines several approaches that are well...Show More

Abstract:

This paper proposes a new framework for non-convex optimization referred to as Metacyclic Parameter Search (MEPS). The framework combines several approaches that are well known from the field of artificial intelligence-namely the iterative update of generations of candidate solutions prevalent in evolutionary computing and particle swarm optimization, as well as metacognitive approaches implementing reward-based improvement such as (deep) reinforcment learning-resulting in a gradient-free approach to non-convex optimization that combines the benefits (and alleviates the mutual shortcomings) of each of these individual approaches. Following an overview of the framework, its workings are demonstrated on three rudimentary examples.
Date of Conference: 23-25 October 2019
Date Added to IEEE Xplore: 11 May 2020
ISBN Information:
Print on Demand(PoD) ISSN: 2380-7350
Conference Location: Naples, Italy

I. Introduction

Non-convex optimization is a broad field of mathematics that finds many applications in engineering tasks where the goal is to find sufficiently good solutions on high-dimensional parametric manifolds. This paper proposes a novel framework for solving such problems that is loosely based on a variety of existing approaches, but at the same time aims to correct some of their respective shortcomings.

Contact IEEE to Subscribe

References

References is not available for this document.