I. Introduction
A recent development in digital the idea of approximate computing refers to the trade-off of calculation accuracy for improvements in performance, energy efficiency, or other system resources. The goal of approximation computing approaches is to produce results that are “approximately correct” within predetermined tolerances rather than precise outcomes. Numerous applications that require error tolerance can benefit from approximate computing. Data mining, machine learning, and multimedia processing are among the examples.