I. Introduction
Genetic algorithms (GAs) and neural networks (NNs) represent two evolving technologies inspired from biological information science. NNs are derived from brain theory to simulate learning behavior of an individual, while GAs are developed from the theory raised by Darwin claiming that populations evolve to achieve better fitness. Although these two technologies seem quite different in the involved time period of action, the number of involved individuals and the process scheme their similar dynamic behaviors stimulated researchers to think about a synergistic combination of these two technologies to provide more problem solving power than either of them alone [3], [4], [6]–[8]. Different granular soft computing based techniques are useful for data mining applications [1]–[7], [9], [12]. The granular NNs are used in data fusion and data mining [10]. The genetic fuzzy NN uses GA to initialize parameters and then uses the gradient descent learning algorithm to complete the training to discover fuzzy rules [11]. The old genetic fuzzy NN in [11] has only two sequential steps that are: 1) GA-based training and 2) gradient-descent learning. To improve the old genetic fuzzy NN, we propose a new hybrid iterative evolutionary learning algorithm to really merge GAs and the gradient-descent learning algorithm in an iterative manner. Importantly, to avoid a local minima problem, the GAs generate optimized parameters for the fuzzy NN with knowledge discovery (FNNKD) [11], then several FNNKDs are trained to generate new parameter values; these new parameter values go back to the GA for further optimization, …, and so on until the termination criteria are satisfied. Such a hybrid iteration training algorithm combining both GA and gradient descent in an alternating way is more powerful than the previous sequential genetic fuzzy neural algorithm described in [11] based on simulation results.