I. Introduction
Adaptive control [1] is a model-free approach for controlling uncertain dynamic systems. The basic idea in adaptive control is to estimate the uncertainties in the plant (or equivalently, in the corresponding controller) on-line based on the measured signals. In principle, the system under control can be uncertain in terms of its dynamic structure (nonparametric uncertainty), or its parameters (parametric uncertainty). Generally, the basic objective of adaptive control is to maintain consistent performance of the control system in the presence of these uncertainties. However, conventional adaptive control theory can only deal with the systems with known dynamic structure, but unknown (constant or slowly-varying) parameters [20], [17]. Furthermore, conventional adaptive controllers cannot make use of human operators' experience, which is usually expressed in linguistic terms.