Abstract:
Standard neural networks and statistical methods are usually believed to be inadequate when dealing with complex structures because of their feature-based approach. In fa...Show MoreMetadata
Abstract:
Standard neural networks and statistical methods are usually believed to be inadequate when dealing with complex structures because of their feature-based approach. In fact, feature-based approaches usually fail to give satisfactory solutions because of the sensitivity of the approach to the a priori selection of the features, and the incapacity to represent any specific information on the relationships among the components of the structures. However, we show that neural networks can, in fact, represent and classify structured patterns. The key idea underpinning our approach is the use of the so called "generalized recursive neuron", which is essentially a generalization to structures of a recurrent neuron. By using generalized recursive neurons, all the supervised networks developed for the classification of sequences, such as backpropagation through time networks, real-time recurrent networks, simple recurrent networks, recurrent cascade correlation networks, and neural trees can, on the whole, be generalized to structures. The results obtained by some of the above networks (with generalized recursive neurons) on the classification of logic terms are presented.
Published in: IEEE Transactions on Neural Networks ( Volume: 8, Issue: 3, May 1997)
DOI: 10.1109/72.572108
References is not available for this document.
Select All
1.
"A learning rule for asynchronous perceptrions with feedback in a combinatorial environment", Proc. IEEE 1st Annu. Int. Conf. Neural Networks, pp. 609-618, 1987.
2.
"Learning on a general network", Neural Information Processing Systems, pp. 22-30, 1988.
3.
L. Atlas, "A performance comparison of trained multilayer perceptrons and trained classification trees", Proc. IEEE, vol. 78, pp. 1614-1619, 1992.
4.
L. Breiman, J. Friedman, R. Olshen and C. Stone, Classification and Regression Trees, 1984.
5.
M. A. Cohen and S. Grossberg, "Absolute stability of global pattern formation and parallel memory storage by competitive neural networks", IEEE Trans. Syst. Man Cybern., vol. SMC-13, pp. 815-826, 1983.
6.
T. H. Cormen, C. E. Leiserson and R. L. Rivest, Introduction to Algorithms, 1990.
7.
J. L. Elman, "Finding structure in time", Cognitive Sci., vol. 14, pp. 179-211, 1990.
8.
S. E. Fahlman, The recurrent cascade-correlation architecture, 1991.
9.
"The cascade-correlation learning architecture", Advances in Neural Information Processing Systems, vol. 2, pp. 524-532, 1990.
10.
K. S. Fu, Syntactical Pattern Recognition and Applications, 1982.
11.
C. L. Giles, D. Chen, G. Z. Sun, H. H. Chen, Y. C. Lee and M. W. Goudreau, "Constructive learning of recurrent neural networks: Limitations of recurrent casade correlation and a simple solution", IEEE Trans. Neural Networks, vol. 6, pp. 829-836, 1995.
12.
C. Goller and A. Ku¨chler, Learning Task-Dependent Distributed Structure-Representations by Backpropagation Through Structure, 1995.
13.
R. C. Gonzalez and M. G. Thomason, Syntactical Pattern Recognition, 1978.
14.
L. H. Hall and L. B. Kier, "Reviews in computational chemistry", The Molecular Connectivity Chi Indexes and Kappa Shape Indexes in Structure-Property Modeling, pp. 367-422, 1991.
15.
J. J. Hopfield, "Neurons with graded response have collective computational properties like those of two-state neurons", Proc. Nat. Academy Sci., pp. 3088-3092, 1984.
16.
T. Li, L. Fang and A. Jennings, "Structurally adaptive self-organizing neural trees", Proc. Int. Joint Conf. Neural Networks, pp. 329-334, 1992.
17.
S. Muggleton and L. De Raedt, "Inductive logic programming: Theory and methods", J. Logic Programming, vol. 19, no. 20, pp. 629-679, 1994.
18.
M. P. Perrone, "A soft-competitive splitting rule for adaptive tree-structured neural networks", Proc. Int. Joint Conf. Neural Networks, pp. 689-693, 1992.
19.
M. P. Perrone and N. Intrator, "Unsupervised splitting rules for neural tree classifiers", Proc. Int. Joint Conf. Neural Networks, pp. 820-825, 1992.
20.
F. J. Pineda, "Dynamics and architecture for neural computation", J. Complexity, vol. 4, pp. 216-245, 1988.
21.
J. B. Pollack, "Recursive distributed representations", Artificial Intell., vol. 46, no. 1, pp. 77-106, 1990.
22.
D. H. Rouvray, Computational Chemical Graph Theory, pp. 9, 1990.
23.
D. E. Rumelhart and J. L. McClelland, Parallel Distributed Processing: Explorations in the Microstructure of Cognition, 1986.
24.
A. Sankar and R. Mammone, "Neural tree networks", Neural Networks: Theory and Applications, pp. 281-302, 1991.
25.
I. K. Sethi, "Entropy nets: From decision trees to neural networks", Proc. IEEE, vol. 78, pp. 1605-1613, 1990.
26.
J. A. Sirat and J.-P. Nadal, "Neural trees: A new tool for classification", Network, vol. 1, pp. 423-438, 1990.
27.
"Encoding of labeled graphs by labeling RAAM", Advances in Neural Information Processing Systems, vol. 6, pp. 1125-1132, 1994.
28.
A. Sperduti, "Labeling RAAM", Connection Sci., vol. 6, no. 4, pp. 429-459, 1994.
29.
A. Sperduti, "Stability properties of labeling recursive autoassociative memory", IEEE Trans. Neural Networks, vol. 6, pp. 1452-1460, 1995.
30.
A. Sperduti and A. Starita, "An example of neural code: Neural trees implemented by LRAAMs", Proc. Int. Conf. Neural Networks Genetic Algorithms, pp. 33-39, 1993.