Please use this identifier to cite or link to this item: http://hdl.handle.net/1783.1/22

Auto-associative learning of on-line handwriting using recurrent neural networks

Authors Yeung, Dit-Yan
Yeung, Kei-Wai
Issue Date 1994-11
Summary Traditionally, the parametric grammar-based approach to the modeling and recognition of temporal sequences using hidden Markov models (HMMs) involves a very crucial step, which requires human experts to determine a priori the appropriate model architecture to work on. This includes, among other things, determining the number of states in the (probabilistic) grammar and the (probabilistic) transitions between states. As a long-term effort, we attempt to develop a more domain-independent, principled approach to the modeling of grammatical structures in temporal sequences, without knowing in advance the topology of the underlying grammars. This is achieved through an unsupervised learning process. In particular, a discrete-time recurrent neural network model, called ASCOC, we proposed before is trained to learn separately the dynamics of each individual embedded subgrammar (or subpattern) class. These subgrammar network models are trained in an auto-associative (or self-supervised) manner, which is similar in spirit to the principal component analysis (PCA) learning paradigm for feedforward neural networks, except that our focus here is on recurrent neural networks that model dynamical behaviors. In the pilot study presented in this paper, some issues of this new approach to temporal sequence processing are investigated in the domain of on-line handwriting modeling and recognition. Some possible future research directions are also discussed.
Subjects
Language English
Format Technical report
Access
Files in this item:
File Description Size Format
tr94-38.pdf 376673 B Adobe PDF