Publication
Symbolic Association Learning inspired by the Symbol Grounding Problem
Federico Raue; Marcus Liwicki; Andreas Dengel
In: Thomas Villmann; Frank-Michael Schleif (Hrsg.). Machine Learning Reports 04/2016. Workshop New Challenges in Neural Computation (NC2-2016), German Conference of Pattern Recognition, located at GCPR, September 12, Hannover, Germany, Pages 40-47, online, 2016.
Abstract
In this work, we present a novel model for a cognitive association
task where two visual sequences represent different instances of
the same semantic sequence. Also, the model learns the binding between
abstract concepts and vectorial representations (e.g., 1-of-K scheme). In
this case, the output vector of a network are used as symbolic features,
and the network learns to ground the abstract concepts to them. This
task is inspired by the Symbol Grounding Problem. Our model uses one
Long Short-Term Memory (LSTM) with an EM-training rule. One important
feature of the training is to use one of the two sequences as a
target of the other sequence for updating the LSTM network, and vice
versa. Our architecture is based on a recent model that uses two LSTM
networks for this association task. We compare our model using a generated
dataset from MNIST. The presented model reaches similar results
against the model with two LSTM networks. Also, our model is compared
to a trained LSTM using only one sequence with a predefined binding of
the abstract concepts, and the performance is also similar