Objective
In this project, we propose to build a biologically plausible model of sentence comprehension based on recurrent neural networks called reservoirs. Human sentence comprehension is mainly handled by prefrontal cortex areas, which have highly recurrent connectivity. In both biological and artificial neural networks this recurrence is supposed to enable the management of different aspects of time such as working memory and contextual information processing. Here, we propose to develop the Reservoir Computing (RC) paradigm – in particular Echo State Networks (ESN) with incremental learning – to model language comprehension at the sentence level given sequential inputs of words or phonemes. Based on our initial research, a model processing syntactic sentence structures was able to demonstrate generalisation and online prediction capabilities while processing sequential input. For less frequent inputs, the model provided potential explanation for human electrophysiological data. Building on this research, a new model is proposed with the following objectives: (1) processing of all semantic information enabling contextual processing and richer representation of meaning, (2) implementing incremental learning with noisy supervision enabling realistic developmental language acquisition from simple to complex sentences, (3) demonstrating that the model can learn from naïve user’s utterances in several languages, and (4) demonstrating the ability of this model when embodied in a robot to acquire extended language capabilities through human-robot interaction accounting for developmental schemes.
Fields of science
Topic(s)
Call for proposal
FP7-PEOPLE-2013-IEF
See other projects for this call
Funding Scheme
MC-IEF - Intra-European Fellowships (IEF)Coordinator
20148 HAMBURG
Germany