Abstract
The primary purpose of this work is to construct a recurrent neural network (RNN) architecture that learns context-free grammars (CFGs) with recursive rules, intending to get some insights for human language acquisition. Specifically, we are interested in how RNNs can learn recursive rules. The models proposed here constructed with two promising connectionist techniques, recursive auto-associative memory (RAAM) and simple recurrent network (SRN). RAAM learns to represent parse trees as real valued vectors, and SRN learns to parse sentences. We investigated if the RAAM/SRN model can learn to parse a language {anbn| n ≥ 1}, and two other languages generated by simple CFGs with recursively embedded phrases.
Original language | English |
---|---|
Pages | 2602-2607 |
Number of pages | 6 |
Publication status | Published - 2001 |
Event | International Joint Conference on Neural Networks (IJCNN'01) - Washington, DC, United States Duration: 15 Jul 2001 → 19 Jul 2001 |
Conference
Conference | International Joint Conference on Neural Networks (IJCNN'01) |
---|---|
Country/Territory | United States |
City | Washington, DC |
Period | 15/07/01 → 19/07/01 |