In this paper we explore the node complexity of recursive neural network implementations of frontier-to-root tree automata (FRA). Specifically, we show that an FRAO (Mealy version) with m states, l input-output labels, and maximum rank N can be implemented by a recursive neural network with O(root(log l+log m)lmN/log l + N log m) units and four computational layers, i.e., without counting the input layer. A lower bound is derived which is tight when no restrictions are placed on the number of layers. Moreover, we present a construction with three computational layers having node complexity of O((log l + log m) root lm(N)) and O((log l + log m) l m(N)) connections. A construction with two computational layers is given that implements any given FRAO with a node complexity of O(lm(N)) and O((log l + N log m)lm(N)) connections. As a corollary we also get a new upper bound for the implementation of finite-state automata (FSA) into recurrent neural networks with three computational layers.

On the Implementation of Frontier-to-Root Tree Automata in Recursive Neural Networks

SPERDUTI, ALESSANDRO
1999

Abstract

In this paper we explore the node complexity of recursive neural network implementations of frontier-to-root tree automata (FRA). Specifically, we show that an FRAO (Mealy version) with m states, l input-output labels, and maximum rank N can be implemented by a recursive neural network with O(root(log l+log m)lmN/log l + N log m) units and four computational layers, i.e., without counting the input layer. A lower bound is derived which is tight when no restrictions are placed on the number of layers. Moreover, we present a construction with three computational layers having node complexity of O((log l + log m) root lm(N)) and O((log l + log m) l m(N)) connections. A construction with two computational layers is given that implements any given FRAO with a node complexity of O(lm(N)) and O((log l + N log m)lm(N)) connections. As a corollary we also get a new upper bound for the implementation of finite-state automata (FSA) into recurrent neural networks with three computational layers.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/119107
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 12
  • ???jsp.display-item.citation.isi??? 9
social impact