Thursday, 17 May 2018

SoPa: Bridging CNNs, RNNs, and Weighted Finite-State Machines. (arXiv:1805.06061v1 [cs.CL])

Recurrent and convolutional neural networks comprise two distinct families of models that have proven to be useful for encoding natural language utterances. In this paper we present SoPa, a new model that aims to bridge these two approaches. SoPa combines neural representation learning with weighted finite-state automata (WFSAs) to learn a soft version of traditional surface patterns. We show that SoPa is an extension of a one-layer CNN, and that such CNNs are equivalent to a restricted version of SoPa, and accordingly, to a restricted form of WFSA. Empirically, on three text classification tasks, SoPa is comparable or better than both a BiLSTM (RNN) baseline and a CNN baseline, and is particularly useful in small data settings.



from cs updates on arXiv.org https://ift.tt/2L86gCv
//

Related Posts:

0 comments:

Post a Comment