Thursday, 1 November 2018

Improving Distant Supervision with Maxpooled Attention and Sentence-Level Supervision. (arXiv:1810.12956v1 [cs.CL])

We propose an effective multitask learning setup for reducing distant supervision noise by leveraging sentence-level supervision. We show how sentence-level supervision can be used to improve the encoding of individual sentences, and to learn which input sentences are more likely to express the relationship between a pair of entities. We also introduce a novel neural architecture for collecting signals from multiple input sentences, which combines the benefits of attention and maxpooling. The proposed method increases AUC by 10% (from 0.261 to 0.284), and outperforms recently published results on the FB-NYT dataset.



from cs updates on arXiv.org https://ift.tt/2AEzBBf
//

0 comments:

Post a Comment