Decay-RNN-ACL-SRW2020
这是本文中描述的实验的官方Pytorch实施,“ RNN架构需要学习语法敏感的依赖性需要多少复杂性?”,计算语言学协会第58届年会论文集:学生研究工作室
要下载概括集模板,请参阅此处
对于第6.1、6.2和6.3节中提到的测试,请查看单个模型文件夹。对于LM,请选择语言建模文件夹。
依赖性:
- numpy
- pytorch:> = 1.1
- 弯曲
- 熊猫
- StatsModels
建议:安装Anaconda(Python Library Manager)。然后根据需要安装拐点,pytorch和任何其他库。
如果您发现我们的工作有用,请考虑引用我们使用:
@inproceedings{bhatt-etal-2020-much,
title = \"How much complexity does an {RNN} architecture need to learn syntax-sensitive dependencies?\",
author = \"Bhatt, Gantavya and
Bansal, Hritik and
Singh, Rishubh and
Agarwal, Sumeet\",
booktitle = \"Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop\",
month = jul,
year = \"2020\",
address = \"Online\",
publisher = \"Association for Computational Linguistics\",
url = \"https://www.*aclw*e*b.org/anthology/2020.acl-srw.33\",
pages = \"244--254\",
abstract = \"Long short-term memory (LSTM) networks and their variants are capable of encapsulating long-range dependencies, which is evident from their performance on a variety of linguistic tasks. On the other hand, simple recurrent networks (SRNs), which appear more biologically grounded in terms of synaptic connections, have generally been less successful at capturing long-range dependencies as well as the loci of grammatical errors in an unsupervised setting. In this paper, we seek to develop models that bridge the gap between biological plausibility and linguistic competence. We propose a new architecture, the Decay RNN, which incorporates the decaying nature of neuronal activations and models the excitatory and inhibitory connections in a population of neurons. Besides its biological inspiration, our model also shows competitive performance relative to LSTMs on subject-verb agreement, sentence grammaticality, and language modeling tasks. These results provide some pointers towards probing the nature of the inductive biases required for RNN architectures to model linguistic phenomena successfully.\",
}
