ACL

Adapting Unsupervised Syntactic Parsing Methodology for Discourse Dependency Parsing

Download Abstract: One of the main bottlenecks in developing discourse dependency parsers is the lack of annotated training data. A potential solution is to utilize abundant unlabeled data by using unsupervised techniques, but there is so far little research in unsupervised discourse dependency parsing. Fortunately, unsupervised syntactic dependency parsing has been studied by decades, which …

Adapting Unsupervised Syntactic Parsing Methodology for Discourse Dependency Parsing Read More »

Robust Transfer Learning with Pretrained Language Models through Adapters

Download Abstract: Transfer learning with large pretrained transformer-based language models like BERT has become a dominating approach for most NLP tasks. Simply fine-tuning those large language models on downstream tasks or combining it with task-specific pretraining is often not robust. In particular, the performance considerably varies as the random seed changes or the number of …

Robust Transfer Learning with Pretrained Language Models through Adapters Read More »

Scroll to Top