UPDF AI

Multilingual Extraction and Categorization of Lexical Collocations with Graph-aware Transformers

Luis Espinosa-Anke,A. Shvets,2 Authors,Leo Wanner

2022 · DOI: 10.48550/arXiv.2205.11456
4 Citations

TLDR

This paper puts forward a sequence tagging BERT-based model enhanced with a graph-aware transformer architecture, which is evaluated on the task of collocation recognition in context and suggests that explicitly encoding syntactic dependencies in the model architecture is helpful.

Abstract

Recognizing and categorizing lexical collocations in context is useful for language learning, dictionary compilation and downstream NLP. However, it is a challenging task due to the varying degrees of frozenness lexical collocations exhibit. In this paper, we put forward a sequence tagging BERT-based model enhanced with a graph-aware transformer architecture, which we evaluate on the task of collocation recognition in context. Our results suggest that explicitly encoding syntactic dependencies in the model architecture is helpful, and provide insights on differences in collocation typification in English, Spanish and French.

Cited Papers
Citing Papers