Combining embedding methods for a word intrusion task

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

41 Downloads (Pure)


We report a new baseline for a Danish word intrusion task by combining pre-trained offthe-shelf word, subword and knowledge graph embedding models. We test fastText, Byte-Pair Encoding, BERT and the knowledge graph embedding in Wembedder, finding fastText as the individual model with the superior performance, while a simple combination of the fastText with other models can slightly improve the accuracy of finding the odd-one-out words in the word
intrusion task.
Original languageEnglish
Title of host publicationProceedings of the 15th Conference on Natural Language Processing
PublisherAssociation for Computational Linguistics
Publication date2019
Publication statusPublished - 2019
Event15th Conference on Natural Language Processing - Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany
Duration: 9 Oct 201911 Oct 2019


Conference15th Conference on Natural Language Processing
LocationFriedrich-Alexander-Universität Erlangen-Nürnberg
Internet address


Dive into the research topics of 'Combining embedding methods for a word intrusion task'. Together they form a unique fingerprint.

Cite this