Can AMR Assist Legal and Logical Reasoning?

Nikolaus Schrack, Ruixiang Cui, Hugo-Andrés López-Acosta, Daniel Hershcovich

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

55 Downloads (Pure)


Abstract Meaning Representation (AMR) has been shown to be useful for many downstream tasks. In this work, we explore the use of AMR for legal and logical reasoning. Specifically, we investigate if AMR can help capture logical relationships on multiple choice question answering (MCQA) tasks. We propose neural architectures that utilize linearised AMR graphs in combination with pre-trained language models. While these models are not able to outperform text-only baselines, they correctly solve different instances than the text models, suggesting complementary abilities. Error analysis further reveals that AMR parsing quality is the most prominent challenge, especially regarding
inputs with multiple sentences. We conduct a theoretical analysis of how logical relations are represented in AMR and conclude it might be helpful in some logical statements but not for others.
Original languageEnglish
Title of host publicationFindings of the Association for Computational Linguistics
PublisherAssociation for Computational Linguistics
Publication date2022
Pages1555 - 1568
Publication statusPublished - 2022
Event2022 Conference on Empirical Methods in Natural Language Processing - Abu Dhabi National Exhibition Centre, Abu Dhabi, United Arab Emirates
Duration: 7 Dec 202211 Dec 2022


Conference2022 Conference on Empirical Methods in Natural Language Processing
LocationAbu Dhabi National Exhibition Centre
Country/TerritoryUnited Arab Emirates
CityAbu Dhabi
Internet address


Dive into the research topics of 'Can AMR Assist Legal and Logical Reasoning?'. Together they form a unique fingerprint.

Cite this