A Semi-Supervised BERT Approach for Arabic Named Entity Recognition

Conference: The 5th Arabic Natural Language Workshop (to appear)

Authors: Chadi Helwe, Ghassan Dib, Mohsen Shamas, Shady Elbassuoni

Abstract: Named entity recognition (NER) plays a significant role in many applications such as information extraction, information retrieval, question answering, and even machine translation. Most of the work on NER using deep learning was done for non-Arabic languages like English and French, and only few studies focused on Arabic. This paper proposes a semi-supervised learning approach to train a BERT-based NER model using labeled and semi-labeled datasets. We compared our approach against various baselines, and state-of-the-art Arabic NER tools on three datasets: AQMAR, NEWS, and TWEETS. We report a significant improvement in F-measure for the AQMAR and the NEWS datasets, which are written in Modern Standard Arabic (MSA), and competitive results for the TWEETS dataset, which contains tweets that are mostly in the Egyptian dialect and contain many mistakes or misspellings.

Paper

Leave a Reply

Your email address will not be published. Required fields are marked *