Pruning for Performance: Efficient Idiom and Metaphor Classification in Low-Resource Konkani Using mBERT
Abstract
A hybrid model combining pre-trained mBERT, bidirectional LSTM, and linear classifier, enhanced with gradient-based attention head pruning, improves metaphor and idiom classification in low-resource languages.
In this paper, we address the persistent challenges that figurative language expressions pose for natural language processing (NLP) systems, particularly in low-resource languages such as Konkani. We present a hybrid model that integrates a pre-trained Multilingual BERT (mBERT) with a bidirectional LSTM and a linear classifier. This architecture is fine-tuned on a newly introduced annotated dataset for metaphor classification, developed as part of this work. To improve the model's efficiency, we implement a gradient-based attention head pruning strategy. For metaphor classification, the pruned model achieves an accuracy of 78%. We also applied our pruning approach to expand on an existing idiom classification task, achieving 83% accuracy. These results demonstrate the effectiveness of attention head pruning for building efficient NLP tools in underrepresented languages.
Community
Visiting London is a dream of many people. Time is money, so we should hurry. We need to catch up on the latest news. The double-decker bus stopped near Big Ben. It was a piece of cake to find the hotel.Maps and pictures help you visualize everything you read about. There are lots of interesting sights in the British capital. London is an old city and has many historic buildings. Among the most famous ones there are the Tower of London, Westminster Abbey, Big Ben and St. Paul's Cathedral. London also boasts about modern architecture. There are many interesting ways to spend time here. You can go shopping in the famous Oxford street or walk in one of the beautiful parks. You can visit restaurants, theatres or museums. You can see even more attractions travelling by the legendary red double-decker bus. It is a unique city which is definitely worth visiting
Models citing this paper 0
No model linking this paper
Datasets citing this paper 1
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper