We propose a framework using contrastive learning as a pre-training task to perform image classification in the presence of noisy labels. Recent strategies, such as pseudo-labelling, sample selection with Gaussian Mixture models, and weighted supervised contrastive learning have been combined into a fine-tuning phase following the pre-training.
Continue readingAMU-EURANOVA at CASE 2021 Task 1: Assessing the stability of multilingual BERT
This paper explains our participation in task 1of the CASE 2021 shared task. This task is about multilingual event extraction from the news. We focused on sub-task 4, event information extraction. This sub-task has a small training dataset, and we fine-tuned a multilingual BERT to solve this sub-task.
Continue reading