Skip to content

Coherence Regularization for Neural Topic Models

Neural topic models aim to predict the words of a document given the document itself. In such models, perplexity is used as a training criterion, whereas the final quality measure is topic coherence. In this work, we introduce a coherence regularization loss that penalizes incoherent topics during the training of the model. We analyze our approach using coherence and an additional metric – exclusivity, responsible for the uniqueness of the terms in topics. We argue that this combination of metrics is an adequate indicator of the model quality. Our results indicate the effectiveness of our loss and the potential to be used in the future neural topic models.

The paper will be published at the 16th International Symposium on Neural Networks taking place in Moscow. In the meantime, do not hesitate to contact our R&D department at [email protected] to discuss how you can leverage neural topic models in your projects.

Katsiaryna Krasnashchok, Aymen Cherif, Coherence Regularization for Neural Topic Models. in 16th International Symposium on Neural Networks 2019 (ISNN 2019)

Click here to access the paper.

Releated Posts

Insights from GTC Paris 2025

Among the NVIDIA GTC Paris crowd was our CTO Sabri Skhiri, and from quantum computing breakthroughs to the full-stack AI advancements powering industrial digital twins and robotics, there is a lot to share! Explore with Sabri GTC 2025 trends, keynotes, and what it means for businesses looking to innovate.
Read More

Development & Evaluation of Automated Tumour Monitoring by Image Registration Based on 3D (PET/CT) Images

Tumor tracking in PET/CT is essential for monitoring cancer progression and guiding treatment strategies. Traditionally, nuclear physicians manually track tumors, focusing on the five largest ones (PERCIST criteria), which is both time-consuming and imprecise. Automated tumor tracking can allow matching of the numerous metastatic lesions across scans, enhancing tumor change monitoring.
Read More