Transformer based Model for Coherence Evaluation of Scientific Abstracts: Second Fine-tuned BERT

Anyelo Carlos Gutierrez-Choque, Vivian Medina-Mamani, Eveling Castro-Gutierrez, Rosa Nú˜nez-Pacheco, Ignacio Aguaded

Research output: Contribution to journalArticlepeer-review


Coherence evaluation is a problem related to the area of natural language processing whose complexity lies mainly in the analysis of the semantics and context of the words in the text. Fortunately, the Bidirectional Encoder Representation from Transformers (BERT) architecture can capture the aforementioned variables and represent them as embeddings to perform Fine-tunings. The present study proposes a Second Fine-Tuned model based on BERT to detect inconsistent sentences (coherence evaluation) in scientific abstracts written in English/Spanish. For this purpose, 2 formal methods for the generation of inconsistent abstracts have been proposed: Random Manipulation (RM) and K-means Random Manipulation (KRM). Six experiments were performed; showing that performing Second Fine-Tuned improves the detection of inconsistent sentences with an accuracy of 71%. This happens even if the new retraining data are of different language or different domain. It was also shown that using several methods for generating inconsistent abstracts and mixing them when performing Second Fine-Tuned does not provide better results than using a single technique.

Original languageEnglish
Pages (from-to)929-937
Number of pages9
JournalInternational Journal of Advanced Computer Science and Applications
Issue number5
StatePublished - 2022

Bibliographical note

Publisher Copyright:
© 2022. International Journal of Advanced Computer Science and Applications. All Rights Reserved.


  • Bert
  • Coherence evaluation
  • Inconsistent sentences detection
  • Second fine-tuned


Dive into the research topics of 'Transformer based Model for Coherence Evaluation of Scientific Abstracts: Second Fine-tuned BERT'. Together they form a unique fingerprint.

Cite this