Transformer based Model for Coherence Evaluation of Scientific Abstracts: Second Fine-tuned BERT

Anyelo Carlos Gutierrez-Choque, Vivian Medina-Mamani, Eveling Castro-Gutierrez, Rosa Nú˜nez-Pacheco, Ignacio Aguaded

Resultado de la investigación: Contribución a una revistaArtículorevisión exhaustiva

Resumen

Coherence evaluation is a problem related to the area of natural language processing whose complexity lies mainly in the analysis of the semantics and context of the words in the text. Fortunately, the Bidirectional Encoder Representation from Transformers (BERT) architecture can capture the aforementioned variables and represent them as embeddings to perform Fine-tunings. The present study proposes a Second Fine-Tuned model based on BERT to detect inconsistent sentences (coherence evaluation) in scientific abstracts written in English/Spanish. For this purpose, 2 formal methods for the generation of inconsistent abstracts have been proposed: Random Manipulation (RM) and K-means Random Manipulation (KRM). Six experiments were performed; showing that performing Second Fine-Tuned improves the detection of inconsistent sentences with an accuracy of 71%. This happens even if the new retraining data are of different language or different domain. It was also shown that using several methods for generating inconsistent abstracts and mixing them when performing Second Fine-Tuned does not provide better results than using a single technique.

Idioma originalInglés
Páginas (desde-hasta)929-937
Número de páginas9
PublicaciónInternational Journal of Advanced Computer Science and Applications
Volumen13
N.º5
DOI
EstadoPublicada - 2022

Nota bibliográfica

Publisher Copyright:
© 2022. International Journal of Advanced Computer Science and Applications. All Rights Reserved.

Huella

Profundice en los temas de investigación de 'Transformer based Model for Coherence Evaluation of Scientific Abstracts: Second Fine-tuned BERT'. En conjunto forman una huella única.

Citar esto