Boosting Transformers: Recognizing Textual Entailment for Classification of Vaccine News Coverage

Luiz Neves, Chico Q. Camargo, Luisa Massarani

Research output: Contribution to journalArticlepeer-review

Abstract

The introduction of Transformers neural network revolutionized Natural Language Processing by effectively handling long-range dependencies and context. Models like BERT and GPT are at the forefront of Large Language Models and have been used in text classification. Despite their benchmark performance, real-world applications pose challenges, including the requirement for substantial labeled data and class balance. Few-shot learning approaches, like the Recognizing Textual Entailment framework, have emerged to address these issues. RTE identifies relationships between a text T and a hypothesis H. T entails H if the meaning of H, as interpreted in the context of T, can be inferred from the meaning of T. This study explores an RTE framework for classifying vaccine-related headlines using 1,000 labeled data points distributed unevenly across 10 classes. We evaluate eight models and procedures, including both open-source and closed-source, as well as paid and free options. They were tested from four perspectives. The results highlight that deep transfer learning, combining language and task knowledge, like Transformers and RTE, enables the development of text classification models with superior performance, addressing data scarcity and class imbalance. This approach provides a valuable protocol for creating classification models and delivers an automated model for classifying vaccine-related content.

Original languageEnglish
JournalComputational Communication Research
Volume7
Issue number1
DOIs
StatePublished - 2025

Bibliographical note

Publisher Copyright:
© The authors.

Keywords

  • BERT
  • GPT
  • Natural Language Processing
  • Recognizing Textual Entailment
  • Transformers

Fingerprint

Dive into the research topics of 'Boosting Transformers: Recognizing Textual Entailment for Classification of Vaccine News Coverage'. Together they form a unique fingerprint.

Cite this