Abstract
The introduction of Transformers neural network revolutionized Natural Language Processing by effectively handling long-range dependencies and context. Models like BERT and GPT are at the forefront of Large Language Models and have been used in text classification. Despite their benchmark performance, real-world applications pose challenges, including the requirement for substantial labeled data and class balance. Few-shot learning approaches, like the Recognizing Textual Entailment framework, have emerged to address these issues. RTE identifies relationships between a text T and a hypothesis H. T entails H if the meaning of H, as interpreted in the context of T, can be inferred from the meaning of T. This study explores an RTE framework for classifying vaccine-related headlines using 1,000 labeled data points distributed unevenly across 10 classes. We evaluate eight models and procedures, including both open-source and closed-source, as well as paid and free options. They were tested from four perspectives. The results highlight that deep transfer learning, combining language and task knowledge, like Transformers and RTE, enables the development of text classification models with superior performance, addressing data scarcity and class imbalance. This approach provides a valuable protocol for creating classification models and delivers an automated model for classifying vaccine-related content.
| Original language | English |
|---|---|
| Journal | Computational Communication Research |
| Volume | 7 |
| Issue number | 1 |
| DOIs | |
| State | Published - 2025 |
Bibliographical note
Publisher Copyright:© The authors.
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 3 Good Health and Well-being
Keywords
- BERT
- GPT
- Natural Language Processing
- Recognizing Textual Entailment
- Transformers
Fingerprint
Dive into the research topics of 'Boosting Transformers: Recognizing Textual Entailment for Classification of Vaccine News Coverage'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver