PRISM-Med: Parameter-Efficient Robust Interdomain Specialty Model for Medical Language Tasks

Jieui Kang, Hyungon Ryu, Jaehyeong Sim

Research output: Contribution to journalArticlepeer-review

Abstract

Language Models (LMs) have shown remarkable potential in healthcare applications, yet their widespread adoption faces challenges in achieving consistent performance across diverse medical specialties while maintaining parameter efficiency. Current approaches to fine-tuning language models for medical tasks often require extensive computational resources and struggle with managing specialized medical knowledge across different domains. To address these challenges, we present PRISM-Med (Parameter-efficient Robust Interdomain Specialty Model), a novel framework that enhances domain-specific performance through supervised domain classification and specialized adaptation. Our framework introduces three key innovations: (1) a domain detection model that accurately classifies medical text into specific medical domains using supervised learning, (2) a domain-specific Low-Rank Adaptation (LoRA) strategy that enables efficient parameter utilization while preserving specialized knowledge, and (3) a neural domain detector that dynamically selects the most relevant domain-specific models during inference. Through comprehensive empirical evaluation across multiple medical benchmarks (MedProb, MedNER, MedQuAD), we demonstrate that PRISM-Med achieves consistent performance improvements, with gains of up to 10.1% in medical QA tasks and 2.7% in medical knowledge evaluation compared to traditional fine-tuning baselines. Notably, our framework achieves these improvements while using only 0.1% to 0.4% of the parameters required for traditional fine-tuning approaches. PRISM-Med represents a significant advancement in developing efficient and robust medical language models, providing a practical solution for specialized medical applications where both performance and computational efficiency are crucial.

Original languageEnglish
Pages (from-to)4957-4965
Number of pages9
JournalIEEE Access
Volume13
DOIs
StatePublished - 2025

Bibliographical note

Publisher Copyright:
© 2013 IEEE.

Keywords

  • Deep learning
  • domain adaptive adapter
  • low rank adapter
  • medical AI
  • small language model

Fingerprint

Dive into the research topics of 'PRISM-Med: Parameter-Efficient Robust Interdomain Specialty Model for Medical Language Tasks'. Together they form a unique fingerprint.

Cite this