Comprehensive Study on German Language Models for Clinical and Biomedical Text Understanding

Abstract

Recent advances in natural language processing (NLP) can be largely attributed to the advent of pre-trained language models such as BERT and RoBERTa. While these models demonstrate remarkable performance on general datasets, they can struggle in specialized domains such as medicine, where unique domain-specific terminologies, domain-specific abbreviations, and varying document structures are common. This paper explores strategies for adapting these models to domain-specific requirements, primarily through continuous pre-training on domain-specific data. We pre-trained several German medical language models on 2.4B tokens derived from translated public English medical data and 3B tokens of German clinical data. The resulting models were evaluated on various German downstream tasks, including named entity recognition (NER), multi-label classification, and extractive question answering. Our results suggest that models augmented by clinical and translation-based pre-training typically outperform general domain models in medical contexts. We conclude that continuous pre-training has demonstrated the ability to match or even exceed the performance of clinical models trained from scratch. Furthermore, pre-training on clinical data or leveraging translated texts have proven to be reliable methods for domain adaptation in medical NLP tasks.

Publication
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Ahmad Idrissi-Yaghir
Ahmad Idrissi-Yaghir
Researcher in the first cohort

My research interests include Deep Learning, Natural Language Processing, and Information Retrieval.

Henning Schäfer
Henning Schäfer
Researcher in the first cohort

My research interests include Deep Learning, Computer Vision, Radiomics, and Explainable AI.

Kamyar Arzideh
Kamyar Arzideh
Associated Researcher

My research interests include NLP.

Giulia Baldini
Giulia Baldini
Associated Researcher

My research interests include NLP.

Peter Horn
Peter Horn
Principal Investigator

My research interests include Transfusion Medicine, Immunology, and Bioinformatics.

Felix Nensa
Felix Nensa
Speaker

My research interests include medical digitalization, computer vision and radiology.

Jens Kleesiek
Jens Kleesiek
Principal Investigator
Christoph M. Friedrich
Christoph M. Friedrich
Co-Speaker

My research interests include Deep Learning, Computer Vision, Radiomics, and Explainable AI.

Next
Previous