An exploratory study of self-supervised pre-training on partially supervised multi-label classification on chest X-ray images
Permanent lenke
https://hdl.handle.net/10037/34908Dato
2024-06-14Type
Journal articleTidsskriftartikkel
Peer reviewed
Sammendrag
This paper serves as the first empirical study on self-supervised pre-training on partially supervised learning,
an emerging yet unexplored learning paradigm with missing annotations. This is particularly important in the
medical imaging domain, where label scarcity is the main challenge of practical applications. To promote the
awareness of partially supervised learning, we leverage partially supervised multi-label classification on chest
X-ray images as an instance task to illustrate the challenges of the problem of interest. Through a series of
simulated experiments, the empirical findings validate that solving multiple pretext tasks jointly in the pretraining stage can significantly improve the downstream task performance under the partially supervised setup.
Further, we propose a new pretext task, reverse vicinal risk minimization, and demonstrate that it provides a
more robust and efficient alternative to existing pretext tasks for the instance task of interest.
Forlag
ElsevierSitering
Dong, Kampffmeyer, Su, Xing. An exploratory study of self-supervised pre-training on partially supervised multi-label classification on chest X-ray images. Applied Soft Computing. 2024;163Metadata
Vis full innførselSamlinger
Copyright 2024 The Author(s)