ub.xmlui.mirage2.page-structure.muninLogoub.xmlui.mirage2.page-structure.openResearchArchiveLogo
    • EnglishEnglish
    • norsknorsk
  • Velg spraakEnglish 
    • EnglishEnglish
    • norsknorsk
  • Administration/UB
View Item 
  •   Home
  • Fakultet for naturvitenskap og teknologi
  • Institutt for fysikk og teknologi
  • Artikler, rapporter og annet (fysikk og teknologi)
  • View Item
  •   Home
  • Fakultet for naturvitenskap og teknologi
  • Institutt for fysikk og teknologi
  • Artikler, rapporter og annet (fysikk og teknologi)
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Mixing up contrastive learning: Self-supervised representation learning for time series

Permanent link
https://hdl.handle.net/10037/26414
DOI
https://doi.org/10.1016/j.patrec.2022.02.007
Thumbnail
View/Open
article.pdf (1.116Mb)
Published version (PDF)
Date
2022-02-14
Type
Journal article
Tidsskriftartikkel
Peer reviewed

Author
Wickstrøm, Kristoffer; Kampffmeyer, Michael; Mikalsen, Karl Øyvind; Jenssen, Robert
Abstract
The lack of labeled data is a key challenge for learning useful representation from time series data. However, an unsupervised representation framework that is capable of producing high quality representations could be of great value. It is key to enabling transfer learning, which is especially beneficial for medical applications, where there is an abundance of data but labeling is costly and time consuming. We propose an unsupervised contrastive learning framework that is motivated from the perspective of label smoothing. The proposed approach uses a novel contrastive loss that naturally exploits a data augmentation scheme in which new samples are generated by mixing two data samples with a mixing component. The task in the proposed framework is to predict the mixing component, which is utilized as soft targets in the loss function. Experiments demonstrate the framework’s superior performance compared to other representation learning approaches on both univariate and multivariate time series and illustrate its benefits for transfer learning for clinical time series.
Publisher
Elsevier
Citation
Wickstrøm, Kampffmeyer, Mikalsen, Jenssen. Mixing up contrastive learning: Self-supervised representation learning for time series. Pattern Recognition Letters. 2022;155:54-61
Metadata
Show full item record
Collections
  • Artikler, rapporter og annet (fysikk og teknologi) [1057]
Copyright 2022 The Author(s)

Browse

Browse all of MuninCommunities & CollectionsAuthor listTitlesBy Issue DateBrowse this CollectionAuthor listTitlesBy Issue Date
Login

Statistics

View Usage Statistics
UiT

Munin is powered by DSpace

UiT The Arctic University of Norway
The University Library
uit.no/ub - munin@ub.uit.no

Accessibility statement (Norwegian only)