ub.xmlui.mirage2.page-structure.muninLogoub.xmlui.mirage2.page-structure.openResearchArchiveLogo
    • EnglishEnglish
    • norsknorsk
  • Velg spraaknorsk 
    • EnglishEnglish
    • norsknorsk
  • Administrasjon/UB
Vis innførsel 
  •   Hjem
  • Fakultet for naturvitenskap og teknologi
  • Institutt for fysikk og teknologi
  • Artikler, rapporter og annet (fysikk og teknologi)
  • Vis innførsel
  •   Hjem
  • Fakultet for naturvitenskap og teknologi
  • Institutt for fysikk og teknologi
  • Artikler, rapporter og annet (fysikk og teknologi)
  • Vis innførsel
JavaScript is disabled for your browser. Some features of this site may not work without it.

Urban land cover classification with missing data modalities using deep convolutional neural networks

Permanent lenke
https://hdl.handle.net/10037/16432
DOI
https://doi.org/10.1109/JSTARS.2018.2834961
Thumbnail
Åpne
article.pdf (21.58Mb)
Accepted manuscript version (PDF)
Dato
2018-06-14
Type
Journal article
Tidsskriftartikkel
Peer reviewed

Forfatter
Kampffmeyer, Michael C.; Salberg, Arnt Børre; Jenssen, Robert
Sammendrag
Automatic urban land cover classification is a fundamental problem in remote sensing, e.g., for environmental monitoring. The problem is highly challenging, as classes generally have high interclass and low intraclass variances. Techniques to improve urban land cover classification performance in remote sensing include fusion of data from different sensors with different data modalities. However, such techniques require all modalities to be available to the classifier in the decision-making process, i.e., at test time, as well as in training. If a data modality is missing at test time, current state-of-the-art approaches have in general no procedure available for exploiting information from these modalities. This represents a waste of potentially useful information. We propose as a remedy a convolutional neural network (CNN) architecture for urban land cover classification which is able to embed all available training modalities in the so-called hallucination network. The network will in effect replace missing data modalities in the test phase, enabling fusion capabilities even when data modalities are missing in testing. We demonstrate the method using two datasets consisting of optical and digital surface model (DSM) images. We simulate missing modalities by assuming that DSM images are missing during testing. Our method outperforms both standard CNNs trained only on optical images as well as an ensemble of two standard CNNs. We further evaluate the potential of our method to handle situations where only some DSM images are missing during testing. Overall, we show that we can clearly exploit training time information of the missing modality during testing.
Beskrivelse
Source at https://doi.org/10.1109/JSTARS.2018.2834961.
Forlag
IEEE
Sitering
Kampffmeyer, M., Salberg, A.B. & Jenssen, R. (2018). Urban land cover classification with missing data modalities using deep convolutional neural networks. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 11(6), 1758-1768. https://doi.org/10.1109/JSTARS.2018.2834961
Metadata
Vis full innførsel
Samlinger
  • Artikler, rapporter og annet (fysikk og teknologi) [1057]

Bla

Bla i hele MuninEnheter og samlingerForfatterlisteTittelDatoBla i denne samlingenForfatterlisteTittelDato
Logg inn

Statistikk

Antall visninger
UiT

Munin bygger på DSpace

UiT Norges Arktiske Universitet
Universitetsbiblioteket
uit.no/ub - munin@ub.uit.no

Tilgjengelighetserklæring