Show simple item record

dc.contributor.authorChoi, Changkyu
dc.contributor.authorKampffmeyer, Michael
dc.contributor.authorJenssen, Robert
dc.contributor.authorHandegard, Nils Olav
dc.contributor.authorSalberg, Arnt-Børre
dc.description.abstractMulti-frequency echosounder data can provide a broad understanding of the underwater environment in a non-invasive manner. The analysis of echosounder data is, hence, a topic of great importance for the marine ecosystem. Semantic segmentation, a deep learning based analysis method predicting the class attribute of each acoustic intensity, has recently been in the spotlight of the fisheries and aquatic industry since its result can be used to estimate the abundance of the marine organisms. However, a fundamental problem with current methods is the massive reliance on the availability of large amounts of annotated training data, which can only be acquired through expensive handcrafted annotation processes, making such approaches unrealistic in practice. As a solution to this challenge, we propose a novel approach, where we leverage a small amount of annotated data (supervised deep learning) and a large amount of readily available unannotated data (unsupervised learning), yielding a new data-efficient and accurate semi-supervised semantic segmentation method, all embodied into a single end-to-end trainable convolutional neural networks architecture. Our method is evaluated on representative data from a sandeel survey in the North Sea conducted by the Norwegian Institute of Marine Research. The rigorous experiments validate that our method achieves comparable results utilizing only 40 percent of the annotated data on which the supervised method is trained, by leveraging unannotated data. The code is available at
dc.identifier.citationChoi, Kampffmeyer, Jenssen, Handegard, Salberg. Deep Semi-Supervised Semantic Segmentation in Multi-Frequency Echosounder Data. IEEE Journal of Oceanic Engineering. 2023en_US
dc.identifier.cristinIDFRIDAID 2121874
dc.relation.ispartofChoi, C. (2023). Advancing Deep Learning for Marine Environment Monitoring. (Doctoral thesis). <a href=></a>.
dc.relation.journalIEEE Journal of Oceanic Engineering
dc.relation.projectIDNorges forskningsråd: 270966en_US
dc.relation.projectIDNorges forskningsråd: 309439en_US
dc.relation.projectIDNorges forskningsråd: 309512en_US
dc.rights.holderCopyright 2023 The Author(s)en_US
dc.rightsAttribution 4.0 International (CC BY 4.0)en_US
dc.subjectVDP::Matematikk og naturvitenskap: 400::Informasjons- og kommunikasjonsvitenskap: 420::Algoritmer og beregnbarhetsteori: 422en_US
dc.subjectVDP::Mathematics and natural scienses: 400::Information and communication science: 420::Algorithms and computability theory: 422en_US
dc.subjectVDP::Landbruks- og fiskerifag: 900::Fiskerifag: 920en_US
dc.subjectVDP::Agriculture and fisheries science: 900::Fisheries science: 920en_US
dc.subjectVDP::Teknologi: 500::Informasjons- og kommunikasjonsteknologi: 550en_US
dc.subjectVDP::Technology: 500::Information and communication technology: 550en_US
dc.subjectVDP::Teknologi: 500::Marin teknologi: 580en_US
dc.subjectVDP::Technology: 500::Marine technology: 580en_US
dc.subjectVDP::Matematikk og naturvitenskap: 400::Matematikk: 410::Statistikk: 412en_US
dc.subjectVDP::Mathematics and natural scienses: 400::Mathematics: 410::Statistics: 412en_US
dc.subjectArtificial Neural Networks / Artificial Neural Networksen_US
dc.subjectDatasyn / Computer Visionen_US
dc.subjectDeep learning / Deep learningen_US
dc.subjectMarine acoustic data analysis / Marine acoustic data analysisen_US
dc.subjectMarinteknologi / Marine Technologyen_US
dc.subjectNevrale nettverk / Neural networksen_US
dc.subjectSemi-supervised deep learning / Semi-supervised deep learningen_US
dc.titleDeep Semi-Supervised Semantic Segmentation in Multi-Frequency Echosounder Dataen_US
dc.typeJournal articleen_US
dc.typePeer revieweden_US

File(s) in this item


This item appears in the following collection(s)

Show simple item record

Attribution 4.0 International (CC BY 4.0)
Except where otherwise noted, this item's license is described as Attribution 4.0 International (CC BY 4.0)