Self-transfer learning for weakly supervised lesion localization

Sangheum Hwang, Hyo Eun Kim

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

80 Scopus citations

Abstract

Recent advances of deep learning have achieved remarkable performances in various computer vision tasks including weakly supervised object localization. Weakly supervised object localization is practically useful since it does not require fine-grained annotations. Current approaches overcome the difficulties of weak supervision via transfer learning from pre-trained models on large-scale general images such as ImageNet. However,they cannot be utilized for medical image domain in which do not exist such priors. In this work,we present a novel weakly supervised learning framework for lesion localization named as self-transfer learning (STL). STL jointly optimizes both classification and localization networks to help the localization network focus on correct lesions without any types of priors.We evaluate STL framework over chest X-rays and mammograms,and achieve significantly better localization performance compared to previous weakly supervised localization approaches.

Original languageEnglish
Title of host publicationMedical Image Computing and Computer-Assisted Intervention - MICCAI 2016 - 19th International Conference, Proceedings
EditorsGozde Unal, Sebastian Ourselin, Leo Joskowicz, Mert R. Sabuncu, William Wells
PublisherSpringer Verlag
Pages239-246
Number of pages8
ISBN (Print)9783319467221
DOIs
StatePublished - 2016

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume9901 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Keywords

  • Convolutional neural networks
  • Lesion localization
  • Weakly supervised learning

Fingerprint

Dive into the research topics of 'Self-transfer learning for weakly supervised lesion localization'. Together they form a unique fingerprint.

Cite this