Leveraging Topological Guidance for Improved Knowledge Distillation

Eun Som Jeon, Rahul Khurana, Aishani Pathak, Pavan Turaga

Research output: Contribution to journalConference articlepeer-review

Abstract

Deep learning has shown its efficacy in extracting useful features to solve various computer vision tasks. However, when the structure of the data is complex and noisy, capturing effective information to improve performance is very difficult. To this end, topological data analysis (TDA) has been utilized to derive useful representations that can contribute to improving performance and robustness against perturbations. Despite its effectiveness, the requirements for large computational resources and significant time consumption in extracting topological features through TDA are critical problems when implementing it on small devices. To address this issue, we propose a framework called Topological Guidance-based Knowledge Distillation (TGD), which uses topological features in knowledge distillation (KD) for image classification tasks. We utilize KD to train a superior lightweight model and provide topological features with multiple teachers simultaneously. We introduce a mechanism for integrating features from different teachers and reducing the knowledge gap between teachers and the student, which aids in improving performance. We demonstrate the effectiveness of our approach through diverse empirical evaluations.

Original languageEnglish
Pages (from-to)158-172
Number of pages15
JournalProceedings of Machine Learning Research
Volume251
StatePublished - 2024
Event1st Geometry-Grounded Representation Learning and Generative Modeling Workshop, GRaM 2024 at the 41st International Conference on Machine Learning, ICML 2024 - Vienna, Austria
Duration: 29 Jul 2024 → …

Fingerprint

Dive into the research topics of 'Leveraging Topological Guidance for Improved Knowledge Distillation'. Together they form a unique fingerprint.

Cite this