Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

An experimental comparison of knowledge transfer algorithms in deep neural networks

Quinn, Sean orcid logoORCID: 0000-0003-0807-1076, McGuinness, Kevin orcid logoORCID: 0000-0003-1336-6477 and Mileo, Alessandra orcid logoORCID: 0000-0002-6614-6462 (2021) An experimental comparison of knowledge transfer algorithms in deep neural networks. In: Irish Machine Vision and Image Processing Conference (IMVIP), 1-3 Sept 2021, Dublin, Ireland. ISBN 978-0-9934207-6-4

Abstract
Neural knowledge transfer methods aim to constrain the hidden representation of one neural network to be similar, or have similar properties, to another by applying specially designed loss functions between the two networks hidden layers. In this way the intangible knowledge encoded by the network's weights is transferred without having to replicate exact weight structures or alter the knowledge representation from its natural highly distributed form. Motivated by the need to enable greater transparency in evaluating such methods by bridging the gap between different experimental setups in the existing literature, the need to cast a wider net in comparing each method to a greater number of its peers and a desire to explore novel combinations of existing methods we conduct an experimental comparison of eight contemporary neural knowledge transfer algorithms and further explore the performance of some combinations. We conduct our experiments on an image classification task and measure relative performance gains over non-knowledge enhanced baseline neural networks in terms of classification accuracy. We observed (i) some interesting contradictions between our results and those reported in original papers, (ii) a general lack of correlation between any given methods standalone performance vs performance when used in combination with knowledge distillation, (iii) a general trend of older simpler methods outperforming newer ones and (iv) Contrastive Representation Distillation (CRD) achieving best performance.
Metadata
Item Type:Conference or Workshop Item (Paper)
Event Type:Conference
Refereed:Yes
Uncontrolled Keywords:Knowledge Transfer; Knowledge Distillation; Deep Learning; Representation Learning
Subjects:Computer Science > Image processing
Computer Science > Machine learning
DCU Faculties and Centres:DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
DCU Faculties and Schools > Faculty of Engineering and Computing > School of Electronic Engineering
Research Institutes and Centres > INSIGHT Centre for Data Analytics
Published in: Irish Pattern Recognition & Classification Society Conference Proceedings 2021. . Irish Pattern Recognition & Classification Society. ISBN 978-0-9934207-6-4
Publisher:Irish Pattern Recognition & Classification Society
Official URL:https://drive.google.com/file/d/1quqaYxnhBBruPhYOY...
Copyright Information:© 2021 The Authors.
Use License:This item is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 3.0 License. View License
Funders:Irish Research Council GOIPG/2018/2501, Science Foundation Ireland SFI/12/RC/2289_P2., Nvidia Corporation research hardware grant
ID Code:26159
Deposited On:06 Sep 2021 14:40 by Kevin Mcguinness . Last Modified 13 Oct 2022 12:13
Documents

Full text available as:

[thumbnail of IMVIP_21.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
76kB
Metrics

Altmetric Badge

Dimensions Badge

Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record