Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

Inducing sparsity in deep neural networks through unstructured pruning for lower computational footprint

Ballas, Camille (2022) Inducing sparsity in deep neural networks through unstructured pruning for lower computational footprint. PhD thesis, Dublin City University.

Abstract
Deep learning has revolutionised the way we deal with media analytics, opening up and improving many fields such as machine language translation, autonomous driver assistant systems, smart cities and medical imaging to only cite a few. But to handle complex decision making, neural networks are getting bigger and bigger resulting in heavy compute loads. This has significant implications for universal accessibility of the technology with high costs, the potential environmental impact of increasing energy consumption and the inability to use the models on low-power devices. A simple way to cut down the size of a neural network is to remove parameters that are not useful to the model prediction. In unstructured pruning, the goal is to remove parameters (ie. set them to 0) based on some importance heuristic while maintaining good prediction accuracy, resulting in a high-performing network with a smaller computational footprint. Many pruning methods seek to find the optimal capacity for which the network is the most compute efficient while reaching better generalisation. The action of inducing sparsity – setting zero-weights – in a neural network greatly contributes to reducing over-parametrisation, lowering the cost for running inference, but also leveraging complexity at training time. Moreover, it can help us better understand what parts of the network account the most for learning, to design more efficient architectures and training procedures. This thesis assesses the integrity of unstructured pruning criteria. After presenting a use-case application for the deployment of an AI application in a real-world setting, this thesis demonstrates that unstructured pruning criteria are ill-defined and not adapted to large scale networks due to the over-parametrisation regime during training, resulting in sparse networks lacking regularisation. Furthermore, beyond solely looking at the performance accuracy, the fairness of different unstructured pruning networks is evaluated highlighting the need to rethink how we design unstructured pruning.
Metadata
Item Type:Thesis (PhD)
Date of Award:November 2022
Refereed:No
Supervisor(s):Little, Suzanne and O'Connor, Noel E.
Subjects:Computer Science > Algorithms
Computer Science > Artificial intelligence
Computer Science > Machine learning
DCU Faculties and Centres:DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Research Institutes and Centres > INSIGHT Centre for Data Analytics
Use License:This item is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 License. View License
Funders:Science Foundation Ireland
ID Code:27730
Deposited On:10 Nov 2022 12:50 by Suzanne Little . Last Modified 08 Dec 2023 15:13
Documents

Full text available as:

[thumbnail of Camille_PhD_thesis_FINALv4.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Creative Commons: Attribution-Noncommercial-No Derivative Works 4.0
9MB
Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record