Open-source neural architecture search with ensemble and pre-trained networks
Lankford, Séamus
(2021)
Open-source neural architecture search with ensemble and pre-trained networks.
International Journal of Modeling and Optimization, 11
(2).
pp. 33-41.
ISSN 2010-3697
The training and optimization of neural networks,
using pre-trained, super learner and ensemble approaches is
explored. Neural networks, and in particular Convolutional
Neural Networks (CNNs), are often optimized using default
parameters. Neural Architecture Search (NAS) enables
multiple architectures to be evaluated prior to selection of the
optimal architecture. Our contribution is to develop, and make
available to the community, a system that integrates open
source tools for the neural architecture search (OpenNAS) of
image classification models. OpenNAS takes any dataset of
grayscale, or RGB images, and generates the optimal CNN
architecture. Particle Swarm Optimization (PSO), Ant Colony
Optimization (ACO) and pre-trained models serve as base
learners for ensembles. Meta learner algorithms are
subsequently applied to these base learners and the ensemble
performance on image classification problems is evaluated. Our
results show that a stacked generalization ensemble of
heterogeneous models is the most effective approach to image
classification within OpenNAS.
Metadata
Item Type:
Article (Published)
Refereed:
Yes
Uncontrolled Keywords:
AutoML; transfer learning; pre-trained models; ensemble; stacking; super learner; PSO; ACO; CNN