Abstract:
Artificial neural network (ANN) architecture design is a nontrivial and time-consuming task that often requires a high level of human expertise. Neural architecture search (NAS) serves to automate the design of ANN architectures, and has proven to be successful in finding ANN architectures that can outperform those manually designed by human experts. It is often the case that in real world implementations of machine learning and ANNs, a reasonable trade-off is accepted for marginally reduced model accuracy in favour of lower computational resources demanded by the model. This study investigates the use of multi-objective evolutionary algorithms as an exploration strategy for NAS to evolve recurrent neural network (RNN) architectures. This allows for the consideration of the underlying computational resource requirements of the RNN models while maintaining an acceptable model performance-related objective. Additionally, methods such as weight inheritance, early stopping, and pruning of architectural unit connections during offspring generation, are investigated in the context of RNN architecture search to allow for more efficient exploration of the RNN architecture search space.