Supplementary-architecture weight-optimization neural networks

Show simple item record

dc.contributor.author O’Reilly, Jared
dc.contributor.author Pillay, Nelishia
dc.date.accessioned 2022-12-15T07:45:24Z
dc.date.issued 2022-07
dc.description.abstract Research efforts in the improvement of artificial neural networks have provided significant enhancements in learning ability, either through manual improvement by researchers or through automated design by other artificial intelligence techniques, and largely focusing on the architecture of the neural networks or the weight update equations used to optimize these architectures. However, a promising unexplored area involves extending the traditional definition of neural networks to allow a single neural network model to consist of multiple architectures, where one is a primary architecture and the others supplementary architectures. In order to use the information from all these architectures to possibly improve learning, weight update equations are customized per set-of-weights, and can each use the error of either the primary architecture or a supplementary architecture to update the values of that set-of-weights, with some necessary constraints to ensure valid updates. This concept was implemented and investigated. Grammatical evolution was used to make the complex architecture choices for each weight update equation, which succeeded in finding optimal choice combinations for classification and regression benchmark datasets, the KDD Cup 1999 intrusion detection dataset, and the UCLA graduate admission dataset. These optimal combinations were compared to traditional single-architecture neural networks, which they reliably outperformed at high confidence levels across all datasets. These optimal combinations were analysed using data mining tools, and this identified clear patterns, with the theoretical explanation provided as to how these patterns may be linked to optimality. The optimal combinations were shown to be competitive with state-of-the-art techniques on the same datasets. en_US
dc.description.department Computer Science en_US
dc.description.embargo 2023-03-02
dc.description.librarian hj2022 en_US
dc.description.sponsorship The National Research Foundation of South Africa. en_US
dc.description.uri http://link.springer.com/journal/521 en_US
dc.identifier.citation O’Reilly, J., Pillay, N. Supplementary-architecture weight-optimization neural networks. Neural Computing and Applications 34, 11177–11197 (2022). https://doi.org/10.1007/s00521-022-07035-5. en_US
dc.identifier.issn 0941-0643 (print)
dc.identifier.issn 1433-3058 (online)
dc.identifier.other 10.1007/s00521-022-07035-5
dc.identifier.uri https://repository.up.ac.za/handle/2263/88820
dc.language.iso en en_US
dc.publisher Springer en_US
dc.rights © The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2022. The original publication is available at : http://link.springer.com/journal/521. en_US
dc.subject Artificial neural networks (ANN) en_US
dc.subject Weight update equations en_US
dc.subject Supplementary architectures en_US
dc.subject Neuro-evolution en_US
dc.subject Grammatical evolution en_US
dc.title Supplementary-architecture weight-optimization neural networks en_US
dc.type Postprint Article en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record