Analysis of Catastrophic Interference with Application to Spline Neural Architectures

Please be advised that the site will be down for maintenance on Sunday, September 1, 2024, from 08:00 to 18:00, and again on Monday, September 2, 2024, from 08:00 to 09:00. We apologize for any inconvenience this may cause.

Show simple item record

dc.contributor.advisor Bosman, Anna Sergeevna
dc.contributor.postgraduate Van Deventer, Heinrich Pieter
dc.date.accessioned 2024-03-01T10:56:26Z
dc.date.available 2024-03-01T10:56:26Z
dc.date.created 2024-05-13
dc.date.issued 2024-02-14
dc.description Dissertation (MSc(Computer Science))--University of Pretoria,2024 en_US
dc.description.abstract Continual learning is the sequential learning of different tasks by a machine learning model. Continual learning is known to be hindered by catastrophic interference or forgetting, i.e. rapid unlearning of earlier learned tasks when new tasks are learned. Despite their practical success, artificial neural networks (ANNs) are prone to catastrophic interference. This study analyses how gradient descent and overlapping representations between distant input points lead to distal interference and catastrophic interference. Distal interference refers to the phenomenon where training a model on a subset of the domain leads to non-local changes on other subsets of the domain. This study shows that uniformly trainable models without distal interference must be exponentially large. A novel antisymmetric bounded exponential layer B-spline ANN architecture named ABEL-Spline is proposed that can approximate any continuous function, is uniformly trainable, has polynomial computational complexity, and provides some guarantees for distal interference. Experiments are presented to demonstrate the theoretical properties of ABEL-Splines. ABEL-Splines are also evaluated on benchmark regression problems. It is concluded that the weaker distal interference guarantees in ABEL-Splines are insufficient for model-only continual learning. It is conjectured that continual learning with polynomial complexity models requires augmentation of the training data or algorithm. en_US
dc.description.availability Unrestricted en_US
dc.description.degree MSc (Computer Science) en_US
dc.description.department Computer Science en_US
dc.description.faculty Faculty of Engineering, Built Environment and Information Technology en_US
dc.description.sdg SDG-09: Industry, innovation and infrastructure en_US
dc.description.sponsorship Computing resources provided by the South African Centre for High-Performance Computing (CHPC). en_US
dc.description.sponsorship Supported by the National Research Foundation (NRF) of South Africa Thuthuka Grant Number 13819413/TTK210316590115. en_US
dc.identifier.citation * en_US
dc.identifier.doi https://doi.org/10.25403/UPresearchdata.25260349 en_US
dc.identifier.other A2024 en_US
dc.identifier.uri http://hdl.handle.net/2263/95024
dc.language.iso en en_US
dc.publisher University of Pretoria
dc.rights © 2023 University of Pretoria. All rights reserved. The copyright in this work vests in the University of Pretoria. No part of this work may be reproduced or transmitted in any form or by any means, without the prior written permission of the University of Pretoria.
dc.subject UCTD en_US
dc.subject machine learning en_US
dc.subject continual learning en_US
dc.subject catastrophic forgetting en_US
dc.subject catastrophic interference en_US
dc.subject overlapping representation en_US
dc.subject sparse distributed representation en_US
dc.subject regression en_US
dc.subject spline en_US
dc.subject artificial neural network en_US
dc.subject universal function approximation en_US
dc.title Analysis of Catastrophic Interference with Application to Spline Neural Architectures en_US
dc.type Dissertation en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record