Retention Length and Memory Capacity of Recurrent Neural Networks

Show simple item record

dc.contributor.advisor Cleghorn, Christopher W.
dc.contributor.postgraduate Pretorius, Abraham Daniel
dc.date.accessioned 2021-01-20T07:32:24Z
dc.date.available 2021-01-20T07:32:24Z
dc.date.created 2021-04-01
dc.date.issued 2020
dc.description Dissertation (MSc (Computer Science))--University of Pretoria, 2020. en_ZA
dc.description.abstract Recurrent Neural Networks (RNNs) are variants of Neural Networks that are able to learn temporal relationships between sequences presented to the neural network. RNNs are often employed to learn underlying relationships in time series and sequential data. This dissertation examines the extent of RNN’s memory retention and how it is influenced by different activation functions, network structures and recurrent network types. To investigate memory retention, three approaches (and variants thereof) are used. First the number of patterns each network is able to retain is measured. Thereafter the length of retention is investigated. Lastly the previous experiments are combined to measure the retention of patterns over time. During each investigation, the effect of using different activation functions and network structures are considered to determine the configurations’ effect on memory retention. The dissertation concludes that memory retention of a network is not necessarily improved when adding more parameters to a network. Activation functions have a large effect on the performance of RNNs when retaining patterns, especially temporal patterns. Deeper network structures have the trade-off of less memory retention per parameter in favour of the ability to model more complex relationships. en_ZA
dc.description.availability Unrestricted en_ZA
dc.description.degree MSc (Computer Science) en_ZA
dc.description.department Computer Science en_ZA
dc.identifier.citation * en_ZA
dc.identifier.other S2021 en_ZA
dc.identifier.uri http://hdl.handle.net/2263/78061
dc.language.iso en en_ZA
dc.publisher University of Pretoria
dc.rights © 2019 University of Pretoria. All rights reserved. The copyright in this work vests in the University of Pretoria. No part of this work may be reproduced or transmitted in any form or by any means, without the prior written permission of the University of Pretoria.
dc.subject Recurrent Neural Networks en_ZA
dc.subject Time Series en_ZA
dc.subject Memory Capacity en_ZA
dc.subject Memory Retention en_ZA
dc.subject Temporal Series en_ZA
dc.subject UCTD
dc.title Retention Length and Memory Capacity of Recurrent Neural Networks en_ZA
dc.type Dissertation en_ZA


Files in this item

This item appears in the following Collection(s)

Show simple item record