Retention Length and Memory Capacity of Recurrent Neural Networks

dc.contributor.advisorCleghorn, Christopher W.
dc.contributor.emailu12022404@tuks.co.zaen_ZA
dc.contributor.postgraduatePretorius, Abraham Daniel
dc.date.accessioned2021-01-20T07:32:24Z
dc.date.available2021-01-20T07:32:24Z
dc.date.created2021-04-01
dc.date.issued2020
dc.descriptionDissertation (MSc (Computer Science))--University of Pretoria, 2020.en_ZA
dc.description.abstractRecurrent Neural Networks (RNNs) are variants of Neural Networks that are able to learn temporal relationships between sequences presented to the neural network. RNNs are often employed to learn underlying relationships in time series and sequential data. This dissertation examines the extent of RNN’s memory retention and how it is influenced by different activation functions, network structures and recurrent network types. To investigate memory retention, three approaches (and variants thereof) are used. First the number of patterns each network is able to retain is measured. Thereafter the length of retention is investigated. Lastly the previous experiments are combined to measure the retention of patterns over time. During each investigation, the effect of using different activation functions and network structures are considered to determine the configurations’ effect on memory retention. The dissertation concludes that memory retention of a network is not necessarily improved when adding more parameters to a network. Activation functions have a large effect on the performance of RNNs when retaining patterns, especially temporal patterns. Deeper network structures have the trade-off of less memory retention per parameter in favour of the ability to model more complex relationships.en_ZA
dc.description.availabilityUnrestricteden_ZA
dc.description.degreeMSc (Computer Science)en_ZA
dc.description.departmentComputer Scienceen_ZA
dc.identifier.citation*en_ZA
dc.identifier.otherS2021en_ZA
dc.identifier.urihttp://hdl.handle.net/2263/78061
dc.language.isoenen_ZA
dc.publisherUniversity of Pretoria
dc.rights© 2019 University of Pretoria. All rights reserved. The copyright in this work vests in the University of Pretoria. No part of this work may be reproduced or transmitted in any form or by any means, without the prior written permission of the University of Pretoria.
dc.subjectRecurrent Neural Networksen_ZA
dc.subjectTime Seriesen_ZA
dc.subjectMemory Capacityen_ZA
dc.subjectMemory Retentionen_ZA
dc.subjectTemporal Seriesen_ZA
dc.subjectUCTD
dc.titleRetention Length and Memory Capacity of Recurrent Neural Networksen_ZA
dc.typeDissertationen_ZA

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Pretorius_Retention_2020.pdf
Size:
4.1 MB
Format:
Adobe Portable Document Format
Description:
Dissertation

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.75 KB
Format:
Item-specific license agreed upon to submission
Description: