Approximation approaches for training neural network problems with dynamic mini-batch sub-sampled losses

Show simple item record

dc.contributor.advisor Wilke, Daniel Nicolas
dc.contributor.postgraduate Chae, Younghwan
dc.date.accessioned 2022-02-25T12:13:06Z
dc.date.available 2022-02-25T12:13:06Z
dc.date.created 2022-05-13
dc.date.issued 2021
dc.description Thesis (PhD (Mechanical Engineering))--University of Pretoria, 2021. en_ZA
dc.description.abstract Learning rate schedule parameters is a sensitive and challenging hyperparameter to resolve in machine learning. It needs to be resolved whenever a model, data, data preprocessing or data batching changes. Implications of poorly resolving learning rates include poor models, high computing cost, excessive training time, and excessive carbon footprint. In addition, deep neural network (DNN) architectures routinely require billions of parameters, with GPT-3 utilizing 175 billion parameters and an estimated 12 million USD to train. Mini-batch sub-sampling introduces bias and variance that can manifest in several ways. Considering a line-search along a descent direction, the implications are smooth loss functions with large bias (static) or pointwise discontinuous loss functions with low bias but high variance in the function response. Two previous studies demonstrated that line searches have the potential to automate learning rate selection. In both cases, learning rates are resolved for point-wise discontinuous functions that include Bayesian regression and direct optimization using a gradient-only line search, GOLS. This study is an explorative study that investigates the potential of surrogates to resolve learning rates instead of direct optimization of the loss function. We aim to identify domains that warrant further investigation, for which purposes we introduced a new robustness measure to compare algorithms more sensibly. As a result, we start our surrogate investigation at the fundamental level, considering the most basic form for each approach. This isolates the essence and rids unnecessary complexity. We do, however, retain selected complexity that is deemed crucial such as dynamic sub-sampling. Hence, this study is an explorative study and not yet another study that proposes a state-of-the-art (SOTA) algorithm on a carefully curated dataset with carefully curated baseline algorithms against which to compare. The three fundamentally different approaches to resolve learning rates using surrogates are 1. The construction of one-dimensional quadratic surrogates for point-wise discontinuous functions to resolve learning rates by minimization; 2. The construction of one-dimensional classifiers to resolve learning rates from a gradient-only perspective using classification; 3. Sub-dimensional surrogates (higher than 1D) on smooth loss functions to isolate the identification of appropriate bases on simple test problems. This study concludes that both 1 and 2 further warrant investigation, with the longer-term goal to be extended to sub-dimensional surrogates to enhance efficiency en_ZA
dc.description.availability Unrestricted en_ZA
dc.description.degree PhD (Mechanical Engineering) en_ZA
dc.description.department Mechanical and Aeronautical Engineering en_ZA
dc.description.sponsorship Department of Mechanical and Aeronautical Engineering en_ZA
dc.identifier.citation * en_ZA
dc.identifier.other A2022 en_ZA
dc.identifier.uri http://hdl.handle.net/2263/84238
dc.language.iso en en_ZA
dc.publisher University of Pretoria
dc.rights © 2022 University of Pretoria. All rights reserved. The copyright in this work vests in the University of Pretoria. No part of this work may be reproduced or transmitted in any form or by any means, without the prior written permission of the University of Pretoria.
dc.subject UCTD en_ZA
dc.subject Line search
dc.subject SNN-GPP
dc.subject Gradient only
dc.subject Neural network
dc.subject Approximation
dc.title Approximation approaches for training neural network problems with dynamic mini-batch sub-sampled losses en_ZA
dc.type Thesis en_ZA


Files in this item

This item appears in the following Collection(s)

Show simple item record