dc.contributor.advisor |
Jacobs, J. Pieter |
|
dc.contributor.postgraduate |
Kok, Pieter Cornelius |
|
dc.date.accessioned |
2024-02-19T14:15:58Z |
|
dc.date.available |
2024-02-19T14:15:58Z |
|
dc.date.created |
2024-04 |
|
dc.date.issued |
2023-12 |
|
dc.description |
Dissertation (MEng(Computer Engineering))--University of Pretoria, 2023. |
en_US |
dc.description.abstract |
Instrument playing technique classification is a problem in music information retrieval (MIR) that has only selectively been explored in the context of specific instrumentations or datasets. Classifying playing techniques with pitch is a further challenge that takes a step closer to automatic music transcription (AMT) with playing technique annotation. Traditional deep learning methods have been used for the problems of instrument classification, playing technique classification and multiple-instrument transcription, however, annotated data for the combined problems are scarce, thus it is hard to train a sufficiently complex deep neural network that would be able to generalize to many different instruments, playing styles and recording conditions. This study presents a few-shot learning model for joint instrument, playing technique and pitch classification of single tones using prototypical networks. The few-shot nature of the model allows it to be trained on what data are available and to adapt to new instruments, playing techniques or recording conditions at inference time from a few examples. This model could form part of a tutorial system where a music student would record scales of a given playing technique under the supervision of a music teacher, which would later be used to match and evaluate a performance with the technique.
Different deep neural network (DNN) architectures and both log-mel spectrogram and constant-Q transform (CQT) input features are compared. The few-shot models are compared to standard neural network classifier models with transfer learning to show how the few-shot models generalize better to previously unseen playing techniques. Model training is optimized with Bayesian optimization. Prototypical models outperform standard classifier models with transfer learning on all experiments.
The 3-shot CQT convolutional neural network (CNN) model performs the best on the joint classification task and achieves a macro F-score of 0.64 on the Studio On Line (OrchideaSOL) string instrument playing technique dataset of previously unseen playing technique classes, which shows an ability for the prototypical model to generalize to a new dataset without much loss of performance compared to evaluation on the training classes. The model also achieves a macro F-score of up to 0.855 on individual instruments, which shows promise for its use in a tutorial set up for any of the string instruments. The models perform just as well when evaluated on extracts from YouTube tutorials and examples of clarinet playing techniques from the Real World Computing (RWC) dataset. The few-shot model also functions as a multitask model, capable of classifying pitch, playing technique or instrument from a recorded sample. The best joint instrument, playing technique and pitch classification prototypical model can accurately classify both playing technique and pitch, and do so just as well or better than models trained more specifically on these problems when compared on the same data. Furthermore, the scenario of instrument, playing technique and pitch classification in the presence of piano accompaniment is investigated, which resulted in some loss of generalization, but still shows promise for the task of main melody extraction, as pitch classification remains high. |
en_US |
dc.description.availability |
Unrestricted |
en_US |
dc.description.degree |
Master of Engineering (Computer Engineering) |
en_US |
dc.description.department |
Electrical, Electronic and Computer Engineering |
en_US |
dc.description.faculty |
Faculty of Engineering, Built Environment and Information Technology |
en_US |
dc.description.sdg |
None |
en_US |
dc.identifier.citation |
* |
en_US |
dc.identifier.doi |
10.25403/UPresearchdata.25242838 |
en_US |
dc.identifier.other |
April 2024 (A2024) |
en_US |
dc.identifier.uri |
http://hdl.handle.net/2263/94734 |
|
dc.language.iso |
en |
en_US |
dc.publisher |
University of Pretoria |
|
dc.rights |
© 2023 University of Pretoria. All rights reserved. The copyright in this work vests in the University of Pretoria. No part of this work may be reproduced or transmitted in any form or by any means, without the prior written permission of the University of Pretoria. |
|
dc.subject |
UCTD |
en_US |
dc.subject |
Instrument playing technique classification |
en_US |
dc.subject |
Pitch classification |
en_US |
dc.subject |
Instrument recognition |
en_US |
dc.subject |
Few-shot learning |
en_US |
dc.subject |
Prototypical network |
en_US |
dc.subject |
Transfer learning |
en_US |
dc.title |
Few-shot Learning for Joint Classification of Instrument, Pitch, and Playing Technique of Tones Produced by Bowed String Instruments |
en_US |
dc.type |
Dissertation |
en_US |