A hybrid convolutional neural network-transformer method for received signal strength indicator fingerprinting localization in long range wide area network

dc.contributor.authorLutakamale, Albert Selebea
dc.contributor.authorMyburgh, Hermanus Carel
dc.contributor.authorDe Freitas, Allan
dc.contributor.emailalbert.lutakamale@tuks.co.zaen_US
dc.date.accessioned2024-05-17T07:14:57Z
dc.date.available2024-05-17T07:14:57Z
dc.date.issued2024-07
dc.descriptionDATA AVAILABILTY : The dataset used in this work is a publicly available dataset.en_US
dc.description.abstractIn recent years, low-power wide area networks (LPWANs), particularly Long-Range Wide Area Network (LoRaWAN) technology, are increasingly being adopted into large-scale Internet of Things (IoT) applications thanks to having the ability to offer cost-effective long-range wireless communication at low-power. The need to provide location-stamped communications to IoT applications for meaningful interpretation of physical measurements from IoT devices has increased demand to incorporate location estimation capabilities into LoRaWAN networks. Fingerprint-based localization methods are increasingly becoming popular in LoRaWAN networks because of their relatively high accuracy compared to range-based localization methods. This work proposes hybrid convolutional neural networks (CNNs)-transformer fingerprinting method to localize a node in a LoRaWAN network. CNNs are adopted to complement the strengths of the Transformer by adding the ability to capture local features from input data and consequently allow the Transformer, through the attention mechanism, to effectively learn global dependencies from the input data. Specifically, the proposed method works by first learning the local location features from the input data using the CNNs and passing the resulting information to the transformer encoder to learn global features from the input data. The output of the transformer encoder is then concatenated with information learned at the local level and then passed through the regressor for the final location estimation. With a localization performance of 290.71 m mean error achieved, the proposed method outperformed similar state-of-the-art works in the literature evaluated on the same publicly available LoRaWAN dataset.en_US
dc.description.departmentElectrical, Electronic and Computer Engineeringen_US
dc.description.librarianhj2024en_US
dc.description.sdgSDG-09: Industry, innovation and infrastructureen_US
dc.description.urihttp://www.elsevier.com/locate/engappaien_US
dc.identifier.citationLutakamale, A.S., Myburgh, H.C. & De Freitas, A. 2024, 'A hybrid convolutional neural network-transformer method for received signal strength indicator fingerprinting localization in long range wide area network', Engineering Applications of Artificial Intelligence, vol. 133, art. 108349, pp. 1-11, doi : 10.1016/j.engappai.2024.108349.en_US
dc.identifier.issn0952-1976
dc.identifier.other10.1016/j.engappai.2024.108349
dc.identifier.urihttp://hdl.handle.net/2263/96033
dc.language.isoenen_US
dc.publisherElsevieren_US
dc.rights© 2024 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY-NC license.en_US
dc.subjectLow-power wide area network (LPWAN)en_US
dc.subjectLong-range wide area network (LoRaWAN)en_US
dc.subjectInternet of Things (IoT)en_US
dc.subjectConvolutional neural network (CNN)en_US
dc.subjectFingerprint localizationen_US
dc.subjectDeep learningen_US
dc.subjectWireless communicationsen_US
dc.subjectSDG-09: Industry, innovation and infrastructureen_US
dc.titleA hybrid convolutional neural network-transformer method for received signal strength indicator fingerprinting localization in long range wide area networken_US
dc.typeArticleen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Lutakamale_Hybrid_2024.pdf
Size:
1.38 MB
Format:
Adobe Portable Document Format
Description:
Article

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: