Soft computing for the posterior of a matrix t graphical network
dc.contributor.author | Pillay, Jason | |
dc.contributor.author | Bekker, Andriette, 1958- | |
dc.contributor.author | Ferreira, Johannes Theodorus | |
dc.contributor.author | Arashi, Mohammad | |
dc.contributor.email | andriette.bekker@up.ac.za | |
dc.date.accessioned | 2025-10-16T12:08:24Z | |
dc.date.available | 2025-10-16T12:08:24Z | |
dc.date.issued | 2025-05 | |
dc.description | DATA AVAILABILITY : The authors do not have permission to share data. | |
dc.description.abstract | Modeling noisy data in a network context remains an unavoidable obstacle; fortunately, random matrix theory may comprehensively describe network environments. Noisy data necessitates the probabilistic characterization of these networks using matrix variate models. Denoising network data using a Bayesian approach is not common in surveyed literature. Therefore, this paper adopts the Bayesian viewpoint and introduces a new version of the matrix variate t graphical network. This model's prior beliefs rely on the matrix variate gamma distribution to handle the noise process flexibly; from a statistical learning viewpoint, such a theoretical consideration benefits the comprehension of structures and processes that cause network-based noise in data as part of machine learning and offers real-world interpretation. A proposed Gibbs algorithm is provided for computing and approximating the resulting posterior probability distribution of interest to assess the considered model's network centrality measures. Experiments with synthetic and real-world stock price data are performed to validate the proposed algorithm's capabilities and show that this model has wider flexibility than the model proposed by [13]. HIGHLIGHTS • Expanding the framework for denoising financial data inside the realm of graphical network theory, where the assumption of normality in the model is inadequate to account for the variation. • Introduction of the matrix variate gamma and inverse matrix variate gamma as priors for the covariance matrices; the univariate scale parameter β may be fixed or subject to a prior. • Following Bayesian inference with more flexible priors, there is an improvement based on relevant accuracy measures. • Experimental results indicate that our proposed framework and results outperform those of [13]. | |
dc.description.department | Statistics | |
dc.description.department | Geography, Geoinformatics and Meteorology | |
dc.description.librarian | hj2025 | |
dc.description.sdg | SDG-08: Decent work and economic growth | |
dc.description.sponsorship | The National Research Foundation and Iran National Science Foundation. | |
dc.description.uri | http://www.elsevier.com/locate/ijar | |
dc.identifier.citation | Pillay, J., Bekker, A., Ferreira, J. & Arashi, M. 2025, 'Soft computing for the posterior of a matrix t graphical network', International Journal of Approximate Reasoning, vol. 180, art. 109397, pp. 1-18, doi : 10.1016/j.ijar.2025.109397. | |
dc.identifier.issn | 0888-613X (print) | |
dc.identifier.issn | 1873-4731 (online) | |
dc.identifier.other | 10.1016/j.ijar.2025.109397 | |
dc.identifier.uri | http://hdl.handle.net/2263/104746 | |
dc.language.iso | en | |
dc.publisher | Elsevier | |
dc.rights | © 2025 The Author(s). Published by Elsevier Inc. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/). | |
dc.subject | Adjacency matrix | |
dc.subject | Stock price data | |
dc.subject | Precision matrix | |
dc.subject | Matrix variate t | |
dc.subject | Matrix variate gamma distribution | |
dc.subject | Gaussian graphical model | |
dc.subject | Bayesian network | |
dc.title | Soft computing for the posterior of a matrix t graphical network | |
dc.type | Article |