Abstract:
The advancement of digitisation in various scientific disciplines has generated data with numerous variables. Gaussian graphical models (GGMs) offer a convenient framework for analysing and interpreting the conditional relationships among these variables, with network inference relying on estimating the precision matrix within a multivariate Gaussian framework. Two novel Bayesian shrinkage methods are proposed for the estimation of the precision matrix. The first develops a Bayesian treatment of the frequentist alternative ridge precision estimator with the common l2 penalty, allowing for networks that are not necessarily highly sparse. The second caters for diverse sparsity by enabling both l1 and l2 based shrinkage within a naïve elastic net setting. Full block Gibbs samplers are provided for implementing the new estimators. The Bayesian graphical ridge and naïve elastic net priors are extended to allow for flexible shrinkage of the off-diagonal elements of the precision matrix. Simulations and practical case studies show that the proposed estimators compare favourably with competing methods and enrich methodological flexibility for data analysis. To this end, a Bayesian approach for estimating differential networks (DN), using the Bayesian adaptive graphical lasso, is introduced. Comparisons to state-of-the-art frequentist techniques highlight the utility of the proposed technique. The novel samplers considered are available in the ’baygel’ R package to facilitate usage and exploration for practitioners.