There is a direct correspondence between the inverse covariance matrix and multiple regression (Stephens 1998; Kwan 2014) . This readily allows for converting the off diagonal elements to regression coefficients, resulting in noncovex penalization for multiple regression modeling.
# S3 method for ggmncv coef(object, ...)
object | An Object of class |
---|---|
... | Currently ignored. |
A matrix of regression coefficients.
The coefficients can be accessed via coefs[1,]
,
which provides the estimates for predicting the first node.
Further, the estimates are essentially computed with both the outcome and predictors scaled to have mean 0 and standard deviation 1.
Kwan CC (2014).
“A regression-based interpretation of the inverse of the sample covariance matrix.”
Spreadsheets in Education, 7(1), 4613.
Stephens G (1998).
“On the Inverse of the Covariance Matrix in Portfolio Analysis.”
The Journal of Finance, 53(5), 1821--1827.
# \donttest{ # data Y <- GGMncv::ptsd[,1:5] # correlations S <- cor(Y) # fit model fit <- ggmncv(R = S, n = nrow(Y), progress = FALSE) # regression coefs <- coef(fit) coefs #> Estimates: #> #> node.1 #> node.2 node.3 node.4 node.5 #> 0.275 0 0.374 0.144 #> --- #> #> node.2 #> node.1 node.3 node.4 node.5 #> 0.24 0.561 0 0 #> --- #> #> node.3 #> node.1 node.2 node.4 node.5 #> 0 0.502 0.193 0.205 #> --- #> #> node.4 #> node.1 node.2 node.3 node.5 #> 0.324 0 0.215 0.329 #> --- #> #> node.5 #> node.1 node.2 node.3 node.4 #> 0.142 0 0.26 0.376 #> --- #> # no regularization, resulting in OLS # data # note: scaled for lm() Y <- scale(GGMncv::ptsd[,1:5]) # correlations S <- cor(Y) # fit model # note: non reg fit <- ggmncv(R = S, n = nrow(Y), progress = FALSE, lambda = 0) # regression coefs <- coef(fit) # fit lm fit_lm <- lm(Y[,1] ~ 0 + Y[,-1]) # ggmncv coefs[1,] #> [1] 0.23883387 0.06552783 0.35127553 0.15686903 # lm as.numeric(coef(fit_lm)) #> [1] 0.23881439 0.06549431 0.35128376 0.15687468 # }