Compute KL divergence for a multivariate normal distribution.

kl_mvn(true, estimate, stein = FALSE)

Arguments

true

Matrix. The true precision matrix (inverse of the covariance matrix)

estimate

Matrix. The estimated precision matrix (inverse of the covariance matrix)

stein

Logical. Should Stein's loss be computed (defaults to TRUE)? Note KL divergence is half of Stein's loss.

Value

Numeric corresponding to KL divergence.

Note

A lower value is better, with a score of zero indicating that the estimated precision matrix is identical to the true precision matrix.

Examples

# \donttest{

# nodes
p <- 20

main <- gen_net(p = p, edge_prob = 0.15)

y <- MASS::mvrnorm(250, rep(0, p), main$cors)

fit_l1 <- ggmncv(R = cor(y),
              n = nrow(y),
              penalty = "lasso",
              progress = FALSE)

# lasso
kl_mvn(fit_l1$Theta, solve(main$cors))
#> [1] 0.3371542

fit_atan <- ggmncv(R = cor(y),
              n = nrow(y),
              penalty = "atan",
              progress = FALSE)

kl_mvn(fit_atan$Theta, solve(main$cors))
#> [1] 0.1466836

# }