Here we describe the variance-covariance matrix adjustment of coefficients.
Introduction
To estimate the covariance matrix of coefficients, there are many
ways. In mmrm
package, we implemented asymptotic,
empirical, Jackknife and Kenward-Roger methods. For simplicity, the
following derivation are all for unweighted mmrm. For weighted mmrm, we
can follow the details of
weighted least square estimator.
Asymptotic Covariance
Asymptotic covariance are derived based on the estimate of .
Following the definition in details in model fitting, we have
Where is the block diagonal matrix of inverse of covariance matrix of .
Empirical Covariance
Empirical covariance, also known as the robust sandwich estimator, or “CR0”, is derived by replacing the covariance matrix of by observed covariance matrix.
Where is the block diagonal part for subject of matrix, is the observed residuals for subject i, is the Cholesky factor of ().
See the detailed explanation of these formulas in the Weighted Least Square Empirical Covariance vignette.
Jackknife Covariance
Jackknife method in mmrm
is the “leave-one-cluster-out”
method. It is also known as “CR3”. Following McCaffrey and Bell (2003), we have
where
Please note that in the paper there is an additional scale parameter where is the number of subjects, here we do not include this parameter.
Bias-Reduced Covariance
Bias-reduced method, also known as “CR2”, provides unbiased under correct working model. Following McCaffrey and Bell (2003), we have
where
Kenward-Roger Covariance
Kenward-Roger covariance is an adjusted covariance matrix for small sample size. Details can be found in Kenward-Roger