Outer product of gradients estimation for the covariance matrix of the coefficient estimates in regression models.

vcovOPG(x, adjust = FALSE, ...)



a fitted model object.


logical. Should a finite sample adjustment be made? This amounts to multiplication with \(n/(n-k)\) where \(n\) is the number of observations and \(k\) the number of estimated parameters.


arguments passed to the estfun function.


In correctly specified models, the “meat” matrix (cross product of estimating functions, see meat) and the inverse of the “bread” matrix (inverse of the derivative of the estimating functions, see bread) are equal and correspond to the Fisher information matrix. Typically, an empirical version of the bread is used for estimation of the information but alternatively it is also possible to use the meat. This method is also known as the outer product of gradients (OPG) estimator (Cameron & Trivedi 2005).

Using the sandwich infrastructure, the OPG estimator could easily be computed via solve(meat(obj)) (modulo scaling). To employ numerically more stable implementation of the inversion, this simple convenience function can be used: vcovOPG(obj).

Note that this only works if the estfun() method computes the maximum likelihood scores (and not a scaled version such as least squares scores for "lm" objects).


A matrix containing the covariance matrix estimate.


Cameron AC and Trivedi PK (2005). Microeconometrics: Methods and Applications. Cambridge University Press, Cambridge.

Zeileis A (2006). “Object-Oriented Computation of Sandwich Estimators.” Journal of Statistical Software, 16(9), 1--16. doi:10.18637/jss.v016.i09

See also


## generate poisson regression relationship
x <- sin(1:100)
y <- rpois(100, exp(1 + x))
## compute usual covariance matrix of coefficient estimates
fm <- glm(y ~ x, family = poisson)
#>              (Intercept)            x
#> (Intercept)  0.005386019 -0.004587807
#> x           -0.004587807  0.009517869
#>              (Intercept)            x
#> (Intercept)  0.006650112 -0.005947346
#> x           -0.005947346  0.010395918