General linear model
The general linear model or general multivariate regression model is simply a compact way of writing various multiple linear regression models. In that sense it is not a separate statistical linear model. The various multiple linear regression models may be compactly written as
where Y is a matrix with series of multivariate measurements, X is a matrix of observations on independent variables that might be a design matrix, B is a matrix containing parameters that are usually to be estimated and U is a matrix containing errors.
The errors are usually assumed to be uncorrelated across measurements, and follow a multivariate normal distribution. If the errors do not follow a multivariate normal distribution, generalized linear models may be used to relax assumptions about Y and U.
The general linear model incorporates a number of different statistical models: ANOVA, ANCOVA, MANOVA, MANCOVA, ordinary linear regression, t-test and F-test. The general linear model is a generalization of multiple linear regression to the case of more than one dependent variable. If Y, B, and U were column vectors, the matrix equation above would represent multiple linear regression.
Hypothesis tests with the general linear model can be made in two ways: multivariate or as several independent univariate tests. In multivariate tests the columns of Y are tested together, whereas in univariate tests the columns of Y are tested independently, i.e., as multiple univariate tests with the same design matrix.
Comparison to multiple linear regression
Multiple linear regression is a generalization of simple linear regression to the case of more than one independent variable, and a special case of general linear models, restricted to one dependent variable. The basic model for multiple linear regression isfor each observation i = 1,..., n.
In the formula above we consider n observations of one dependent variable and p independent variables. Thus, Yi is the ith observation of the dependent variable, Xij is ith observation of the jth independent variable, j = 1, 2,..., p. The values βj represent parameters to be estimated, and εi is the ith independent identically distributed normal error.
In the more general multivariate linear regression, there is one equation of the above form for each of m > 1 dependent variables that share the same set of explanatory variables and hence are estimated simultaneously with each other:
for all observations indexed as i = 1,..., n and for all dependent variables indexed as j = 1,..., m''.
Note that since each dependent variable has its own set of regression parameters to be fitted, from a computational point of view the general multivariate regression is simply a sequence of standard multiple linear regressions using the same explanatory variables.
Comparison to generalized linear model
The general linear model and the generalized linear model are two commonly used families of statistical methods to relate some number of continuous and/or categorical predictors to a single outcome variable.The main difference between the two approaches is that the GLM strictly assumes that the residuals will follow a conditionally normal distribution, while the GLiM loosens this assumption and allows for a variety of other distributions from the exponential family for the residuals. Of note, the GLM is a special case of the GLiM in which the distribution of the residuals follow a conditionally normal distribution.
The distribution of the residuals largely depends on the type and distribution of the outcome variable; different types of outcome variables lead to the variety of models within the GLiM family. Commonly used models in the GLiM family include binary logistic regression for binary or dichotomous outcomes, Poisson regression for count outcomes, and linear regression for continuous, normally distributed outcomes. This means that GLiM may be spoken of as a general family of statistical models or as specific models for specific outcome types.
General linear model | Generalized linear model | |
Typical estimation method | Least squares, best linear unbiased prediction | Maximum likelihood or Bayesian |
Examples | ANOVA, ANCOVA, linear regression | linear regression, logistic regression, Poisson regression, gamma regression, general linear model |
Extensions and related methods | MANOVA, MANCOVA, linear mixed model | generalized linear mixed model, generalized estimating equations |
R package and function | in stats package | in stats package |
Matlab function | mvregress | glmfit |
SAS procedures | , | , |
Stata command | regress | glm |
SPSS command | , | genlin, logistic |
Wolfram Language & Mathematica function | LinearModelFit | GeneralizedLinearModelFit |
EViews command | ls | glm |