For example, R = I * R 0, where I is an identity matrix of size number of observations, * is the direct product operation (do not confuse with a plain matrix multiplication) and R 0 is the error covariance matrix for the traits involved in the analysis. Other example of a more complex covariance structure is a multivariate analysis in one site, where both the residual and additive genetic covariance matrices are constructed as the product of two matrices. For example, an analysis of data from several sites might consider different error variances for each site, that is R = Σd R i, where Σd represents a direct sum (see any matrix algebra book for an explanation) and R i is the residual matrix for site i. However, there are several situations when the analysis require a more complex covariance structure, usually a direct sum or direct product of two or more matrices.
For example, the residual covariance matrix in simple examples is R = I σ e 2, or the additive genetic variance matrix is G = A σ a 2 (where A is the numerator relationship matrix). This is because ASReml assumes that, in absence of any additional information, the covariance structure is the product of a scalar (a variance component) by a design matrix. When fitting simple models (as in many examples of univariate analysis one needs to specify only the model equation (the bit that looks like y ~ mu.) but nothing about the covariances that complete the model specification.