Contrast (statistics)
Encyclopedia
In statistics
Statistics
Statistics is the study of the collection, organization, analysis, and interpretation of data. It deals with all aspects of this, including the planning of data collection in terms of the design of surveys and experiments....

, particularly analysis of variance
Analysis of variance
In statistics, analysis of variance is a collection of statistical models, and their associated procedures, in which the observed variance in a particular variable is partitioned into components attributable to different sources of variation...

, a contrast is a linear combination of two or more factor level means (average
Average
In mathematics, an average, or central tendency of a data set is a measure of the "middle" value of the data set. Average is one form of central tendency. Not all central tendencies should be considered definitions of average....

s) whose coefficients add up to zero. A simple contrast is the difference between two means. A contrast may be any of: the set of coefficients used to specify a comparison; the specific value of the linear combination obtained for a given study or experiment; the random quantity
Random variable
In probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...

 defined by applying the linear combination to treament effects when these are themselves considered as random variables. In the last context here, the term contrast variable is sometimes used.

Background

Contrasts are sometimes used to compare mixed effects
Mixed model
A mixed model is a statistical model containing both fixed effects and random effects, that is mixed effects. These models are useful in a wide variety of disciplines in the physical, biological and social sciences....

. A common example can be the difference between two test scores — one at the beginning of the semester and one at its end. Note that we are not interested in one of these scores by itself, but only in the contrast (in this case — the difference). Since this is a linear combination of independent variables, its variance will match accordingly, as the weighted sum of the variances; in this case both weights are one. This "blending" of two variables into one might be useful in many cases such as ANOVA, regression
Regression analysis
In statistics, regression analysis includes many techniques for modeling and analyzing several variables, when the focus is on the relationship between a dependent variable and one or more independent variables...

, or even as descriptive statistics in its own right.

Another example would be comparing 5 standard treatments to a new treatment, hence giving each old treatment a weight of 1/5, and the new sixth treatment a weight of −1. If this new linear combination has a mean zero, this will mean that the old treatments are not different from the new on average.

The usual results for linear combinations of independent random variables
Statistical independence
In probability theory, to say that two events are independent intuitively means that the occurrence of one event makes it neither more nor less probable that the other occurs...

 mean that the variance of a contrast is equal to the weighted sum of the variances. If two contrasts are orthogonal
Orthogonality
Orthogonality occurs when two things can vary independently, they are uncorrelated, or they are perpendicular.-Mathematics:In mathematics, two vectors are orthogonal if they are perpendicular, i.e., they form a right angle...

, estimates created by using such contrasts will be uncorrelated
Uncorrelated
In probability theory and statistics, two real-valued random variables are said to be uncorrelated if their covariance is zero. Uncorrelatedness is by definition pairwise; i.e...

. This has implications in ANOVA and regression
Regression analysis
In statistics, regression analysis includes many techniques for modeling and analyzing several variables, when the focus is on the relationship between a dependent variable and one or more independent variables...

 where in certain designed experiments the design is such as to ensure that estimates for a number of different parameters, each related to a different contrast, will be uncorrelated. Thus, if orthogonal contrasts are available, it is possible to summarise the results of a statistical analysis in the form of a simple analysis of variance table, in such a way that it contains the results for different test statistics relating to different contrasts, each of which are statistically independent.

A recent development in statistical analysis is the standardized mean of a contrast variable
Standardized mean of a contrast variable
In statistics, the standardized mean of a contrast variable , is a parameter assessing effect size. The SMCV is defined as mean divided by the standard deviation of a contrast variable.The SMCV was first proposed for one-way ANOVA cases...

. This makes a comparison between the size of the differences between groups, as measured by a contrast and the accuracy with which that contrast can be measured by a given study or experiment..

Types of constrast

  • Orthogonal contrasts are a set of contrasts in which, for any distinct pair, the sum of the cross-products of the coefficients is zero
  • Orthonormal contrasts are orthogonal constrasts which satify the additional condition that, for each contrast, the sum squares of the coefficients add up to one


External links

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK