Intra-rater reliability
Encyclopedia
In statistics
Statistics
Statistics is the study of the collection, organization, analysis, and interpretation of data. It deals with all aspects of this, including the planning of data collection in terms of the design of surveys and experiments....

, intra-rater reliability is the degree of agreement among multiple repetitions of a diagnostic test performed by a single rater.

See also

  • Inter-rater reliability
    Inter-rater reliability
    In statistics, inter-rater reliability, inter-rater agreement, or concordance is the degree of agreement among raters. It gives a score of how much homogeneity, or consensus, there is in the ratings given by judges. It is useful in refining the tools given to human judges, for example by...

  • Reliability (statistics)
    Reliability (statistics)
    In statistics, reliability is the consistency of a set of measurements or of a measuring instrument, often used to describe a test. Reliability is inversely related to random error.-Types:There are several general classes of reliability estimates:...

  • Repeatability
    Repeatability
    Repeatability or test-retest reliability is the variation in measurements if they would have been taken by a single person or instrument on the same item and under the same conditions. A less-than-perfect test-retest reliability causes test-retest variability. Such variability can be caused by, for...

  • Test-retest reliability
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK