Search results

Jump to navigation Jump to search

Page title matches

Page text matches

  • Statistical measure of inter-rater reliability between two observers.
    106 bytes (11 words) - 20:01, 7 September 2009
  • '''Inter-rater reliability''' or '''Inter-rater agreement''' is the measurement of agreement between r There are a number of statistics which can be used in order to determine the inter-rater reliability. Different
    4 KB (615 words) - 14:27, 18 February 2008
  • Auto-populated based on [[Special:WhatLinksHere/Inter-rater reliability]]. Needs checking by a human.
    442 bytes (56 words) - 17:29, 11 January 2010
  • {{r|Inter-rater reliability}}
    471 bytes (59 words) - 16:34, 11 January 2010
  • ...bility]] between radiologists in detecting massive pulmonary emboli, the [[inter-rater reliability]] is only moderate for segmental or smaller emboli.<ref name="pmid19931759"
    7 KB (995 words) - 06:16, 12 December 2014
  • ...kappa''' is a variant of [[Cohen's kappa]], a [[statistical]] measure of [[inter-rater reliability]]. Where Cohen's kappa works for only two raters, Fleiss' kappa works for a
    7 KB (939 words) - 12:41, 26 September 2007