Statistics κ can take values ranging from – 1 to 1 and are arbitrarily interpreted as follows: 0 = concordance equivalent to chance; 0.10-0.20 = light match; 0.21-0.40 = fair agreement; 0.41-0.60 = moderate concordance; 0.61-0.80 = essential correspondence; 0.81-0.99 = almost perfect match; and 1.00 = full compliance. Negative values indicate that the agreement respected is worse than what is expected by chance. Another interpretation is that Kappa values below 0.60 indicate a significant degree of disagreement. Readers are referenced to the following documents, which contain compliance measures: two methods are available to assess the concordance between measurements of a continuous variable on observers, instruments, dates, etc. One of them, the intraclass correlation coefficient (ICC), provides a single measure of the magnitude of the concordance, and the other, the Bland Altman diagram, provides a quantitative estimate of the proximity of the values of two measures. Think of two ophthalmologists who measure the pressure inside the eye with a tonometer. Each patient therefore receives two measurements, one from each observer. ICC provides an estimate of the total concordance between these measured values. It somewhat resembles “analysis of variance” because it considers variances between pairs expressed as a percentage of the overall variance of observations (i.e., the overall variability in “2n” observations that is expected to be the sum of variances within and between). The CCI can accept a value from 0 to 1, with 0 showing no agreement and 1 showing a perfect chord. .

. .