# Sklar’s Omega: A Gaussian copula-based framework for assessing agreement

@article{Hughes2018SklarsOA, title={Sklar’s Omega: A Gaussian copula-based framework for assessing agreement}, author={John Hughes}, journal={Statistics and Computing}, year={2018}, volume={32} }

The statistical measurement of agreement—the most commonly used form of which is inter-coder agreement (also called inter-rater reliability), i.e., consistency of scoring among two or more coders for the same units of analysis—is important in a number of fields, e.g., content analysis, education, computational linguistics, sports. We propose Sklar’s Omega, a Gaussian copula-based framework for measuring not only inter-coder agreement but also intra-coder agreement, inter-method agreement, and…

## 4 Citations

### krippendorffsalpha: An R Package for Measuring Agreement Using Krippendorff's Alpha Coefficient

- Computer ScienceR J.
- 2021

The package permits users to apply the α methodology using built-in distance functions for the nominal, ordinal, interval, or ratio levels of measurement, andBootstrap inference is supported, and the bootstrap computation can be carried out in parallel.

### Toward improved inference for Krippendorff's Alpha agreement coefficient

- Mathematics
- 2022

In this article I recommend a better point estimator for Krippendorﬀ’s Alpha agreement coeﬃcient, and develop a jackknife variance estimator that leads to much better interval estimation than does…

### A case study comparison of objective and subjective evaluation methods of physical qualities in youth soccer players

- PsychologyJournal of sports sciences
- 2020

It is suggested that while ratings derived from objective and subjective assessment methods may be related when attempting to differentiate between distinct populations, concerns exist when evaluating homogeneous samples using these methods.

### Validity and reliability of the South African Triage Scale in prehospital providers

- Medicine, PsychologyBMC Emergency Medicine
- 2021

It is found that SATS generally under-performed as a triage tool, mainly due to the clinical discriminators, and is the first assessment of SATS as used by EMS providers for prehospital triage.

## References

SHOWING 1-10 OF 91 REFERENCES

### krippendorffsalpha: An R Package for Measuring Agreement Using Krippendorff's Alpha Coefficient

- Computer ScienceR J.
- 2021

The package permits users to apply the α methodology using built-in distance functions for the nominal, ordinal, interval, or ratio levels of measurement, andBootstrap inference is supported, and the bootstrap computation can be carried out in parallel.

### Testing the Difference of Correlated Agreement Coefficients for Statistical Significance

- MathematicsEducational and psychological measurement
- 2016

A technique similar to the classical pairwise t test for means, which is based on a large-sample linear approximation of the agreement coefficient is proposed, which requires neither advanced statistical modeling skills nor considerable computer programming experience.

### Computing inter-rater reliability and its variance in the presence of high agreement.

- PsychologyThe British journal of mathematical and statistical psychology
- 2008

This paper explores the origin of these limitations, and introduces an alternative and more stable agreement coefficient referred to as the AC1 coefficient, and proposes new variance estimators for the multiple-rater generalized pi and AC1 statistics, whose validity does not depend upon the hypothesis of independence between raters.

### Beyond kappa: A review of interrater agreement measures

- Psychology
- 1999

In 1960, Cohen introduced the kappa coefficient to measure chance‐corrected nominal scale agreement between two raters. Since then, numerous extensions and generalizations of this interrater…

### High agreement but low kappa: II. Resolving the paradoxes.

- BusinessJournal of clinical epidemiology
- 1990

### Computing Krippendorff's Alpha-Reliability

- Computer Science
- 2011

Krippendorff’s alpha () is a reliability coefficient developed to measure the agreement among observers, coders, judges, raters, annotators or measuring instruments drawing distinctions among…

### A Coefficient of Agreement for Nominal Scales

- Psychology
- 1960

CONSIDER Table 1. It represents in its formal characteristics a situation which arises in the clinical-social-personality areas of psychology, where it frequently occurs that the only useful level of…

### Bayesian measures of model complexity and fit

- Mathematics
- 2002

The posterior mean deviance is suggested as a Bayesian measure of fit or adequacy, and the contributions of individual observations to the fit and complexity can give rise to a diagnostic plot of deviance residuals against leverages.

### Survey Article: Inter-Coder Agreement for Computational Linguistics

- LinguisticsCL
- 2008

It is argued that weighted, alpha-like coefficients, traditionally less used than kappa-like measures in computational linguistics, may be more appropriate for many corpus annotation tasks—but that their use makes the interpretation of the value of the coefficient even harder.

### Measuring Agreement for Multinomial Data

- Mathematics
- 1982

A kappa-like statistic is proposed as a measure of agreement for a set of multinomial random variables arrayed in a two-way layout. This statistic is shown to arise either via the chancecorrection of…