📚 node [[inter rater_agreement]]
📓
garden/KGBicheno/Artificial Intelligence/Introduction to AI/Week 3 - Introduction/Definitions/Inter-Rater_Agreement.md by @KGBicheno
inter-rater agreement
Go back to the [[AI Glossary]]
A measurement of how often human raters agree when doing a task. If raters disagree, the task instructions may need to be improved. Also sometimes called inter-annotator agreement or inter-rater reliability. See also Cohen's kappa, which is one of the most popular inter-rater agreement measurements.
📖 stoas
- public document at doc.anagora.org/inter-rater_agreement
- video call at meet.jit.si/inter-rater_agreement
🔎 full text search for 'inter rater_agreement'