Skip to content

Commit

Permalink
pre-commit pass
Browse files Browse the repository at this point in the history
  • Loading branch information
lwaekfjlk committed Apr 6, 2024
1 parent bb1397b commit 18a2770
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 4 deletions.
3 changes: 1 addition & 2 deletions human_eval/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ data3 annotated by A,D
data4 annotated by B,E
```

We support three agreement metrics for such unpaired multi-annotator annotation:
We support three agreement metrics for such unpaired multi-annotator annotation:

1. Pairwise agreement

Expand All @@ -130,4 +130,3 @@ We support three agreement metrics for such unpaired multi-annotator annotation:
4. Fleiss's kappa

Fleiss' Kappa is an extension of Cohen's Kappa for situations involving three or more raters. It assesses the reliability of agreement among a fixed number of raters when assigning categorical ratings to a number of items. Like Cohen's Kappa, it also accounts for agreement occurring by chance.

4 changes: 2 additions & 2 deletions human_eval/agreement.py
Original file line number Diff line number Diff line change
Expand Up @@ -193,13 +193,13 @@ def computeAlpha(

scores = computeAlpha(longDf, "ratingBinary", groupCol="id")
rkappa = computeFleissKappa(longDf, "ratingBinary", "id", 2, "randolf")
#fkappa = computeFleissKappa(longDf, "ratingBinary", "id", 2, "fleiss")
# fkappa = computeFleissKappa(longDf, "ratingBinary", "id", 2, "fleiss")
ppa = scores["ppa"]
alpha = scores["alpha"]
results = {
"Pairwise Agreement": round(ppa, 4),
"Krippendorf's Alpha": round(alpha, 4),
"Randolph's Kappa": round(rkappa, 4),
#"Fleiss' Kappa": round(fkappa, 4),
# "Fleiss' Kappa": round(fkappa, 4),
}
print(results)

0 comments on commit 18a2770

Please sign in to comment.