Toxic Comment Classification On Civil
评估指标
AUROC
评测结果
各个模型在此基准测试上的表现结果
比较表格
模型名称 | AUROC |
---|---|
pytorch-frame-a-modular-framework-for-multi | 0.865 |
a-benchmark-for-toxic-comment-classification | - |
a-benchmark-for-toxic-comment-classification | 0.966 |
a-benchmark-for-toxic-comment-classification | 0.9526 |
a-benchmark-for-toxic-comment-classification | - |
a-benchmark-for-toxic-comment-classification | - |
a-benchmark-for-toxic-comment-classification | 0.979 |
a-benchmark-for-toxic-comment-classification | - |
pytorch-frame-a-modular-framework-for-multi | 0.882 |
pytorch-frame-a-modular-framework-for-multi | 0.947 |
a-benchmark-for-toxic-comment-classification | - |
a-benchmark-for-toxic-comment-classification | 0.9804 |
palm-2-technical-report-1 | 0.7596 |
pytorch-frame-a-modular-framework-for-multi | 0.97 |
a-benchmark-for-toxic-comment-classification | 0.9818 |
a-benchmark-for-toxic-comment-classification | 0.9813 |
palm-2-technical-report-1 | 0.8535 |
a-benchmark-for-toxic-comment-classification | 0.9639 |
a-benchmark-for-toxic-comment-classification | 0.9791 |
pytorch-frame-a-modular-framework-for-multi | 0.945 |
pytorch-frame-a-modular-framework-for-multi | 0.885 |
a-benchmark-for-toxic-comment-classification | 0.979 |