Results

Ranks were computed by summing the correlation values across all four levels of comparisons.  The sum of the Pearson correlations is used for the official rank of Task 3.  However, we also provide a second ranking using the sum of the Spearman correlations, which teams may use for comparison.  The column "Difference" shows the difference between these two ranks.

Four systems were submitted after the deadline and are included here in a separate table for comparison, but are unranked.

Team System Para-2-Sent Sent-2-Phr Phr-2-Word Word-2-Sense Official Rank Spearman Rank Difference
SimCompass run1 0.811 0.742 0.415 0.356 1 1 0
ECNU run1 0.834 0.771 0.315 0.269 2 2 0
UNAL-NLP run2 0.837 0.738 0.274 0.256 3 6 -3
SemantiKLUE run1 0.817 0.754 0.215 0.314 4 4 0
UNAL-NLP run1 0.817 0.739 0.252 0.249 5 7 -2
UNIBA run2 0.784 0.734 0.255 0.180 6 8 -2
UNIBA run1 0.769 0.729 0.229 0.165 7 10 -3
UNIBA run3 0.769 0.729 0.229 0.165 8 11 -3
BUAP run1 0.805 0.714 0.162 0.201 9 13 -4
BUAP run2 0.805 0.714 0.142 0.194 10 9 1
Meerkat_Mafia pairingWords 0.794 0.704 -0.044 0.389 11 12 -1
HULTECH run1 0.693 0.665 0.254 0.150 12 16 -4
HULTECH run3 0.669 0.671 0.232 0.137 13 15 -2
RTM-DCU run3 0.780 0.677 0.208   14 17 -3
HULTECH run2 0.667 0.633 0.180 0.169 15 14 1
RTM-DCU run1 0.786 0.666 0.171   16 18 -2
Meerkat_Mafia SuperSaiyan 0.834 0.777     17 19 -2
Meerkat_Mafia Hulk2 0.826 0.705     18 20 -2
RTM-DCU run2 0.747 0.588 0.164   19 22 -3
FBK-TR run3 0.759 0.702     20 23 -3
FBK-TR run1 0.751 0.685     21 24 -3
FBK-TR run2 0.770 0.648     22 25 -3
Duluth Duluth2 0.501 0.450 0.241 0.219 23 21 2
AI-KU run1 0.732 0.680     24 26 -2
UNAL-NLP run3 0.708 0.620     25 27 -2
AI-KU run2 0.698 0.617     26 28 -2
TCDSCSS run2 0.607 0.552     27 29 -2
JU-Evora run1 0.536 0.442 0.090 0.091 28 31 -3
TCDSCSS run1 0.575 0.541     29 30 -1
Duluth Duluth1 0.458 0.440 0.075 0.076 30 5 25
Duluth Duluth3 0.455 0.426 0.075 0.079 31 3 28
OPI run1   0.433 0.213 0.152 32 36 -4
SSMT run1 0.789       33 34 -1
DIT run1 0.785       34 32 2
DIT run2 0.784       35 33 2
UMCC_DLSI_SemSim run1   0.760     36 35 1
UMCC_DLSI_SemSim run2   0.698     37 37 0
UMCC_DLSI_Prob run1       0.023 38 38 0

 

Team System Para-2-Sent Sent-2-Phr Phr-2-Word Word-2-Sense Official Rank Spearman Rank Difference
Baseline (LCS)   0.527 0.562 0.165 0.109      
RTM-DCU run1-late 0.845 0.750 0.305        
RTM-DCU run2-late 0.785 0.698 0.221        
RTM-DCU run3-late 0.786 0.663 0.171        
Meerkat_Mafia pairingWords (corrected) 0.794 0.704 0.457 0.389      

 

Spearman Correlation Results

Team System Para-2-Sent Sent-2-Phr Phr-2-Word Word-2-Sense sum
SimCompass run1 0.801 0.728 0.424 0.344 2.297
ECNU run1 0.821 0.757 0.306 0.263 2.147
Duluth Duluth3 0.725 0.660 0.399 0.322 2.106
SemantiKLUE run1 0.802 0.739 0.218 0.327 2.086
Duluth Duluth1 0.726 0.658 0.385 0.311 2.080
UNAL-NLP run2 0.820 0.710 0.249 0.236 2.009
UNAL-NLP run1 0.803 0.717 0.258 0.231 2.015
UNIBA run2 0.775 0.718 0.247 0.174 1.914
BUAP run2 0.804 0.709 0.169 0.196 1.878
UNIBA run1 0.762 0.713 0.221 0.162 1.858
UNIBA run3 0.762 0.713 0.221 0.162 1.858
Meerkat_Mafia pairingWords 0.776 0.709 0.001 0.380 1.866
BUAP run1 0.804 0.709 0.175 0.164 1.852
HULTECH run2 0.688 0.633 0.260 0.124 1.704
HULTECH run3 0.688 0.633 0.259 0.124 1.704
HULTECH run1 0.666 0.633 0.260 0.126 1.685
RTM-DCU run3 0.769 0.683 0.201   1.653
RTM-DCU run1 0.778 0.669 0.166   1.613
Meerkat_Mafia SuperSaiyan 0.817 0.760     1.577
Meerkat_Mafia Hulk2 0.799 0.726     1.525
Duluth Duluth2 0.553 0.473 0.235 0.225 1.486
RTM-DCU run2 0.734 0.581 0.152   1.467
FBK-TR run3 0.770 0.695     1.465
FBK-TR run1 0.759 0.681     1.440
FBK-TR run2 0.775 0.642     1.417
AI-KU run1 0.727 0.646     1.373
UNAL-NLP run3 0.708 0.607     1.315
AI-KU run2 0.700 0.612     1.312
TCDSCSS run2 0.642 0.599     1.241
TCDSCSS run1 0.623 0.563     1.186
JU-Evora run1 0.533 0.440 0.096 0.075 1.144
DIT run1 0.777       0.777
DIT run2 0.777       0.777
SSMT run1 0.777       0.777
UMCC_DLSI_SemSim run1   0.747     0.747
OPI run1   0.424 0.188 0.131 0.743
UMCC_DLSI_SemSim run2   0.674     0.674
UMCC_DLSI_Prob run1       0.054 0.054

 

Team System Para-2-Sent Sent-2-Phr Phr-2-Word Word-2-Sense sum
Baseline   0.613 0.626 0.162 0.130 1.528
RTM-DCU run1-late 0.829 0.734 0.295    
RTM-DCU run2-late 0.778 0.687 0.219    
RTM-DCU run3-late 0.778 0.667 0.166    
Meerkat_Mafia pairingWords (corrected) 0.776 0.709 0.448 0.380  

 

Contact Info

Organizers


email : semeval-2014-clss@googlegroups.com
group: groups.google.com/group/semeval-2014-clss

Other Info

Announcements