Submitting Results
Note: The task is closed for submission since January 30, 2017
As in all SemEval-2017 tasks the submission of system results is done using CodaLab.
We have set up competitions on CodaLab for each of the subtasks:
Subtask A, Subtask B, Subtask C, Subtask D, and Subtask E
Here you can submit your results on the development sets (from the 2016 task) and the 2017 test sets, when they become available. The submissions on the development sets are only for testing purposes. Before the submission deadline we encourage you to check that you know how to format and submit the output for any task you participate in, and that the scores computed at CodaLab are exactly those you obtain locally with the official task scorer. Note that to be able to submit results you will have to sign up in CodaLab and request permission. Please, make sure to do these steps well before the submission deadline.
Each competition contains a leaderboard, so you can see how your system's results compare to those of the other teams. For the development phase, you can make as many submissions as you like. Only one of them will show up on the leaderboard, and you need to choose which one (if any). The leaderboard for the submissions on the development set is purely informative and has no official value.
On the test set only three submissions are allowed, one primary and two contrastive. The contrastive runs are optional. The official evaluation and ranking for the task will be done using the primary submission only. Again, it is up to you to choose which one that is. To do it, you have to indicate either "primary" or "contrastive 1/2" in the available text box when uploading your output files. You can update this description text after submitting the file as many times as you want. Please, make sure to have one of them marked as "primary" and the others as "contrastive 1" and "contrastive 2". The CodaLab leaderboard on the test set will be used to show the winners of the competitions on February 6, 2017. Before that date, the leaderboard will be hidden from the participants. The scores of your test submissions will not be available either. All the information will be available from February 6, 2017, including the results of the contrastive submissions, which will also be reported in the task description paper for completeness.
Note: only valid submissions are counted by CodaLab; submissions that produce errors when scoring are not counted against your maximum number of submissions (3).
If you run into trouble with CodaLab, don't hesitate to get in touch.
The test data for subtasks A, B, C, and D was released on the 12th of January. The test data for Subtask E will be released later than for the other tasks, on the 21st of January, due to the specific nature of the data and subtask. The final day to upload results in CodaLab for any task will be January 30, 2017. Note that this deadline is hard.
Final notes:
- Please check the README of the test sets as it contains relevant information
- Participants can choose to download the test data at any moment during the evaluation period from the data webpage. Regardless of the time of download, the deadline for submitting results is the same January 30, 2017 (for all tasks).
- Participants are free to participate for a single subtask or for any combination of subtasks.
- After the official results are published in the leaderboard, we will contact the participant teams to collect some relevant information about the team, the system approach, features, etc. We will use this information in the task description paper.