International Workshop on Semantic Evaluation 2019

SemEval has evolved from the SensEval word sense disambiguation evaluation series. The SemEval wikipedia entry and the ACL SemEval Wiki provide a more detailed historical overview. SemEval-2019 will be the 13th workshop on semantic evaluation. Location details TBA.

 

Important Dates

 

Task Proposals:

  • 26 Mar 2018: Task proposals due
  • 04 May 2018: Task proposal notifications

Setup for the Competition:

20 Aug 2018: CodaLab competition website ready and made public. Should include basic task description and mailing group information for the task. Trial data ready. Evaluation script ready for participants to download and run on the trial data.
17 Sep 2018: Training data ready. Development data ready. CodaLab competition website updated to include an evaluation script uploaded as part of the competition so that participants can upload submissions on the development set and the script immediately checks the submission for format and computes the results on the development set. This is also the date by which a benchmark system should be made available to participants. Also, the organizers should run the submission created with the benchmark system on CodaLab, so that participants can see its results on the LeaderBoard.

Competition and Beyond:

 

10 Jan 2019: Evaluation start*
31 Jan 2019: Evaluation end*
05 Feb 2019: Results posted
28 Feb 2019: System and Task description paper submissions due by 23:59 GMT -12:00
14 Mar 2019 Paper reviews due (for both systems and tasks)
06 Apr 2019: Author notifications
20 Apr 2019: Camera ready submissions due
Summer 2019: SemEval 2019

* 10 Jan to 31 Jan 2019 is the period during which the task organizers must schedule the evaluation periods for their individual tasks. Usually, evaluation periods for individual tasks are 7 to 14 days, but there is no hard and fast rule about this. Contact the task organizers for the tasks you are interested in for the exact time frame when they will conduct their evaluations. They should tell you the date by which they will release the test data, and the date by which participant submissions are to be uploaded. Note that some tasks may involve more than one sub-task, each having a separate evaluation time frame.

Discussion Group

 

Please join our discussion group at semeval3@googlegroups.com to receive announcements and participate in discussions. For details, questions, and discussion on a particular task, visit the task website. You will also find there a link for the task mailing list.

 

 

Anti-Harassment policy

 

SemEval highly values the open exchange of ideas, the freedom of thought and expression, and respectful scientific debate. We support and uphold the NAACL Anti-Harassment policy. Participants are encouraged to send any concerns or questions to the NAACL Board members, Priscilla Rasmussen and/or the workshop organizers.

Contact Info

Organizers

  • Jonathan May, ISI, University of Southern California
  • Ekaterina Shutova, University of Cambridge
  • Aurelie Herbelot, University of Trento
  • Xiaodan Zhu, Queen's University
  • Marianna Apidianaki, LIMSI, CNRS, Université Paris-Saclay & University of Pennsylvania
  • Saif M. Mohammad, National Research Council Canada

Email

semeval-organizers@googlegroups.com Note that this is the mailing list for SemEval organizers. For questions on a particular task, post them at the *task* mailing list or contact the task organizers directly. You can find the task mailing list from the task webpage.

Other Info

Announcements

  • 2018/9/4: Microsoft is co-sponsoring SemEval! More