Participate!

We encourage a variety of systems in our task and allow systems to be either General Purpose or Level-Specific.

  • General purpose systems should report similarity scores for comparisons across all lexical levels
  • Level-specific systems should report similarity scores for only a single lexical level.

If your team would like to submit to some level (but not all), teams should submit a separate system for each level.
To participate in our SemEval task you simply need to:

  1. Download the trial data and start building your cross-level semantic similarity system.
  2. Subscribe to our Google group to facilitate discussion and information exchange about this task. We encourage teams to email questions directly to the group, rather than the organizers.
  3. Starting in mid-December 2013, download the training data and begin training (or tuning) your system on data.
  4. Starting March 2014, register your team for the task and download the test data. You will need to then submit your similarity scores to the task's FTP site. (FTP details will be provided with the email from your team's registration.)
  5. Once the test period is ended, we will report scores for all teams and systems. Teams will then need to submit a paper describing their system and results.

As training and test periods are decided, we will release more details on how to submit your system and paper.

Contact Info

Organizers


email : semeval-2014-clss@googlegroups.com
group: groups.google.com/group/semeval-2014-clss

Other Info

Announcements