Call For Participation

SemEval 2015 Task 12 - Aspect Based Sentiment Analysis



This task (ABSA15 for short) is a continuation of SemEval 2014 Task 4 (ABSA14, ABSA15 will focus primarily on the same domains as ABSA14 (restaurants and laptops). However, unlike ABSA14, the input datasets of ABSA15 will contain entire reviews, not isolated (potentially out of context) sentences. Also, ABSA15 consolidates the four subtasks of ABSA14 within a unified framework. Furthermore, ABSA15 will include an out-of-domain subtask, involving test data from a domain unknown to the participants, other than the domains that will be considered during training. ABSA15 consists of the following subtasks.


Subtask 1: In-domain ABSA

Given a review text about a laptop or a restaurant, identify the following information:


Slot 1: Aspect Category. Identify every entity (E) and attribute (A) pair (E#A) towards which an opinion is expressed in the given text. E and A should be chosen from predefined domain-specific inventories of entity types (e.g. laptop, keyboard, operating system, restaurant, food, drinks) and attribute labels (e.g. performance, design, price, quality). Each E#A pair is considered an aspect category of the given text. The inventories of entity types and attribute labels are described in the annotation guidelines; see Some examples highlighting the required information follow:

a. It is extremely portable and easily connects to WIFI at the library and elsewhere.  →{LAPTOP#PORTABILITY}, {LAPTOP#CONNECTIVITY}
b. The exotic food is beautifully presented and is a delight in delicious combinations. → {FOOD#STYLE_OPTIONS}, {FOOD#QUALITY}


Slot 2 (Only for the restaurants domain): Opinion Target Expression (OTE). An opinion target expression (OTE) is an expression used in the given text to refer to the reviewed entity E of a pair E#A. The OTE is defined by its starting and ending offsets in the given text. The OTE slot takes the value “NULL”, when there is no (explicit) mention of the entity E. Below are some examples:

a. Great for a romantic evening, but over-priced.  → {AMBIENCE#GENERAL, “NULL”}, {RESTAURANT# PRICES, “NULL”}
b. The fajitas were delicious, but expensive. → {FOOD#QUALITY, “fajitas”}, {FOOD# PRICES, “fajitas”}


Slot 3: Sentiment Polarity. Each identified E#A pair of the given text has to be assigned a polarity (positive, negative, or neutral). The neutral label applies to mildly positive or mildly negative sentiment, as in the second example below. 

a. The applications are also very easy to find and maneuver.  → {SOFTWARE#USABILITY, positive}
b. The fajitas are nothing out of the ordinary”.  → {FOOD#GENERAL, “fajitas”, neutral}


Subtask 2: Out-of-domain ABSA

The participating teams will be asked to test their systems in a previously unseen domain for which no training data will be made available. The gold annotations for Slot1 will be provided and the teams will be required to return annotations for Slot 3 (sentiment polarity) only.

Two datasets of ~550 reviews of laptops and restaurants annotated as above are already available for training. Additional datasets will be provided to evaluate the participating systems in Subtask 1 (in-domain ABSA). Information about the domain adaptation dataset of Subtask 2 (out-of-domain ABSA) will be provided later.

Evaluation period: December 15-22, 2014 [the exact start and end dates will be announced shortly]
Paper submission due: January 30, 2015
Paper reviews due: February 28, 2015
Camera ready due: March 30, 2015
SemEval workshop: June 4-5, 2015 (co-located with NAACL-2015 in Denver, Colorado)


The Semeval-2015 Task 12 website includes further details on the training data, evaluation, and examples of expected system outputs:

Registration at:
Join our mailing list:


Ion Androutsopoulos (Athens University of Economics and Business, Greece)
Dimitris Galanis (“Athena” Research Center, Greece)
Suresh Manandhar (University of York, UK) [Primary Contact]
Harris Papageorgiou ("Athena" Research Center, Greece)
John Pavlopoulos (Athens University of Economics and Business, Greece)
Maria Pontiki (“Athena” Research Center, Greece)

Contact Info



Other Info


  • November 3, 2014: Train and Trial data updated!
  • October 31, 2014: Annotation Guidelines Released!
  • October 31, 2014: Task description updated!
  • August 1, 2014: Task description updated!
  • August 1, 2014: Trial data updated!
  • August 1, 2014: Restaurants training data released!
  • Trial data released!

Last updated on