Tasks
A task is an activity proposed with the purpose of solving a specific NLP problem, generally within the framework of a competition. Below is information about NLP tasks in Spanish from 2013 to the present.
HUHU - Degree of prejudice prediction IberLEF 2023
- NLP topic: processing humor
- Dataset: HUHU 2023
- Forum: IberLEF
- Competition: HUHU: HUrtful HUmour - Detection of humour spreading prejudice on Twitter
- Domain: Social
- Language(s): Spanish
JOKER: Pun location in Spanish CLEF 2023
- NLP topic: processing humor
- Dataset: JOKER 2023 ES
- Forum: CLEF
- Competition: JOKER: Automatic Wordplay Analysis
- Domain: Social
- Language(s): Spanish
HUHU - Hurtful humour detection IberLEF 2023
- NLP topic: processing humor
- Dataset: HUHU 2023
- Forum: IberLEF
- Competition: HUHU: HUrtful HUmour - Detection of humour spreading prejudice on Twitter
- Domain: Social
- Language(s): Spanish
HUHU - Prejudice Target Detection IberLEF 2023
- NLP topic: processing humor
- Dataset: HUHU 2023
- Forum: IberLEF
- Competition: HUHU: HUrtful HUmour - Detection of humour spreading prejudice on Twitter
- Domain: Social
- Language(s): Spanish
JOKER: Pun detection in Spanish CLEF 2023
- NLP topic: processing humor
- Dataset: JOKER 2023 ES
- Forum: CLEF
- Competition: JOKER: Automatic Wordplay Analysis
- Domain: Social
- Language(s): Spanish
Humor rating IberLEF 2021
- NLP topic: processing humor
- Dataset: HAHA
- Forum: IberLEF
- Competition: Detecting, Rating and Analyzing Humor in Spanish
- Domain:
- Language(s): Spanish
Humor logic mechanism classification IberLEF 2021
- NLP topic: processing humor
- Dataset: HAHA
- Forum: IberLEF
- Competition: Detecting, Rating and Analyzing Humor in Spanish
- Domain:
- Language(s): Spanish
Humor detection IberLEF 2021
- NLP topic: processing humor
- Dataset: HAHA
- Forum: IberLEF
- Competition: Detecting, Rating and Analyzing Humor in Spanish
- Domain:
- Language(s): Spanish
Factuality classification IberLEF 2020
- NLP topic: processing factuality
- Dataset: FACT
- Forum: IberLEF
- Competition: FACT 2020: Factuality Analysis and Classification Task
- Domain:
- Language(s): Spanish
Irony detection IberLEF 2019
- NLP topic: processing humor
- Dataset: IDAT-SP-EU, IDAT-SP-MEX, IDAT-SP-CUBA
- Forum: IberLEF
- Competition: Irony Detection in Spanish Variants
- Domain:
- Language(s): Spanish (Cuba), Spanish (Mexico), Spanish (Spain)
Factuality classification IberLEF 2019
- NLP topic: processing factuality
- Dataset: FACT
- Forum: IberLEF
- Competition: FACT: Factuality Analysis and Classification Task,
- Domain:
- Language(s): Spanish
Humor detection IberLEF 2019
- NLP topic: processing humor
- Dataset: HAHA
- Forum: IberLEF
- Competition: HAHA 2019: Humor Analysis based on Human Annotation
- Domain:
- Language(s): Spanish
Irony detection IberLEF 2019
- NLP topic: processing humor
- Dataset: IDAT-SP-EU, IDAT-SP-MEX, IDAT-SP-CUBA
- Forum: IberLEF
- Competition: Irony Detection in Spanish Variants
- Domain:
- Language(s): Spanish (Cuba), Spanish (Mexico), Spanish (Spain)
Humor rating IberLEF 2019
- NLP topic: processing humor
- Dataset: HAHA
- Forum: IberLEF
- Competition: HAHA 2019: Humor Analysis based on Human Annotation
- Domain:
- Language(s): Spanish
Irony detection IberLEF 2019
- NLP topic: processing humor
- Dataset: IDAT-SP-EU, IDAT-SP-MEX, IDAT-SP-CUBA
- Forum: IberLEF
- Competition: Irony Detection in Spanish Variants
- Domain:
- Language(s): Spanish (Cuba), Spanish (Mexico), Spanish (Spain)
Humor detection IberEVAL 2018
- NLP topic: processing humor
- Dataset: HAHA
- Forum: IberEVAL
- Competition: Humor Analysis based on Human Annotation (HAHA)
- Domain:
- Language(s): Spanish
Humor rating IberEVAL 2018
- NLP topic: processing humor
- Dataset: HAHA
- Forum: IberEVAL
- Competition: Humor Analysis based on Human Annotation (HAHA)
- Domain:
- Language(s): Spanish
Person name disambiguation IberEVAL 2017
- NLP topic: information retrieval
- Dataset: M-WeP-NaD-2017
- Forum: IberEVAL
- Competition: Multilingual Web Person Name Disambiguation (M-WePNaD)
- Domain:
- Language(s): Spanish, English
If you have published a result better than those on the list, send a message to odesia-comunicacion@lsi.uned.es indicating the result and the DOI of the article, along with a copy of it if it is not published openly.