Please use this identifier to cite or link to this item:
https://hdl.handle.net/20.500.11851/10903
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Galassi, A. | - |
dc.contributor.author | Ruggeri, F. | - |
dc.contributor.author | Barrón-Cedeño, A. | - |
dc.contributor.author | Alam, F. | - |
dc.contributor.author | Caselli, T. | - |
dc.contributor.author | Kutlu, M. | - |
dc.contributor.author | Antici F. | - |
dc.date.accessioned | 2023-12-23T06:07:21Z | - |
dc.date.available | 2023-12-23T06:07:21Z | - |
dc.date.issued | 2023 | - |
dc.identifier.issn | 1613-0073 | - |
dc.identifier.uri | https://hdl.handle.net/20.500.11851/10903 | - |
dc.description | 24th Working Notes of the Conference and Labs of the Evaluation Forum, CLEF-WN 2023 -- 18 September 2023 through 21 September 2023 -- 193170 | en_US |
dc.description.abstract | We describe the outcome of the 2023 edition of the CheckThat!Lab at CLEF. We focus on subjectivity (Task 2), which has been proposed for the first time. It aims at fostering the technology for the identification of subjective text fragments in news articles. For that, we produced corpora consisting of 9,530 manually-annotated sentences, covering six languages —Arabic, Dutch, English, German, Italian, and Turkish. Task 2 attracted 12 teams, which submitted a total of 40 final runs covering all languages. The most successful approaches addressed the task using state-of-the-art multilingual transformer models, which were fine-tuned on language-specific data. Teams also experimented with a rich set of other neural architectures, including foundation models, zero-shot classifiers, and standard transformers, mainly coupled with data augmentation and multilingual training strategies to address class imbalance. We publicly release all the datasets and evaluation scripts, with the purpose of promoting further research on this topic. © 2023 Copyright for this paper by its authors. | en_US |
dc.description.sponsorship | PE00000013, PNRR-M4C2-Investimento 1.3; Qatar Foundation, QF; Qatar National Research Fund, QNRF; Bundesministerium für Bildung und Forschung, BMBF: 01FP20031J; Türkiye Bilimsel ve Teknolojik Araştırma Kurumu, TÜBİTAK: 120E514; Università di Bologna, UNIBO: 2021-15854, DOT1303118, NPRP 14C-0916-210015, NPRP13S-0206-200281 | en_US |
dc.language.iso | en | en_US |
dc.publisher | CEUR-WS | en_US |
dc.relation.ispartof | CEUR Workshop Proceedings | en_US |
dc.rights | info:eu-repo/semantics/closedAccess | en_US |
dc.subject | Data augmentation | en_US |
dc.subject | Foundation models | en_US |
dc.subject | Multilingual trainings | en_US |
dc.subject | Neural architectures | en_US |
dc.subject | News articles | en_US |
dc.subject | State of the art | en_US |
dc.subject | Text fragments | en_US |
dc.subject | Training strategy | en_US |
dc.subject | Transformer modeling | en_US |
dc.subject | Turkishs | en_US |
dc.title | Overview of the Clef-2023 Checkthat! Lab: Task 2 on Subjectivity in News Articles Notebook for the Checkthat! Lab at Clef 2023 | en_US |
dc.type | Conference Object | en_US |
dc.department | TOBB ETÜ | en_US |
dc.identifier.volume | 3497 | en_US |
dc.identifier.startpage | 236 | en_US |
dc.identifier.endpage | 249 | en_US |
dc.identifier.scopus | 2-s2.0-85175626781 | en_US |
dc.institutionauthor | … | - |
dc.authorscopusid | 57196712506 | - |
dc.authorscopusid | 57215861451 | - |
dc.authorscopusid | 26321398000 | - |
dc.authorscopusid | 56024506200 | - |
dc.authorscopusid | 35932126700 | - |
dc.authorscopusid | 35299304300 | - |
dc.authorscopusid | 57566127900 | - |
dc.relation.publicationcategory | Konferans Öğesi - Uluslararası - Kurum Öğretim Elemanı | en_US |
item.openairetype | Conference Object | - |
item.languageiso639-1 | en | - |
item.grantfulltext | none | - |
item.fulltext | No Fulltext | - |
item.openairecristype | http://purl.org/coar/resource_type/c_18cf | - |
item.cerifentitytype | Publications | - |
Appears in Collections: | Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection |
CORE Recommender
Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.