The Role of Data Curation in Image Captioning
Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Documents
- Fulltext
Final published version, 7.71 MB, PDF document
Image captioning models are typically trained by treating all samples equally, neglecting to account for mismatched or otherwise difficult data points. In contrast, recent work has shown the effectiveness of training models by scheduling the data using curriculum learning strategies. This paper contributes to this direction by actively curating difficult samples in datasets without increasing the total number of samples. We explore the effect of using three data curation methods within the training process: complete removal of a sample, caption replacement, or image replacement via a text-to-image generation model. Experiments on the Flickr30K and COCO datasets with the BLIP and BEiT-3 models demonstrate that these curation methods do indeed yield improved image captioning models, underscoring their efficacy.
Original language | English |
---|---|
Title of host publication | EACL 2024 - 18th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference |
Editors | Yvette Graham, Matthew Purver, Matthew Purver |
Number of pages | 15 |
Publisher | Association for Computational Linguistics (ACL) |
Publication date | 2024 |
Pages | 1074-1088 |
ISBN (Electronic) | 9798891760882 |
Publication status | Published - 2024 |
Event | 18th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2024 - St. Julian's, Malta Duration: 17 Mar 2024 → 22 Mar 2024 |
Conference
Conference | 18th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2024 |
---|---|
Land | Malta |
By | St. Julian's |
Periode | 17/03/2024 → 22/03/2024 |
Sponsor | Adobe, Babelscape, Bloomberg Engineering, Megagon Labs, Snowflake |
Bibliographical note
Publisher Copyright:
© 2024 Association for Computational Linguistics.
Links
- https://aclanthology.org/2024.eacl-long.65/
Final published version
ID: 392216501