Conference Paper (published)
Details
Citation
Walber T, Scherp A & Staab S (2014) Exploitation of gaze data for photo region labeling in an immersive environment. In: Gurrin C, Hopfgartner F, Hurst W, Johansen H, Lee H & O’Connor N (eds.) MultiMedia Modeling 20th Anniversary International Conference, MMM 2014 Dublin, Ireland, January 6-10, 2014 Proceedings, Part I. Lecture Notes in Computer Science, 8325. 20th Anniversary International Conference, MMM MultiMedia Modeling 2014, 06.01.2014-10.01.2014. Dublin: Springer, pp. 424-435. https://doi.org/10.1007/978-3-319-04114-8_36
Abstract
Metadata describing the content of photos are of high importance for applications like image search or as part of training sets for object detection algorithms. In this work, we apply tags to image regions for a more detailed description of the photo semantics. This region labeling is performed without additional effort from the user, just from analyzing eye tracking data, recorded while users are playing a gaze-controlled game. In the game EyeGrab, users classify and rate photos falling down the screen. The photos are classified according to a given category under time pressure. The game has been evaluated in a study with 54 subjects. The results show that it is possible to assign the given categories to image regions with a precision of up to 61%. This shows that we can perform an almost equally good region labeling using an immersive environment like in EyeGrab compared to a previous classification experiment that was much more controlled.
Journal
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Status | Published |
---|---|
Title of series | Lecture Notes in Computer Science |
Number in series | 8325 |
Publication date | 31/12/2014 |
Publisher | Springer |
Place of publication | Dublin |
ISSN of series | 1611-3349 |
ISBN | 9783319041131 |
Conference | 20th Anniversary International Conference, MMM MultiMedia Modeling 2014 |
Dates | – |