University of Limerick Institutional Repository

Variational autoencoder for image-based augmentation of eye-tracking data

DSpace Repository

Show simple item record Elbattah, Mahmoud Loughnane, Colm Guérin, Jean-Luc Carette, Romuald Cilia, Federica Dequen, Gilles 2021-05-11T08:14:01Z 2021-05-11T08:14:01Z 2021
dc.description peer-reviewed en_US
dc.description.abstract Over the past decade, deep learning has achieved unprecedented successes in a diversity of application domains, given large-scale datasets. However, particular domains, such as healthcare, inherently suffer from data paucity and imbalance. Moreover, datasets could be largely inaccessible due to privacy concerns, or lack of data-sharing incentives. Such challenges have attached significance to the application of generative modeling and data augmentation in that domain. In this context, this study explores a machine learning-based approach for generating synthetic eye-tracking data. We explore a novel application of variational autoencoders (VAEs) in this regard. More specifically, a VAE model is trained to generate an image-based representation of the eye-tracking output, so called scan paths. Overall, our results validate that the VAE model could generate a plausible output from a limited dataset. Finally, it is empirically demonstrated that such approach could be employed as a mechanism for data augmentation to improve the performance in classification tasks. en_US
dc.language.iso eng en_US
dc.publisher MDPI en_US
dc.relation.ispartofseries Journal of Imaging;7, 83
dc.subject deep learning en_US
dc.subject variational autoencoder en_US
dc.title Variational autoencoder for image-based augmentation of eye-tracking data en_US
dc.type info:eu-repo/semantics/article en_US
dc.type.supercollection all_ul_research en_US
dc.type.supercollection ul_published_reviewed en_US
dc.identifier.doi 10.3390/jimaging7050083
dc.contributor.sponsor Université de Picardie Jules Verne, France en_US
dc.rights.accessrights info:eu-repo/semantics/openAccess en_US

Files in this item

This item appears in the following Collection(s)

Show simple item record

Search ULIR


My Account