Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

Eye Movement Classification Using Neuromorphic Vision Sensors

Iddrisu, Khadija orcid logoORCID: 0009-0004-0008-5697, Shariff, Waseem orcid logoORCID: 0000-0001-7298-9389, Stec, Maciej, O'Connor, Noel E. orcid logoORCID: 0000-0002-4033-9135 and Little, Suzanne orcid logoORCID: 0000-0003-3281-3471 (2026) Eye Movement Classification Using Neuromorphic Vision Sensors. Journal of Eye Movement Research, 19 (17). ISSN 1995-8692

Abstract
Eye movement classification, particularly the identification of fixations and saccades, plays a vital role in advancing our understanding of neurological functions and cognitive processing. Conventional modalities of data, such as RGB webcams, often face limitations such as motion blur, latency and susceptibility to noise. Neuromorphic Vision Sensors, also known as event cameras (ECs), capture pixel-level changes asynchronously and at a high temporal resolution, making them well suited for detecting the swift transitions inherent to eye movements. However, the resulting data are sparse, which makes them less well suited for use with conventional algorithms. Spiking Neural Networks (SNNs) are gaining attention due to their discrete spatio-temporal spike mechanism ideally suited for sparse data. These networks offer a biologically inspired computational paradigm capable of modeling the temporal dynamics captured by event cameras. This study validates the use of Spiking Neural Networks (SNNs) with event cameras for efficient eye movement classification. We manually annotated the EV-Eye dataset, the largest publicly available event-based eye-tracking benchmark, into sequences of saccades and fixations, and we propose a convolutional SNN architecture operating directly on spike streams. Our model achieves an accuracy of 94% and a precision of 0.92 across annotated data from 10 users. As the first work to apply SNNs to eye movement classification using event data, we benchmark our approach against spiking baselines such as SpikingVGG and SpikingDenseNet, and additionally provide a detailed computational complexity comparison between SNN and ANN counterparts. Our results highlight the efficiency and robustness of SNNs for event-based vision tasks, with over one order of magnitude improvement in computational efficiency, with implications for fast and low-power neurocognitive diagnostic systems
Metadata
Item Type:Article (Published)
Refereed:Yes
Uncontrolled Keywords:Eye movements; event cameras; spiking neural networks
Subjects:Biological Sciences > Neuroscience
Humanities > Biological Sciences > Neuroscience
Computer Science > Artificial intelligence
Computer Science > Machine learning
DCU Faculties and Centres:DCU Faculties and Schools > Faculty of Engineering and Computing
DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Publisher:Bern Open Publishing
Official URL:https://www.mdpi.com/1995-8692/19/1/17
Copyright Information:Authors
Funders:Inisght research ireland centre for data analytics
ID Code:32270
Deposited On:23 Feb 2026 11:05 by Khadija Iddrisu . Last Modified 23 Feb 2026 11:05
Documents

Full text available as:

[thumbnail of jemr-19-00017-3.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Creative Commons: Attribution 4.0
4MB
Metrics

Altmetric Badge

Dimensions Badge

Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record