Browse DORAS
Browse Theses
Latest Additions
Creative Commons License
Except where otherwise noted, content on this site is licensed for use under a:

Neurological modeling of what experts vs. non-experts find interesting

Smeaton, Alan F. and Wilkins, Peter and Healy, Graham and Ampatzis, Christos and Rusinski, M. and Izzo, Dario (2009) Neurological modeling of what experts vs. non-experts find interesting. In: Neuroscience 2009, 17-21 October 2009, Chicago, USA.

Full text available as:

PDF (abstract) - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
PDF (poster) - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader


The P3 and related ERP's have a long history of use to identify stimulus events in subjects as part of oddball-style experiments. In this work we describe the ongoing development of oddball style experiments which attempt to capture what a subject finds of interest or curious, when presented with a set of visual stimuli i.e. images. This joint work between Dublin City University (DCU) and the European Space Agency's Advanced Concepts Team (ESA ACT) is motivated by the challenges of autonomous space exploration where the time lag for sending data back to earth for analysis and then communicating an action or decision back to the spacecraft means that decision-making is slow. Also, when extraterrestrial sensors capture data, the determination of what data to send back to earth is driven by an expertly devised rule set, that is scientists need to determine apriori what will be of interest. This cannot adapt to novel or unexpected data that a scientist may find curious. Our work is attempting to determine if it is possible to capture what a scientist (subject) finds of interest (curious) in a stream of image data through EEG measurement. One of the our challenges is to determine the difference between an expert and a lay subject response to stimulus. To investigate the theorized difference, we use a set of lifelog images as our dataset. Lifelog images are first person images taken by a small wearable camera which continuously records images whilst it is worn. We have devised two key experiments for use with this data and two classes of subjects. Our subjects are a person who has worn the personal camera, from which our collection of lifelog images is taken and who becomes our expert, and the remaining subjects are people who have no association with the captured images. Our first experiment is a traditional oddball experiment where the oddballs are people having coffee, and can be thought of as a directed information seeking task. The second experiment is to present a stream of lifelog images to the subjects and record which images cause a stimulus response. Once the data from these experiments has been captured our task is to compare the responses between the expert and lay subject groups, to determine if there are any commonalities between these groups or any distinct differences. If the latter outcome is the case the objective is then to investigate methods for capturing properties of images which cause an expert to be interested in a presented image. Further novelty is added to our work by the fact we are using entry-level off-the-shelf EEG devices, consisting of 4 nodes with a sampling rate of 255Hz.

Item Type:Conference or Workshop Item (Poster)
Event Type:Conference
Subjects:Biological Sciences > Bioinformatics
Biological Sciences > Neuroscience
Engineering > Signal processing
Computer Science > Artificial intelligence
Computer Science > Image processing
DCU Faculties and Centres:Research Initiatives and Centres > CLARITY: The Centre for Sensor Web Technologies
DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Official URL:
Copyright Information:Copyright © 2009 the authors
Use License:This item is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 3.0 License. View License
Funders:European Space Agency, Science Foundation Ireland
ID Code:4709
Deposited On:27 Oct 2009 16:02 by Alan Smeaton. Last Modified 16 Jan 2017 12:35

Download statistics

Archive Staff Only: edit this record