Recent proof-of-concept research has appeared highlighting the applicability of using Brain Computer Interface (BCI) technology to utilise a subjects visual system to classify images. This technique involves classifying a users EEG (Electroencephalography) signals as they view images presented on a screen. The premise is that images (targets) that arouse a subjects attention generate distinct brain responses, and these brain responses can then be used to label the images. Research thus far in this domain has focused on examining the tasks and paradigms that can be used to elicit these neurologically informative signals from images, and the correlates of human perception that modulate them. While success has been shown in detecting these responses in high speed presentation paradigms, there is still an open question as to what search tasks can ultimately benefit from using an EEG based BCI system.
In this thesis we explore: (1) the neural signals present during visual search tasks that require eye movements, and how they inform us of the possibilities for BCI applica- tions utilising eye tracking and EEG in combination with each other, (2) how temporal characteristics of eye movements can give indication of the suitability of a search task to being augmented by an EEG based BCI system, (3) the characteristics of a number of paradigms that can be used to elicit informative neural responses to drive image search BCI applications.
In this thesis we demonstrate EEG signals can be used in a discriminative manner to label images. In addition, we find in certain instances, that signals derived from sources such as eye movements can yield significantly more discriminative information.