Lifelogging– particularly image capture – is capable of generating vast amounts of image data of complex human activates and events which can be difficult to automatically sort and navigate. In this work we
demonstrate how neural signals from EEG (Electroencephalography) can be used to help sort and navigate these datasets at high speed. By using EEG we can detect a variety of attention related neural
responses to viewing lifelogimages which in turn allows us to sort them from the subjective perspective of which images caught the person’s attention most significantly.