Smart phones are becoming increasingly powerful and can act as computers in our pockets, with always-on network access and a suite of sensors to help it interact with the environment and the user. Due to the fact that users tend to bring their mobile phone with them everywhere, the modern smartphone appears to be a key stepping stone towards the ‘Total Recall’ vision of Bell & Gemmel.
In this work, we present a real-time lifelogging software that can work independently or in conjunction with a Sensecam. The software is called Life-lens and it runs on Android smartphones. It utilizes energy conservation software on the smartphone to support day-long sensor capture and build a semantically rich life narrative. All available sensors on the phone (incl camera, accelerometer, GPS, Bluetooth, etc…) are employed to capture the current user context. The main contribution of life-lens beyond previously existing solutions is that it is real-time in nature. Data gathered by the life-lens software is analyzed on the phone and uploaded dynamically as a life-stream to a central server where additional semantic analysis is performed to refine the event segmentation and perform more processor intensive operations such as face detection. The server implements a WWW-based real-time interface that can automatically annotate the life-stream to generate the diary-style narrative of life activity. In this work, we present life-lens on the mobile device and on the WWW interface.