Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

Characterizing everyday activities from visual lifelogs based on enhancing concept representation

Wang, Peng, Lifeng, Sun, Shiqiang, Yang, Smeaton, Alan F. orcid logoORCID: 0000-0003-1028-8389 and Gurrin, Cathal orcid logoORCID: 0000-0003-2903-3968 (2016) Characterizing everyday activities from visual lifelogs based on enhancing concept representation. Computer Vision and Image Understanding, 148 . pp. 181-192. ISSN 1077-3142

Abstract
The proliferation of wearable visual recording devices such as SenseCam, Google Glass, etc. is creating opportunities for automatic analysis and usage of digitally-recorded everyday behavior, known as visual lifelogs. Such information can be recorded in order to identify human activities and build applications that support assistive living and enhance the human experience. Although the automatic detection of semantic concepts from images within a single, narrow, domain has now reached a usable performance level, in visual lifelogging a wide range of everyday concepts are captured by the imagery which vary enormously from one subject to another. This challenges the performance of automatic concept detection and the identification of human activities because visual lifelogs will have such variety of semantic concepts across individual subjects. In this paper, we characterize the everyday activities and behavior of subjects by applying a hidden conditional random field (HCRF) algorithm on an enhanced representation of semantic concepts appearing in visual lifelogs. This is carried out by first extracting latent features of concept occurrences based on weighted non-negative tensor factorization (WNTF) to exploit temporal patterns of concept occurrence. These results are then in- put to an HCRF-based model to provide an automatic annotation of activity sequences from a visual lifelog. Results for this are demonstrated in experiments to show the efficacy of our algorithm in improving the accuracy of characterizing everyday activities from individual lifelogs. The overall contribution is a demonstration that using images taken by wearable cameras we can capture and characterize everyday behavior with a level of accuracy that allows useful applications which measure, or change that behavior, to be developed.
Metadata
Item Type:Article (Published)
Refereed:Yes
Additional Information:alan.smeaton@dcu.ie
Uncontrolled Keywords:Lifelogging; Assistive living; SenseCam; Activity classification; Wearable camera
Subjects:Computer Science > Lifelog
Computer Science > Computer software
DCU Faculties and Centres:Research Institutes and Centres > INSIGHT Centre for Data Analytics
DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Publisher:Elsevier
Official URL:http://dx.doi.org/10.1016/j.cviu.2015.09.014
Copyright Information:© 2015 Elsevier Inc. All rights reserved.
Use License:This item is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 3.0 License. View License
Funders:National Science Foundation of China, Science Foundation Ireland
ID Code:21227
Deposited On:22 Jun 2016 11:56 by Alan Smeaton . Last Modified 15 Dec 2021 16:13
Documents

Full text available as:

[thumbnail of CVIU_Peng.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
8MB
Metrics

Altmetric Badge

Dimensions Badge

Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record