Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

An interpretable machine vision approach to human activity recognition using photoplethysmograph sensor data

Brophy, Eoin orcid logoORCID: 0000-0002-6486-5746, Wang, Zhengwei orcid logoORCID: 0000-0001-7706-553X, Dominguez Veiga, José Juan orcid logoORCID: 0000-0002-6634-9606, Smeaton, Alan F. orcid logoORCID: 0000-0003-1028-8389 and Ward, Tomás E. orcid logoORCID: 0000-0002-6173-6607 (2018) An interpretable machine vision approach to human activity recognition using photoplethysmograph sensor data. In: Irish Conference on Artificial Intelligence and Cognitive Science (AICS 2018), 6 -7 Dec 2018, Dublin, Ireland.

Abstract
The current gold standard for human activity recognition (HAR) is based on the use of cameras. However, the poor scalability of camera systems renders them impractical in pursuit of the goal of wider adoption of HAR in mobile computing contexts. Consequently, researchers instead rely on wearable sensors and in particular inertial sensors. A particularly prevalent wearable is the smart watch which due to its integrated inertial and optical sensing capabilities holds great potential for realising better HAR in a non-obtrusive way. This paper seeks to simplify the wearable approach to HAR through determining if the wrist-mounted optical sensor alone typically found in a smartwatch or similar device can be used as a useful source of data for activity recognition. The approach has the potential to eliminate the need for the inertial sensing element which would in turn reduce the cost of and complexity of smartwatches and fitness trackers. This could potentially commoditise the hardware requirements for HAR while retaining the functionality of both heart rate monitoring and activity capture all from a single optical sensor. Our approach relies on the adoption of machine vision for activity recognition based on suitably scaled plots of the optical signals. We take this approach so as to produce classifications that are easily explainable and interpretable by non-technical users. More specifically, images of photoplethysmography signal time series are used to retrain the penultimate layer of a convolutional neural network which has initially been trained on the ImageNet database. We then use the 2048 dimensional features from the penultimate layer as input to a support vector machine. Results from the experiment yielded an average classification accuracy of 92.3\%. This result outperforms that of an optical and inertial sensor combined (78\%) and illustrates the capability of HAR systems using standalone optical sensing elements which also allows for both HAR and heart rate monitoring. Finally, we demonstrate through the use of tools from research in explainable AI how this machine vision approach lends itself to more interpretable machine learning output.
Metadata
Item Type:Conference or Workshop Item (Paper)
Event Type:Conference
Refereed:Yes
Uncontrolled Keywords:deep learning; activity recognition; explainable artificial intelligence
Subjects:Computer Science > Artificial intelligence
Computer Science > Machine learning
Computer Science > Visualization
DCU Faculties and Centres:DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Research Institutes and Centres > INSIGHT Centre for Data Analytics
Published in: Brennan, Rob, Beel, Joeran, Byrne, Ruth, Debattista, Jeremy and Crotti Junior, Ademar, (eds.) Proceedings for the 26th AIAI Irish Conference on Artificial Intelligence and Cognitive Science. 2259. CEUR -WS.
Publisher:CEUR -WS
Official URL:http://ceur-ws.org/Vol-2259/aics_22.pdf
Copyright Information:© 2018 The Authors
Use License:This item is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 3.0 License. View License
Funders:Science Foundation Ireland under grant number SFI/12/RC/2289 and by SAP SE
ID Code:22807
Deposited On:03 Dec 2018 16:26 by Eoin Brophy . Last Modified 23 Jun 2020 15:35
Documents

Full text available as:

[thumbnail of AICS_2018_EB_paper_60.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
727kB
Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record