https://www.mdu.se/

mdu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Non-contact heart rate monitoring using lab color space
Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.ORCID iD: 0000-0002-1547-4386
Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.ORCID iD: 0000-0003-3802-4721
Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.ORCID iD: 0000-0002-1212-7637
2016 (English)In: Studies in Health Technology and Informatics, 2016, Vol. 224, p. 46-53Conference paper, Published paper (Refereed)
Resource type
Text
Abstract [en]

Research progressing during the last decade focuses more on non-contact based systems to monitor Heart Rate (HR) which are simple, low-cost and comfortable to use. Most of the non-contact based systems are using RGB videos which is suitable for lab environment. However, it needs to progress considerably before they can be applied in real life applications. As luminance (light) has significance contribution on RGB videos HR monitoring using RGB videos are not efficient enough in real life applications in outdoor environment. This paper presents a HR monitoring method using Lab color facial video captured by a webcam of a laptop computer. Lab color space is device independent and HR can be extracted through facial skin color variation caused by blood circulation considering variable environmental light. Here, three different signal processing methods i.e., Fast Fourier Transform (FFT), Independent Component Analysis (ICA) and Principal Component Analysis (PCA) have been applied on the color channels in video recordings and blood volume pulse (BVP) has been extracted from the facial regions. In this study, HR is subsequently quantified and compare with a reference measurement. The result shows that high degrees of accuracy have been achieved compared to the reference measurements. Thus, this technology has significant potential for advancing personal health care, telemedicine and many real life applications such as driver monitoring.

Place, publisher, year, edition, pages
2016. Vol. 224, p. 46-53
Keywords [en]
Heart rate, Lab color space, Signal processing, blood volume, circulation, driver, face, Fourier transformation, human, independent component analysis, luminance, monitoring, principal component analysis, quantitative study, skin color, telemedicine, videorecording
National Category
Medical Engineering
Identifiers
URN: urn:nbn:se:mdh:diva-32175DOI: 10.3233/978-1-61499-653-8-46ISI: 000385238500008Scopus ID: 2-s2.0-84973483708ISBN: 9781614996521 (print)OAI: oai:DiVA.org:mdh-32175DiVA, id: diva2:941968
Conference
13th International Conference on Wearable Micro and Nano Technologies for Personalised Health, pHealth 2016; Heraklion, Crete; Greece; 29 May 2016 through 31 May 2016; Code 121852
Available from: 2016-06-23 Created: 2016-06-23 Last updated: 2021-02-25Bibliographically approved
In thesis
1. Artificial Intelligence for Non-Contact-Based Driver Health Monitoring
Open this publication in new window or tab >>Artificial Intelligence for Non-Contact-Based Driver Health Monitoring
2021 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

In clinical situations, a patient’s physical state is often monitored by sensors attached to the patient, and medical staff are alerted if the patient’s status changes in an undesirable or life-threatening direction. However, in unsupervised situations, such as when driving a vehicle, connecting sensors to the driver is often troublesome and wired sensors may not produce sufficient quality due to factors such as movement and electrical disturbance. Using a camera as a non-contact sensor to extract physiological parameters based on video images offers a new paradigm for monitoring a driver’s health and mental state. Due to the advanced technical features in modern vehicles, driving is now faster, safer and more comfortable than before. To enhance transport security (i.e. to avoid unexpected traffic accidents), it is necessary to consider a vehicle driver as a part of the traffic environment and thus monitor the driver’s health and mental state. Such a monitoring system is commonly developed based on two approaches: driving behaviour-based and physiological parameters-based.

This research work demonstrates a non-contact approach that classifies a driver’s cognitive load based on physiological parameters through a camera system and vehicular data collected from control area networks considering image processing, computer vision, machine learning (ML) and deep learning (DL). In this research, a camera is used as a non-contact sensor and pervasive approach for measuring and monitoring the physiological parameters. The contribution of this research study is four-fold: 1) Feature extraction approach to extract physiological parameters (i.e. heart rate [HR], respiration rate [RR], inter-beat interval [IBI], heart rate variability [HRV] and oxygen saturation [SpO2]) using a camera system in several challenging conditions (i.e. illumination, motion, vibration and movement); 2) Feature extraction based on eye-movement parameters (i.e. saccade and fixation); 3) Identification of key vehicular parameters and extraction of useful features from lateral speed (SP), steering wheel angle (SWA), steering wheel reversal rate (SWRR), steering wheel torque (SWT), yaw rate (YR), lanex (LAN) and lateral position (LP); 4) Investigation of ML and DL algorithms for a driver’s cognitive load classification. Here, ML algorithms (i.e. logistic regression [LR], linear discriminant analysis [LDA], support vector machine [SVM], neural networks [NN], k-nearest neighbours [k-NN], decision tree [DT]) and DL algorithms (i.e. convolutional neural networks [CNN], long short-term memory [LSTM] networks and autoencoders [AE]) are used. 

One of the major contributions of this research work is that physiological parameters were extracted using a camera. According to the results, feature extraction based on physiological parameters using a camera achieved the highest correlation coefficient of .96 for both HR and SpO2 compared to a reference system. The Bland Altman plots showed 95% agreement considering the correlation between the camera and the reference wired sensors. For IBI, the achieved quality index was 97.5% considering a 100 ms R-peak error. The correlation coefficients for 13 eye-movement features between non-contact approach and reference eye-tracking system ranged from .82 to .95.

For cognitive load classification using both the physiological and vehicular parameters, two separate studies were conducted: Study 1 with the 1-back task and Study 2 with the 2-back task. Finally, the highest average accuracy achieved in terms of cognitive load classification was 94% for Study 1 and 82% for Study 2 using LR algorithms considering the HRV parameter. The highest average classification accuracy of cognitive load was 92% using SVM considering saccade and fixation parameters. In both cases, k-fold cross-validation was used for the validation, where the value of k was 10. The classification accuracies using CNN, LSTM and autoencoder were 91%, 90%, and 90.3%, respectively. 

This research study shows such a non-contact-based approach using ML, DL, image processing and computer vision is suitable for monitoring a driver’s cognitive state.

Place, publisher, year, edition, pages
Västerås: Mälardalen University, 2021
Series
Mälardalen University Press Dissertations, ISSN 1651-4238 ; 330
Keywords
Driver Monitoring, Artificial Intelligence, Machine Learning, Deep Learning
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:mdh:diva-53529 (URN)978-91-7485-499-2 (ISBN)
Public defence
2021-04-07, Delta + digitalt via Zoom, Mälardalens högskola, Västerås, 13:15 (English)
Opponent
Supervisors
Available from: 2021-02-26 Created: 2021-02-25 Last updated: 2021-04-13Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Rahman, HamidurAhmed, Mobyen UddinBegum, Shahina

Search in DiVA

By author/editor
Rahman, HamidurAhmed, Mobyen UddinBegum, Shahina
By organisation
Embedded Systems
Medical Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 118 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf