TY - JOUR
T1 - Contextualizing remote fall risk
T2 - Video data capture and implementing ethical AI
AU - Moore, Jason
AU - McMeekin, Peter
AU - Parkes, Thomas
AU - Walker, Richard
AU - Morris, Rosie
AU - Stuart, Samuel
AU - Hetherington, Victoria
AU - Godfrey, Alan
N1 - Funding information: This research is co-funded by a grant from the National Institute of Health Research (NIHR) Applied Research Collaboration (ARC) North East and North Cumbria (NENC). This research is also co-funded by the Faculty of Engineering and Environment at Northumbria University.
PY - 2024/3/6
Y1 - 2024/3/6
N2 - Wearable inertial measurement units (IMUs) are being used to quantify gait characteristics that are associated with increased fall risk, but the current limitation is the lack of contextual information that would clarify IMU data. Use of wearable video-based cameras would provide a comprehensive understanding of an individual’s habitual fall risk, adding context to clarify abnormal IMU data. Generally, there is taboo when suggesting the use of wearable cameras to capture real-world video, clinical and patient apprehension due to ethical and privacy concerns. This perspective proposes that routine use of wearable cameras could be realized within digital medicine through AI-based computer vision models to obfuscate/blur/shade sensitive information while preserving helpful contextual information for a comprehensive patient assessment. Specifically, no person sees the raw video data to understand context, rather AI interprets the raw video data first to blur sensitive objects and uphold privacy. That may be more routinely achieved than one imagines as contemporary resources exist. Here, to showcase/display the potential an exemplar model is suggested via off-the-shelf methods to detect and blur sensitive objects (e.g., people) with an accuracy of 88%. Here, the benefit of the proposed approach includes a more comprehensive understanding of an individual’s free-living fall risk (from free-living IMU-based gait) without compromising privacy. More generally, the video and AI approach could be used beyond fall risk to better inform habitual experiences and challenges across a range of clinical cohorts. Medicine is becoming more receptive to wearables as a helpful toolbox, camera-based devices should be plausible instruments.
AB - Wearable inertial measurement units (IMUs) are being used to quantify gait characteristics that are associated with increased fall risk, but the current limitation is the lack of contextual information that would clarify IMU data. Use of wearable video-based cameras would provide a comprehensive understanding of an individual’s habitual fall risk, adding context to clarify abnormal IMU data. Generally, there is taboo when suggesting the use of wearable cameras to capture real-world video, clinical and patient apprehension due to ethical and privacy concerns. This perspective proposes that routine use of wearable cameras could be realized within digital medicine through AI-based computer vision models to obfuscate/blur/shade sensitive information while preserving helpful contextual information for a comprehensive patient assessment. Specifically, no person sees the raw video data to understand context, rather AI interprets the raw video data first to blur sensitive objects and uphold privacy. That may be more routinely achieved than one imagines as contemporary resources exist. Here, to showcase/display the potential an exemplar model is suggested via off-the-shelf methods to detect and blur sensitive objects (e.g., people) with an accuracy of 88%. Here, the benefit of the proposed approach includes a more comprehensive understanding of an individual’s free-living fall risk (from free-living IMU-based gait) without compromising privacy. More generally, the video and AI approach could be used beyond fall risk to better inform habitual experiences and challenges across a range of clinical cohorts. Medicine is becoming more receptive to wearables as a helpful toolbox, camera-based devices should be plausible instruments.
KW - Privacy
KW - fall risk
KW - gait analysis
KW - body-worn cameras
KW - deep learning
UR - http://www.scopus.com/inward/record.url?scp=85187128684&partnerID=8YFLogxK
U2 - 10.1038/s41746-024-01050-7
DO - 10.1038/s41746-024-01050-7
M3 - Article
C2 - 38448611
SN - 2398-6352
VL - 7
JO - npj Digital Medicine
JF - npj Digital Medicine
IS - 1
M1 - 61
ER -