TY - GEN
T1 - Deep learning semantic segmentation for indoor terrain extraction
T2 - 2022 IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks, BSN 2022
AU - Moore, Jason
AU - Stuart, Samuel
AU - Walker, Richard
AU - McMeekin, Peter
AU - Young, Fraser
AU - Godfrey, Alan
N1 - Funding information: This research is co-funded by a grant from the National Institute of Health Research (NIHR) Applied Research Collaboration (ARC) North East and North Cumbria (NENC). This research is also co-funded by the Faculty of Engineering and Environment at Northumbria University.
PY - 2022/9/27
Y1 - 2022/9/27
N2 - Contemporary approaches to gait assessment use wearable within free-living environments to capture habitual information, which is more informative compared to data capture in the lab. Wearables range from inertial to camera-based technologies but pragmatic challenges such as analysis of big data from heterogenous environments exist. For example, wearable camera data often requires manual time-consuming subjective contextualization, such as labelling of terrain type. There is a need for the application of automated approaches such as those suggested by artificial intelligence (AI) based methods. This pilot study investigates multiple segmentation models and proposes use of the PSPNet deep learning network to automate a binary indoor floor segmentation mask for use with wearable camera-based data (i.e., video frames). To inform the development of the AI method, a unique approach of mining heterogenous data from a video sharing platform (YouTube) was adopted to provide independent training data. The dataset contains 1973 image frames and accompanying segmentation masks. When trained on the dataset the proposed model achieved an Instance over Union score of 0.73 over 25 epochs in complex environments. The proposed method will inform future work within the field of habitual free-living gait assessment to provide automated contextual information when used in conjunction with wearable inertial derived gait characteristics. Clinical Relevance—Processes developed here will aid automated video-based free-living gait assessment.
AB - Contemporary approaches to gait assessment use wearable within free-living environments to capture habitual information, which is more informative compared to data capture in the lab. Wearables range from inertial to camera-based technologies but pragmatic challenges such as analysis of big data from heterogenous environments exist. For example, wearable camera data often requires manual time-consuming subjective contextualization, such as labelling of terrain type. There is a need for the application of automated approaches such as those suggested by artificial intelligence (AI) based methods. This pilot study investigates multiple segmentation models and proposes use of the PSPNet deep learning network to automate a binary indoor floor segmentation mask for use with wearable camera-based data (i.e., video frames). To inform the development of the AI method, a unique approach of mining heterogenous data from a video sharing platform (YouTube) was adopted to provide independent training data. The dataset contains 1973 image frames and accompanying segmentation masks. When trained on the dataset the proposed model achieved an Instance over Union score of 0.73 over 25 epochs in complex environments. The proposed method will inform future work within the field of habitual free-living gait assessment to provide automated contextual information when used in conjunction with wearable inertial derived gait characteristics. Clinical Relevance—Processes developed here will aid automated video-based free-living gait assessment.
KW - Deep Learning
KW - Gait Analysis
KW - Floor Segmentation
UR - https://bhi-bsn-2022.org/
UR - http://www.scopus.com/inward/record.url?scp=85142290005&partnerID=8YFLogxK
UR - https://www.mendeley.com/catalogue/28c322e2-1928-3cb1-ab12-2ecc032bc02f/
U2 - 10.1109/BSN56160.2022.9928505
DO - 10.1109/BSN56160.2022.9928505
M3 - Conference contribution
AN - SCOPUS:85142290005
SN - 9781665459266
T3 - IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks (BSN)
SP - 1
EP - 4
BT - 2022 IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks (BSN)
PB - IEEE
CY - Piscataway. US
Y2 - 27 September 2022 through 30 September 2022
ER -