To help student nurses learn to transfer patients from a bed to a wheelchair, this paper proposes a system for automatic skill evaluation in nurses' training for this task. Multiple Kinect sensors were employed, in conjunction with colored markers attached to the trainee's and patient's clothing and to the wheelchair, in order to measure both participants' postures as they interacted closely during the transfer and to assess the correctness of the trainee's movements and use of equipment. The measurement method involved identifying body joints, and features of the wheelchair, via the colors of the attached markers and calculating their 3D positions by combining color and depth data from two sensors. We first developed an automatic segmentation method to convert a continuous recording of the patient transfer process into discrete steps, by extracting from the raw sensor data the defining features of the movements of both participants during each stage of the transfer. Next, a checklist of 20 evaluation items was defined in order to evaluate the trainee nurses' skills in performing the patient transfer. The items were divided into two types, and two corresponding methods were proposed for classifying trainee performance as correct or incorrect. One method was based on whether the participants' relevant body parts were positioned in a predefined spatial range that was considered ‘correct’ in terms of safety and efficacy (e.g., feet placed appropriately for balance). The second method was based on quantitative indexes and thresholds for parameters describing the participants' postures and movements, as determined by a Bayesian minimum-error method. A prototype system was constructed and experiments were performed to assess the proposed approach. The evaluation of nurses' patient transfer skills was performed successfully and automatically. The automatic evaluation results were compared with evaluation by human teachers and achieved an accuracy exceeding 80%.
Zhifeng HUANG
The University of Tokyo
Ayanori NAGATA
The University of Tokyo
Masako KANAI-PAK
Tokyo Ariake University of Medical and Health Sciences
Jukai MAEDA
Tokyo Ariake University of Medical and Health Sciences
Yasuko KITAJIMA
Tokyo Ariake University of Medical and Health Sciences
Mitsuhiro NAKAMURA
Tokyo Ariake University of Medical and Health Sciences
Kyoko AIDA
Tokyo Ariake University of Medical and Health Sciences
Noriaki KUWAHARA
Kyoto Institute of Technology
Taiki OGATA
The University of Tokyo
Jun OTA
The University of Tokyo
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Zhifeng HUANG, Ayanori NAGATA, Masako KANAI-PAK, Jukai MAEDA, Yasuko KITAJIMA, Mitsuhiro NAKAMURA, Kyoko AIDA, Noriaki KUWAHARA, Taiki OGATA, Jun OTA, "Automatic Evaluation of Trainee Nurses' Patient Transfer Skills Using Multiple Kinect Sensors" in IEICE TRANSACTIONS on Information,
vol. E97-D, no. 1, pp. 107-118, January 2014, doi: 10.1587/transinf.E97.D.107.
Abstract: To help student nurses learn to transfer patients from a bed to a wheelchair, this paper proposes a system for automatic skill evaluation in nurses' training for this task. Multiple Kinect sensors were employed, in conjunction with colored markers attached to the trainee's and patient's clothing and to the wheelchair, in order to measure both participants' postures as they interacted closely during the transfer and to assess the correctness of the trainee's movements and use of equipment. The measurement method involved identifying body joints, and features of the wheelchair, via the colors of the attached markers and calculating their 3D positions by combining color and depth data from two sensors. We first developed an automatic segmentation method to convert a continuous recording of the patient transfer process into discrete steps, by extracting from the raw sensor data the defining features of the movements of both participants during each stage of the transfer. Next, a checklist of 20 evaluation items was defined in order to evaluate the trainee nurses' skills in performing the patient transfer. The items were divided into two types, and two corresponding methods were proposed for classifying trainee performance as correct or incorrect. One method was based on whether the participants' relevant body parts were positioned in a predefined spatial range that was considered ‘correct’ in terms of safety and efficacy (e.g., feet placed appropriately for balance). The second method was based on quantitative indexes and thresholds for parameters describing the participants' postures and movements, as determined by a Bayesian minimum-error method. A prototype system was constructed and experiments were performed to assess the proposed approach. The evaluation of nurses' patient transfer skills was performed successfully and automatically. The automatic evaluation results were compared with evaluation by human teachers and achieved an accuracy exceeding 80%.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.E97.D.107/_p
Copy
@ARTICLE{e97-d_1_107,
author={Zhifeng HUANG, Ayanori NAGATA, Masako KANAI-PAK, Jukai MAEDA, Yasuko KITAJIMA, Mitsuhiro NAKAMURA, Kyoko AIDA, Noriaki KUWAHARA, Taiki OGATA, Jun OTA, },
journal={IEICE TRANSACTIONS on Information},
title={Automatic Evaluation of Trainee Nurses' Patient Transfer Skills Using Multiple Kinect Sensors},
year={2014},
volume={E97-D},
number={1},
pages={107-118},
abstract={To help student nurses learn to transfer patients from a bed to a wheelchair, this paper proposes a system for automatic skill evaluation in nurses' training for this task. Multiple Kinect sensors were employed, in conjunction with colored markers attached to the trainee's and patient's clothing and to the wheelchair, in order to measure both participants' postures as they interacted closely during the transfer and to assess the correctness of the trainee's movements and use of equipment. The measurement method involved identifying body joints, and features of the wheelchair, via the colors of the attached markers and calculating their 3D positions by combining color and depth data from two sensors. We first developed an automatic segmentation method to convert a continuous recording of the patient transfer process into discrete steps, by extracting from the raw sensor data the defining features of the movements of both participants during each stage of the transfer. Next, a checklist of 20 evaluation items was defined in order to evaluate the trainee nurses' skills in performing the patient transfer. The items were divided into two types, and two corresponding methods were proposed for classifying trainee performance as correct or incorrect. One method was based on whether the participants' relevant body parts were positioned in a predefined spatial range that was considered ‘correct’ in terms of safety and efficacy (e.g., feet placed appropriately for balance). The second method was based on quantitative indexes and thresholds for parameters describing the participants' postures and movements, as determined by a Bayesian minimum-error method. A prototype system was constructed and experiments were performed to assess the proposed approach. The evaluation of nurses' patient transfer skills was performed successfully and automatically. The automatic evaluation results were compared with evaluation by human teachers and achieved an accuracy exceeding 80%.},
keywords={},
doi={10.1587/transinf.E97.D.107},
ISSN={1745-1361},
month={January},}
Copy
TY - JOUR
TI - Automatic Evaluation of Trainee Nurses' Patient Transfer Skills Using Multiple Kinect Sensors
T2 - IEICE TRANSACTIONS on Information
SP - 107
EP - 118
AU - Zhifeng HUANG
AU - Ayanori NAGATA
AU - Masako KANAI-PAK
AU - Jukai MAEDA
AU - Yasuko KITAJIMA
AU - Mitsuhiro NAKAMURA
AU - Kyoko AIDA
AU - Noriaki KUWAHARA
AU - Taiki OGATA
AU - Jun OTA
PY - 2014
DO - 10.1587/transinf.E97.D.107
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E97-D
IS - 1
JA - IEICE TRANSACTIONS on Information
Y1 - January 2014
AB - To help student nurses learn to transfer patients from a bed to a wheelchair, this paper proposes a system for automatic skill evaluation in nurses' training for this task. Multiple Kinect sensors were employed, in conjunction with colored markers attached to the trainee's and patient's clothing and to the wheelchair, in order to measure both participants' postures as they interacted closely during the transfer and to assess the correctness of the trainee's movements and use of equipment. The measurement method involved identifying body joints, and features of the wheelchair, via the colors of the attached markers and calculating their 3D positions by combining color and depth data from two sensors. We first developed an automatic segmentation method to convert a continuous recording of the patient transfer process into discrete steps, by extracting from the raw sensor data the defining features of the movements of both participants during each stage of the transfer. Next, a checklist of 20 evaluation items was defined in order to evaluate the trainee nurses' skills in performing the patient transfer. The items were divided into two types, and two corresponding methods were proposed for classifying trainee performance as correct or incorrect. One method was based on whether the participants' relevant body parts were positioned in a predefined spatial range that was considered ‘correct’ in terms of safety and efficacy (e.g., feet placed appropriately for balance). The second method was based on quantitative indexes and thresholds for parameters describing the participants' postures and movements, as determined by a Bayesian minimum-error method. A prototype system was constructed and experiments were performed to assess the proposed approach. The evaluation of nurses' patient transfer skills was performed successfully and automatically. The automatic evaluation results were compared with evaluation by human teachers and achieved an accuracy exceeding 80%.
ER -