1-3hit |
Zhifeng HUANG Ayanori NAGATA Masako KANAI-PAK Jukai MAEDA Yasuko KITAJIMA Mitsuhiro NAKAMURA Kyoko AIDA Noriaki KUWAHARA Taiki OGATA Jun OTA
To help student nurses learn to transfer patients from a bed to a wheelchair, this paper proposes a system for automatic skill evaluation in nurses' training for this task. Multiple Kinect sensors were employed, in conjunction with colored markers attached to the trainee's and patient's clothing and to the wheelchair, in order to measure both participants' postures as they interacted closely during the transfer and to assess the correctness of the trainee's movements and use of equipment. The measurement method involved identifying body joints, and features of the wheelchair, via the colors of the attached markers and calculating their 3D positions by combining color and depth data from two sensors. We first developed an automatic segmentation method to convert a continuous recording of the patient transfer process into discrete steps, by extracting from the raw sensor data the defining features of the movements of both participants during each stage of the transfer. Next, a checklist of 20 evaluation items was defined in order to evaluate the trainee nurses' skills in performing the patient transfer. The items were divided into two types, and two corresponding methods were proposed for classifying trainee performance as correct or incorrect. One method was based on whether the participants' relevant body parts were positioned in a predefined spatial range that was considered ‘correct’ in terms of safety and efficacy (e.g., feet placed appropriately for balance). The second method was based on quantitative indexes and thresholds for parameters describing the participants' postures and movements, as determined by a Bayesian minimum-error method. A prototype system was constructed and experiments were performed to assess the proposed approach. The evaluation of nurses' patient transfer skills was performed successfully and automatically. The automatic evaluation results were compared with evaluation by human teachers and achieved an accuracy exceeding 80%.
Atsuhiro NISHI Masanori YOKOYAMA Ken-ichiro OGAWA Taiki OGATA Takayuki NOZAWA Yoshihiro MIYAKE
The present study aims to investigate the effect of voluntary movements on human temporal perception in multisensory integration. We therefore performed temporal order judgment (TOJ) tasks in audio-tactile integration under three conditions: no movement, involuntary movement, and voluntary movement. It is known that the point of subjective simultaneity (PSS) under the no movement condition, that is, normal TOJ tasks, appears when a tactile stimulus is presented before an auditory stimulus. Our experiment showed that involuntary and voluntary movements shift the PSS to a value that reduces the interval between the presentations of auditory and tactile stimuli. Here, the shift of the PSS under the voluntary movement condition was greater than that under the involuntary movement condition. Remarkably, the PSS under the voluntary movement condition appears when an auditory stimulus slightly precedes a tactile stimulus. In addition, a just noticeable difference (JND) under the voluntary movement condition was smaller than those under the other two conditions. These results reveal that voluntary movements alternate the temporal integration of audio-tactile stimuli. In particular, our results suggest that voluntary movements reverse the temporal perception order of auditory and tactile stimuli and improve the temporal resolution of temporal perception. We discuss the functional mechanism of shifting the PSS under the no movement condition with voluntary movements in audio-tactile integration.
Taiki OGATA Naoki HIGO Takayuki NOZAWA Eisuke ONO Kazuo YANO Koji ARA Yoshihiro MIYAKE
People's body movements in daily face-to-face communication influence each other. For instance, during a heated debate, the participants use more gestures and other body movements, while in a calm discussion they use fewer gestures. This “coevolution” of interpersonal body movements occurs on multiple time scales, like minutes or hours. However, the multi-time-scale coevolution in daily communication is not clear yet. In this paper, we explore the minute-to-minute coevolution of interpersonal body movements in daily communication and investigate the characteristics of this coevolution. We present quantitative data on upper-body movements from thousand test subjects from seven organizations gathered over several months via wearable sensors. The device we employed measured upper-body movements with an accelerometer and the duration of face-to-face communication with an infrared ray sensor on a minute-by-minute basis. We defined a coevolution measure between two people as the number of per-minute changes of their body movement and compared the indices for face-to-face and non-face-to-face situations. We found that on average, the amount of people's body movements changed correspondingly for face-to-face communication and that the average rate of coevolution in the case of face-to-face communication was 3-4% higher than in the case of non-face-to-face situation. These results reveal minute-to-minute coevolution of upper-body movements between people in daily communication. The finding suggests that the coevolution of body movement arises in multiple time scales.