Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2015 Jun 18;10(6):e0130293.
doi: 10.1371/journal.pone.0130293. eCollection 2015.

Motion Tracker: Camera-Based Monitoring of Bodily Movements Using Motion Silhouettes

Affiliations

Motion Tracker: Camera-Based Monitoring of Bodily Movements Using Motion Silhouettes

Jacqueline Kory Westlund et al. PLoS One. .

Erratum in

Abstract

Researchers in the cognitive and affective sciences investigate how thoughts and feelings are reflected in the bodily response systems including peripheral physiology, facial features, and body movements. One specific question along this line of research is how cognition and affect are manifested in the dynamics of general body movements. Progress in this area can be accelerated by inexpensive, non-intrusive, portable, scalable, and easy to calibrate movement tracking systems. Towards this end, this paper presents and validates Motion Tracker, a simple yet effective software program that uses established computer vision techniques to estimate the amount a person moves from a video of the person engaged in a task (available for download from http://jakory.com/motion-tracker/). The system works with any commercially available camera and with existing videos, thereby affording inexpensive, non-intrusive, and potentially portable and scalable estimation of body movement. Strong between-subject correlations were obtained between Motion Tracker's estimates of movement and body movements recorded from the seat (r =.720) and back (r = .695 for participants with higher back movement) of a chair affixed with pressure-sensors while completing a 32-minute computerized task (Study 1). Within-subject cross-correlations were also strong for both the seat (r =.606) and back (r = .507). In Study 2, between-subject correlations between Motion Tracker's movement estimates and movements recorded from an accelerometer worn on the wrist were also strong (rs = .801, .679, and .681) while people performed three brief actions (e.g., waving). Finally, in Study 3 the within-subject cross-correlation was high (r = .855) when Motion Tracker's estimates were correlated with the movement of a person's head as tracked with a Kinect while the person was seated at a desk (Study 3). Best-practice recommendations, limitations, and planned extensions of the system are discussed.

PubMed Disclaimer

Conflict of interest statement

Competing Interests: The authors have declared that no competing interests exist.

Figures

Fig 1
Fig 1. A flow chart depicting the steps taken when processing each frame of the video.
The left column contains a description of each step, while the center column lists the corresponding OpenCvSharp function, and the right column shows the mathematical formula applied during that step.
Fig 2
Fig 2. The Motion Tracker software interface in action.
The video being processed is displayed in the top left panel, while the corresponding motion silhouette is shown at the top right. The lower left panel displays a moving graph of the motion index over time. Controls for the software, which allow the user to select which video to process, whether to show the visualizations, and where to save output files, are located in the lower right panel.
Fig 3
Fig 3. Sample output of the motion tracking algorithm.
On the left are single frames extracted from a video sequence, while the panels on the right display the corresponding motion silhouettes. Pixels that have been displaced (i.e., places in the video frame where motion has occurred) are shown in white; pixels that have not been displaced are shown in black.
Fig 4
Fig 4. The overall experimental setup for the first validation study.
The participant sat on a chair with the BPMS seat and pack pads, facing a computer monitor. The camera recorded the participant’s upper torso and face.
Fig 5
Fig 5. These graphs display a 500 timestep segment of a sample motion time series.
The top graph shows motion in individual frames as the proportion of changed pixels per frame, while the bottom graph shows the absolute difference of the proportion of changed pixels per frame across consecutive frames, i.e., the change in motion across adjacent frames. Periods of stable motion in the top graph are reflected by small spikes in the absolute difference graph, i.e., small changes in motion across adjacent frames. Sharp increases or decreases in motion are reflected by larger spikes, indicative of larger changes in motion across adjacent frames.
Fig 6
Fig 6. Sample output from the BPMS pressure pads.
On the left is a pressure map from the seat pressure pad. Each square in the map corresponds to a single sensing element. On the right are graphs showing a 200 timestep segment of a sample mean pressure time series for the back pressure pad (top) and for the seat pressure pad (bottom). Changes in pressure against the seat and back pads are reflected in the spikes and dips in the mean pressure graphs.
Fig 7
Fig 7. Scatter plots showing the mean of the absolute difference of each time series (as z-scores) vs. each other time series.
The top panel includes all data while the bottom panel eliminates participants with negligible back movement.
Fig 8
Fig 8. Line graphs of the mean cross-correlation across all participants of the mean of the absolute difference of each time series (z-scores), with each time series divided into 10 windows.
The top graph shows the motion and seat over time, while the bottom graph shows the motion and back over time.
Fig 9
Fig 9. Scatter plots showing the mean of the absolute difference of the estimated movement vs. the accelerometer time series (as z-scores), for each of the three actions performed by subjects: (1) right arm swipe to the left, (2) right arm swipe to the right, and (3) right hand wave.
Fig 10
Fig 10. Line graph of the cross-correlation of the absolute difference of the estimated movement and the Kinect head position time series (z-scores), with each time series divided into 10 windows.

Similar articles

Cited by

References

    1. Ramenzoni VC, Riley MA, Shockley K, Chiu CY. Postural responses to specific types of working memory tasks. Gait Posture. 2006;25: 368–373. - PubMed
    1. Riley MA, Mitra S, Saunders N, Kiefer AW, Wallot S. The interplay between posture control and memory for spatial locations. Exp Brain Res. 2012;217: 43–52. 10.1007/s00221-011-2970-y - DOI - PubMed
    1. Pellecchia G. Dual-task training reduces impact of cognitive task on postural sway. J Mot Behav. 2007;37: 239–246. - PubMed
    1. Swan L, Otani H, Loubert P. Reducing postural sway by manipulating the difficulty levels of a cognitive task and a balance task. Gait Posture. 2007;26: 470–474. - PubMed
    1. Dael N, Mortillaro M, Scherer KR. Emotion expression in body action and posture. Emotion. 2011;12: 1085–1101. 10.1037/a0025737 - DOI - PubMed

Grants and funding

The authors have no support or funding to report.

LinkOut - more resources