Project team: Danica Hendry, Prof. Leon Straker, Dr. Amity Campbell, Prof. Peter O’Sullivan, Dr. Luke Hopper, Prof. Tele Tan, Dr. Kevin Chai
There are many activity trackers, smart watches and GPS-enabled sports watches on the market, and these can record how far and how fast you move but, unfortunately, they can’t yet tell a jeté (jump) or an arabesque (leg lift) from a plié (bending at the knees), and don’t record much when dancers train on one spot at the barre. So to record specific movements, the research team are building an automated human activity recognition system.
Wearable sensors, each incorporating an accelerometer, a gyroscope and a magnetometer, are placed on six locations on the body (left and right shin, left and right thigh, sacrum and thoracic spine), and document their movement as a dancer works through specific dance movements. These continuous signals are then segmented and manually cross-referenced against video footage taken at the same time, so specific signal segments can be connected to individual dance movements.
Each dancer is different, so the research team recorded 23 dancers as they worked through a sequence of dance movements, both as isolated movements with a clearly defined beginning and end, and ‘buried’ within choreographed sequences of dance. Forty minutes of annotated video and correlated sensor data from each dancer built up the 106 GB base ‘library’ of data of on specific movements. The movements included in the study were jumps as the forces exerted on the body on landing are implicated in lower limb injury, and leg lifts as they are implicated in hip and lower back pain.
CIC specialist Dr Kevin Chai led the team in applying machine learning techniques to this library of manually-classified movement data. A convolution neural network trained using this library was able to identify patterns and diagnostic features in the mass of sensor segments that were associated with different jumps and leg lifts. Tested against sensor data from two unknown dancers, it could identify the target movements with 80 per cent accuracy or better.
The next stage of the study is to record sensor-only data from 52 dancers over an entire day of training, four times across a semester, and use the trained neural network to convert that into a quantitative measure of jumping and leg-lifting training volume for each. The dancers will also complete a survey each data collection day, self-assessing a range of emotional, cognitive and lifestyle factors, any pain experienced, and any limitations that has on their training. The data will then be used to explore the various factors that correlate with pain and disability.