While the final capture session for the project will focus on using highly accurate optical motion capture, as a preliminary test we wanted to quickly capture some conducting motion data for analysis.
We had access to a Perception Neuron IMU mocap suit, designed for small animation/game studios and education establishments. This uses put to 32 “Neurons” or small Inertial Measurement Units which house gyro, accelerometer and magnetometer. The Axis Neuron software translates this data into 3D position data for each Neuron and streams this in real-time and can record it to disk.
Willing victim Dr Benjamin Oliver was strapped into the Neuron suit during a rehearsal with Workers Union Ensemble which was also video and audio recorded so we could compare the data from each source.
We captured approximately 60 minutes of rehearsal time, which was then integrated by Research Assistant Dan Halford, a sample of which can be seen below which features Helen Papaioannou‘s Backscatter (2017).