After months of planning and two days of setting up equipment, Capture Day finally took place on 27th September 2017. The team arrived early to make sure the systems were up and running ready for the conductors and ensemble.
Thanks to Vicon’s generosity in lending us additional cameras and devices for the capture system, as well as technical support to optimise the system, we managed to obtain high quality face, body and hand data from the conductors.
In addition to the Vicon motion data, we have video of the conductors and ensemble from 7 cameras (including a 360 degree video) in various locations, plus motion data from a Kinect 2, audio recording from stereo and close microphones and muscle data from four wireless EMG devices on the conductors’ biceps and triceps.
The day ran very smoothly, with help from Harry Matthews, Beth Walker and Sergiu Rusu (student and recent graduates) to help with unsticking and sticking markers amongst other activities!
We are deeply grateful to Bob Dimmock, Phil Bacon, Matt Oughton and everyone at Vicon for all their help and generosity in making this day a success. We’d also like to thank Geoff and Holly for being such willing guinea pigs, and also our ensemble, for their wonderful patience and playing: Anna Durance (oboe + electronics); Vicky Wright (bass clarinet); Julian Poore (trumpet); Joley Cragg (percussion); Liga Korne (electric piano); Aisha Orazbayeva (violin) and Dan Molloy (double bass).
Finally, of course our thanks to The British Academy and the Leverhulme Trust for their funding of the project.
While the final capture session for the project will focus on using highly accurate optical motion capture, as a preliminary test we wanted to quickly capture some conducting motion data for analysis.
We had access to a Perception Neuron IMU mocap suit, designed for small animation/game studios and education establishments. This uses put to 32 “Neurons” or small Inertial Measurement Units which house gyro, accelerometer and magnetometer. The Axis Neuron software translates this data into 3D position data for each Neuron and streams this in real-time and can record it to disk.
Willing victim Dr Benjamin Oliver was strapped into the Neuron suit during a rehearsal with Workers Union Ensemble which was also video and audio recorded so we could compare the data from each source.
We captured approximately 60 minutes of rehearsal time, which was then integrated by Research Assistant Dan Halford, a sample of which can be seen below which features Helen Papaioannou‘s Backscatter (2017).
Capturing the Contemporary Conductor is a BA/Leverhulme Trust funded pilot-project investigating the use of high-quality motion capture systems for the study of conducting gesture. It is a collaborative project between Music and Health Sciences at the University of Southampton.