Richard attended the 6th International Conference on Movement and Computing (MOCO 2019) in October in Tempe Arizona to present a paper titled Force & Motion: Conducting to the Click. It was a very interesting conference, with wide variety of applications involving movement and computing. A particularly interesting paper combined a number of Kinect devices to cover a large space – something which could be particularly useful when capturing in large or complex environments.
In our paper we took a section of the Captured piece where the conductors work to a click-track (which the performers can’t hear) and so all three conductors are working to the same timing, giving a temporal ground truth to analyse. We then compared four sources of timing information for musical beats to study differences in the results: the click track itself, manually annotated beats from observing the video, marker position data from the conductors’ hands, and ground reaction force data from the force plate the conductors’ stood on.
There were a number of findings in the paper, but interestingly we were able to extract the beats from the force plate data with comparable precision to using the hand marker, providing a potential means for capturing conducting beats in live performance in a non-intrusive manner (no hand-held controller or attached sensors) and without issues of lighting or line of sight associated with optical systems.
Our paper can be found here: https://dl.acm.org/doi/abs/10.1145/3347122.3347139