We are nearing the time that the conducting data will be ready for upload to this website and to repovizz (repovizz.upf.edu). Repovizz is an open repository for multiple data streams, including mo-cap data that allows visualising and simultaneous playback of a wide variety of content. Our data includes multiple camera angles, multiple audio tracks in addition to fully marker labelled and gap-filled mo-cap data and EMG sensor outputs. We’re now testing on repovizz as you can see in the screenshots below.

Initial test: Two video views, plus plain mo-cap data

This has taken many hours of editing and rendering A-V data, many, many hours of data processing in Vicon’s Nexus software, and a number of other tools (including Mokka and Matlab) to finally get all material time-aligned, cropped, correctly formatted and bundled into repovizz datapacks to be uploaded.

Audio test: Multiple audio tracks successfully loaded.

The process from data capture through to publishing will be fully documented in the release in order to make sure that what the data contains is well understood, but also to aid any future conducting mo-cap studies.

EMG Test: Upper arm data, left and right bicep/tricep

The work is still ongoing, but we thought we’d share a few screenshots to whet appetites. The data will be available with open access online later this year.

Six camera angles, plus composite. Mo-cap coloured, points scaled and bones added.

Richard will be presenting a paper using some of the data from the project at MOCO 2019 next month… expect a blog post here afterwards.

As ever, our appreciative thanks to the project funders – the British Academy/Leverhulme Trust – Vicon Motion Systems for their technical help and equipment loans, and the conductors and instrumentalists who took part.

the ctcc team.