Capturing Cafe Oto. The technical challenges behind the musical ambition

Cafe Oto has won Joining the Dots funding and support to help them achieve their ambition to make their progressive music programme more widely available. The Cafe’s digital producer John Chantler explains in detail how he and his colleagues figured out the technicals needed to store and release every note performed at the venue.

The core part of our online subscription service is the recordings themselves. Thanks to the generous donations made as part of an equipment fundraiser we ran, we were able to purchase and install a JoeCo Black Box Recorder at Cafe OTO. This meant we could easily make high quality multi-track recordings of every concert that happens. Of course, this makes for a new problem where we quickly accumulate a massive amount of ‘stuff’ that needs to be properly archived and made accessible to the musicians and other people who will be working with us on the project.

Hear John talk in more detail about the Cafe Oto/Joining the Dots project.

The initial process of manually uploading selected large files to Dropbox over a slow internet connection and then emailing the various links to different musicians was both time-consuming and cumbersome.

Here our head technician James Dunn shares the first part of the solution he has put in place to make the recordings accessible with little need for ongoing manual intervention.

THE SOLUTION
Automatically upload the recordings to the cloud and then email the musicians a link. Essentially, this would mean switching the external USB hard drive that stores the recordings between the JoeCo BlackBox recorder and a computer which will automatically detect the presence of the drive and runs a script which uploads any new files to the cloud.

GATHERING THE PARTS
The first step was to source a USB switch. We decided on using this switch suggested by Joe Bull of JoeCo. We also required a computer, small yet powerful enough to handle the uploading. A perfect job for the Raspberry Pi.

After testing the external USB hard drive worked via the switch with both the JoeCo BBR and Raspberry Pi, the next step was to install the software.

DECIDING ON AN OPERATING SYSTEM
There are various linux distributions listed on the Raspberry Pi website. The most popular one (and the one that most pre-configured Pi’s ship with) is Raspian which is based on Debian, and comes with the LXDE desktop environment pre-installed. However, we have no need for a GUI or any kind of Desktop, X server etc, so even though it is a lightweight desktop environment, it is still completely unnecessary for our purposes. I chose instead to use Arch linux, which ships by default as an extremely minimal distribution that is highly customisable.

Once the OS is installed to the SD card, the Pi can be booted and configured. The excellent Arch wiki has details on how to do this. Because there is no screen with the Pi, I used a projector connected to composite video out to monitor the boot process initially but thereafter all subsequent programming was done over ssh.

CONFIGURING THE SOFTWARE
Initially I used Udev to monitor when the external USB hard drive was inserted and then call a script. This was functional but slightly problematic so a better solution was found using devmon. This package automatically detects removable media when it is inserted or removed and can autostart applications and scripts. Devmon was then configured with the following options in the conf file :

/etc/conf.d/devmon:

ARGS=”–mount-options ro –exec-on-drive ‘/usr/local/bin/upload %d'”

These arguments first pass the read only parameter to udisks to prevent any data being written, or more importantly deleted, from the external USB hard drive. The next argument tells devmon to launch the custom script when a drive is inserted and pass the mount point as a variable to the script.

The script is as follows:

#!/bin/bash
#
# This bash script automatically uploads the contents of an external
# USB hard drive to and Amazon Web Service S3 server.
# The hard drive is mounted by the devmon daemon
# (/usr/lib/systemd/system/devmon.service) according
# to the configuration options in /etc/conf.d/devmon
#
# James Dunn 13/01/14
# james@cafeoto.co.uk

# redirect the output of this script to the log file
exec > /tmp/upload.log 2>&1

# mark the beginning of the log file
echo LOG START

# redirect info from devmon to log file
env

# print the current time to the log for debugging
echo `date +%T`

# assign mountpoint directory from devmon to variable
dir=$1

# print the mount point to the log
echo $dir

# upload to AWS S3 server:
aws s3 sync $dir/ s3://cafeoto/recordings –exclude “.*” –exclude “*/.*” –exclude “manage.bbr” –exclude “recent.bbr”

echo LOG END

exit 0

The script is fairly self-explanatory but essentially the mount point directory variable is passed to the aws command which then syncs the files in that directory to the bucket/object path on the Amazon S3 server. The exclude options simply prevent the syncing of hidden system files in the root directory and first subdirectory which will usually have been created by a Mac OS X computer previously accessing the drive. This will exclude the .Spotlight-V100 and .Trashes directories amongst others.

Similarly there are two directories named manage.bbr and recent.bbr which are created by the BlackBox recorder and should also be excluded.

To enable all of this to automatically start at boot, the devmon service file just needs to be enabled:

sudo systemctl enable devmon.service

And that’s about it. Each evening at the end of the night the hard drive is switched to the Raspberry Pi and the uploads automatically begin. A useful tool for monitoring the network activity is iftop where you will clearly be able to see the uploads in progress.

Sign up to our newsletter to get news of further music industry resources posted by the hub or follow us on twitter.