Turning Gigs into Gigabytes (and back again): Songification and Live Music Data

What is Songification?

You may already know about data sonification. Its a technique that’s been successfully used to assist the vision impaired for some time. Its even possible to “listen” to your DNA in the form of a unique sound file.

What you may not have heard of is “songification” (yep, there’s ‘g’ in there – its not a typo). Here’s an example:

 

What you are listening to is a musical rendition of the itinerary travelled by the Australian band Max Merritt and the Meteors as they played at venues in and around Melbourne during the late 1960s and early 1970s.

This post outlines the genesis of our attempts to ‘songify‘ some of the live music data collected in the TUGG database. TUGG (The Ultimate Gig Guide) is an online database of live music (‘gig’) information. Most of the data in TUGG has been sourced from the Melbourne-based music magazine Go-Set (1965-1972). Additional data has been imported from Sarah Taylor’s PhD research on the music scenes of Melbourne and Sydney in the 1980s and 1990s.

Even though we’re still busily gathering data and weighing up the pros and cons of the first iteration of our database we’re thrilled that the project has got to the stage where we can begin analysing the information we have to hand. In keeping with our interest in better understanding cultural flows, we’ve been really curious to examine the itineraries of bands as they moved around from one venue to the next.

Why?

The TUGG database can already represent a band’s gigs spatially on a google map (we like to call this process of mapping music performances; ‘gigography’). But the scale of travel between venues is difficult to view in these maps – bands might play a sequence of gigs in the inner city and then move far afield to a country venue. For example, it’s almost impossible to see the intricacies of adjacent movements around Melbourne in the same map as a gig in Colac without losing a great deal of the detail.  And because the maps are static we can’t see the sequential order of a band’s gigs either.

To better represent these fluctuations of spatial scale and temporal sequence – and to ‘repatriate’ our research more meaningfully to the music community itself – we decided to try ‘sonifying‘ our data. We can’t emphasise enough the last point about “thinking through” our research in formats that make the most sense to the communities and industries we are studying.

We don’t want our research results to be delivered as a fully formed “after” thought to other academics only. Any more than we want to just “deliver” our research results to the musicians and fans we think will be interested in TUGG. Avoiding this pretty typical division between academics-as-agents (generating Analysis) and non-academic communities as the objects of research (generating Content), means making sure that our research is undertaken in multi-modal ways – acknowledging that although academics might enjoy written texts for developing and communicating their thinking, other communities might prefer to think visually or aurally.

The unprecedented opportunities presented by cultural data projects like TUGG then, are that they lead not only to innovative methods for studying and understanding the creative industries and creative labour, but conversely, they enable us to simultaneously understand the creative potential of eResearch itself. And it changes the ‘ordering of things’ in other ways too.

The specific point of a methodology like ‘songification’ is to enable the live music community to engage with and contribute to our analysis as we develop it. We want them to be involved in helping us along with our analysis, to be much more than the passive recipients of our thinking. Songification is the instantiation of a belief that research itself is always in beta mode. That what’s important is not the end result of a purely scholarly exercise but that the research that really, really matters is inevitably iterative and recursive and co-created.

The Process

Step 1: Sonification

At this point we need to give full props to the super creative James Verhoeven for working with us on developing this methodology.

Data was taken from TUGG – all gigs were searched for a particular band (for example Max Meritt and the Meteors) and their associated attributes were exported as a CSV file. The data values that were required for the analysis were: Band Name, Date, and Venue Location (Latitude and Longitude).

The distance to each of the gig venues (as car driving distance) from the GPO in Melbourne CBD was calculated in metres using a Google Maps API. This figure was then translated into frequency (Hz). The range of this data is from 584 metres/Hz to 151413 metres/Hz – the latter which is impossible to register. As a result, we took this range of data and translated it proportionally (pro-rated) to a narrower range – one that we can hear and recognise easily. The lower and upper limits of this range were chosen as C3 (130.81 Hz) to B7 (3951.07 Hz). In musical terms this is from C below middle C to B, 4 octaves above middle C.

We then translated these frequencies to notes that can be recognised. The nearest note on the 12 note scale was chosen for each frequency. This was then further transposed to the nearest note in the C major scale. What resulted was an array of C major notes which represented the distance that the venue was from the CBD.

We applied this methodology to three different bands:

Max Merritt Gigs Sonified:

 

Billy Thorpe Gigs Sonified:

 

Doug Parkinson Gigs Sonified:

 

As you can plainly hear – these sonifications are pretty awful to listen to. And so we decided to take steps to enhance the sonifications, both to enable clearer pattern detection and to honour the musical provenance of the data.

Step 2: Gigs as Melody

Songification aims to show how enhanced auditory data design can provide a medium for aural intuition, as well as being an opportunity to make some unique music!

To turn the sonifications into music, we added a temporal element. Each gig/note was played in the succession in which it was originally performed – by date. The length or duration of each note was set to the number of days between the current gig and the next gig. The longer the delay between gigs, the longer the note. When all notes/gigs are played in succession we end up with a melody (but which still needs a backing track).

Step 3: Backing Tracks

The process to create the backing tracks involved a series of steps. Firstly we needed to get an average beat for each artist we were studying. This was done by looking at tracks on YouTube of the artist and tracking the beat to a metronome – the average beat rate was then calculated. The method to get the chord structure varied depending on the band. For example, the backing chords for the Max Merritt track was obtained by going on to a popular guitar tabbing website and obtaining all the chords that had been tabbed for the artist. The four most common chords were then used for the track and played in the order of what was most musically pleasing. The backing track for Doug Parkinson was more difficult as there was just one tab available. Therefore, YouTube was used to determine the playing style of the band and the common chord structures. Billy Thorpe is a basic 12 bar blues riff per the average song on his playlist.

The Backing Tracks for the three bands sound like this:

Max Merritt Backing Track:

 

Doug Parkinson Backing Track:

 

Billy Thorpe Backing Track:

 

Step 4: Mixing it up

The final stage of the songification process is to add the melody of gig data to the backing track in the form of a lead guitar riff. Here is that Max Merrit track again which now sounds surprisingly musical, and even more surprisingly, it is reminiscent of a track Max Merritt might once have played.

Enjoy!

 

Max Merritt Gigs Songified:

 

Meta Max: Max Merritt and the Meteors Gigs Performed live at the Brisbane Entertainment Centre: