Inspiration – In Your Own Time Part #1

02
Apr
2014

In Your Own Time – one year on

The length of the initial post was starting to get a bit out of hand so I’ve separated it out into two posts. This one contains the backstory of the app: the inspiration behind it and how it came into being as a masters project. The second post is about what happened when the app went out into the world. What worked, what didn’t, the ups and downs of putting something out, and the lessons I learned along the way.

It’s been a little over a  year since I, along with the help of Shane Finan and Mick Cody, released In Your Own Time. The app has now been downloaded over 1000 times in over 20 countries and has been a massive step for me both personally and artistically. I want to share the journey to date with you here.

Inspiration

The idea of creating music as an app had its genesis in about 2009. I was finishing an engineering degree – which I hated – and  looking for ways to jump ship from engineering to the world of music. Music technology seemed like a good way to breach the gap.  I spent a lot of the time, when I should have been studying, browsing for music technology projects online. During this search I came across Ambient Addition by Noah Vawter. It was the first Mobile Music project – a work in which the listener and their environment effect the music – I had seen and I fell in love with the idea. It’s just seemed so potent to me, so full of possibilities.

I poured over Vawter’s thesis and looked up every work he referenced. This exposed me to works like Sonic City, Electric Walks and RjDj. Following up these I came across more inspiring works based in the area. Mobile Music – as Layle Gaye of Sonic City fame calls it – seemed like such a beautiful way to interact with music, and I was completely smitten. I spent about a month chasing down all the different projects I could find then my attention then drifted back to finishing my degree.

Masters

I knew I had to get out of engineering, my heart was never going to be in. I’d applied for several course in music technology and had interviews with a few colleges and was ultimately accepted to my preference, Music & Media Technologies in Trinity College Dublin. The idea for creating a Mobile Music project was still in my head, I think I even mentioned it in the interview for the course, but that specific area wasn’t on the curriculum so it slipped out of my mind. I got stuck into the programme soaking up everything that the course, and everything else I was discovering at that time, had to offer.

It was at the beginning of the second year, September 2011, that I needed to make a choice for what to do for the final project. This was the point when Mobile Music popped back up in my mind. I chatted to a lot of the lecturers about a few different projects, trying to figure out what I was interested in, how that would tap into their expertise and how relate to what I had been learning.

In Your Own Time nearly wasn’t to be my final project. I almost decided to do what later became Liquid Pixels. I ultimately persuaded out of doing this as it was more than could realistically have done in the available time. This distraction aside I committed to working on a Mobile Music project, with the express idea of focusing on the musicality of working with mobile sound.

Motivations

Having looked at all the existing works I could find, I decided that a focus on musicality was something that I was interested in. The works I had viewed had been across a spectrum of work encompassing human computer interaction ideas and sound art. I felt there was little focus on using the medium with the express purpose of musical expression and wanted to experiment with the expressive possibilities of the medium. I set out to do a project with creating a musical work as its main thrust. For me the music had to be the reason for the work, with the adaptive element created to serve the musical outcome.

It’s worth mentioning that I had heard relatively few Mobile Music works directly. The Mobile Music works created pre-smartphone and were made using custom made hardware and software. With the app based Mobile Music, like Bluebrain’s Listen to the Light, the technique of geotagging –tagging sounds to locations on a map – precluded experiencing the work directly. This created a situation where it was difficult for me to experience the works as they were intended. My listening experience was limited to some of RjDj’s apps and pulling apart the code of the Bluebrain’s apps to listen to the sound samples to try to get an idea of the musical experience.

This was a bit of a bugbear for me and I wanted to create a work which wasn’t linked to a specific location. An app which could experienced by anyone in the world. This had implications for the type of techniques I could use to create the adaptive elements, and meant that geotagging was out. I focused instead on using the phone’s sensors to make the musical experience more personal to the listener.

Technique

I tried to think of musical ideas that would fit with things I could do using the phone’s sensors and microphone, and vice verse. Trying to think of ways that I could use the expressive possibilities of a smartphone I took a lot of inspiration from the existing works and copied a sound sampling technique described in Ambient Addition. Every time a loud sound is detected at the phone’s microphone it is recorded into the app. These sounds are slotted into predefined rhythmic patterns and filtered so they would gel harmonically.

The rhythms are always the same but the timbre of these sounds changes depending upon the environment. Each time a new sound is detected it overwrites an existing sound altering the timbre of the music. The timbre of the music shifts and adapts based on the sounds happening around the listener. Variation in timbre happen more often in sonically busier environments, less so in quieter surroundings.

As well as this connection to the environment I wanted the piece to have connection with the listener as well. I’ve had some form of personal stereo since I was about 10. I used them in a variety of circumstances but mainly when in transit, and mostly while walking. When I was thinking about how the music would adapt to the listener I tried to think about how I could include walking as the adaptive force.

The idea of just using the speed that the person walks at as the speed of the music was the first thing which came to mind. This seemed like it could be interesting but could quickly become very trivial and annoying. I then came upon the idea of using the pace at which the listener is walking at discrete point during the music. Sampling the walking pace at set points to determine the tempo of individual melodies created a much subtler effect. The piece is constructed of five melodies and as each melody enters the speed of the melody is set by the speed that the listener is walking at. This allows for variation of texture and relationship between the melodies each time the piece is listened to.

I created the piece within the ScenePlayer app for Android. This allowed me to create the music using the musical programming language Pure Data and run it in an existing app. It meant that I didn’t have to go near any Android programming and could just focus on making the best musical experience possible. I submitted a Pure Data sketch created within ScenePlayer along with a thesis* for my masters, but I’d always had ambitions of releasing In Your Own Time. Pretty much as the dust was settling after the masters I started working on developing the app for release on Google Play.

Final word

This post sums up the Genesis of In Your Own Time and how I was thinking about creating a musical experience for a very new medium. Making music in this way was an incredibly rewarding and frustrating experience. Frustrating because you have to do lots of things from scratch. It takes for ever and the ideas don’t always live up to your expectations. It is rewarding as you’re getting to make a work which it very different and getting to explore untapped avenues for musical expression. In the next post I’ll be talking about the process of bringing together a team to make In Your Own Time available for release. Previewing the app at New Music Dublin and the app’s eventual release on Google Play.

*For those especially interested the thesis documenting the project is available here: In Your Own Time: A Mobile Music Composition.

Comments