Sonic City – Mobile Music #2


Sonic CitySonic City was one of the earliest mobile music works and is regularly cited in discussions of the field. It was one of the first works that allowed the user to “play” their environment and become an active participant in their urban musical experience. The user was fitted with sensors, headphones and a laptop. Sensor readings were taken as they moved about their environment which triggered musical events. This created an interactive music experience which allowed the user to improvise with their environment. The project was created by Layla Gayle, Lars Erik Holmquist, Ramia Mazé, Margot Jacobs and Daniel Skoglund at the Swedish Institute of Computer Science.

The technology

Sonic City hardwareSonic City was built around custom hardware and software and consisted of a collection of sensors connected to a laptop.
The laptop was worn in a backpack or, in later versions, embedded in a jacket which was worn by the user. The sensors consisted of light, sound, metal, heat and pollution sensors along with an accelerometer and microphone. These were placed at different points about the body and information from them fed into a patch on the laptop.

The music was created in a Pure Data patch using a series of modular DSP operations. Audio from a microphone was fed into the patch and processed using several modules in parallel. The sensor information controlled the processes and determined which were audible, as well as affecting other sonic parameters. The sensors drive the processing of the environmental sound in realtime to create live, interactive music.

Sonic City Wearable Prototype

The experience

Sonic City allows the user to interact with their environment in a tangible musical way. The sensors create an experience in which the music has a direct relationship to the user’s activity and their interaction with their surroundings. Walking in a darkened area, living in a polluted city and ambient noise all have an affect on the musical output. An interesting aspect of the project is that almost all sound stops when the user isn’t moving. It is only through movement that the sound is heard, the project has been designed almost to compel the user to interact. The video below gives a sense of the user experience along with commentary from people about their experiences.

User experience was a big aspect of the project and a small sample of user studies were undertaken. Each user varied in their engagement with the system. Some actively sought to cause sound while others enjoyed being immersed in the sound created by their surroundings. Only one user didn’t enjoy the experience due to feeling uncomfortable wearing the bulky equipment in public. The study is too small to give a real idea of how this type of system might be perceived, but it does provide food for thought. The participants take on the Sonic City experience gives an idea of what an ordinary user might think of a mobile music experience and provides a starting point for other projects.

To sum up

Sonic City has been an inspiration for lots of the mobile music projects, including Ambient Addition. The work was very successful due to its thoroughness of its research and implementation. The combination of skills and backgrounds, including a sound designer and sociologist, also greatly contributed to the final work. It was a ground breaking project which moved the personal stereo experience from passive to active and opened the door to a new way of thinking about music making. Shifting it from live and recorded mediums to one that is in tune with an individual and the surroundings. 

In my next post I’ll be looking at Location33 which takes a different approach to creating mobile music, using GPS geotagging to create locative music.