NOTE: This post is adapted from a paper that I submitted towards the completion my Master of Arts degree in August of 2016.
Sedimentary is a live-coding performance which I debuted at McMaster’s LIVELab alongside the Cybernetic Orchestra during the April 7th 2016 edition of the LIVELab’s Series 10dB. I also performed it at the International Conference on Live Coding (ICLC) on October 13th, 2016. It combines photography and audience participation (via smartphone), with a poetic approach to live-coding. The base panorama here is a 30000-pixel wide sweep of the Niagara Escarpment, at Arkeldun Avenue in Hamilton (known as the Jolley Cut).
Road widening in 1953 exposed a cross-section of layers, including the Grimsby and Thorold formations. (See also: a Virtual Field Trip of the site prepared by McMaster’s school of geography and geology). Shales, sandstones, and limestones, strikingly segmented into primary colours, here present an aesthetically and geologically significant dataset: the productive remnants of an ancient inland sea.
VIDEO: This video documents my performance at the International Conference on Live Coding, held in Hamilton in October 2016. It is the evolution of my earlier April performance at the LIVEab’s Series 10dB.
Sedimentary fits into what Collins et al (2003) describe as live-coding: a musical performance that also involves real-time scripting. (p. 321) It has been created in view of principles of openness and quality fostered by live-coding communities like TOPLAP, wherein I have found helpful impetus to reveal ‘the performer’s mind’ and ‘the whole human instrument’, and to ‘transcend the program’ by means of language. (Ward et al., 2004, p. 247) As we will see, Sedimentary employs a poetic subtext alongside a technical: pun is valued together with precision.
The web browser is the platform for both the performer (on a laptop) and the audience (on mobile devices). The laptop projects a control and display interface onto a large overhead screen. We see a scrolling escarpment, along with a series of ‘lenses’ that appear to hover over it. These lenses correspond to the image colours: red, blue, and yellow. Audience members visit a webpage, where they find a set of similarly coloured channel buttons, used to ‘tune in’ to a given lens. Devices are made to emit sound samples when their corresponding overhead lens encounters familiar pixels (achieved technically via HTML5, Canvas, CSS3, and jQuery).
If the blue lens finds a blue rock, then the blue channel will sound. The audience thus self-organized into sections, each producing distinct output. Yellow mapped onto a bird-like whistle, blue onto a guttural stone scrape, and red onto a resonant rocky clink. In addition, the relative brightness or darkness of the encountered image data affected the sample playback rate, thus effecting a more expressive pitch space. Sounds played on audience mobile devices in the browser via sample-based synthesizers created using the Web Audio API.
To activate further musical emphasis and direction in Sedimentary, I triggered poetically named JavaScript functions that altered the behaviour of hovering lenses. (See Table 1). Functions were entered into a text area for that purpose, integrated over the display.
Function | Effect |
goldenAge() theBlues() redShift() |
Change to a gold/blue/red filtered version of the escarpment, thus highlighting colouristic segmentation in the image and activating the corresponding synth. |
rockOn() rockOff() |
Commence panning movement with optional speed argument Conclude panning movement |
callMyBluff() | Trigger the “Primordeal” synth to “sigh” on all devices. |
freeze() thaw() |
Halt motion of traversal lenses (inspired by icicles) Resume motion of traversal lenses |
breakOut() | Initiate interesting motion and rhythm in the traversal lenses. |
getBackInLine() | Set traversal lenses into positions that align their colour with the image. |
Sedimentary is built on the technical affordances of apert, a server platform that enables artists to distribute JavaScript code and resources to participants’ mobile web browsers, and activate them through a centralized interface (Ogborn, 2016). Apert assumes the context of a free and open source development model, which values transparency. This is beneficial in collaborative projects for many reasons, include increased learning (from the actions of other developers) and heightened project visibility (Dabbish et al., 2012). It also saves time. With apert taking care of the complexities of network communication in the mobile browser, I was able to focus on the artistic aspects of my work. By linking my finished artworks back to the open source ecosystem, a mutual benefit is attained.
Musically, the work exhibited a choral texture: a tug between feathers and tectonics. Whistles contrasted with more deeply resonant rock scrapes and clinks. Efficient realtime distribution of signals effected a general synchronization of sounds, attenuated by small differences in latency and the speaker properties specific to each participating device. The sonic layering exhibited local imperfections, which nevertheless contributed to a blended (if not choral) texture. While none of these sounds involved collective of lungs or trachea, the unified outcome resonated for me with Daugherty’s description of choral sound as having “a nuanced life of its own apart from the discrete individual sound sources that contribute to it” (2001, p. 70).
References
Collins, N., McLean, A., Rohrhuber, J., & Ward, A. (2003). Live coding in laptop performance. Organised sound, 8(3), 321-330.
Dabbish, L., Stuart, C., Tsay, J., & Herbsleb, J. (2012, February). Social coding in GitHub: transparency and collaboration in an open software repository. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work (pp. 1277-1286). ACM.
Ogborn, D. (2016) Apert.[Github repository] Retrieved from https://github.com/d0kt0r0/apert
Ward, A., Rohrhuber, J., Olofsson, F., McLean, A., Griffiths, D., Collins, N., & Alexander, A. (2004). Live algorithm programming and a temporary organisation for its promotion. In Proceedings of the README Software Art Conference.