of Electroacoustic Music
Winter Fragments was commissioned by Collectif et Cie of Annecy, France and first performed on 21 November 2000 by the Ensemble Les Temps Modernes under the direction of Fabrice Pierre at the Festival Sons d`Annecy, Bonlieu.
The title alludes to the composer experiencing the winter landscape at his former home in upstate New York in 1999. [Score, notice] The piece is dedicated to the memory of Gérard Grisey who had passed away a year before. This personal background is manifest in the melodic reference to Prologue for viola solo (1978), the opening piece of Grisey’s cycle Les Espaces Acoustiques (1974–85).
In Winter Fragments Murail combines instrumental and electronic synthesis by means of computer aided composition tools. Both the so called “instrumental synthesis”, characteristic of early spectral works such as Périodes (1974) for seven instruments and Partiels (1975) for 18 musicians by Gérard Grisey, and the electroacoustic production of carefully crafted electronic sounds are based on the analysis of pre-existing musical material, be it samples or notated material. The process uses software capable of analysing a sound file and determining parameters such as frequency and amplitude over time. The results are then used to formalise musical structures and to re-synthesise new sounds. These functionalities have been available since the nineteen-nineties through programs such as AudioSculpt and PatchWork developed at IRCAM. Murail first used these tools in 1994 in L’Esprit des dunes for eleven instruments and synthesiser sounds (Patchwork) and then in1995 in Le Partage des eaux for large orchestra and Bois flotté for five instruments and electronics (1996) (Audiosculpt). [Hirs 2009a, pp. 8 – 9].
Rozalie Hirs’ [2009a] analysis of Winter Fragments describes in detail the processes leading to the definition and transformation of “musical objects”. By this Murail means “musical material that is short enough to be recognised as one entity and recognisable even after one or more transformations”. [Hirs 2009b, p. 47] Their typology determines the form. As Hirs points out, “each musical object is a building block with a clearly defined identity, derived from real or virtual sound sources, acoustic or theoretical models”. [Hirs 2009b, p. 47] These objects remain fundamental throughout the composition process and are subject to multiple transformations along the five sections of the piece (perhaps a further reference to Prologue’s opening five note cell).
According to Hirs, the “musical objects” defined in Winter Fragments include the following [Hirs 2009c, p. 177]:
-“Initial call”, a melodic cell obtained from transformations of a Mongolian chant [s. Tüzün 2008, p. 303] used both for instrumental synthesis and electroacoustic transformation.
-“Five note neume” prefiguring the initial melodic cell from Grisey’s Prologue.
-Piano sample (F#4 and F#5): Its spectral analysis includes frequency, amplitude and duration. A selection of partials is used as a carrier input in ring modulation synthesis.
-Tam-tam sample: This had already been used by Murail in Pour adoucir le cours du temps for 18 instruments and electronics (2005). In Winter Fragments the result of its spectral analysis was simplified by filtering out some frequencies that were later varied using different transpositions. This new material was then re-synthesised and used both as electroacoustic sound and as harmonic material. [Hirs 2009c, p. 180]
-The sound of a glass cup hit by a spoon (“verre7”). From its analysis emerges a chord containing two minor sevenths and a minor sixth (approx.). This analysis is not taken as an independent “musical object”. However, it is used to rule harmonic relations between instruments, producing e.g., the characteristic timbre of the cascaded figures in sections III and IV. [Hirs 2009c, p. 184–85]
In the introduction to the score, Murail refers to these objects as “fragments”:
un fragment mélodique répétitive, un contour de cinq hauteurs
quelques accords distordus d’un piano imaginaire
échos en souffles synthétiques des flûte et clarinette
un son de verre, un son de tamtam, analysés, manipulés, reconstruits”
a melodic repetitive fragment, a contour of five pitches
some distorted chords from an imaginary piano
echos in synthetic breaths from the flute and clarinet
a glas and a tam-tam sound, analysed, manipulated, reconstructed”
Besides the extensive use of morphing processes through permutation and interpolation and the use of the sound envelope model (attack, decay, sustain, release) as formal devices, [Hirs 2009c, p. 186] the overall form of Winter Fragments presents a processual but not linear development leading from the “Initial call” at the beginning to the melodic cell of Prologue, which as a literal quotation only appears in the violoncello at the very end of the piece:
“Le fragment mélodique prolifère en tourbillons, en descentes cadentielles, se métamorphose doucement, révèle son origine à la toute fin de la pièce (la cellule initiale de ’Prologue’ de Gérard Grisey).”
“The melodic fragment proliferates in whirlwinds, in cadential descents, metamorphoses slowly, reveals its origin at the very end of the piece (the initial cell of ‘Prologue’ by Gérard Grisey”
The electronic part consists of two layers: a large number of pre-composed samples triggered by the keyboard player and the amplification of the instruments. Each layer was conceived to be spatialised in real time through its own quadriphonic loudspeaker setup. The spatialisation of the samples is based on a set of mathematical and hand drawn functions that define specific movements from one point to the next. The spatialisation of the five instruments follows a set of 10 configurations in which the instruments can sometimes be extremely dislocated, generating a double image superposed to the acoustic sound.
Murail’s decision to create the electronic sounds in the studio and to limit the live processes to the spatialisation and the performance of the sampler can be explained by the limitations he encountered at that time. [Hirs 2009c, p. 177] At the same time, it illustrates the evolving understanding of live electronic practice at the turn of the millennium insofar as the opposition between real and differed time processes is abandoned in favour of an integrative approach.
Winter Fragments can optionally be accompanied by video projection. The video was created by the visual artist Hervé Bailly-Basin (b. 1958), who also realised videos for other works of Murail including Treize Couleurs du soleil couchant for five instruments (1978), Bois flotté for five instruments and electronics (1996) and Liber fulguraris for ensemble, electronics and video (2008). [Cf. https://www.tristanmurail.com/en/video-WinterFragment.html, last accessed on 4 AUgust 2021]
a) Score and parts
Source: Editions Henry Lemoine
The score contains:
– “Notice” with information about the work’s poetic conception and dedication
– List of instruments
– List of technical specifications and layout of the instruments on stage and the electronic setup
– Explanation of the rhythmic notation and the symbols used in the score
The resulting electronic sound is notated on four staffs combining pitch, rhythmical values, space notation and graphic representations for each sample. Actions on the keyboard are notated in one dedicated staff below. The remaining instruments are notated below.
The music is written without bar lines. Instead, marks over each system, which in most cases correspond to a quarter note, define the length of the metric units. Of the individual instrumental parts, only the MIDI keyboard part is written in measured notation.
Source: Editions Henry Lemoine
Date: November 2000, modified 2004, 2010, 2016 and 2018
Title: Winter Fragments
Author. Ph. Moènne-Loccoz
Format: Max 8; 44.1kHz 16bit
The patch folder contains a readme file with information on the Max patch functionality. In the patch itself, a patcher p doc contains information on the samples and the spatialisation trajectories.
This lists the equipment needed for electronic sound production:
– Six microphones
– One mixing console 12×8
– Eight loudspeakers
-One or two stereo reverb units
-Five-octave MIDI keyboard with easy-to-access program changes and sustain pedal
-A fast Macintosh computer, 500 Mb of non-fragmented disk space; MIDI interface
sound card with eight inputs and outputs
The spatialisation can be reduced to four diffusion channels (instruments and electronic sounds are then mixed onto the same speakers).
There is also a visualisation of the rider showing the connections of all channels to the mixing console
Plan de scène
This document describes the exact position of the instruments and the conductor on stage. The screen (for the version with video) is also indicated.
The lighting for the conductor and the musicians is also precisely listed:
-conductor’s platform with a lighted conductor’s music stand
-10 lighted music stands
-Lighting for the piano music stand as well as for the conductor (double or lateral).
Eight loudspeakers, four for the samples and four for the spatialisation of the instruments as well as a subwoofer are required. An optional configuration with only four speakers is also described. (s. Fig. I)
A minimum of five microphones is prescribed for the amplification of the instruments. In order to add reverb to the instruments, one or two external reverb units are needed.
The keyboard player needs a MIDI keyboard connected to the Max patch to trigger the samples. Triggering the samples will also recall the spatialisation and level presets of the acoustical instruments. Indeed, the spatialisation of the samples and of the instruments is fully automated. However, the sound engineer has the possibility to control some parameters in the Max patch.
The fader Master Electronic level controls the level of the samples.
A menu box allows the user to switch between sin1 and lin1, this parameter affecting the interpolation of the spatial movements (exponential or linear).
The individual output level of the instruments can be adjusted during the performance with an additional fader to the left of the spatialisation box. However, these parameters are automated and will be overwritten by the next cue. It is be possible to store new values on the ten presets controlling the instruments level and their spatialisation.
The patch contains two reverberation units for the samples. Some of the parameters can be edited: short echoes level, long echoes level, lowpass, and room size. Other parameters are automated: reverb level, reverb time and dry/wet mix.
Additionally, it is possible to store and recall the general level setting of the speakers output in the patcher p test audio. This feature allows to perform the piece without an additional audio mixer.
The concert took place on 18 January 2019 at ZHdK’s main concert hall. Performers were the ensemble Arc-en Ciel, conductor Simeon Pironkoff, Germán Toro Pérez, sound projection and Leandro Gianini, sound engineer. The piece was performed without the optional video projection.
The documentation contains an inconsistency with respect to the placement of the musicians: In one stage plot (Plan de scène, s. above) the position of the violin and cello are switched compared to the score and the technical rider (s. above). The positioning indicated in the latter two documents was adhered to for this performance.
As mentioned above, the instruction requires four speakers for each layer. The reason for this is not completely clear. The two sample speakers on stage are placed behind the musicians, probably to enable monitoring. In general, it can be argued that through differences in speaker types and positions a specific sound for each layer and therefore a more precise spatialisation of the instruments might be achieved. Accordingly, we decided to use different speaker types for each layer and to position them at different heights. Four L-Acoustics 108P were used for the samples and four Meyer Sound UPM-1P for the spatialisation of the instruments. Additionally, two d&b Y-Sub (18”/12”, cardioid) subwoofers were placed front left and right. The UPM-1P formed a lower layer at approximately 2 – 3 metres height and the 108P formed an upper layer at approximately 5 – 6 metres. The two layers created more depth and helped to achieve a sense of enhanced transparency. With this setup the sample sounds created a surround feeling and the amplification was more present and direct. It is however difficult to draw an objective conclusion on how it affected the perception of the spatialisation.
Understanding the relationship between the sound coming from the acoustical instruments on stage and the amplified and panned signal coming from the loudspeakers is a demanding task. The shape, acoustics and dimensions of the venue as well as the placement of the audience greatly influence how the spatial movements and relations will be perceived. A spatialised signal coming from the speakers placed in the back of the venue will be barely audible to listeners sitting close to the stage where the real source originates. There will always be a (at times considerable) gap between the intended panning of the instruments as defined in the software and the perceived result.
There is no clear indication as to how to apply the reverberation to the live instruments. There are different options to achieve this. One is to apply mono, stereo or quad reverbs to the spatialisation output, another would be to apply mono reverbs to the spatialisation input. It was decided to apply one stereo reverb unit to the front speakers and another to the two in the back with slightly different parameters (rear longer reverb time, less pre-delay). This proved to be more effective.
Intonation and Monitoring
It is very important to rehearse the tuning and synchronisation of samples carefully together with the musicians and the conductor. The patch provides a 442 Hz signal for initial tuning.
The musicians must be able hear the samples clearly in order to be synchronous with them and, according to the composer, to adjust the tuning of the microtones according to them. [cf. Hirs 2009a, p. 14]. In large venues, this can become a serious issue and monitoring may have to be considered.
Difficulties presented by the score
The non-measured notation presented difficulties for the musicians during the first rehearsals. To allow for a more accurate and effective execution, it was necessary to define the bars and metric units to be shown by the conductor. The distribution of measures as indicated in the MIDI keyboard part was adopted for each instrument.
Murail, Tristan (2000b), Winter Fragments, Note. http://www.tristanmurail.com/en/oeuvre-fiche.php?cotage=27488 (Last accessed on July 16. 2020)
Hirs, Rozalie (2009a): Interview with Tristan Murail, April 10, 2007. In: Hirs, Rozalie & Gilmore, Bob (eds.): Contemporary Compositional Techniques and Openmusic. Paris: Editions Delatour, 2009, pp.
7 – 14.
Hirs, Rozalie (2009b): On Tristan Murail’s Le Lac. In: Hirs, Rozalie & Gilmore, Bob (eds.): Contemporary Compositional Techniques and Openmusic. Paris: Editions Delatour, 2009, pp. 45 – 89.
Hirs, Rozalie (2009c): Frequency-Based Compositional Techniques in the Music of Tristan Murail. In: Hirs, Rozalie & Gilmore, Bob (eds.): Contemporary Compositional Techniques and Openmusic. Paris: Editions Delatour, 2009, pp. 93 – 176.
Tüzün, Tolga (2008): An Analysis of Tristan Murail’s «Winter Fragments». In: Reigle, Robert & Whitehead, Paul (eds.): Proceedings of the Istanbul Spectral Music Conference. Istambul: Pan Yayincilik, 2008, pp. 296 – 317.