Since his early origin, Music has been considered a medium to evoke emotions in the listeners. Even if researchers still argue wheter it elicits emotional responses in the audience, or wheter it simply represents emotions, they agree that music provides a sort of emotional experience and affects our moods (Sloboda & Juslin, 2001).
In modern society there are different mediums that are used for delivering emotion and experiences but one of the youngest is the video games. Since the first games in late 70s, they included sound effects and music to better represent the narrative and to deliver the emotions in the players.
In these interactive and immersive applications, however, in order to deliver emotions, the developement engine has to adjust the music score automatically according to the player’s interactions. Very often pre-composed musical elements or pre-arranged sound effects are triggered by the elements of the game play (Berndt & Hartmann: Audio Mostly Conference, Sweden 2006).
One of the best examples in the indie scene for interactive music is the game FEZ developed by Polytron Corporation, 2013. The game’s creator and designer is Phil Fish and its programmer is Renaud Bédard. The soundtrack is composed by Rich Vreeland, also known as Disasterpeace.
A new Dimension
Fez offers to the player a highly dynamic world with levels that consist of non-euclidean spaces known as “Rooms”. Gomez, the game’s “protagonist”, is a bi-dimensional figure who lives inside a two-dimensional world. Similarly to the protagonists of 8-bit and 16-bit games such as Super Mario (Super Mario Bros., Nintendo 1985), Gomez has impressive jumping abilities, which serve as the main element of gameplay in a world composed of various types of platforms. After a brief introduction, he encounters a mysterious being known as the “Hexahedron”, who give him a magical fez hat that allows him to perceive the third dimension, which rotates the gamer’s perspective at will. As Gomez experiments with his new ability the Hexahedron unexpectedly fractures and explodes, causing the game to glitch, freeze and reboot, complete with BIOS screen. Gomez awakens in his room with his ability to perceive and manipulate a third dimension intact, and is charged with the task of recovering the scattered fragments of the Hexahedron before the world is torn apart. Depth, or the Z-axis, is only visible to the player in the rotation of perspectives, and is not a factor in the actual obstacles and chasms which Gomez must traverse. The player must manipulate these perspectives to explore the world of FEZ and collect thirtytwo cubes in the form of ‘cube bits’, ‘whole cubes’ or ‘anti-cubes’. In Video 1 it is possible to see how the rotation of perspective allows the character to reach platforms too far for his bidimensional sight (Mack Enns, Academic Paper, University of Ontario, 2013).
Music composition as note proximity instead of order
In a workshop presented by Pyramind Studios and Game Audio Network Guild entitled “Philosophy of Music Design in Games” made in September 2012, Rich Vreeland, talks about the process he used to compose the score for the game. He notes that:
“it was quite an interesting challenge because […] instead of thinking about order, like when things happen in the music, it was more about proximity, like which notes do I want to happen near other notes so that they sound pleasing. Which is kind of a weird thing to think about in music (29:41)”.
With the game’s soundtrack Vreeland not only complements this exploration aesthetically with the combination of ‘chiptune’ sounds and studio effects like reverberation and delay, but also incorporates this exploration into its production and composition. This concept it is easily shown in the Temple Room, as in Video 2, in which the music was composed in a way to simulate the “rain phenomenon” with midi notes and chords. It starts with a chord which represent the sound of a thunder and the rain particles are represented with fast and soft arpeggiator lines.
His quote from the 2012 workshop signifies his emphasis on a spatial rather than temporal conception of music, and coincides with his discussion of “Music Gameplay” and “FEZZER” the “FEZ Music System”.
FEZ Music System Tools
Sequence Context Menu
Fezzer, the game editor, allows the developer to explore each section of the game as an outsider camera. It is not necessary to modify the perspective to explore the game map because the editor is in three dimensions. In figure 4, it is possible to see how it works. When a block it is right-clicked it shows the context menu which allows to specifically choose the exact sound for each game elements. It is used for both the sound effects and music elements. In order to load a sample or a midi file it is needed to click the “Sequence…” button. In the above picture, the composer has loaded two sounds “3x_03” and “3x_04”. These refer to the bright and bit-crushed synth arrays that coincide with the appearance and disappearance of bright red blocks in the Music Rooms (Vreeland, 2013).
This example involved the close collaboration between the programmer and the composer. The first one has to adapt the gameplay to match the rhythm of the music composed by the second. The blocks do not appear only with the synth arrays, but also they appear on beat with the level’s score. The same technique it is used by Lena Raine, the composer for Celeste (Matt Makes Games, 2018). Vreeland said that this technique needed to think about music in terms of proximity rather than order, or in terms of spatiality rather than temporality. The ability to visualize the implementation of music in Fezzer was crucial in his composition process, as he would think about “which notes have to happen near other notes so that they sound pleasing” (Polytron Conference, 2013). The term “near” in Vreeland’s quote does not mean a nearness in time, but a spatial proximity between the game elements. Also thanks to Bédard, the game developer, the editor allows this spatial conception of music through its incorporation in each element placed among the levels.
On the contrary of the context menu, the script browser affetcs an entire room instead of a single element. Scripts are programs which are written for a specific run-time environment that can read and execute tasks in an automated fashion. In other words, scripts are sets of tasks that can be performed by programs that can interpret them (John V. Guttag, 2016). So Fezzer deals with its own type of script. The purpose of the definition is an attempt to explain the different possibilities that are available with scripts, which can perform almost any function as long as they are interpreted by the host program.
The script browser window, in Fezzer is structured as a table which lists each script for “Id”, “Name”, “Trigger”, “Condition” and “Action”. The “Id” stands for identifying number, while the Name column show the name given to the script. In figure 5 it is possible to see that the composer, left the “Id” and the “Name” fields at their default values. The “Trigger” means the event that will start the execution of a script. This particular windows presents all the scripts used for one of the “Music Rooms” which are constituted by altitude-sensitive musical objects. As seen in Video 2 when Gomez, the main character, ascends higher in the room, different musical elements are added to and subtracted from the mix. Each “Volume[x]” condition contained inside the Triggers indicates an altitude.
So, for example the script four has as its trigger value “Volume, Go Higher”. It means that each time Gomez goes higher than an altitude of “5”, the script is triggered. The altitude values are arbitrarily assigned to invisible blocks positioned by the programmer inside the Music Room. The “condition” column allows for any other conditions to be entered, such as time of day or the amount of cube bits the character has acquired. Finally, the “Action column refers to which action will be taken when all the conditions are met.
If Gomez ascends higher than the altitude marked by an invisible block as “5”, a new musical element will therefore enter the mix, and it will remain there unless Gomez descends lower than the marked altitude. When this happens that is, (when Gomez descends below the designated altitude) gamers hear the opposite effect: the loop is muted again. In this sense, Vreeland’s composition for the game is actually interactive, what he calls “Music Gameplay”. Progress in the Music Rooms is signified by the soundtrack, which rewards players with more elements of the song as they approach the summit.
Main Composition Sequencer
The main composition sequencer window is used mostly to determine the timing logic of the elements in one of Vreeland’s “songs”. Like the scripts browser, the main composition sequencer can make changes that affect an entire level, but also, like the sequence context menu, it can be used to tweak single musical elements. The song name can be entered or re-entered at the top of the window. The “Overlay Loops” list box displays all the loops that can be in a level’s song, which can be added, removed and reordered with the buttons at the bottom.
Although it is not readily evident in Figure 6, seems that it is possible to select and manipulate more than one loop at a time for faster workflow. Vreeland’s naming style for his loops can be seen in the above example, and takes the form of “[Song Name] ^ [Mode]_[Musical Element]_[Amount of bars]bars”. All the information of a given loop is in the file name, so there is no guesswork necessary to determine which loop is which. It is notable that the “Musical Element” field does not adhere to any specific type of musical aspect, but instead serves solely to help programmers identify the loop. In some cases, it identifies a type of instrument featured, as in “Bass”, while in other cases it identifies a melodic phrase in relation to others, such as “antecedent” and “consequent”.
The “Selected Loop Properties” area serves most of the functionality of the main composition sequencer window. The “Loop Filename” is visible at the top, with a browse button beside the text field. The “Trigger between after every…” area has two text fields, with scroll arrow buttons, where a range of bars may be entered. In Figure 6the song “Cycle” is split into many overlay loops, which play in the Puzzle Rooms according to the settings entered here. The “Trigger” section, for instance, denotes where the selected loop will play, within a given range if desired. This makes the actual song heard during gameplay slightly unpredictable, or aleatoric, as loops may come and go anywhere within these set ranges.
Below, the “Fractional time” checkbox allows for irregular time signatures to be used in the deployment of loops. The “…and loop between…” section can be set to a range of the amount of times the selected loop will play another instance of aleatory composition. The length of the selected loop may be entered in the “The loop is…” field, or it may be automatically supplied by the “Detect” button. Vreeland’s naming style incorporates the length of the loop in bars, so it is likely that he never uses the “Detect” button. The “Delay first trigger by…” field can be set to denote the number of bars after which the loop is played the first time. In this case, loops may be staggered in order to adhere more to the logic of a traditional song form.
The “Mute”, “Solo” and “Preview” buttons are used to preview the song or selected loop within the main composition sequencer window. Finally, the time of day checkboxes “Day”, “Night”, “Dawn” and “Dusk” may be checked to specify when the selected loop may play according to the game’s time system.
The bottom section of the main composition sequencer actually deals with sound effects, as Vreeland wanted the eight cube bits that make up a full cube to have corresponding sounds that make up a full musical scale. The “Assemble Chord” drop-down menu allows the user to choose the chord to be assembled, while each drop-down menu in the “Shard Notes” area allows the user to choose a note for each cube bit to play.
- Sloboda, J.A. and Juslin, P.N. (2001) Music and emotion: theory and research. Oxford: OUP.
- Rich Vreeland, Philosophy of Music Design in Games: An Audio for video games workshop — Fez, 2013
- James Swirsky, Lisanne Pajot: Indie Game: The Movie, 2012
- Super Mario Bros., Nintendo 1985
- FEZ, Polytron Corporation, 2013
- Celeste, Matt Makes Games, 2018
- John V. Guttag, Introduction to Computation and Programming Using Python, 2016
- Renaud Bédard, Fez Technical Post-Mortem Slides, Polytron, 2013
- A. Berndt, K. Hartmann, Proceedings of the Audio Mosty Conference - a Conference on Sound in Games, 2006 Sweden
- Mack Enns, “Game Scoring: FEZ, Video Game Music and Interactive Composition”, Academic Paper, University of Ontario, 2013
- S. Domsch. Storyplaying: Agency and Narrative in Video Games (2013)
- S. Egenfeldt-Nielsen, J. H. Smith and S. P. Tosca. Understanding Video Games: The Essential Introduction (2008)
- J. Juul. Half-Real: Video Games between Real Rules and Fictional Worlds (2005)