top of page

BWW 2021 - The "Midway" Point

  • Writer: Petra Mickey
    Petra Mickey
  • Feb 14, 2022
  • 20 min read

Updated: Mar 10, 2022

I have started working on the sounds that will be required for the game, as well as composing some sounds for the Video that has been given to me. So far, I have made great progress, with the video in question sounding like this:



So first off, I had to Import the video into Logic, so to see what I am doing and compose the sounds in time the image.


I first made the score to the game. A score is a non diegetic sound (one that is unheard in the actual setting of the game, by any of the characters) and is solely the music of that accompanies the game to make it seem more intriguing, captivating and emotional for the player. The score that I have composed consists of mostly traditional ancient greek instrumentation. I researched what sorts of instrumentation was used in Ancient Greece, in order to provide a cultural significance to the music, allowing it to be accurate to the premise of the game (which according to the developer, is based in Ancient Greece), as well as instantly evoke the connotation and association of the game with Ancient Greece, by the player.


I have researched what sorts of instruments were traditionally used in Greece, and how greek music was composed. I used this page https://kotsanas.com/gb/cat.php?category=21, to get an idea of what instrumentation was used in ancient Greece. As expected, the Lyre was probably one of the most popular instruments in Greece, with Gods such as Hermes often being depicted holding a lyre (The Editors of Encyclopedia Britannica (2014)).


I therefore downloaded a Soundfont (SFZ file) of Lyre. a Soundfont is file format of samples, that allows them to be played back in Samplers such as Sforzando Player, as a virtual instrument, using MIDI (musical instrument digital interface) (PCMAG. (n.d.)). Musical Instrument Digital Interface, or MIDI for short, is a digital language/protocol, that was created in the 80s, as a way of allowing multiple instruments and computers to communicate with each other and activate existing sounds and samples with just one set of triggers.


Originally MIDI was intended to minimise the need for separate instruments and gear that was required on stage, up to that point, especially by Prog and Metal musicians, who often had huge keyboard racks or giant Multi FX processors for their guitars, in order to be able to access all the sounds that they required for the performance. With MIDI, a plethora of sounds was instantly given at the users fingertips, with ability to be triggered with just one keyboard or one guitar with an equipped MIDI pickup, which changes the analogue guitar sounds into MIDI data, such as the Hamer Phantom A7, which notably used by Glen Tipton From Judas Priest. Even Drum triggers, and Electronic drum kits as well as Drum machines, can be used to trigger different MIDI sounds (often the sounds of real drums, or synthesised ones, such as the 808 sub kick). MIDI became an industrial standard in 1983 (Encyclopedia Britannica. (n.d.)).


The Soundfont of the Lyre, played back in the Sforzando player, was of descent quality, which meant that the sound that I got out of it was pretty realistic, despite it being a free Soundfont, which are often of a bad quality. I originally intended to use my guitar, paired with a virtual MIDI converter (Jam Origin MIDI guitar 2), to control the sounds of the Lyre. This would give it a more human feel with the slight nuances created by the strings, being picked up, hopefully making it sound more like a string instrument, as opposed to a triggered sample. However I decided against it, due to how Lyres differ drastically from the contemporary electric guitar, and how the intonations and playing style are very different, for example because of how the notes are laid out on a guitar fretboard, as opposed to they're laid out on a lyre, which does not have a fretboard, instead being more reminiscent to the contemporary harp, essentially being an ancestor of the harp, and belonging to the harp-like family of instruments.


Instead I programmed the lyre in manually, sequencing in slow riffs and ostinatos. A riff is a selection of notes (and/or chords, often power chords) that serves as the foundation and main focus of a piece, an example of a riff occurs in the intro to Postmortem by Slayer, which kicks off the piece, providing it with a memorable hook. On the other hand, an ostinato is a repeated group of notes that often lasts throughout the piece and is less in focus than a riff, and more of a rhythm/backing track, in a similar vein to a chord progression, an example of an ostinato occurs in Canon by Johann Pachelbel, (lowlevelowl911 (2017)).


The riffs that I sequenced in were intended to be mysterious sounding and rather on the melodic side, in order to convey tension as well as a light tinge of majesty/divinity and ethereality that may be a more suitable representation of the Gods. In a sentence you could describe the music as "Calm before a storm", and that is mainly down to the fact that while the character is walking around and swinging their weapon, there is no explicit cues that suggest that a battle is happening, instead this just being the regular "walking/exploring the map" type of scenario.


Most of what I have composed is chromatic, meaning that it uses any of the twelve notes available in modern music (Theory, D. @ H.M. (n.d.)). I have deliberately used the Chromatic scale, for two key reasons - firstly, because it is dissonant, and has a harsher and darker sounding sound, due to the user being essentially able to use any of the twelve existing tones, meaning that some interesting and dissonant combinations can be made, such as adjacent intervals, which sound dark, and consist of two adjacent pitches which are a semitone (half step, so basically one key onward on a regular piano keyboard) apart, for example, B and C, which create an ominous sound, especially when played one after another, such as in the main riff in the score for Jaws. Also, the tritone is usable and present in all variations, in the Chromatic scale. The tritone is used to describe the interval, that creates a dark and evil sound, by playing an augmented fourth, which is essentially two notes that are three tones (six semitones, since one tone is equivalent to two semitones) apart, creating a really dissonant and dark sounding sound, which sounds pretty ominous when played one after the other in a descending fashion, as heard in Black Sabbath's title song, Black Sabbath.


Another important reason for my use of the Chromatic scale specifically, is down to the fact that Ancient Greek music was very often dissonant and, to contemporary western musicians, would sound "out of tune" and messy. This was due to the fact that the Ancient Greeks had a different notation system, which vastly differed from the contemporary industry standard notation used in western music, and their music as a result consisted of multiple pitches, some of which were "microtonal" (Hewitt, M. (2014)). This means that not only there were many pitches used, but even pitches in between the pitches, essentially a quarter-tone or less, meaning that, whilst contemporary western musicians measure the fretboard in semitones (half steps), for example C to C#/Db to D etc. The musicians of Ancient Greece would have much more pitches to play with, as they measured their instruments' ranges in much smaller steps than semitones, for example C to C half #/D half flat (essentially between the C and the C#/Db note), and then up to D etc (Nettl, B. (2016)). This meant that what I composed needs to be dissonant to a point, and unrestricted and "weird" sounding to a point, in order to be believable.


The passage sequenced for the lyre, is brought to life and given atmosphere, and accentuated by the sound of a Pan Flute. The Pan Flute or Pan Pipes, are a woodwind instrument that originated from ancient Greece, and is associated with, and named after, the great horned god Pan (also known as Cerunnos, in the Celtic pantheon, and Faunus in the Roman pantheon) (Study.com. (2019). The horned god, contemporary often worshipped as a patron god by modern day Wiccans, is often associated with nature, and likewise, the Panpipes/Pan flute is often associated with the latter (GreekMythology.com (2018)). This meant that I could use the Pan Flute to further enforce the feel of the setting that is shown in the video, which is one that is mostly in a meadow/forested area, due to the cultural significance of the instrument. It essentially further enforced both the premise of the game, as well as gave it a more "folk" and natural feel, bringing connotation with Ancient Greece, alongside just sounding amazing, for what, once again, is a free Soundfont.


Both the Pan Flute, as well as the lyre complement each other, with the Pan flute playing drones that are in key with, and harmonise with the riff of the lyre, creating faux "chords" with both the instruments tones. The chords created are a progression of power chords, which are dyads (two different tones, (www.jazz-guitar-licks.com. (n.d.)) that are a perfect fifth apart, as well as minor dyads (which consist of a root and a minor third, a root being the original note, played by probably the pan flute in this scenario, for example an A, and the minor third being the third note that appears in the root note's scale, for example if the root is A, then the minor third would be C), to create a mournful and dark atmosphere, as well as dominant sevenths to create dissonance and darkness. A dominant seventh essentially is a major seventh, within which the seventh interval has been lowered down by a semitone, for example, an A major seventh chord, consists of the notes A C# E G#, whereas in order to achieve the dominant seventh, the seventh interval (the G# in this case) would be moved down by a half step, resulting in A C# E G. In dyad form, the lyre would play the dominant seventh, while the panpipes played the root (Liberty Park Music. (2017)).


None of the instruments played "real" chords, neither dyads (two notes), nor triads (three notes), nor any other groupings of chords. This is because ancient Greek music was supposedly monophonic, meaning that no more than one pitch would be played at the same time by a single instrument, so chords were formed with multiple instruments playing together, in a similar vain to how an orchestra or choir would play/sing chords by using multiple players/singers, who would playing/singing different notes, in harmony with each other. (public.wsu.edu. (n.d.).


To add to the ethereal and mysterious feel of the score, I added some choral lines, which were long drawn out vocal phrases, from Soundiron's Requiem Light Choir. These choral patterns would create a mysterious ambience, in an almost ghostly fashion, singing The Eb5, Ab5 and Db5 chords with their low voices, to create an ancient "chant" like feel. Despite Greek music being monophonic, I sequenced in polyphonic patterns in the choirs, as voices are monophonic anyway, so for things to be poly, multiple singers would have to sing different notes, which is actually similar to what I already did, with the harmonies and relationships between the different instruments, and would actually be pretty feasible in Ancient Greece, as despite there being no chords being played on instruments, different monophonic instruments did play and harmonise together, meaning that technically, something like a choir, which consisted of multiple singers, would also work the same way, so in essence the choir, if thought of as one entity would be an exception to the no polyphony rule, however this is actually multiple singers' monophonic voices blending together, so it works in this context. A vocal harmony in a choir context is when two or more singers sing two or more pitches which harmonise together, in the same way a guitar player may strum two or more strings to make a chord, or a rhythm and a lead guitarist may harmonise by the rhythm playing the foundations of a piece, such as a riff, and the lead playing the same riff an interval (such as a fifth, so seven semitones up or the string beneath) above the initial rhythm riff (Harmony Helper. (2017)).


To finish off the score, I added some percussive embellishments with some drums. These would serve to accentuate the piece, adding dramatic accents and hits to the score, which in turn, would help make the piece sound darker, more intense and more dramatic, with the ponding of the drums serving to build suspense and create a "battle" feel. The percussion section is very important in cinematic music for these reasons, but also because very often drums and other percussive instruments serve to add power, and are a driving force to a piece. This is seen in many game and films scores, with my favourite example being the terrifying sound of the large drums in Ghom's Larder in Act 3 of Diablo 3, like in many scores, the percussion section, often consisting of multiple large, dark and cinematic sounding drums, such as Japanese Taiko, which are war drums used by Samurai, being pounded in unison, creates a dark, heavy, angry and intense sound that helps project the rest of the instruments onto another level (Anon, (n.d.)).


Because the Game's premise is Greek, using Japanese War Drums, such as Taiko, would sound out of place. Instead I used regular large toms, which are used in European music, and originated from Tupans, and Tabors, and similar war drums, and would sound much more Greek, than the taiko. For the Toms, I used the Toms from Steven Slate Drums 5.5, which were layered with different samples of huge toms that I have obtained from a free sample pack off YouTube, essentially triggering the toms. Triggering is when a drum sound is blended with a sample of another drum sound, which plays simultaneously when the original drum is hit, thus creating a huge drum tone from blending two separate drum tones together. This technique is often used in certain styles of metal and rock music, where drummers will electronically trigger their acoustic bass and snare drums, and blend them with samples of different bass and snare drums, to create a bigger, and heavier or fatter sound, for example by blending the sound of a regular bass drum, with an 808 sub kick sample, although some drummers in many modern metal styles use samples to cheat, and entirely replace their badly played acoustic drums, with pristine sounding electronic samples (Corporation, R. (n.d.)).


With these huge sounding toms, I created some pounding drum accents, by having them come in at unexpected moments, playing "military" beats, essentially emulating marching/war drum beats, which ended up sounding very much like war drums, and added that impending battle feel to the piece. To get a better sound from those drums, I ran them through an EQ, which cut out certain muddy and unpleasant/annoying sounding frequencies in the mids, and boosted low end, to give the drums more punch and a darker sound, with a low shelf. I applied a high shelf to increase presence in the drums, ensuring that they have a sharp transient and that they cut through the mix.


As a final step to completing the score, I humanised the instruments in the piece. Humanisation essentially adds some slight nuances in the timing and some fluctuations in velocities. By Humanising the virtual instruments, I ensured that they sound organic, and more like real instruments, played by real people, rather than samples programmed into a computer, which are of the same velocity and always hit exactly on the grid, which removes from the realistic and immersive aspect of the game and creates the feel of an amateur, robotic and/or lazy score. (www.blackghostaudio.com. (n.d.))


For the game's Sound Library, I used an array of different synth and sound samples, that emulate the perceived sounds of what is going on in the video, what the character is doing, and what will be happening in the game. A sound library in a game is a directory that holds all the games sound files, that make up the non musical/non soundscape sounds of the game. These sounds are diegetic, meaning that they are heard by both the player in the real world, as well as the virtual characters in the game's dimension. These could be things like the swinging/whistling sounds of swords/arrows, or the clanging of armour/shields. However they also include things that are much more subtle, such as breaths or footsteps, which differ when made in different conditions, for example upon different surfaces, like grass or sand. A sound library essentially is composed of multiple WAV files which are one shots of each sound, this allows them to be played back individually in any order and any place that the game developer chooses, essentially like samples in a sampler (ShylenokBloggerMay 06, P. and 2019 (2019)).


So for the sounds in the sound library, I was given the requirements, which consisted of footstep sounds, weapon sounds, armour sounds, power sounds, human grunts and water sounds. For most of these sounds, I have used an array of synthesiser sounds made mostly with the ESP synth engine in Logic, as well as TableWarp2 in sforzando player, as well as sample sounds, which were recorded one shots of different sounds, played back in the TX16Wx sampler by CWITEC Audio. The Battle grunts consisted of a regular Audio recording.


To make the sounds of footsteps on grassland, I made two individual sounds with the ESP, and layered them both, essentially playing them both over each other. The first sound consisted of the initial impact of the characters' feet hitting the ground. Because grass grows on mud and dirt, the impact would be softer, due to the fact that the dirt is a slightly softer material, and also the grass cushions the fall. For this I set the attack to fast, but gave it some slowness so that it would not be an instant impact, rather being much softer instead.


I turned down the gain on all the oscillators, but left the white noise in full swing, as this would ensure that I would get an unpitched sound that is similar to a drum, in that it has no definite pitch, and instead consists of an impact with multiple overtones and pitches (TheFreeDictionary.com. (n.d.)). Then I proceeded to pitch down the frequency of the wave. Frequency, also referred to as pitch, is essentially a way of defining how many times a sound wave peaks - the more peaks present in a sound wave, the higher pitched it will sound. Having turned down the frequency all the way, I made a very dull and bass oriented sound that sounds very thick and muddy (pun intended) - a bit like feet hitting dirt/mud based ground.


In order for this sound to cut through, I ran it through Logic's Channel EQ, where I boosted the high end of the sound, with a high shelf, an EQ technique that involves boosting the high end/high mids with a flat shelf-like filter, in a similar manner to a low pass, but instead the process involves boosting instead of cutting. Having boosted the high end of the sound enabled the transient to cut through the mix, in a similar vein to how one would EQ a bass drum, for clearer transient. The transient in a wave is where the initial impact hits and the wave starts, in this case the transient would be the initial sound of the foot hitting the ground, which would be quieter, due to the slower attack (J.Sturgis Tones. (n.d.)).


To clear up the sound, I did a series of cuts in the mids, that got rid of high end harshness, and low end mud. I did also apply a high pass filter, which cut the low bass/sub frequencies, that would otherwise dominate the sound, and cause it to clip during mastering.


To fully complete the sound of the grass impact, I layered some bright white noise, with a slow attack and medium sustain over the sound, which would emulate the swishing of the grass itself.


To add realism to both sounds, I automated small parameters of the Synth, to be "random", as well as humanised the velocity and offset of the impacts. This would add realism due to the fact that when one steps, they usually have subtle nuances within each footstep, that are not replicated by a synthesiser, which instead robotically and monotonously can produce exactly the same sound, at the same velocity, and at the same time, for the whole duration of the sound. However this would make the walking and movement of the characters seem very robotic and mechanical, which would take away from the organic and ancient and very human feel of the game, as well as generally lack realism (Production Music Live. (n.d.)).


I somewhat repeated the process, to a similar extent, with the sound for footsteps on stone. During the video, the character is seen walking on stone, so I felt that it would be necessary to add different sounds and nuances to the footsteps, depending on where the character steps.

This was inspired by the foley in the popular Sandbox game Minecraft, which has footstep sound effects for every single block in the game, so to add to the realism and immersive nature of the game. For example a player walking upon sound in Minecraft, would sound different than the same Player walking on dirt, snow, grass or stone. I drew heavy inspiration from Minecraft when making the footstep sounds, in an attempt to make the sounds cause the game to be more immersive and realistic (The Ultimate Resource for Video Game Design. (2018)).


For the sound of footsteps upon stone, I also utilised the ESP synth engine, and did a similar process to how I made the footsteps upon ground sound, but instead I cranked the attack to be faster, and thus give a sharper transient and more impact. I turned down the frequency and made the sound punchy, while still retaining a sharp and clear transient, kind of like a bass drum, again. This sound was intended to produce a sharper transient, in due to emulate the sharp, staccato sound that is made when a foot strikes a hard surface, like stone or concrete, which is less resonant and has a tighter sound to it, thus making a sharper attack. I boosted the highs and high mids on the EQ to get the transient to really cut through, and applied a high pass, in an attempt to reduce the overpowering bass and sub bass frequencies. I also applied a small low pass, in order for the sound to not sound too airy or "clicky".


For both of these footstep sounds, I applied a small amount of compression, using Native Instruments' Supercharger. This would ensure that the transients do not clip the track, whilst remaining at their optimal volume. Clipping happens in tracks when a sound wave's output is too loud for the speakers, which results in the edges of the waves to be "clipped", essentially squaring them off, at the amplitudes that the speakers can't handle, producing a bit-crushed/distortion style effect, which can damage the speakers and take away from the clarity of the sounds (McAllister, M. (2018)).


I also bussed, all the foley effects (footsteps etc.) as well as the soundscape, to a separate bus, where I applied Logic's Space Designer, with an Amphitheatre Impulse Response. This basically would route the outputs of these tracks into that effect, which would make it seem like they are all coming from the same large, outdoor area, which would in turn add to the realism of the game. Impulse Responses are basically output effects of a system, used to shape and emulate different output sources, such as large outdoor room acoustics, to speakers and cabs. They are used on guitar cabinet emulators to emulate the sounds of different cab speakers, such as a Marshall cab which sounds different, sonically to a different cab, such as a Fender Cab. They can also be however used on outputs to make the sound seem like it's coming through a specific output, such as resonating in a large outdoor space, coming through a tape cassete player's speaker playing back a tape cassete, or through a walkie talkie (Premier Guitar. (2012)).


I later made the mace swing sounds, which consisted of white noise being played back, from the ESP again, with a medium attack and a fast decay, essentially creating a rapid whooshing sound that sounds like a sharp object cutting through the air. The white noise is slightly pitched down, to create a "thicker" and fuller sound so that the sound is more reminiscent of a blunt(er) weapon, in this case a mace, as opposed to a sharp slashing weapon like a sword or dagger, which would cut through the air faster and have a higher pitched sound with more presence.



I ran this sound through an EQ which cut away the harshness and fizz of the high mids, and boosted the high end sparkle and air of the high end, with a high shelf, and also cut some of the muddiness and looseness of the low mids. This helped create an illusion of a fast, heavy and slightly sharp object being swung through the air, at high velocity, and with the harshness cut, it hopefully sounded more natural, and less like a jet engine.





I applied some ChromaVerb onto the sound with a slight decay, which would help the sound sustain a tad more, while fading out essentially giving it some more release, letting it fade out into the distance. Decay is essentially the tail end of a sound wave, with the transient being the head, or the initial impact. The Decay on the other hand is the actual "residue", of the impact, for example, if a drum is hit, then the transient will be the initial impact of the percussionist's beating implement hitting the membrane of the drum, and decay would be the actual vibration of the skin and shell of the drum, and the tone of the drum's sound (without it the sound would literally just be a sharp click) (www.freemusicdictionary.com. (n.d.)).


In the case of a synth, there are attack (transient) and decay parameters on the engine itself, however, I didn't want the sound itself to decay, or sustain, I wanted it to feel like its diffusing into an outdoor space, in this case an amphitheatre, and for this, one uses reverb. Reverb emulates the acoustics of an area, and applies them to the sound. This can have different effects, such as a massive decay, in the case of large reverb, that simulates the acoustics of large spaces, such as halls or chambers, or smaller decay, if room IRs are applied. In either scenario, Reverb serves to increase the diffusion, spread and decay of a sound wave, similarly to how making a noise in a small room, e.g. a closet, makes a shorter decay, than if the same noise was to be made in an open field, or a huge concert hall, due to how the sound wave travels and reflects of the surfaces in the room, with larger rooms meaning the sound has to travel for longer. (reverb.com. (n.d.)). Certain surfaces also change the way a sound travels, for example, treated rooms, such as studios, will usually have foam blocking the corners, or be non-parallel, meaning that the bass frequencies, of the sounds present, are not lost in the corners.


Finally, I also needed to make a soundscape. A soundscape is an omnipresent series of diegetic sound(s) that plays and is heard when the character is in a certain location. These sounds are essentially the ambient sounds of the location, (Acoustic Nature. (n.d.)). In this case, the video shows the character walking around a meadow/forested area, which would come with its own soundscape, which would consist of the ambient sounds present in these areas, such as the rustling of trees, the whooshing of wind, the chirping of birds or the calls and noises of the animals that inhibit the woodlands.


For the soundscape, I used the sounds of Tundra Atmos from Spitfire LABS. The game developers require a continuous 10 minute long soundscape, which will be played back continuously in a loop, and will need to sound as realistic as possible, and unlike a loop. The soundscape will later need to be bounced out of Logic Pro, alongside every individual foley sound for the sound library, and placed in a separate folder.


For this I composed a VERY long note, playing back the Tundra Atmos, on repeat. The good thing about Tundra Atmos, is that, like all LABS and Spitfire instruments, it is well made, meaning that unless one was to deliberately place a gap between the notes, then there is no pause between the start and end of the loop, and likewise no way to identify where it loops and starts/ends. This is very good as it means that the sound will be very realistic, essentially just sounding like the ambience of a real forest, and not a sample, which is very good as it will both add to the realistic and immersive aspects of the game, as well as be easy for the game students to work with, with them literally having to just place the sound into their game, without any issues of it stopping, cutting out or not working, meaning it will always be present in the background, just like a real ambience.







Acoustic Nature. (n.d.). What is a Soundscape? (Definition and Science of Hearing). [online] Available at: https://acousticnature.com/journal/what-is-a-soundscape.


Anon, (n.d.). Why Percussion is Important & Why Music Needs It (Sometimes) |. [online] Available at: https://coolpercussion.com/why-percussion-is-important/.


Corporation, R. (n.d.). Roland - What are Drum Triggers? [online] Roland. Available at: https://www.roland.com/uk/blog/what-are-drum-triggers/ [Accessed 14 Feb. 2022].


Encyclopedia Britannica. (n.d.). MIDI | music technology. [online] Available at: https://www.britannica.com/art/MIDI-music-technology.


GreekMythology.com (2018). Pan - Greek Mythology. [online] Greekmythology.com. Available at: https://www.greekmythology.com/Other_Gods/Pan/pan.html.


Harmony Helper. (2017). What is a Vocal Harmony? Learning to Harmonize with Harmony Helper. [online] Available at: https://harmonyhelper.com/2017/10/what-is-a-vocal-harmony/.


Hewitt, M. (2014). Microtonality in Ancient Greek Music. [online] The Note Tree; 1st edition (30 Nov. 2014). Available at: https://www.amazon.co.uk/Microtonality-Ancient-Greek-Michael-Hewitt/dp/0957547013 [Accessed 10 Dec. 2021].


J.Sturgis Tones. (n.d.). What Are Transients & Why Do They Matter to Your Mix? [online] Available at: https://joeysturgistones.com/blogs/learn/what-are-transients-why-do-they-matter-to-your-mix.


Liberty Park Music. (2017). What is Dominant and Diminished Seventh Chords? [online] Available at: https://www.libertyparkmusic.com/dominant-diminished-seventh-chords/.


lowlevelowl911 (2017). What is the difference between an ostinato and a riff? [online] Available at: https://www.reddit.com/r/musictheory/comments/7csni9/what_is_the_difference_between_an_ostinato_and_a/ [Accessed 10 Dec. 2021].


McAllister, M. (2018). What Is Audio Clipping and Why Is It Important? - Produce Like A Pro. [online] Produce Like A Pro. Available at: https://producelikeapro.com/blog/audio-clipping/.


Nettl, B. (2016). Microtonal music. [online] Encyclopedia Britannica. Available at: https://www.britannica.com/art/microtonal-music.


PCMAG. (n.d.). Definition of Soundfont. [online] Available at: https://www.pcmag.com/encyclopedia/term/soundfont [Accessed 10 Dec. 2021].


public.wsu.edu. (n.d.). Ancient Greek Music. [online] Available at: https://public.wsu.edu/~delahoyd/mythology/greek.music.html#:~:text=Plato%20defined%20song%20(melos)%20as [Accessed 4 Feb. 2022].


Production Music Live. (n.d.). 6 Ways to Humanize Your Tracks. [online] Available at: https://www.productionmusiclive.com/blogs/news/6-ways-to-humanize-your-tracks#:~:text=Humanization%20can%20also%20be%20defined [Accessed 4 Feb. 2022].


Premier Guitar. (2012). The Working Guitarist: All About Impulse Responses. [online] Available at: https://www.premierguitar.com/diy/recording-tips/what-is-an-impulse-response [Accessed 4 Feb. 2022].


reverb.com. (n.d.). What does a reverb effect do? | The Basics. [online] Available at: https://reverb.com/news/what-does-a-reverb-effect-do-the-basics#:~:text=Reverb%20occurs%20when%20a%20sound.


ShylenokBloggerMay 06, P. and 2019 (2019). Designing Sounds for a Game. [online] Game Developer. Available at: https://www.gamedeveloper.com/audio/designing-sounds-for-a-game [Accessed 4 Feb. 2022].


Study.com. (2019). What is a Pan Flute? - History, Origin & Types | Study.com. [online] Available at: https://study.com/academy/lesson/what-is-a-pan-flute-history-origin-types.html.


The Editors of Encyclopedia Britannica (2014). Lyre | musical instrument. In: Encyclopædia Britannica. [online] Available at: https://www.britannica.com/art/lyre.



Theory, D. @ H.M. (n.d.). A Guide To The Chromatic Scale | Hello Music Theory. [online] https://hellomusictheory.com/. Available at: https://hellomusictheory.com/learn/chromatic-scale/.


The Ultimate Resource for Video Game Design. (2018). Video Game Sound Design | Beginner’s Guide. [online] Available at: https://www.gamedesigning.org/learn/video-game-sound/.


www.blackghostaudio.com. (n.d.). 5 Ways to Humanize Your Productions | Black Ghost Audio. [online] Available at: https://www.blackghostaudio.com/blog/5-ways-to-humanize-your-productions [Accessed 14 Feb. 2022].


www.freemusicdictionary.com. (n.d.). “Decay” | Definition on FreeMusicDictionary.com. [online] Available at: https://www.freemusicdictionary.com/definition/decay/.


www.jazz-guitar-licks.com. (n.d.). Dyads & Diatonic intervals - Guitar Shapes and Music Theory. [online] Available at: https://www.jazz-guitar-licks.com/blog/lessons/what-are-dyads.html [Accessed 14 Feb. 2022].


www.nps.gov. (n.d.). Understanding Sound - Natural Sounds (U.S. National Park Service). [online] Available at: https://www.nps.gov/subjects/sound/understandingsound.htm#:~:text=Frequency%2C%20sometimes%20referred%20to%20as.

 
 
 

Recent Posts

See All

Commentaires


©2019 by Shadow Von Nyx. Proudly created with Wix.com

bottom of page