Tuesday, March 27, 2007

CC2 - Week 4 - MIDI Information and Control

First and foremost, a big shout out to John "I <3 Ubu" Delany for allowing me to pilfer his comprehensive ubumenu of MIDI instruments.

Time seems to be the most important ingredient in brewing a successful Max patch, followed closely by love and patience. CC2 this week was a deeper foray into MIDI control in Max, mainly pitch bend, modulation and program changing. I will never doubt the power of taking notes in class, as they pulled me out of a rut on more than one occasion, as did the right-click>help option. One specific problem I had was getting both MIDIin data and MIDIout data out of the midiinfo bidul... uh, box thingy. Notes to the rescue: left input created MIDIout, right input created MIDIin. Then again, this class note was only of limited use- "MIDI Panic button - Difficult". Nevertheless, Matt M and I figured out that 'flush' was the secret. I spent some time trying to figure out how to access program data, but luckily a flashback of the midiparse outputs helped me out- I think Luke showed me it some time ago, but I didn't know what I could use it for. Interesting quirk- when the patch loads, midiparse throws up 9 errors, then works anyway. If it ain't broke...

I have also obviously made visual upgrades to the patch, the main change being the context-sensitive colors of the pitch bend, modulation and others. Originally Luke and I discovered a program called 'mood_color' or something, which randomly changes colour values on an object. While this was nice and psychadelic, I instead ended up programming some of the dials on my keyboard to change colour depending on the value, which resulted in a sort of 'heating up' visual effect. The most complicated part of achieving this was for the pitch bend, as I wanted the colour to stay black while there is no bend, then fade to red for both bend up and bend down. There is an explanation of this inside the encapsulation "p Pitch Colouring" in the patch, and I have provided a pic below.

Big thanks to Dave D, Johnny D and Luke Dipants for regularly helping me out.



Patch File

1. Christian Haines. "Creative Computing: Semester 1, Week 4; MIDI Information and Control" Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 22nd March 2007.

Sunday, March 25, 2007

AA2 - Week 4 - Drums

The effluent drum kit in EMU was the focus of our attention for this week’s exercise (1). Mrs. John Delany and I teamed up once again, but this time we required someone who can hit things with sticks- John and I are just far too timid. Luckily Matt Mazzone jumped in at the last minute, which sped things up a bit. The ’sweet spot’ of the Recording Space seems to be in front of the bass trap next to the Studio 1 window, much like we had in the AA class, however our testing found that facing the drum kit more towards Studio 2 on somewhat of an angle gave a crisp room sound. Here is our eventual microphone selection and placement:

Kick: Shure SM52; dead centre
Snare: Shure SM58; above
Snare: Shure Beta57; below
HiHat: Neumann KM84; above, aimed at body
HiTom: Shure Beta56A; placed on inner edge, aimed into centre of drum body
MidTom: Same as HiTom
Floor Tom: Shure Beta56A; placed on inner edge, aimed towards the side
Overheads: Neumann KM84s; spaced wider than kit width
Room: Neumann U87; in front, equal distance from walls as to kit


For this recording I took the opportunity to test out something different- a wide spacing of overheads. The result was a definitive stereo difference between the ride and crash, and quite nice in my opinion. As expected, the basketball- I mean kick drum- was quite average sounding, but John and I decided that we got 'a great recording of a crap sound'. The snare came up good in the mix, but only after the above mic was turned down slightly to get rid of some of the inherent plasticy ringing.


The toms were quite even sounding, probably thanks to the use of similar mics for all 3. Overall there seems to be a slight lack of substance for the toms, and at the time of the recording this was the best sound we could achieve. If I was to do it again I would try different microphones- I don't like the idea of the one-size-fits-all Beta56A. I was quite happy with the sound of the hihat while in the studio, however the soon-to-be-ending problem of the Tannoy speakers providing inaccurate bass reflex resulted in a somewhat thick sound. The room mic worked fine, and gave quite an accurate representation of the room sound. The only insert we used was compression on the room mic, so as to maintain the full 'dynamic' of Matt's drumming.

In the end I was relatively pleased with the sound of the recording, but not the sound of the drum kit. It's kind of like recording a singer who can't sing.

Drums with room mic MP3 408KB

Drums without room mic MP3 409KB

1. David Grice. "Audio Arts: Semester 1, Week 4; Drums & Percussion" Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 20th March 2007.

Friday, March 23, 2007

Forum - Week 4 - Mass Debate 1

I fail to see the point of student presentations (1), considering my opinion is the only one that matters. Regardless, the topic for today was 'Collaboration', which the Encarta Dictionary defines as "a fibrous protein found in skin, bone, cartilage, tendon, and other connective tissue that yields gelatin when boiled in water" (2). Hon. David Dowling presented a... presentation on the "Symphony And Metallica" collaboration between Michael Kamen and some rock band of mild notoriety. I have heard some of these tracks before, and I felt that the combination of metal guitar and orchestral instruments was fine for a while, but the novelty eventually wore off to the stage where it was embarrassing to be listening to. The conversation afterwards centred around whether the collaboration was a success, which received mixed responses.

The next presentation was from 3rd year Vinny Bhagat, who introduced us to Trilok Gurtu. I found the music very intriguing and would have enjoyed hearing more, however Lady Time (Stephen Whittington) interfered and we only heard short snippets of songs. Next was Will "Ferrell" Revill, who spoke of the collaboration between sound engineers/designers and video game developers. An example of superb collaboration of this sort is between the developers at Bungie Studios and composer Marty O'Donnell for the Halo trilogy on XBOX/X360. Marty is highly involved for the entire development of the games, which allows him to fully understand what sound the game requires in each situation.

The final presentation was from the effervescent 1st year, Sanad, who's topic used a collaboration of Google and Wikipedia to 'bring to light' the irrelevance of the title "World Music". While the talk had all the Sanad zest that we have come to love, supporting content was minimal*, instead relying heavily on unsubstantiated opinion. He argues that the convolution of Eastern society by Western music is a one way street, then strangely alludes to Britney Spears using an Arabic riff and The Chemical Brothers using an Iranian riff. While the overall topic was interesting, I still fail to see why any cultural cross-pollination is so terrible, or why generalisations such as "World Music" are detrimental. The resultant calamity of class discussion was tantamount to the gossiping at a hairdressing salon (and just as meaningful), and in the end there were no answers- just hot collars. This would be a great area of study should Sanad decide to pursue it, perhaps in an essay for Dr. MC.

*University is the expensive exchange of other people's ideas.

1. Stephen Whittington. "Music Technology Forum" Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 22nd March, 2007.
2. "Collagen" Encarta ® World English Dictionary © & (P) 1998-2004 Microsoft Corporation. All rights reserved.

Tuesday, March 20, 2007

Forum - Week 3 - Dirty Harry

While doing this course we seem to be constantly forced into expanding our views on what constitutes music. Are we so bigoted in our musical taste that we have to have an endless stream of weird crap thrown at us like so much wet fish? For forum this week we all played our part in performing David Harris' creation "Compossible" (1), which roughly translates to "Composition? Impossible!", coincidentally David Harris' life motto. One big problem- this piece would have been very, very easy to create. I could do the same thing in 2 hours, hand-written and photocopied. It doesn't matter who plays what, who does what or what the result is. The notion of every sound being musical died out in the 1980s, when all the 70s hippies realised that you cannot be self sufficient if your only activities are expanding your mind and sharing needles (normally vice-versa).

I listen to such experimental 'music' and think about the era that we live in. In the time of the Renaissance, people rejoiced in the rediscovered musical writings of Ancient Greece- Ptolemy, Plato et al.- and luxuriated in the aural perfection and purity that these studies unleashed. And now, coinciding with the upsurging of computerised technology, some people feel the need to push music in a new direction- a non-musical direction. After untold millenia of studying what makes good music so good and creating vast libraries of techniques and secrets to fully realise our musical potential as a species, are we really going to revert everything to Paleolithic times? Music was created by humans, humans sculpted it to be what it is today, and it is a crowning achievement over the other animals of our planet. Removing almost all of the aspects of harmony, melody and rhythm reduces music to a previous state of our evolution. We earned our 44+2 chromosomes, and I for one will enjoy the fruits of being part of such an advanced species- namely the delicious peach of harmony, the succulent grape of melody and the dionysian honeydew of rhythm.

I could just imagine it now, the future music industry flooded with an army of David Harri, music stores with chance operations cranked up on the instore stereo.

The revolution is coming...

1. David Harris. "Music Technology Forum" Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 15th March, 2007.

CC2 - Week 3 - Program Structuring

It's always disconcerting when you look at the empty Max patch window and think about how little you know.

Even though starting was difficult, once things on my keyboard patch started to work it all fell into place. The only hurdles I encountered were getting the MIDI channel selector to work (patched out of the wrong output) and a bug with the pitch class, which turned out to be due to the pitch class formula being to the right of the octave formula, when the pitch class used the octave in its algorithm- right to left order! Patch TXT File





1. Christian Haines. "Creative Computing: Semester 1, Week 3: Program Structuring" Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 15th March 2007.

Monday, March 19, 2007

AA2 - Week 3 - Electric Strings

For Audio Arts this week, Johnny D and I formed an unstoppable duo, combining his guitar talent with my... talent, to test out some recording techniques for electric guitar. Even though I hadn't eaten all day. The recordings were made in Studio 2, utilising the deadroom and Studio 2's brand new mixing desk (which is a welcome change from that horrendous 01v Omni thing). We kept in mind the phasing issues in regards to multiple micing, but luckily Corporal Delany knew an industry trick. "3 times the distance" he would say in an authoritative manner, to which I would promptly obey and adjust the second mic so that it was three times the distance from the amplifier of the first (actually we just went by ear, but the industry trick was still good to know). You may notice from the pics that the amp was placed on an upside down table- this was to satisfy my belief that amplifiers sound better on a hard surface, and John's belief that amplifiers sound better on the ground. If they're called 'microphones', is there a 'macrophone'?


Our first microphone setup was the time-honored duo of a Shure Beta57 and Beta58 for close-micing, plus an AKG C414-BULS for a room mic. The result was very homogeneous to the actual room sound, and the positioning of the AKG added in some reverberant low frequencies (which was on purpose- we moved it around the room until we found a bass trap). After some by-ear repositioning, the final position had the B58 2cms from the outer edge of the speaker, and the B57 slightly off-axis and slightly in from the outer edge of the speaker. Gold. Despite the "3 times" rule for phasing, we simply moved the second mic around until there was no perceivable phasing.
MP3 206KB


Our second setup used a Beta57 and one of the new SM52 kick drum mics, however the sound was so disappointing that we did not keep the recording. No matter what position we placed the 52, the sound was flat and unattractive, especially compared to the deliciousness of the B57. Instead we tested out a combination of Beta57, RODE NT3 and the AKG room mic. It took a lot of moving around to get a nice sound from the NT3, and even the best sound wasn't all that great. You can see in the picture that the final position of the NT3 meant it was more of a back up to the B57, rather than anything complimentary to the sound. Definitely not my favourite result, but if I wanted trashy then this would be the setup.
MP3 242KB


We inevitably accepted that the Shure Beta57 was king for the day, so we started to experiment with just the B57 (still in position) and the AKG. This next recording was made with the soundproof double-doors of the deadroom open and the AKG placed so it could pick up the sound passing in and out of the control room. Even though it was so close to the door, there was minimal early reflections- in fact the sound seemed to get fuller the closer to the door the mic was. I also made sure there was as many exposed flat surfaces as possible, including having all the curtains open. The final sound has a nice live 'Studio' sound to it, especially when compared to...
MP3 204KB



... the same thing but with the door closed, curtains shut and a baffle in front of one of the windows. This was a nice, clean sound, however I would imagine some post-production reverb would be required to bring it to life in a mix. Add another baffle and the deadroom would be have been as dead as it gets, but those things are damn heavy, and I hadn't eaten yet.
MP3 254KB


The final sound was a stroke of genius by Sergeant Delaney, whereby we had the Beta57 on the amp and a Beta58 in the control room, aimed at the centre of the strings of the guitar. The combination of the amplified, overdriven output and the acoustic output makes for a clean, almost refreshing sound. It works especially well with virtuosic soloing, as can be heard at the end of the recording.
MP3 238KB

Wednesday, March 14, 2007

CC2 - Week 2 - Max Quickstart

This week we were asked to create a Max patch that triggers the notes of the chromatic octave in sequence with a user selectable tempo and starting pitch. I had created half of the patch before I needed help from Dave, who suggested "Right click and select help." I am enjoying Max far more than Plogue last year, but possibly only because there is now less of a learning curve with using graphical user interfaces. One part in particular that I could not figure out was when to add in the MIDI note value, however it (eventually) made sense to add it to the counting value output from the counter, thus incrementally increasing the starting note value.



Patch Text File

Saturday, March 10, 2007

AA2 - Week 2 - Vocal Recording

After I 'performed' for John's vocal recording, I figured while the microphone stands and patchbay were all set up I may as well do my own. The script was written on the spot, and is no longer as funny as it was when I first recorded it. A slight logistical problem was recording in the Space or Deadroom when I have to press the record button, so I threw a baffle into Studio 1 and created a mini deadroom- unsuccessfully. When listening back to the recordings in the studio I could not tell how much the liveness of the room was making it onto the tracks, that is until I mixed them down and listened to them on headphones. Despite the roomverb, I managed to keep most of the sound pretty clean. And I didn't do any pitch shifting or anything to my voice, it really is that manly.

Script:
Are you having trouble with masculinity? Then join the army! You get a gun. You get bullets. You get sent somewhere... and then you go back home. Maybe alive. What are you, chicken? Buk buk buk.

1st Recording MP3 290KB
Microphone: Neumann U87
Mic Position: 25cm from source
Vocal Style: Plain, spoken word
Comment: The first recording was simple enough. Some compression was needed, but due to the lack of animation in my voice it was not very heavy.

2nd Recording MP3 422KB
Microphone: AKG C414-BULS, Neumann U87
Mic Position: 15cm for both
Vocal Style: Deep, masculine, spoken
Comment: My voice can only go so loud when it is this deep, so a close mic position was needed. The recording was originally bass-heavy, which somewhat suited the vocal style, but I still cut off some of the low end for a more natural sound. Both mic recordings were panned slightly to the left and right respectively, but only by about 15 degrees. Before compression, the end was louder than the beginning, but a moderately heavy threshold evened it out.

2rd Recording (Take2) MP3 483KB
Microphone: Neumann KM85 x2
Mic Position: 15cm, perpendicular to source (see pic)
Vocal Style: Deep, masculine, spoken
Comment: This is part two of the deep voice recording, in which I experimented with setting up the KM85s in atypical positions until I found a nice sound. The final position was not what I expected to work, but it had a nice presence when panned. Both mics needed identical compression and gain, as customising the settings individually created weird stereo phasing. I did not EQ this recording, as it achieved the sound I was aiming for originally with the AKG and U87.


3rd Recording MP3 374KB
Microphone: AKG C414-BULS
Mic Position: 25cm from source
Vocal Style: Loud, yokel, spoken
Comment: After some serious clipping and gain fixing, I found the levels which allowed almost any vocal volume to be recorded without distortion. The extreme variation in amplitude required heavy compression, and I also used a light Gate to lessen the presence of the roomverb.

4th Recording MP3 459KB
Microphone: Neumann U87
Mic Position: 5cm from source
Vocal Style: Very quiet, spoken
Comment: For this recording I wanted the opposite of the yokel. The gain needed to be increased considerably, thanks to the last recording being so loud and this being so soft. I had to make a conscious effort to minimise unintentional mouth noise, however some has still slipped through, mainly due to mouth movement imperative in forming sounds. While the compression was kept light, quite a bit of low frequency EQing was needed to minimise the proximity effect of the mic.

5th Recording MP3 365KB
Microphone: AKG C414-BULS
Mic Position: 25cm from source
Vocal Style: Loud singing, some spoken
Comment: While I was 'singing' for John's recording, I did my own recording of sung voice. Unfortunately, by this point in time I couldn't get out a whole line of Ubi Caritas without laughing. Fortunately, this resulted in a nice array of dynamics with both sung and spoken voice, much like an opera. Uhh, a bad opera. Quite severe compression was needed to bring out the quieter spoken parts, as I wasn't even facing the microphone properly.


Friday, March 09, 2007

Forum - Week 2 - The Stimuli Simulacrum

While the forum was primarily about originality(1), we touched on a topic which I wish to pursue further. This is regarding, as I put it, the intellectual integrity of a computer.

Intellectual Property
In our highly developed society it is a criminal offence to steal or replicate another person's intellectual property and masquerade it as your own. While ownership of a physical object has always been a possibility, the prospect of owning the nonphysical has only transpired in recent times. As our species emerged from a faceless fog of anonymity into the populism of the Enlightenment, we fashioned a mentality of self-commercialisation. It was only a matter of time before this egotistical attitude conceived the monetary value that could be attributed to ownership of intellect, and in turn the legal system had to ensure the money went to the rightful owner. Today, a computer is not legally capable of owning anything physical or nonphysical, however it is in our perceived ownership of the nonphysical that a dilemma emerges...

Intellectual Integrity
The physical existence of a manufactured item certifies its creation and implies the prospect of ownership. When there is no physical manifestation of an item, its existence, creation and possibility of ownership is questionable, as it can never be proven. When an idea becomes property, the line between the physical and the intellect is blurred. If we can own a complex amalgamation of electrical stimuli in the human brain, why is it impossible for a computer to possess its own 'intellectual' creation? It could be argued that the creator of the computer owns anything that computer produces, yet our society does not pass ownership of intellectual property over to our parents, our 'creators' if you will. Where is the line that separates intelligence and artificial intelligence? The obvious difference is as simple as the boundaries of organic and mechanical. This however, reignites the initial conundrum of physical and nonphysical possession. If we suggest the nonphysical to be existent such that it can be possessed, why is there such a restrictive physical limitation of 'organic only' to the concept of ownership? We draw the line of ownership at organic, yet blur the line of physicality altogether with ownership of the nonphysical.

Intellectual Evolution
If a computer composes a piece of music based on our rules of harmony and melody, why does it not belong to the computer? When does a computer cease to be following orders, and actually resemble our creative processes? When does a byte, become an idea?

I have asked more questions than I have answered, however it is the existence of the questions that is important.



1. Stephen Whittington, "I Want To Be Original Like Everyone Else." Lecture presented in the Recording Space, Electronic Music Unit, Adelaide University, 8/3/07.

Nerd: But electrical stimuli in the brain can be considered physical!
Me: So can the electrical current in a computer. Thank you for strengthening my argument.

Monday, March 05, 2007

Forum - Week 1 - Introductions and... Uhh....

Did you know Thursdays really suck for me? I have classes 9am till 4pm without a break, meaning I have to eat in the theory lecture. Anyway, the first forum of the year started out the same as first and second semester last year, with no real plan of what to do or how to mark us. All we know is we present anything on a given topic for an unspecified amount of time, and it will somehow be worth 25% of our music tech grade. I don't know why it is necessary have 8 people on each topic, I would think 1 or 2 people could cover a topic quite sufficiently. Perhaps we could present a topic based around Music Technology that we feel needs discussion. I am to discuss the lack of women in our discipline, which to me is inconsequential to the future of Music Technology. If women were not allowed to join, then that would be a topic worth discussing. Are we supposed to force people into a study area, because their 'type' are a minority? If the heads of Music Tech feel that the lack of females is an issue serious enough that it needs investigation, I might wonder whether a woman may be accepted into the course for that reason alone. I would assume the lack of women is due to the lack of female interest in any technology based course. Who knows why? Who cares? Aren't they allowed to do whatever they want? What do they hope to achieve out of this topic? I suppose I could make my presentation on how irrelevant the topic is, but that sort of thing only floats with Mark "MC" Carroll. Maybe I'll invite him along...

Something I always enjoyed was David Harris' mind-expanding electronic music appreciation class in Sem 1 last year. I would prefer to have that class again, and perhaps be graded on an essay about Music Technology in Contemporary Society, or our interpretation of it.

In any case, this Forum ended 50 minutes early as we ran out of things to do/teach. Would I be right to assume that a lecturer teaching a 70 minute class gets paid for 70 minutes? Surely not 2 hours pay. You wouldn't think so.


AA2 - Week 1 - Session Planning and Management

For our first week in Audio Arts we created a basic checklist of the steps required for a recording. Through mostly extracurricular means, I have done about 5 recordings, plus 8 or so of a pianist who has just finished her doctorate in music. My first couple recordings were very disorganised, one of which took 2 hours to set up, but eventually I figured out that there is no such thing as too much preparation. Our exercise for this week is to create a mock session plan for a recording of an artist of notoriety(1). Despite this, I have decided to 'record' the fictional Scottish hard-rock band Love Fist, from Grand Theft Auto: Vice City fame. For those who haven't been *cough* Love Fisted, here is the official advert taken from the game.

MP3 767KB


Pre-Production

Band Name: Love Fist
Style: Hard Rock
Members/Instrumentation: Jezz Torrent (Vocals, Guitar), Willy (Bass), Dick (Guitar), Percy (Percussion)
Equipment: Vocalist, electric guitar (1&2), electric bass, 8-piece drumkit.
Influences: Any dodgy 80s hard rock.
Song(s) to be recorded: Fist Fury
Duration: 3 minutes
Recording Studio: Studio 1, EMU Space and Dead Room; EMU, Adelaide University

Microphonage
Vocals: U87 Neumann
Guitar 1: Shure 57 (x2)
Guitar 2: Shure Beta 58 (x2)
Bass: AKG-BULS
Drums: Drumkit Mic Set


ProTools

Quality: 96kHz, 24bit
Tracks
1:Kick drum
2:Snare
3:HiHat
4:HiTom
5:Mid Tom
6:Floor Tom
7:Overhead Left
8:Overhead Right
9:Bass
10:Guitar 1a
11:Guitar 1b
12:Guitar 2a
13:Guitar 2b
14:Vocals

Microphone Placement:

The lead guitar will have two Shure Beta 58s, one facing square into the centre of the cone and the other at a 45 degree angle, but still aiming to the middle. Each mic will be roughly 20cms away from the speaker face. Back up guitar will use both Shure 57s, one aimed directly into the centre of the cone and the other aimed at the outer edge of the cone. Bass will have the AKG placed in the middle of the 4 speakers in the quad amplifier, around 30cms away. The drums will have each designated drum mic for its respective drum. The overheads will be place in an xy position, to collect a close stereo field. While this sounds good in theory, all mic positions may need to be changed if the desired sound is not achieved.

Tracking

I will first record the drums by themselves, as the overheads are susceptible to bleed from other instruments in the room, and the kit itself can bleed onto the guitar mics. I will have the bassist playing quietly in the room at the same time so that the drummer does not lose his position in the song. Baffles will only be used in certain parts of the space, such at bass traps and sheer surfaces like glass. This will (hopefully) minimise early reflections and booming, but maintain the 'live' sound of the space. I will then record the guitars one at a time, with only lead guitar having any baffling, as 'fake' verb sounds more in the style of 80's dodge rock. Once the best takes have been selected, the vocalist will be recorded in the dead room.

Post Production

The band will be heavily involved in the post production phase, as their tastes on the sound may be different to mine. I have found post-production to be very much trial and error, as no techniques are universally applicable. Any dynamic effects such as compression will be added in post prod rather than during the recording process, so I have a 'blank canvas' of sound on which to weave my magic.

Result

MP3 2.15MB


1. David Grice. "Audio Arts: Semester 1, Week 1. Session Planning & Management." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 27th February 2007.

Sunday, March 04, 2007

CC2 - Week 1 - Pseudocode

This week of CC introduced us to Pseudocode, otherwise known as 'English'. Our task description did not specify whether a note will be played by a person on an instrument or by the program itself, so I will not specify either.



If the program itself was generating the initial note, I would surmise some sort of NoteGenerator would be needed. Although if you have a NoteGenerator, why are you getting it to play in D Major only to transpose it into B Minor? One would think choosing B Minor to begin with would suffice. If a human was needed to enter the note values manually, then some additional 'code' would be needed, probably along the lines of 'IF note is played THEN' just after the REPEAT command.



1. Christian Haines. "Creative Computing: Semester 1, Week 1. Programming & Pseudocode." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 1st March 2007.

Saturday, March 03, 2007

07

New year, new template and a new photoshoppery. No, YOU'VE got too much time on your hands! You're reading this aren't you? Don't you have something better to do?