Wednesday, April 30, 2008

Forum - Week 6 - Vicki Bennett and the Art of Sound Collage



I found Vicki's work quite engaging. Mash-ups are a little old hat nowadays, but her development of the genre shows how no horse should be left unflogged. Despite the length of the piece played, the class remained fixated on the musical progression, listening for a recognisable song, melody of voice. In particular I spotted Marylin Manson's voice, which was somewhat difficult to discern considering it was just a raw recording of his voice with no effects, unlike the majority of his recordings.

It was also good to have David Harris back in the mix (so to speak), and I hope we get some more subjugation to Harrisian values throughout the year.


Forum, Week 6, “Vicki Bennett and the Art of Sound Collage.” Workshop presented at EMU Space, Level 5 Schultz building, University of Adelaide, 10th of April 2008.

Monday, April 14, 2008

CC3 - Week 6 - Something or other.



Interesting that we find out the blog is now worth 40% of our overall grade, 6 weeks into the the semester. Sure, it's extra incentive - next semester. I've got a funny feeling week 2 of these holidays will be jam packed with SuperfunhappyColliding.

Sunday, April 13, 2008

AA3 - Week 6 - Production "Outside The Square" (3)


Sidechaining is not something I have had a use for, perhaps because I never really knew what context to use it in. For this week's exercises, I used some samples from the Mac library. Each MP3 has an untouched sample followed by the... touched... sample...

1. I tested out sidechaining a gate, so I used a percussive sound (trumpet honks) to control the gate on a continuous sound (a bass synth). After boosting the Threshold and extending the Hold time, I had added a pseudo-bassline to the trumpets.

2. I extended this idea to an actual percussion sound (a drum loop) and used the same bass as before. I used the low pass filter on the sidechain controls so only the lower sounds of the drum loop would open the gate for the bass. While this worked, it sounded a bit dodgy.

3. I then used a (good sounding) drum loop to control the gate on a didgeridoo sound. Only sounds below 89Hz opened the gate, meaning only the kick drum had any effect. I had to play around with the ratio, release and threshold, but I eventually made it sound like someone was actually playing the didgeridoo rhythmically. Finally, I swapped the gate for a compressor, and made the didgeridoo turn down dramatically on each kick drum hit. The result is very Daft Punk, and I can't wait to find a use for it.

BenAAwk6.ZIP 712KB

Wednesday, April 09, 2008

CC3 - Week 5 - Sound Generation (0)




This is not a markable entry. This week has thrown me about quite a bit, so I guess I'll have to take another 0. I had spent a while creating ambience codes that "demonstrate (my) understanding of unit generators", then reread the planner only to spot that every patch must use either mouse or envelope. This might have been okay for retrospective implementation if MouseX/Y actually worked in PsyCollider. I doubt I'll have the time to use the Macs until Friday afternoon (MusEd and cello work due Thursday, working Thursday night, MusEd due Friday).

I guess planning and prioritising are the two main skills that one must possess to be successful at University, so I don't doubt it's my own fault. I may have to scale back the amount of (paid) work I'm doing, as I have been working all Saturday, Sunday and Monday, then Thursday nights as well. Also I've been falling asleep around 5-6pm while I'm trying to do homework, which tends to wreck my study time. Right now I'm writing this at 2am after waking up on the couch with PsyCollider still running. Maybe it was the ocean sound I was creating...
<
// Ocean Breeze

(
{ // Main controller
a = LFTri.kr( freq: 0.2,
iphase: 3,
mul: (LFNoise1.kr(0.3, 0.3, 0.4)),
add: 1
);

// Create noise and stereoise
b = PinkNoise.ar( [ a, a ] );

// Change bandwidth according to "a"
c = Resonz.ar(b, a*150+250, 0.8);

}.play
)
>

Saturday, April 05, 2008

AA3 - Week 5 - Production Outside the Square (2)



This week Luke and I teamed up to experiment with the Antares Auto-tune plugin. I have used this quite extensively in the past, so we tried to use it out of its intended context. Note - my MP3s are the same as Luke's.

For this first example I sung a glissando line from low to high, then used auto-tune set to C minor with the retune speed set to fastest. The result is more humorous than it is useful.
OverTune MP3

The next example took the previous concept a step further. I sung a guitar solo, then we added auto-tune with the same settings as before and ran it through Amplitube to add distortion. The result is a fairly believable-sounding guitar solo.
Lead Vocals MP3

Finally we tested out auto-tune on my cello. I purposefully did not tune it beforehand, so the results would be more obvious. Auto-tune was set to D minor with medium retune speed. Pitch-correction can be heard on the second note, which means the G string was probably out of tune.
Celloooo MP3

Grice, David. “Production Outside the Square." Lecture presented at Studio One, Level 5, Schultz Building, University of Adelaide, 1st of April 2008.

Friday, April 04, 2008

Thursday, April 03, 2008

Forum - Week 5 - Pierre Henry



Music Technology Forum has almost always maintained an engaging and thought-provoking medium for technology-based talk, however this week was quite disappointing. Watching a 90 minute video on Pierre Henry is not what I would call a successful use of 2 hours, particularly because all of the second and third years would already know who he is and what he did. I understand that Henry helped to develop Music Technology to what it is today, but that is seriously all we need to know. How he did it, using now outdated technology, is rather irrelevant in the context of Music Technology today.

1. Stephen Whittington. "Music Technology Forum - Week 5 - Pierre Henry." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 3rd April 2008.

Wednesday, April 02, 2008

Forum - Week 4 - Student Presentations 2



Due to the complex nature of my presentation I got mine out of the way first. While segregating myself from the speakers (and thus audience) seemed like a good idea, I wasn't prepared for how dehumanising it is to talk to a computer screen. Hearing everything I say repeated back half of a second later didn't help either. In any case, my patch didn't crash so I shouldn't be complaining. I am glad that no one asked what mark I got, as it would have been a bit disconcerting to the first years.

Luke was up next, showing off his impressive work for the Fringe and his new band. I love the sound of his band, they are obviously heavily influence by the likes of Godspeed! You Black Emperor. Freddy then unleashed his major project from last year. I'm not sure what I did wrong for mine, but if this got a distinction I must have missed the point entirely.

Matt used the opportunity of a captive audience to present his latest Fresh FM job applications. I don't listen to the radio, so I guess it wasn't really my 'bag'. Finally Douglas presented a screen video capture of a live rendition of his major project. Much like Freddy's, the use of a guitar was prevalent.

Stephen Whittington. "Music Technology Forum - Student Presentations - Week 4." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 27th March 2008.

CC3 - Week 4 - Sound Generation

I decided to try my hand at additive synthesis again, this time I created a violin. Considering how long it took to create, the code itself doesn't look too impressive. Luckily the sound itself is quite convincing, although it does sound exactly like a synthesised violin. As is the approach whenever an attempt is made to mimic real life on a computer, realism stems from imperfection. A computer generated face just doesn't look real without hairs, blemishes, discolouration etc. The same can be applied to musical synthesis. What my violin sound needs is imperfection; maybe a scratchy start, shaky volume as the string is bowed, less instant vibrato, the sound of a finger pressing a string against a fingerboard, and so on. The inherent problem with computers is that we have built them to be so mechanical, so inorganic, that forcing them to imitate real life requires vast amounts of human input. Is the outcome worth the effort? Couldn't I just learn to play a real violin instead?



CCwk4.RTF (No promises)
CCwk4.SC

Haines, Christian. "Creative Computing - Week 4 - Sound Generation." Lecture presented at Tutorial Room 408, Level 4, Schultz Building, University of Adelaide, 27th of March 2008.