This month’s newsletter is a little late as I am performing as Alias Zone in three different events this month, and was waiting to confirm time slots before I sent this out. Details are in the Upcoming Events section further down, but the short version is I have two sets* this Saturday May 8 – Mountain Skies and ModularWorld – and then another Friday May 21 as part of the Synth Society Summit.
(*As the Firesign Theatre would ask, “How can you be in two places at once when you’re not anywhere at all?” By streaming on the internet, of course…)
For these sets, I used a “hybrid” setup where my studio modular synth was at the core, but I also used samples, played pads on digital polyphonic synths as well as some acoustic percussion, used a few virtual instruments while mixing everything in Ableton Live as I performed.
Although many may think of me as someone who is into modular synths, the truth is I’m into sound. Now, modular synths are my favorite way to create and perform sounds! And I used modulars almost exclusively from 2015 through mid-2020 to see what all I could use them for to create my own music. But the long term goal was always to bring back into the mix some of the other ways I’ve enjoyed creating music in the past, and integrate them with the modular. An initial “progress report” on this initiative is the main article in this month’s newsletter.
- featured article: How I’ve been merging the rest of my studio with my modular synth.
- new videos & posts: A primer on using Mutable Instruments’ Marbles.
- Patreon updates: A mini-series on the different ways I use Marbles, and a new Basic Concepts.
- upcoming events: Details on my three performances this month.
- one more thing: Sisters With Transistors.
As I’ve been trying to create more complex solo pieces, I started hitting a barrier of how many parts I could create with my modular at once. I had been using so many samples that I regularly employed both the 1010music Bitbox and Blackbox for the same piece, and recording pads would require at the very least more oscillators if not fully-patched voices to play chords. I could start multi-tracking – recording a few parts, repatching the entire modular, and record then additional parts – or, I could add non-modular instruments that I could play at the same time.
Since I currently want to continue to perform these pieces in real time, I decided to go down the second path – which has had the added benefit of giving me a wider palette of sounds to play with. Here’s a breakdown on how I’m currently dividing the musical tasks between different instruments.
In general, the monophonic voices – with the exception of the modeled pluck and string modules, like Rings – come from the modular, while anything requiring chords is played on a poly synth or virtual instrument in Ableton Live. I’ve been going through a variety of controller keyboards – it seemed like each music video of mine used a different controller! – but I’ve settled on the Novation 49SL MkIII. The eventual plan is to use a MIDI patchbay plus channel & patch changes on the SL to remotely play all of the polysynths, but so far I’ve liked the “security blanket” of having my main polysynth right next to me while playing (for tabletop units), or to play its own keyboard directly.
So far, I’ve been playing samples of ambience beds, voices the need to be triggered in time with the music, or rhythm loops from the modular, and sampled percussion libraries (individual hits) from Live – sequenced from the Five12 Vector sequencer, often in conjunction with Grids. I know Live can also do the loops and timed samples – that’s how I used to perform in the 1990s and early 2000s – but so far I’ve preferred having the samples and their trigger interface right in front of me, instead of remote-controlled (I don’t look at the computer’s screen while playing). This may evolve over time, if I get back to using layered percussion loops.
Plus, I’ve been using more and more hand percussion to add accents while playing. If the hand percussion part is repetitive, I’ll record a loop of it ahead of time and play that back from one of my sample player modules.
I continue to be in love with the Five12 Vector: Its Chance Operations are a perfect match for the way I like to bring generative techniques to “traditional” composition and performance, and the addition of a Novation LaunchPad Mini has made it much easier to recall Scenes during a piece to switch multiple sequence parts at once.
In addition to using a few parts to sequence voices in the modular – typically one or two bass patches, and up to three melodic lines (I have its Jack Expander to go beyond the two CV/Gate/Velocity outputs on the main unit) – I have also been using it to sequence virtual instruments in Ableton Live.
(I have also been using the advanced arpeggiators in instruments like the Korg Z-1 and the Vector synthesizer as “live performance sequencers” I can play interactively, while the Vector sequencer and its Chance Operations + Scenes are handling all of the modular’s parts.)
I would prefer for the Vector to be the main clock, as it’s in charge of playing all of the parts in time. However, I am also recording all of the individual parts – both audio and MIDI – into Ableton Live, and I want to be able to edit on beat and bar markers in Live rather than just minutes and seconds. This requires Live to also know where the main clock is, and Live is terrible at receiving an outside clock – it requires a lot of resources, causing frequent CPU overloads (especially when recording at high sample rates).
So now, Live is the main clock, sending it over USB MIDI to the Vector sequencer. Fortunately, Live also allows you to adjust the timing of each individual clock stream, making it easier to make sure the various arpeggiators are time-aligned with the modular, and Live’s own timing grid. I have also started recording MIDI from the poly synths and the arpeggiators – even the result of “chance operation” sequences performed on the Vector – back into Live as a safety backup, in case I want to edit the arrangement, timing, or velocity of individual notes later.
Recording & Mixing
Before going hybrid, I would mix the modular’s different parts (plus maybe a poly synth for pads) in real time using a combination of the PreSonus StudioLive AR16C analog mixer + digital interface, combined with mixer modules (primarily the Frap Tools CGM series) to do sub-mixes inside the modular.
Almost all of the effects were done inside the modular, plus maybe an external reverb pedal like the Source Audio Ventris or Strymon Nightsky. I even routed the auxiliary busses from the AR16C back to the modular, with the effect returns going to their own stereo channels on the AR16C. I then recorded these as multitrack stems in Adobe Audition.
As I mentioned above, I then switched to recording the multitrack audio in Ableton Live so that I could edit based on bars and beats. Live can also record mix automation in real time along with the audio; I have been slowly adding a few plug-in effects in Live. These changes mean it makes a lot more sense to do the real time mixing as well as recording using Live, as it can capture the full-level, pre-faded audio plus my mix moves. This allows me to re-do a mix later with a loss in quality.
As a result, the AR16C has been functioning just as an audio interface and mic preamp. As I added more instruments, I’ve needed more channels, so I’ve started using a Roland Rubix 44 to handle effects sends and returns, bonded together with the AR16C as an aggregate audio interface in MacOS.
I’ve found it to be extremely important to use the same interface for both input and output in Live; otherwise, I was getting periodic audio glitches in Live as it tried to resolve the time bases of the different interfaces – even with a 1024 byte buffer, running on a high-end computer. Now I’m down to a 64-byte buffer while recording at 96 kHz (another subject onto itself) and recording all the tracks to 32-bit files. This small of a buffer makes the latency of going to and from the computer bearable.
To control the mix, I started by using the knobs on an Ableton Push, and then moved to the knobs and faders on my current controller keyboard, the Novation SL. The SL works very well as a control interface for Live, complete with multiple OLED displays showing you what’s on your tracks and devices in Live. Since it’s not uncommon for me to have more than 8 mono or stereo tracks of instrument stems, effects returns, mic inputs, and more, I quickly got tired of “paging” through the inputs on the SL to fit them into its eight sliders and knobs. So, I added a Novation Launch Control XL to give me an additional eight channels of physical mix controls.
With all of this virtual control, I no longer use the mixer modules in the modular synth to do level changes; now they are primarily used for modular-controlled panning. If a modular voice (or sample playback) does not require panning tricks, I just route it straight through a modular-to-balanced-audio interface to the AR16C or the Rubix.
I expect this system to continue to evolve as I get more experience with it, and as I want to add even more parts. But in the meantime, this has been working well for me, and I’m comfortable with it. You can see and hear it in action in my various performances this month; my SoundQuest Fest performance was probably my first big test of it.
New Videos & Blog Posts
I’ve been writing a series on generative patch techniques for my subscribers on Patreon. Recently, I showed a variety of ways I use Mutable Instruments’ Marbles – but realized that before I demonstrated them, first I should demonstrate Marbles itself. Since this is a module that many have or have considered getting, I decided to make that introductory post freely available for all – click here to read & watch it.
I’ve been continuing my Thoughts on Generative Patching series with a four-part installment based around the Mutable Instruments Marbles, plus wrote another Basic Concepts piece based on a question from a patron.
- Basic Concepts 03: Precision Adders versus Quantizers is available to all patrons with a current subscription.
Thoughts on Generative Patching 4: Using Marbles (first post free to patrons and non-patrons; remaining posts for those subscribed to the +5v level and above):
- 4.1: A Quick Overview of Mutable Instruments Marbles
- 4.2: Using Marbles to Change Timbre
- 4.3: Using Marbles to Switch Between Sound Sources
- 4.4: Using Marbles to Modify Effects
I have new Notes from the Studio installments planned, as well as breakdowns for each of the three pieces I’m performing this month.
I have three performances planned for this month:
Mountain Skies: Saturday May 8, 12:30 PM PDT/3:30 PM EDT/9:30 PM CEST
This will be a half-hour audio-only performance of a new “space music” piece I am creating this week, tentatively called Sputnik’s Ghosts. It will be streamed on Radio Spiral.
ModularWorld First Anniversary: Saturday May 8, ~3:20-3:30 PM PDT/6:20-ish EDT/12:20-ish Sunday morning CEST
This will be a 10 minute video performance of an upbeat little piece called Four O’Clock – Raining, followed by a short interview. This is part of a 30+ hour, 130 artist event, so timing will slide around a little bit; the two acts before me are Elin Piel and Sequenox, and John McKenna is after me. It will be streamed on ModularWorld’s YouTube channel.
Synth Society Summit: Friday May 21, somewhere between 7-10 PM PDT/10 PM-1AM EDT/4-7 AM Saturday CEDT
This is another ~10 minute video set, of a “spiritual” piece called Devotion. I do not have the exact time yet; I’ll announce it on both the Alias Zone and Learning Modular Facebook and Instagram pages when I do. It will be streamed on the Southern California Synth Society YouTube channel.
If you miss the original broadcasts, I’ll place the videos of Four O’Clock and Devotion on the Alias Zone YouTube channel after those events. I have not decided yet what I’m going to do with Sputnik’s Ghosts, as there will be no video; if it goes well, I’ll post it on the Alias Zone Bandcamp page.
One More Thing…
There is an enjoyable, educational new documentary called Sisters With Transistors on “electronic music’s unsung heroines” including Clara Rockmore, Bebe Barron, Suzanne Ciani, Laurie Spiegel, Daphne Oram, Pauline Oliveros, Delia Derbyshire and Eliane Radigue, created by Lisa Rovner. It can be streamed on Metrograph through May 6 (subscription required – just $5 for one month), which includes access to a Lisa Bella Donna concert, Laurie Anderson’s Home of the Brave, and other experimental videos with soundtracks created by composers mentioned in the documentary.
The documentary can also be viewed in the UK through the Modern Films web site, and other parts of the world through Vimeo. This is a limited-time engagement, so watch it soon!
As the pandemic lock-downs were starting last year, I teamed up with my good friends Ben Wilson (DivKid) and Kim Bjørn (co-author of Patch & Tweak) to create the streaming Feeding the Monster series, explaining the reasoning behind every module in my studio system as I installed it, and ending with a few quick demos. One year later, we’re considering doing a follow-up, showing which of my initial plans worked, explaining which modules have been replaced, and possibly ending with a little performance or more demos. We’re tentatively talking about doing this in June; stay tuned for updates.
enjoying life –