Ableton has updated Live 9 to v9.1.5. Improvements and feature changes: Minor improvement to ensure compatibility with Mac OS X’ Gatekeeper for the forthcoming OS X updates 10.9.5 and 10.10. Improved [Read More]
...now browsing by tag
Ableton has updated Live 9 to v9.1.5. Improvements and feature changes: Minor improvement to ensure compatibility with Mac OS X’ Gatekeeper for the forthcoming OS X updates 10.9.5 and 10.10. Improved [Read More]
Listen to Gerhard Behles (CEO, Ableton) and Matt Black (Ninja Tune, Coldcut) on Music and DemocratizationThursday, September 25th, 2014
Music as social medium is perhaps as profound as any connection as we can have between people. And it’s a unique pleasure to get to reflect on that with someone like Gerhard Behles or Matt Black. Yesterday, we got both at the same time. I’ll even listen to this conversation again; there’s plenty of fuel for further thought.
Before apps, Gerhard Behles and Robert Henke shared their Monolake Max/MSP sequencer (by Henke – still available); back when music production offered little in real-time, they had the vision to offer Ableton Live. When “VJ” still meant a host on MTV, Matt Black was building new tools to remix video alongside music, inspired by hip-hop technique to re-conceive digital expression and sampling.
Now, Ableton serves millions of users; Matt Black and Ninja Tune encourage users to remix their artists on their phones with Ninja Jamm.
And it seems anyone, anywhere can produce. Matt and Gerhard reflected with me yesterday on where they’ve come from, where their endeavors are today, and where we’re headed.
They got deep into the philosophy of why we make music, and where their responsibilities lie as tool makers and as individuals, where artists and labels and communities might go.
We have audio on SoundCloud:
And video (top).
Thanks to Social Media Week Berlin and Platoon for hosting us!
Few pieces of music hardware ever have had the impact that KORG’s electribe series has. And there was a time when playing live almost equated to showing up with this gear. Today, KORG has a genuinely new generation of that hardware, long awaited by fans. The engines under the hood are new, finally taking the tech we’ve seen on various KORG gadgets and building it into the flagship production gizmos. They allow for more live performance scenarios.
And in a first, you can use an electribe to build patterns for Ableton Live, creating on-the-go or onstage patterns you can bring back into your live studio.
And in a nod to the endless rise of the MPC-style grid, these are electribes with pads on them. There’s still an X/Y pad, but it’s shrunk to dimensions resembling a trackpad. And there are loads of knobs, the effect being oddly reminiscent of Swedish drum machine maker Elektron as much as something from KORG.
There are actually two electribes today: one called simply “electribe,” the other “electribe sampler.” (Yes, that new capitalization is official, too.)
Pad workflow. The 16 pads (2×8) can be a real-time recording keyboard or step sequencer. And you can use “step jump,” inspired by the volca series, or change length. True to KORG, there’s also motion sequencing for knobs and buttons.
These pads are apparently inherited from the taktile keyboards (yes, by popular demand, we’ll have a CDM review of that). They’re velocity-sensitive, though you can switch that off if your finger drumming chops are deficient.
Touch pad. That X/Y pad now uses a touch scale from the kaossilator – jam with your fingers without any wrong notes.
New synth engine. Yes, there’s a serious synth inside. 409 oscillator waveforms cover both analog modeling and PCM. The analog-modeled synth engine includes basic waves as well as dual, unison, sync, ring modulation, and cross modulation combinations. PCM KORG says focuses mainly on rhythm but also has multisamples for melodic material. And there’s the filter engine from the KingKORG, too – with the ability to route drums through the same filter.
Now, this isn’t really a full-blown synth as far as control; think macro-style controls of a deeper engine. That makes the electribe synth into more of a preset box than a sound programmer’s dream, but this is an electribe, after all. (KORG promises presets covering the genres you kids like so much like “trap” and “EDM,” which makes us shudder. But as with KORG gear of yore, I’m sure we’ll dial our way to the stuff we actually like.)
Per-part effects, grooves; live performance features. Of course, you really get into electribe territory once you start adding effects and such and actually jamming.
There’s now per-part compression and overdrive, per-part insert effects, and per-part groove templates, so not everything is master-bus stuff.
This being an electribe, when you do start applying master effects, you get KAOSS Pad-style control on the touch pad. Seq Reverse and Odd Stepper apply even to the sequencer. And, so you can alienate your friends, there’s a “Vinyl Break” effect. (Yay! Actually – augh! No! Turn it off!)
The performance additions look really nice. “pattern set” lets you switch patterns with the trigger pads. You can then record that sequence of pads as an “event recording” – so you can jam on arrangements with the pads, then save that jam (either to store a performance live or to experiment with arrangements).
I/O and batteries. MIDI in/out, sync in/out for the volca, mono tribe, and MS-20/MS-20 mini, and battery operation on 6 AA’s. This is simply a killer mobile unit.
The electribe sampler is basically the same as the electribe; the easiest way to tell it apart is its darker gray color.
The difference is the sound engine, which on the sampler (versus the standard electribe) is a hybrid sampler-synth engine.
You still get analog modeling sound engine. (It seems this is missing all the PCM melodic content, but that’s it.)
In place of the preset PCM engine, you can add your own samples – 999 preset and user samples, with a maximum 270 seconds of sampling time (in mono, or half that for stereo).
Time Slice automatically detects attack transients, so you don’t have to do any work to slice things up. And there’s of course pitch-independent tempo changes. You can take slices and add them to steps or parts, or add per-sample effects.
There’s also resampling, with knobs controlling pitch or modulation.
All in all, the sampling workflow looks terrific, intuitive, and very electribe-ish. Add that to the enhanced performance features, and to me, electribe sampler looks like a real winner.
You can also see the differentiation here between the volca sampler and electribe sampler. I don’t think that’s so much market differentiation between the two – there’s little risk of the volca cannibalizing electric sales – as it is that fundamentally, the volca is a different animal. It’s really designed to be simpler and cheaper philosophically.
Sample rate is 16-bit/48K. There’s a stereo minijack line input for sampling.
Both units feature USB, MIDI in and out jacks, and an SD card for storage.
When Ableton met electribe
It used to be, if someone said they were playing a live set, they actually meant they were showing up with ElecTribes. These days, of course, it’s Ableton Live. And Live is a wonderful tool, especially when combined with hardware like Push. But … yeah, we miss the old hardware days.
Then again – why choose?
What may turn out to be the killer feature of the new electribe generation is that it now exports to Ableton Live sets. KORG even says it’s a collaboration with the folks at Ableton.
Your patterns and parts are saved to scenes and clips. Open these files on your computer, and you see them inside a Live Set.
There’s even a copy of Live Lite in the box, but — yeah, you probably don’t need that.
No need for explanation here – this is huge. You now have a battery-powered unit you can use away from Ableton Live that can make drum parts, melodic parts, and even live sample, and then you can finish off songs and arrangements back on your machine. If you like starting songs on hardware and getting away from the computer, or if you want to integrate KORG’s hardware with your live set and then later turn jams into songs, it’ll be a beautiful combination.
All in all, I think the electribe is some of the best news on the market in a long time for hardware workflows.
No word yet on price or availability, but I hope we’ll be first in line for a review.
The post New KORG electribe Focuses on Live Performance – and Export to Ableton Live appeared first on Create Digital Music.
Ableton alone can’t take you mobile, apart from bringing your MacBook running Live on the bus. But now KORG is ready to take your Ableton Live work on the road. Apart from adding native Live set export to their electribe and electribe sampler, the new versions of KORG’s iOS apps Gadget and iKaossilator do export, too.
And that’s just one feature in the deceptively-named “1.03″ release of KORG’s Gadget.
Gadget is one of those apps that I’ve had to file under “wow, this looks cool but I’ve no time.” As the name implies, you get a selection of synths and drum machines. Here’s where having a newer iPad benefits you, too – the latest processor runs up to 20 at once. There’s a 303-style bass, PCM and digital synths, virtual analog synths, semi-modulars, percussion synths, “wobble” and chip goodies. Then, you can either perform live with the lot or save patterns.
1.03 finally makes integrating that goodness easier, with MIDI input, Live export, and multitrack export, for starters:
MIDI input. You can connect MIDI devices for easier playability. KORG has wisely made their latest gear class-compliant, so that includes KORG keyboards – make the full-sized Taktile into a synth by adding an iPad, or going the other way, add a tiny nanoSERIES input for tactile control beyond the touchscreen. (And yes, that’s a huge limitation of Native Instruments’ pricey Komplete Kontrol keyboards we saw this week; they need a computer to function, so you essentially pay more for hardware that does less.) See the whole KORG controller lineup.
Export as Ableton Live sets. Each phrase and scene in Gadget now exports natively to an Ableton Live set with clips and scenes, respectively. You can transfer via Dropbox or iTunes File Sharing. And if you don’t use Ableton –
Export as individual audio tracks. From Pro Tools or Maschine, that means another way of moving phrases and songs around. (It appears you’d have to leave one phrase per track in order to separate them, so it’s not as convenient as Ableton Live, perhaps, by definition – but still very workable.)
Other enhancements: 64-bit, landscape. 1.03 also fixes an annoyance: you can now use landscape mode and not just portrait. It’s also 64-bit native, bringing big performance gains on new Apple hardware.
There are also two new instruments:
Bilbao is a US$ 9.99 in-app sample player, with 16 one-shots and import:
Nice, but Abu Dabhi is I think more interesting – sample slicing and groove manipulation, also US$ 9.99:
– on top of 1.02′s addition of Audiobus, more. The last “minor” update brought Audiobus support, a better sequencer and UI, a beginners’ guide, and more. It also has “Increased Japanese,” which is always a good thing. See the what’s new guide for all the specifics.
KORG has been releasing featured tracks from the community, too. Let’s have a listen. (Maybe there are some CDM readers in the bunch?)
The app is on sale, alongside KORG’s other apps, through September 8th on the App Store for US$ 28.99 (instead of the usual $ 38.99).
The post KORG Gadget for iPad Gets Serious, with MIDI, New Slicer-Samplers, and Ableton Export appeared first on Create Digital Music.
Jesse Abayomi, Ableton Product Specialist, is one heck of a virtuoso Push player. And you can learn something from him, too.
Performance technology doesn’t always add to performance, it’s true. But when the machine and human are in sync, it’s beautiful. People can develop their musical chops and machine control chops at once – improve on their musical practice and technique. And when that happens, the quality of performances actually gets better.
I’ve seen a funny thing as Push has crept into performances. Just as with the spread of custom controllers in the past, access to more playing technique has livened up live sets. It literally makes it more fun to be an audience member – not only if you’re (cough) one of us creepy, nerdy people always hovering behind the screen of players, but even when out in the crowd, listening to the music being more dynamic.
Ableton, for their part, have begun spotlighting artists using Push. This is marketing stuff, but they’ve also presented some real techniques you can learn from. That is, they might be trying to sell you Push, but if you’ve got one already, you should pay attention.
With Jesse in Zone3 guise (shifting from his techno and house realms into bass music), he does some amazing things on “Chemistry.” It’s also a nice catalog of the sort of functions Push can accomplish. By my count, that includes:
- Clip and scene triggering
- Pad triggering (live, with velocity)
- Step sequencing (percussion, melodic)
- Step sequencing one-shot samples (in place of triggers)
- Melodic playing (bassline)
- Parameter control (via a macro – more on that below)
He also goes into some tips in a second video. The ingredients:
- Audio, instruments, live effects
- Drum Rack with samples
- Record quantize on (1/16), but with swing
- August synthesizer (emulating Juno 106), for Max for Live
- Chords and Arpeggiator
- MultiMap – parameters across tracks, which allows a buildup (by mapping everything to a macro)
I’ll go one step further: I love Push, but only after adding some scripting and modifications. Push’s versatility is both its greatest strength and its greatest weakness. While it takes you off the computer screen and mouse, it embodies the functions of the software – that is, it does a whole lot at once. To handle all those multiple functions across multiple tracks, there’s a lot of toggling to switch functions. Done improperly, that can take you out of the set, leaving you all muddled.
MultiMap for Max for Live is one really brilliant solution, as he shows in the video. By assigning multiple controls to one, it makes it possible to command a lot of control at once.
It’s also well worth checking out nativeKONTROL PXT-Live for Push (mouthful, that name, sorry). PXT does a lot – this is the script from the same genius scripter who gave you (as the name implies) awesome templates for KORG’s hardware. But there’s one feature alone that makes it worth some cash: you can lock devices, so that switching tracks doesn’t suddenly have you playing the wrong input. I didn’t even bother requesting a review copy; I just sent the guy some cash.
You should, too. It’ll change your world, if you’ve got a Push.
It does so much, that it’s worth a separate review, but I think while I work on that, Jesse gives you more than enough to play with here.
In the midst of a lot of heated discussion of Native Instruments’ new controller keyboards, now is the perfect time to revisit Push, too. Push centers on a grid, not a keyboard, but also features lots of functionality for automatically displaying and controlling parameters without looking at the computer screen. And while Push, like Komplete Kontrol, is built to work with Ableton’s own software, you do get added openness by virtue of Live being a DAW. You can certainly map controls easily not only to Ableton instruments, but plug-ins.
Now, if Ableton were to built Ableton Play – Push on a keyboard – well, we’d have quite a horse race. (Um… Ableton Ivory? No. Ableton Keys? Hrm.) I’m actually curious if any keyboardists have a Play alongside.
There are other ways to skin a cat, too; sometimes a Roland Handsonic alongside a Live set is more fun to watch, minus all this step sequencing business. But it’s fun to watch the “I’m sequencing and finger drumming and clip triggering and playing tunes all at once” workflow evolve.
Anyway, I’m off to play with my new Faderfox UC3, because faders. (Yep, this could be your answer if you’re annoyed at all that knob twisting on Push.)
The post This Virtuoso Ableton Push Performance Comes Full of Tips for Controllerists appeared first on Create Digital Music.
Ableton Live’s iPad-augmented control can take some forms. There are apps that do everything, replicating the mouse so that you can go directly to touch for every single task and avoid your computer completely (Touchable, for instance). There are specialized controllers, which focus on a few tasks or a particular device or Max patch.
And then there’s Conductr, and it’s something a bit different.
First, as of June, it’s free – or freemium, anyway. The free version is limited to four tracks and eleven scenes, but it’s enough to give you a taste. And with user modules, it’s easily a workable complement to other controllers like Push, even without touching the in-app purchases. (That seems to me a use of in-app purchases that actually makes sense.)
But most importantly, Conductr focuses on giving you controls that are well suited to the iPad. This isn’t about the iPad screen pretending to be hardware faders and knobs, or about cramming a mouse-style interface under your fingers. Instead, the Barcelona-based developers of Conductr have made an interface that intentionally does less. It’s cleaner, easier to see, and less crowded. You can leave just a few controls, use gestural controllers, and even set up layouts that don’t require you to look at your iPad. As a keyboardist, and someone who finds the iPad sometimes as clumsy as I do invaluable, it’s great.
And today’s update is really the best development of that concept. The XY-4D is simply the best X/Y controller I’ve seen on the iPad yet. It’s totally user assignable, and makes your iPad into something like a 21st Century, Star Trek-chic KAOSS Pad.
There’s a single user-assignable XY-4D pad now free in the free edition of Conductr, so you don’t have to listen to me drone onto it, but here’s what it does:
- 4 XY units inside
- Each unit, four parameters (horizontal, vertical, pinch, tap to toggle on/off)
- Reset, freeze
- Up to four modules per view (for up to 64 parameters)
It’s so nice, in fact, that I’d love to assign this to other tools, too. (Maschine, for instance – though I can do that now within the host, which is probably where I’ll start.)
It’s also great to see them using pinch; hopefully more controllers take advantage of the various gestures possible on iPad.
I hope the developers, PatchWorks, continue on this path. It really does seem better suited to what the iPad is. Let us know how you wind up using it.
Here’s there manifesto, in a recent press release, which I appreciate:
We do not want to replicate hardware on a touchscreen; we want to get maximum advantage of multitouch technology to give musicians the kind of resources that they can’t get from a hardware controller. From an ergonomic interface that adapts to any momentary need —in other words: you only see what you need at any given time— to a gestural mode that allows the user to play without watching the iPad and a modular basis that will permit the app to grow through the addition of complementary modules.
More on the controller and how it works:
And here’s another freebie for you – a loop pack from Hermético and Sr. Click on netlabel Inoquo. It’s pre-mapped to Conductr, so a nice way to explore the tool:
More at their site – and also check out some very nice artist profiles and blog entries; they’ve been rather busy!
The post An iPad Controller for Ableton That’s Gesture-Friendly, Free: Conductr, Now with X/Y [Gallery] appeared first on Create Digital Music.
Digital music can go way beyond just playback. But if performers and DJs can remix and remake today’s music, why should music from past centuries be static?
An interactive team collaborating on the newly reopened Museum im Mendelssohn-Haus wanted to bring those same powers to average listeners. Now, of course, there’s no substitute for a real orchestra. But renting orchestral musicians and a hall is an epic expense, and the first thing most of those players will do when an average person gets in front of them and tries to conduct is, well – get angry. (They may do that with some so-called professional conductors.)
Instead, a museum installation takes the powers that allow on-the-fly performance and remixing of electronic music and applies it to the Classical realm. Touch and gestures let you listen actively to what’s happening in the orchestra, wander around the pit, compare different spaces and conductors, and even simulate conducting the music yourself. Rather than listening passively as the work of this giant flows into their ears, they’re encouraged to get directly involved.
We wanted to learn more about what that would mean for exploring the music – and just how the creators behind this installation pulled it off. Martin Backes of aconica, who led the recording and interaction design, gives us some insight into what it takes to turn an average museum visitor into a virtual conductor. He shares everything from the nuts and bolts of Leap Motion and Ableton Live to what happens when listeners get to run wild.
First, here’s a look at the results:
CDM: What was your conception behind this piece? Why go this particular direction?
Martin: We wanted to communicate classical music in new ways, while keeping its essence and original quality. The music should be an object of investigation and an experimental playground for the audience.
The interactive media installation enables selective, partial access to the Mendelssohn compositions. The audience also has the opportunity to get in touch with conducting itself. They can easily conduct this virtual orchestra without any special knowledge — of course, in a more playful way.
It was all about getting in touch with Mendelssohn as a composer, while leaving his music untouched.
The idea for the Effektorium originated in the cooperation between Bertron Schwarz Frey Studio and WHITEvoid. WHITEvoid worked on the implementation.
What was your impression of the audience experience as they interacted with the work? Any surprises?
Oh yes, people really loved it.
The audience during the reopening was pretty mixed — from young to old. Most people were very surprised what they are able to do with the interactive installation. I mean, everything works in real time, so you would have direct feedback on whatever you’re doing. The audience could be an active part — I think that’s what people liked the most about it.
Visitors can also just move around within the “orchestra pit” in order to listen to the individual instruments through their respective speakers. This creates the illusion of walking through a real orchestra. Normally you are not allowed to do that in a real concert situation. So this was also a big plus, I would say.
I saw a lot of happy faces as people played around with the interactive system and as they walked around within the installation room.
Can you explain the relationship of tracking/vision and sound? How is the system set up?
There are basically two computers connected to each other and communicating via Ethernet to run the whole system. The first computer runs custom-made software, built in Java and OpenGL, for the touchscreen, Leap Motion control (via its Java SDK), and the whole room’s lighting and LED/loudspeaker sculptures. Participants can navigate through various songs of the composer and control them.
The second computer is equipped with Ableton Live and Max for Live. Ableton Live is the host for all the audio files we recorded at the MDR Studio in Leipzig, with some 70 people and lots of help. We had specific needs for that installation, for both the choir and the orchestra sounds. So everything had to be very isolated and very dry for the installation, which was very unusual for the MDR Studio and their engineers and conductor.
Within Live, we are using some EQs, the Convolution Reverb Pro, and some utility plug-ins. That’s it, more or less. Then there is a huge custom-made Max Patch/Max for Live Patch … or a couple of patches, to be exact.
We decided to just work exclusively with the Arrange view within Live. So this made it easy to arrange all the orchestral compositions within Live’s timeline, and to read out these files for controlling and visualisation.
Both computers need to know the exact position at the same time in order to control everything via touchscreen and Leap fluently. For the light visualisation, we also needed this data to control the LED`s properly to the music.
We basically read out the time of the audio files — we’re basically tracking the time and the position within Ableton’s timeline.
How does the control mechanism work – how is that visitors are able to become conductors, in effect – how do their gestures impact the sound?
The Leap Motion has influence on the tempo only (via OSC messages). One has to conduct in time to get the playback with the right tempo. There’s also an visualisation of the conducting for the visitors in order to see if they are too slow or too fast. You have two possibilities in the beginning when you enter the installation, playback audio with conducting or playback audio without conducting. If you choose “playback audio with conducting” you have to conduct all the time; otherwise the system will stop and ask you kindly to continue.
For the audio, we are working heavily with the warp function in Live to keep the right pitch. But we scaled it a bit to stay within a certain value range. The sound of the orchestra was very important, so we had to make sure that everything sounded normal to the ears of an orchestral musician. Extreme tempo changes and of course very slow and very fast tempo was a no-go.
And you’ve given those visitors quite a lot of options for navigation, too, yes – not only conducting, but other exploration options, as well?
The touch screen serves as an interactive control center for the following parameters:
- position within the score
- volume for single orchestral or choral groups
- selective mute to hear individual instruments
- visualization of score and notes
- the ability to compare a music piece as directed by five different conductors
- room acoustics: dry, music salon, concert hall, church
- tuning: historical instruments (pitch 430 Hz) and modern instruments (pitch 443 Hz)
- visualization of timbre and volume (LEDs)
- general lighting
So all these features could be triggered by the visitor. The two computers communicate via OSC. Every time someone hits one of the features via touchscreen, Max for Live gets an OSC message to jump position within the score or to change the room acoustics (Convolution Reverb Pro) on the fly, for example.
Right, so you’re actually letting visitors change the acoustic of the environment? How is that working within Live?
We had to come up with a proper working signal routing to be able to switch room acoustics fluently with the IR-based plug-in. Especially the room acoustics were a big problem in the beginning. We really wanted to use the built-in Convolution Reverb, but figured out that we couldn’t use ten instances of that plug-in or more at the same time without any latency problems. So now we are basically using three of the them at the same time for one setting (room acoustics: concert hall for example). All the other Reverbs are turned off or on automatically while switching settings. Everything runs very fluently now and without any noticeable latency. I would say that this wouldn’t have been possible five years ago. So we are happy that we made it
How did you work with Leap Motion? Were there any challenges along the way; how happy were you with that system?
The Leap Motion is very limited when it comes to its radius: it’s only sensitive within about 60 cm. So we had to deal with that; that isn’t much. Some of the testers, conductors and orchestra musicians, were of course complaining about the limited radius. If you have a look how conductors work, you can see that they use their whole body. So this kind of limitation was one of our challenges, to satisfy everyone with the given technology.
We could’ve used Kinect, but we went for Leap, because it allowed us to monitor and track the conductor’s baton. The Kinect is not able to monitor and track such a tiny little thing (at least not the [first-generation] Kinect). This was much more important for us than to be able to monitor the whole body, and it was also part of the concept.
I would say that we were kind of happy with the Leap Motion, but the radius could be bigger. Maybe this will change with version 2, but I don’t know what they have planned for the future.
Another problem was the feature list. We had a lot more features in the beginning, but we found out very quickly that one feature would be enough for the Leap Motion tracking, especially when you think of the audience who will visit this kind of museum. It has to be easy to understand and more or less self-explanatory. So by gesturing up and down, one will have influence on the tempo of Mendelssohn’s music only – that’s basically it. All the other features were given to the touchscreen – functional wise. So we have basically two interactive components for the installation setting.
Your studio was one among others. How did you collaborate with these other studios?
Bertron Schwarz Frey and WHITEvoid were basically the lead agencies. Bertron Schwarz Frey is the agency that was responsible for the whole redesign of the Mendelssohn-Haus museum in Leipzig and we and WHITEvoid just worked on the centerpiece of the newly reopened Mendelssohn Museum – the interactive media installation “Effektorium”. So we worked directly with WHITEvoid from the very beginning and our part was mainly the sound part of the project.
I am very happy that Christopher Bauder from WHITEvoid asked us to work with him on this project. We are actually good friends, but this was the first project we worked on. So I am glad that it ended up very well. Then there was a lot of other people to deal with. The sound engineers from the MDR studio, technicians, the conductor David Timm, the orchestra and choir, and of course the people from the Mendelssohn Haus museum itself.
Thanks, Martin! I was a Mendelssohn fan before ever even spotting a computer, so I have to say this tickles my interest in technology and Classical music alike. Time for a trip to Leipzig. Check out more:
The post How Gestures and Ableton Live Can Make Anyone a Conductor of Mendelssohn [Behind the Scenes] appeared first on Create Digital Music.
Forget fancy effects or sophisticated plug-ins – day-in, day-out, it’s those simple MIDI modules you wind up using again and again and again and again. It’s like having a bucket of paperclips on your desk. It doesn’t have to be exciting. It’s the simple stuff that gets used.
So, one of my favorite demos from the jam-packed sessions at MIDI Hack Day in Stockholm in May was unquestionably Midular. The idea was simple: make some basic modules that do stuff to notes and control events, then combine them in useful ways. It deserved an ovation.
And now, you can get those same modules for Max for Live, for free. They’re open source, properly under a GPL license (meaning, if you want to port them to Pure Data, you can, for instance). And they’re good enough that you’ll wonder with at least a couple of them why Ableton didn’t include these as defaults effects.
The starting lineup:
- LiveQuantizer. Well, duh. And as the creator notes, this means you can do to notes what Live does to clips.
- Repeater. Repeat incoming notes.
- Buffer. A round-robin note storage-and-playback sequencer – cool. And that naturally leads to -
- Rotator. 8-note rotating buffer plus an 8-step sequencer, based on the Roland System 100m modular sequencer. This is a no-brainer to add to that Roland SYSTEM-1 I’m dragging into the studio tonight, in fact, both in SYSTEM-1 and SH-101 modes – I’ll report back.
- SuperPitcher Works the way you wish Pitch did in Ableton – but then also adds a step-based modulator, for other effects.
It’d be great to see this collection grow over time, particularly with additions from others in the Max for Live community. You can start on that right away by forking it on GitHub – or just download and get to playing.
So, yes, fairly simple. It’s combining these (and, no doubt, communing them with other tools and toys from the Max for Live community) that gets more interesting. Some video examples:
A simple demonstration showing how some of the Midular MIDI effect modules can be used together, focusing on the 8 note step sequencer called Rotator. I’ve tried keeping the sounds and sequences as simple as possible so that it’s easy to get a feeling for what’s going on.
A simple demonstration of how some of the Midular MIDI effect modules can be used to generate various arpeggiated sequences from a single sustained note. The sound is purposefully kept as basic as possible so that it’s easier to hear what’s going on.
The project is the work of Knut Andreas Ruud. Brilliant stuff, Knut!
https://github.com/carrierdown/m4l-midular (look for the “download ZIP” link in the right-hand column if you haven’t used GitHub before!)
The post Midular are the Free MIDI Modules Every Ableton Live Setup Needs appeared first on Create Digital Music.