You might already have seen this on my socials. A nice photo of a new box stacked alongside my MIDI patchbay. Lately studio life got more complicated. I have 2 mixing tables. One for working in the studio and one for practicing live gigs. I found myself plugging instruments in and out of these mixing tables. Also, the studio mixing table, a Yamaha 01v, is getting old and some switches now already noticeably start making noise. For me this was the sign to start saving the desk and considering a patch panel.
You can spend any amount on a good one, but for my modest home studio purposes I chose the Behringer Ultrapatch Pro PX3000. With 48 channels it is well beyond my need to patch 6 channels across 12 inputs. But hey, who knows what will happen in the future. And it doesn’t break the bank at around 80 euros.
Plugging the instruments across the inputs of two tables now won’t wear down the inputs on the more expensive mixing desks any more. There is even be an option to use the patchbay in half-normal mode. In this mode I can make a setup to send the instruments to both inputs at the same time. Then you have to factor in the impedance of both mixing desks against the line outs of the instruments, but to my calculations it might just work.
This maybe something that I had overlooked for too long: Loopcloud. For years the talk of the sampling library town, but I didn’t look at it until I got a demo of the new Loopmasters Loopcloud 5.0 version at the Amsterdam Dance Event this year. I also had looked at other sample managers like Algonaut Atlas, but that may be only drums oriented. Intriguing, because Atlas uses machine learning to recognize the types of samples. For me, up to now, a sample manager was simply a folder in Ableton Live to browse through. And I had always put Loopcloud away as simply a shop to buy samples with a subscription model.
How to work with the application
The Loopcloud application is a standalone application, but it integrates with your DAW through a Loopcloud plugin. You can only have it on one track in your DAW. All samples that you browse then play through that track. The idea is to start with a sample in the Loopcloud application. You can have random sorting to free your mind. Then use that to edit, slice, dice, sequence, mash up and add effects if you wish. You can drag the final result into your DAW as a sound file. Quite something different than finding a sample and then edit it in the DAW. All with the tune and tempo of your DAW. It nicely prevents you using kind of preset sounds over and over. Clever!
It means however, that you have to keep two applications open while working. For those of you with two monitors, maybe a no-brainer. But then again, it could just be that you already have a nice workflow with your two monitors and now you need to fit in yet another application. Anyway, there is an option to have the application dock to the sides of a window at about 20% of the width. Combined with scaling and other options, you might manage with one screen. The application sometimes forgets how you docked and scaled it.
Your library manager
Now about the library management. The moment you add your own samples to the Loopcloud application it starts scanning all the samples in it. It will try to find BPM and key information and it will try to read other information from the name of the sample or the loop. It will probably not correctly discover more complex information like the genre, loop or one-shot, or the exact instrument. All is then marked down as tags and you can start searching for things like key and BPM.
For this you need to click the button marked “Your Libary”. If you also want the detailed information of your scanned samples to be correct you will have to start tagging yourself. Its quite advanced, you can tag whole folders and batches of files. For a more in-depth dive into the tagging and searching you should dive into the tutorials.
But then when I found out Loopcloud as a sample manager, the tutorial also pointed me to Loopcloud Drum. A separate plugin that is actually a full sample drum instrument. It uses its own Loopcloud drumkit format and will open up a separate section in the Loopcloud manager. A strange find in a sample library manager. As a separate instrument it has its own format and its actually more of a pattern beatmaker with its own sequencer. A preset list of drum kits get activated that have been assembled from Loopcloud one shot samples of course.
I didn’t find any option to change the patterns in the beatmaker, other than with a mouse. You would also expect an option to edit drum kits and build your own. You can edit the mix of the kit and save that as a “user” drum kit, but I didn’t see any way to create a drum kit from your own set of one shots. Maybe this is in a future version, or in a Loopcloud subscription model that I didn’t explore. I was kind of on the lookout for tools to start making beats, other than with loops or Nerve, but this is not it yet.
And even more? The tutorial also points to the Loopcloud Play plugin. Yet another sample instrument, but this time melodic. As an instrument its quite basic, maybe so basic that you fall back into the preset trap again. There are about 7 knobs to turn and that’s it. Like the Drum instrument it has its own place in the library and again no way to choose the samples. You can save tweaks to the knob as “User” instruments. I think it needs work, as this is no match for Native Instrument’s Kontakt.
Loopcloud has a quite intricate subscription model and not all of the features are available in all tiers. Specifically on using multiple tracks and the sample editing. However, if you just want to use it as a sample library manager you can even use the free subscription model tier. If you already own Loopmasters stuff it will automatically appear in your library. Even though it could do with more advanced detection of the samples that you load in the library, for me this was a great find and it surely beats the user folders in Ableton Live.
This is a short story about something that you take for granted in this high-tech age. That you can connect anything to anything and that it just works. This time I tripped over something that did not work and it reminded me harshly that there are classic electrical laws to take in to account: impedance matching. Even more embarrassing is that I am actually an electrical engineer that switched to computer science and music.
So these days I am working on my stage monitoring. Of course its at least my performers dream to have wireless in-ear monitoring, but then you will find that you have to invest at least hundreds of euros and you can easily go up to several thousands. This is why I started experimenting with a simple wired stereo in-ear monitoring system. The Zoom L-12 mixer/recorder that I am using has 4 mix outputs for monitoring so that is the starting point.
Lets try to set the impedance story straight without getting too technical. For that you can go to the wiki page about the subject. In short its about getting the energy from the output (a mixer) optimally to the input (headset, amplifier) of the connected device. Otherwise its kind of like fitting a wide garden hose to something that is too small. The electrical equivalent: the output impedance should be lower than the input impedance. As a rule of thumb you can expect for outputs:
100 ohm to 600 ohm output impedance from line outputs
0.1 ohm or less from an amplifier
And for input impedance:
10K ohm input impedance or more for line inputs
An average of 32 ohms for headphones, but it can range from 8-600 ohm
Around 8 ohms for speakers
This only applies to unbalanced outputs and inputs. So that means jack plugs and speaker connections. The transformers used in balanced outputs and inputs will usually match without you having to worry about it.
Enough theory. It is always a good idea to start with the ‘zero’ option. Lets connect a simple Shure SE215 earphone to the L-12 monitoring output. It says ‘Phones’. Easy peasy. The sound comes out, but the lows are kind of missing. I just skipped over this this, because I just thought that this was the quality of the output from the L-12. Looking back this was not surprising. If you check the SE215 spec sheet you will find that with an average input impedance of 17 ohm this earphone is quite hard to drive!
A lot of energy is therefore lost, because the output impedance of the L-12 turns out to be 100 ohms. This output qualifies as a line output driver, expecting a high-impedance amplifier to pick up the signal. Actually connecting earphones to this connector is a bad idea! Listening however with a directly connected Sennheiser HD 280 Pro is a more pleasant experience. This is easily explained by its more friendly 64 ohm impedance. Energy is transferred not very efficiently (almost halved), but much more efficiently than with the Shure!
So then I first looked at the Behringer P2, a small active monitoring amplifier. It uses two AAA batteries. You can connect XLR or a stereo jack plug. Since the L-12 has stereo jack monitoring outputs, this seemed to be the way. When connecting it all and the SE215 the result was very disappointing. Like listening to overly compressed, pumping audio, with completely random frequency dips and a lot of noise. Another impedance mismatch?
I immediately blamed the Behringer P2. But when you scout for reviews, this device invariably comes out as top rated with a lot of very happy users. How is this possible? I still don’t know. Particularly vexing is that there is no specification of the input impedance of the P2. It must be that however. Because when I connect the balanced input to a balanced output, it all sounds fine. Possibly no-one uses the unbalanced jack of the P2.
This is why have fallen back to using the Thomann mini body pack 2. It allows me to use long cables and gives me volume control on the belt mounted device. The sound isn’t perfect, because the 100 ohm output still has to drive the SE215. I am still looking for that perfect wired monitoring solution. Any ideas?
For some time now I am looking for a way to add video to my Ableton Live performance. In this article I am experimenting with VideoRemix Pro from MixVibes. There are many people with a similar quest so it seems and equally as many solutions. Most solutions (Resolume, Modul8) revolve around the Apple MacOS. Since I am not in the Apple ecosystem, these are not available to me. Some quite elaborate solutions use many components that all are glued together. Sometimes with MIDI, sometimes with plugins.
As a first attempt am looking for a simple single piece of software that can run inside Ableton Live for a PC. Enter VideoRemix Pro. You need to have the Pro version to run it inside Ableton Live as a plugin. When you look at the instruction video, you can see that it runs in Session mode. Which is how I use Ableton Live live. Looking at this it seems simple enough, but there is a learning curve.
This learning curve is not helped by obvious glitches and problems when using the software. I had quite a battle installing it and getting it to run as a plugin inside Live. The first problem was Live crashing when dropping the plugin on a MIDI track. Which is how you are supposed to use it. My first reaction was to ask for a refund, but after a reboot and some experimenting I got it to work. The secret for me was to make sure that VideoRemix does not use the Windows Default audio. Once I switched to the ASIO audio option that Live also uses, the plugin stopped crashing.
VideoRemix Pro runs in desktop mode as well as plugin mode, but not at the same time. The desktop mode seems solid enough, but even there I have run into glitches. This had to do mostly with customizing the Novation LaunchPad Mini that I wanted to use to control the video. The LaunchPad Mini had been just lying around as a backup for the Ableton Push that I mainly use. It is however not supported by default. The makers of the software prefer you using the full Launchpad Mk2, which has more control options of course.
This means that in order to use it, you have to define a custom control mapping for the software. This seems easy enough, since you have a MIDI learn mode in the software. It took some learning for me to use it. In short, hover over the element in VideoRemix you want to control. Then click or turn the midi knob to link it. Press it again to see if the mapping worked. After this you will see a custom mapping in the list of midi devices in the preferences, which you could then rename.
Then moving over to Ableton Live and running it as a plugin (remember: not at the same time), you will find this same list. Confusing enough there is a VST MIDI device there, but in my case that did not respond to any attempt to control the video. If you switch over to your custom mapping that you created in the desktop mode, things start moving. Now you can record your video sequence.
Creating or recording a video sequence is based on the 6×6 grid of buttons in VideoRemix. This means that you are limited to 36 clips that you can launch. One clip can run for 100 seconds. Hit a clip to start it. Hit it again to stop it. By default running clips is column oriented. You cannot start more clips running on the same column. One clip on the same column will stop a clip on another row. You can start an entire row with a single command. You can start an entire column, but only if you enable all clips playing in a grid of course.
If you want a more complex mix of clips with more than a few clips per song and more then a dozen of songs, you’re probably out of luck with 36 slots. It seems you have to simplify your VJ mix if you are using this software standalone. For now it will have to do for me.
The effects (FX) section is quite elaborate. You can control it as well as all the faders, through MIDI. The moment you hit full screen on the top right you will see your VJ mix full screen. Hopefully on the right video output, but I will have to look into that yet. The default set of clips also loops sound and this sound can be mixed, so you can also have sound effects playing as part of your performance.
This is my first attempt at working with video as part of a Live based performance. After quite a battle to get it working, it is now seems actually possible to have a video running as part of a Session mode sequence, like there is a real VJ at work. I am still quite worried about the overall stability of the setup and I need to get to grips with the quirks of the software.
If you have experience with this or other software setups, please comment below!
After carrying around big and powerful laptops for years and tablets that were simply not powerful enough. I wanted to try a laptop in the ultrabook category. At that time the choice was light and powerful, but with compromises in working with graphics: the Lenovo Thinkpad T480s. These days you can buy ultrathin notebooks with additional graphics power, but that was then and this is now. Unfortunately similarly spec-ed Macbooks are out of my league.
One thing that I really checked when selecting this laptop was the support for Thunderbolt 3. This connection supports external graphics cards. Even though the onboard graphics on paper should be good enough for minimal VR support. However, I already had guessed that this would not run smoothly. Now after one year of use I finally got round to trying out the Akitio Node external housing with Thunderbolt support. Lo and behold, equipped with a ASUS GeForce GTX 1060 OC3 my laptop has now become a graphics powerhouse that easily runs VR, games and any other task smoothly as butter!
If you check the compatibility lists of Akitio, you will not find this graphics card. This list is very limited and only contains higher end cards. There is a small notice that it should work with any card, but there’s no guarantee. My idea was to try the slightly lower end, because maxing out anything like this in the end is bound to cause problems somewhere in the chain. The laptop is also a year old, so I reckoned that careful drivers go a long way. By the way, Macbooks also support Thunderbolt and external graphics cards. They don’t support the specific card I chose, because I believe you should use Radeon AMD graphics cards.
There was a struggle to get this working. When you go the Nvidia way with you graphics card, the driver installation can moan about the hardware not qualifying for installation. I got round this by checking forums and these explained that you should manually install the driver. Unpacking the software and finding and installing the driver files by hand, right clicking on .inf files did the trick.
The Akitio Node is large and slightly noisy, but you can switch it off for music work of course. Then when you want to do graphics intensive work, switch the Node on and voilà. Magic in a box! One other down side is that it does not support any other function then connecting the graphics card. There are other options with storage and other connections, like USB or network. There also is no daisy chaining of devices. The Node is a dead end.
If you too are looking for this upgrade I wanted to put this story out, because this is a working combination. There are a lot of horror stories around of combinations refusing to work. I hope this upgrade trick will work for you too!
I am a big fan of custom made covers for all studio equipment. Dust kills the quality of connectors, sliders and switches. If possible I try to use the dust covers from the instrument or equipment manufacturer. Otherwise I try to look for a Decksaver, because actually these are very clever desk space savers as well. And they fit like a glove. All too often however I find that there is no custom cover that exactly fits.
I tried to find custom covers for the Yamaha 01v and the MicroKorg, but couldn’t find any. My current solution for this is to buy flexible transparent foil and have it cut to a little more than the surface area of the device. It can attract dust, but at least the dust does not get in the equipment. It also looks quite professional and is easy to pull over the surface and slip away again.
For me this beats ill fitting cloth covers and other half baked solutions, like putting it in a box. The best alternative could be to buy thick sheets of perspex and glue a custom cover. There are shops that support you in building your own perspex cover. It will however never be as sophisticated as Decksaver covers, with extra space for knobs and bends in the device. If you’re on a budget, at least put a sheet of transparent foil over your equipment and make it last longer.
Already four years ago I started using a 360 camera. At that time I wanted to create those videoclips where you are really in the set and I wanted viewers to experience the video. The video quality was then an issue and for me it still is, unless you have a solid budget to spend. At the 3.000 euro price point video quality is no longer a big issue. At the lower end however, things have improved slightly. I have now invested in an Insta360 ONE X at a fraction of that price, 400 euro. What has persuaded me to invest in this camera if the quality is only slightly better?
First off, it comes with software that allows you to take your full 360 degree recording and cut out a flat rectangle that looks like you recorded it with a normal camera. Where is the advantage in that? It is actually intended to allow you use it as an action camera and then in the video editing cut out, pan and zoom into any action around you. You can see samples of this in the product page. What use is that to me as a musician, you might ask. Well, how about filming a whole gig from several points and cutting, panning and zooming into all the action on stage and in the crowd? Also the software has some really captivating special effects like speeding up, turning the 360 view into a ball, fish eye etc.
Secondly, it has rock-solid stabilization, because it uses gyroscopes to record all movements. This also ensures that the recording is perfectly horizontal, even when recording at an angle. You will find that if there is too much movement in your recording, most viewers will become sea sick really fast. A smooth recording and stable recording makes the difference. I can now confidently record while walking. Also freaky is that if you use a selfie stick to hold the camera, the software will remove the stick. It will appear as if the camera is hovering above you.
Thirdly, it actually matters that the quality of the recordings is at least slightly better than that of the first generation of 360 cameras. The performance in low light is dramatically better and the 25% increase in pixels of camera’s in the same price range does make a difference. Am I completely happy? No of course not. I can really and wholeheartedly recommend the ONE X at the lower price tier. It has made some impossible recordings possible and I will keep using 360 as part of my video recording to capture the action and experiences.
So this is why you too (as a musician) should start using a 360 camera. Not because you want people to experience VR, but to capture everything and decide how to use the recording later. On stage and everywhere the action happens.
For a while now I have been starting up my live show. After five years of building a repertoire, I feel the next step is playing it live. I have been lucky to have had my “real pop star moment” with my previous band. A CD recording contract and live touring abroad. Now I am back to step one with my own music project.
Starting up, the most important for me is to record all practice sessions and to record all tryouts. For this purpose I have invested in multitrack recording stuff. It might just be that there is a gem in these recordings that needs exploring and investing in. This is what I learned in the previous band. Recoding, recording, recording… Learning, exploring, improving….
Multitrack live recording is easier than ever. It used to be only Tascam with analog 4 track tape recording, but now its digital 8, 12 or 16 track recording with computers, or Zoom or more exotic brands. CD quality or studio quality even. For now I focused on Zoom, because they make really affordable devices. I am not scared of using computers, but for me now it needs to be one single reliable device. Not another chain of devices with a computer at the end.
So I tried the Zoom R16 first. This is a true 16 track recorder. It has the shape of a mixer, but it is actually only a multitrack recorder. It can record 8 channels at once, but has a limitation for the SD card at 32GB. My problem with it was the sound quality as a mixer, that makes it difficult to make sure that the recordings are Ok. Also it tempts you to use it as a live mixer, but it does not have adequate send/return/monitoring chain at all.
Enter the Zoom LiveTrak L-12. The sound quality of the mixer is immediately a lot better. It can record 12 channels at once. It also accepts larger SD cards and record at higher bit rates then CD quality. Unfortunately, the send/return and single effect chain is still a bit meagre. You do have a compressor per channel, but when you use it, its recorded compressed as well. This might not be what you want. The monitoring chain is a different story. Its amazing. Four, or even if you really need it a fifth monitoring channel if you separate it from the master mix.
All in all, this cannot be your live mixer for all purposes. Just because of the limitations of the send/return and single effect and the compression with the penalty of also recording it. However it is probably exactly the mixer that you’ll find in any commercial practice room. So just replace it with this one and you could have a multitrack recording of all your practice sessions. Awesome! Now if you hit a gem, you can mix it down to a demo later.
Can it be your mixer for live venues? Absolutely! Connect some active speakers and you’re live. Unless you need more send/returns and effects live of course, then you need to bring a real live mixer. The challenge will then be to connect separate tracks of that mixer to the multitrack recorder. Hopefully, that live mixer has at least enough monitor channels or busses. Otherwise you’re stuck with a recording that does not give you enough options to remix the live recording.
Now in practice, how does it work when using the LiveTrak as a multitrack recording mixer? First of, as a mixer it will remember all your mix and recording settings as part of something that Zoom calls a Project. It will save it all on the SD card when you switch off and on. You will need to make sure that you do switch off and on again on the device, not just pull the power plug. When you switch projects then you can save different mix and recording settings per project.
Like an advanced digital mixer all fader settings are saved. But because it does not have motorized faders, a led shows the stored fader settings and such. When you hit that point of the fader again, you can change the value and save that again. This applies to all mixer settings in general. To extend on this you can save 10 different scenes per Project.
This is nice, but you cannot from the menu simply clone a project. There is a trick however, if you switch to USB host mode you can save and restore projects on a USB stick. The trick here is to save and restore an existing Project to a new name. This way you can start recording to a new Project with settings from an existing Project.
So there you have it. This is how I use this now and I know what it can do for me. I think it is great as a practice room mixer and for small venues. Please check the Zoom site or review sites to read more about all other modes and features of the LiveTrak. I don’t use any of the other modes, so I have no experience with any of the other features. It might work for your specific purposes as well.
Recently I had to revise some cabling and routing under the mixing desk, when I found a rack device. A MIDI patch bay, the A880. It was happily blinking and had silently done its useful job there for at least 10 years. After looking it up, it turned out to be an actually more than 30 years old product from Roland! You can also see the dust on the cables in my setup.
Then the question is: do you need a MIDI patch bay? The answer is twofold. MIDI itself is an ancient protocol. If you have MIDI devices and a computer hooked up via MIDI, I will say that you cannot do without a MIDI patch bay. However, MIDI is showing its age and probably some of you are using USB instead. Also, new MIDI standards are now seriously being discussed. Possibly resulting in something altogether new that may not be supported by the A-880.
The current standard MIDI protocol is ancient. And when you look at it technically it is also slow and limited. Of course it is fast enough to connect a keyboard to an instrument or a computer. Most devices allow daisy-chaining to connect any chain of computer and keyboard and MIDI instruments you have. However, that is when you will find that MIDI has its limitations. If you daisy-chain more than three devices you will likely hit one of its limitations: bandwidth. When too much information passes through a single chain, then you will get traffic jams and you might start hearing hick ups.
This is where a MIDI patch bay kicks in. Instead of daisy-chaining you can now connect devices in parallel. The A-880 connects 8 inputs to 8 outputs. Each of the individual connections to a midi device from the patch bay can now pass the maximum amount of data without traffic jams. Also with some simple button presses you can determine which input gets sent to which output. Allowing you to have more keyboards and route inputs from there to more devices. The forever friendly blinking lights show you which inputs go to which outputs.
Inputs 1 and 2 are special. The A-880 can merge the inputs and send it to multiple outputs. I use the patch bay in its most simple and useful form. The inputs from my main master keyboard are mixed with the input from the computer and then sent out through all remaining outputs at once. This is the blinking pattern that been the core of my setup for more than a decade. Only occasionally I push the Signal button. Then the blinking lights show which devices actually send data.
It may be that the future of MIDI does not include the A-880. This will be the moment when I will switch off this blinking, silently working work horse. And I will remove it from its hidden place under the mixing desk.
After a month of working on singing and performing. Everything but working in the studio, I wanted to get up and running again with making music. As always, I started with updating the studio software. When updating the Native Instruments (NI) suite I am using, the A49 was part of the updates. When playing around in Ableton Live after that it soon became obvious that things did not work quite right. So it was time to reserve some hours diving into this.
The NI Native Access manager was updated and the first step is then of course to check all the software installations inside it. It soon turned out that the VST installation path of Komplete Kontrol was not correct anymore. NI likes to think that it is the only source for plugins on your computer, so I needed tot tell it that VSTs are located elsewhere on the computer. The Komplete Kontrol installation was then fixed by reinstalling. Nice.
After checking if both the version of Komplete Kontrol inside Live and Komplete Kontrol as a standalone application were matching. Things started working again. A plugin rescan was needed to pick up all NI instruments in both versions, so a lot of instrument settings were not matching up apparently. Also a quick scan of the MIDI integration settings revealed that the integration was still correct.
I use the Komplete Kontrol Rack VST in Ableton Live, but when you update your NI software this is not automatically updated in Ableton Host Integration. Time to copy vst files (vst) all over again from C:\Program Files\Common Files\Native Instruments\Host Integration\Ableton Live to D:\Documents\Ableton\Library\Presets\Instruments\Instrument Rack. Or some equivalent on a Mac.
This Komplete Kontrol instrument rack can host any plug in instrument and map the A49 knobs to macros to controls in the instrument. Please note: Only use this for all instruments other than NI instruments! You must manually map any control to any control inside the instrument. Not very pretty, but once you’ve set it up it works.
And what if you do want to use a NI instrument? I also found out that instead of adding Kontakt to a track to start working with a NI instrument, as I always did, its better to use the Komplete Kontrol plugin. This immediately gives you full control with the A49 and allows you to quickly switch instruments on the fly. Oh well. Never too old to learn.