Streaming live with OBS Studio

Okay, like everybody else i started streaming too. I had a planned live show, but live shows will not be possible for at least another half year. Every evening my social timelines start buzzing with live streams and all the big artists have also started to stream live. No place for me with my newly created and sometimes shaky solo live performance to make a stand? After some discussions with friends i decided to make make the jump.

But how to go about it? If you already have experience with live streaming, you can skip this entire article. This is here just for the record so to say. After some looking around I came to this setup:

OBS Studio with ASIO plugin
Restream.io for casting to multiple streaming platforms
Logitech C920 webcam
Ring light
– Ayra ComPar 2 stage light see this article

OBS is surprisingly simple to set up. It has its quirks. Sometimes it does not recognize the camera, but some fiddling with settings does the trick. You define a scene by adding video and audio sources. Every time you switch from scene to scene it adds a nice cross fade to make the transition smooth. You can switch the cross fade feature off of course.

OBS Main scene setup
OBS Main scene setup

I only use one scene. The video clip is there to promote any YouTube video clip. It plays in a corner and disappears when it has played out. The logo is just “b2fab” somewhere in a corner. The HD cam is the C920 and the ASIO source is routed from my live mixer to the audio interface on the PC. I setup a limiter at -6db on the ASIO audio as a filter to make sure i don’t get distortion over any audio peaks.

I also had to choose my platform. From the start i wanted also to stream live on Facebook and Instagram. Instagram however kind of limits access to live streaming to only phones. There is software to stream from a PC, but then you have to set it up again for every session and you need to switch off two-factor authentication. For me one bridge too far for now.

I chose Restream.io as a single platform to set up for streaming from OBS. It then allows to stream to multiple platforms and bundle all the chats from the different platforms into a single session. For Facebook pages however, you need a paid subscription tier. For now I selected the free options YouTube, Twitch and Periscope. YouTube because it is easy to access for my older audience. Twitch because it seemed quite fun and i also like gaming. Periscope because it connects to Twitter.

If the live show takes shape i might step into streaming from my Facebook page. Another plan is to try the iRig Stream solution and start making separate live streams on Instagram. With high quality audio from the live mixer. I will surely blog about it if i start working with it.

For now it all works. Restream.io allows me to drop a widget on my site. Its a bit basic and only comes alive when i am live, so i have to add relevant information to it to make it interesting. If you want to drop in and join my live musings check my YouTube, Twitch and Periscope channels or my site at around 21:00 CEST.

Controlling a light show for a small solo set

I’m back on the track of my own small solo live set. The first experiment was running a video stream that would run along with the show. But now there is a new twist: The Corona virus came and there will be no live set the coming months. All public shows have been cancelled for about half a year. My first live show has been pushed to November from June. The only alternative is live streaming.

Just before the lockdown to combat the spread of the Corona virus I had bought a stage light. Just one to at least have a blue wash on stage to set a kind of moonlight mood. This was the Ayra ComPar 2. A simple LED stage light, with an IR remote and plenty of flexibility be more than just a blue stage wash.

But while staying at home and after browsing through some online articles it dawned on me: you can simply control stage lights as part of your Ableton Live set. I use Ableton Live sets to run my stage show and believe it or not I use color coding for each different song to quickly browse through all the songs without having to look up the names.

The colors match the moods of the song, so my simple idea was to use this color code to match the color of the wash on stage. A red wash for a deeply felt love song. A green wash for a song about nature. A purple wash for an up tempo hot song etc.

But why put all this effort in a stage light when there will not be a stage for months to play on? Up to then I had been a bit weary of immediately jumping to live streaming instead of playing gigs. All the bigger artists now stream live. Every night on my socials there are at least a dozen artists performing live. I’m just starting out, so what can I bring to the table?

After discussing this with a close group of musicians and my music coach it became obvious. Why not start streaming live? It’ll be fun, even if nobody watches it. I can invite friends and just have fun together. And also because I had nothing else to do I jumped in to make this stage light idea work. It would change color with the song. Not on stage, but in the attic. The attic with my home studio as my online stage.

One of the intriguing functions of the ComPar 2 is the ability to connect a XLR cable with DMX signal to control it. After diving into it and in lockdown there was a lot of time to dive into anything I found out that there are also DMX light controllers that support MIDI. From the same company I got the Ayra OSO 1612 DMX Scanmaster controller. Very friendly priced i think.

Blacked out by default
Blacked out by default

The DMX light controller simply accepts MIDI note data and maps that to programmable scenes. The controller can be connected to a chain of lights and a scene can set each light correspondingly. You can have flashing lights in a scene or movement from stage lights that can move. With 240 scenes you can probably make an interesting progression of lights for several songs, but I simply have a red, green, purple and blue scene for each song.

The controller I chose has a default setting where it blacks out all lights when starting up and that is not a bad thing at all. The only thing I must remember is to switch off the black out when playing live. That is the only attention it needs and from there everything is now running on rails. The live streaming shows allow me to test stuff out, but I’m now pretty happy with this setup.

Discovering Loopcloud 5.0 as a sample library manager

This maybe something that I had overlooked for too long: Loopcloud. For years the talk of the sampling library town, but I didn’t look at it until I got a demo of the new Loopmasters Loopcloud 5.0 version at the Amsterdam Dance Event this year. I also had looked at other sample managers like Algonaut Atlas, but that may be only drums oriented. Intriguing, because Atlas uses machine learning to recognize the types of samples. For me, up to now, a sample manager was simply a folder in Ableton Live to browse through. And I had always put Loopcloud away as simply a shop to buy samples with a subscription model.

How to work with the application

The Loopcloud application is a standalone application, but it integrates with your DAW through a Loopcloud plugin. You can only have it on one track in your DAW. All samples that you browse then play through that track. The idea is to start with a sample in the Loopcloud application. You can have random sorting to free your mind. Then use that to edit, slice, dice, sequence, mash up and add effects if you wish. You can drag the final result into your DAW as a sound file. Quite something different than finding a sample and then edit it in the DAW. All with the tune and tempo of your DAW. It nicely prevents you using kind of preset sounds over and over. Clever!

Loopcloud sample editor
Loopcloud sample editor

It means however, that you have to keep two applications open while working. For those of you with two monitors, maybe a no-brainer. But then again, it could just be that you already have a nice workflow with your two monitors and now you need to fit in yet another application. Anyway, there is an option to have the application dock to the sides of a window at about 20% of the width. Combined with scaling and other options, you might manage with one screen. The application sometimes forgets how you docked and scaled it.

Your library manager

Now about the library management. The moment you add your own samples to the Loopcloud application it starts scanning all the samples in it. It will try to find BPM and key information and it will try to read other information from the name of the sample or the loop. It will probably not correctly discover more complex information like the genre, loop or one-shot, or the exact instrument. All is then marked down as tags and you can start searching for things like key and BPM.

Loopcloud docked
Loopcloud docked

For this you need to click the button marked “Your Libary”. If you also want the detailed information of your scanned samples to be correct you will have to start tagging yourself. Its quite advanced, you can tag whole folders and batches of files. For a more in-depth dive into the tagging and searching you should dive into the tutorials.

Additional plugins!

But then when I found out Loopcloud as a sample manager, the tutorial also pointed me to Loopcloud Drum. A separate plugin that is actually a full sample drum instrument. It uses its own Loopcloud drumkit format and will open up a separate section in the Loopcloud manager. A strange find in a sample library manager. As a separate instrument it has its own format and its actually more of a pattern beatmaker with its own sequencer. A preset list of drum kits get activated that have been assembled from Loopcloud one shot samples of course.

Loopcloud Drum plugin
Loopcloud Drum plugin

I didn’t find any option to change the patterns in the beatmaker, other than with a mouse. You would also expect an option to edit drum kits and build your own. You can edit the mix of the kit and save that as a “user” drum kit, but I didn’t see any way to create a drum kit from your own set of one shots. Maybe this is in a future version, or in a Loopcloud subscription model that I didn’t explore. I was kind of on the lookout for tools to start making beats, other than with loops or Nerve, but this is not it yet.

Loopcloud Play plugin
Loopcloud Play plugin

And even more? The tutorial also points to the Loopcloud Play plugin. Yet another sample instrument, but this time melodic. As an instrument its quite basic, maybe so basic that you fall back into the preset trap again. There are about 7 knobs to turn and that’s it. Like the Drum instrument it has its own place in the library and again no way to choose the samples. You can save tweaks to the knob as “User” instruments. I think it needs work, as this is no match for Native Instrument’s Kontakt.

Closing out

Loopcloud has a quite intricate subscription model and not all of the features are available in all tiers. Specifically on using multiple tracks and the sample editing. However, if you just want to use it as a sample library manager you can even use the free subscription model tier. If you already own Loopmasters stuff it will automatically appear in your library. Even though it could do with more advanced detection of the samples that you load in the library, for me this was a great find and it surely beats the user folders in Ableton Live.

A first attempt at an automatic VJ mix on stage

For some time now I am looking for a way to add video to my Ableton Live performance. In this article I am experimenting with VideoRemix Pro from MixVibes. There are many people with a similar quest so it seems and equally as many solutions. Most solutions (Resolume, Modul8) revolve around the Apple MacOS. Since I am not in the Apple ecosystem, these are not available to me. Some quite elaborate solutions use many components that all are glued together. Sometimes with MIDI, sometimes with plugins.

As a first attempt am looking for a simple single piece of software that can run inside Ableton Live for a PC. Enter VideoRemix Pro. You need to have the Pro version to run it inside Ableton Live as a plugin. When you look at the instruction video, you can see that it runs in Session mode. Which is how I use Ableton Live live. Looking at this it seems simple enough, but there is a learning curve.

This learning curve is not helped by obvious glitches and problems when using the software. I had quite a battle installing it and getting it to run as a plugin inside Live. The first problem was Live crashing when dropping the plugin on a MIDI track. Which is how you are supposed to use it. My first reaction was to ask for a refund, but after a reboot and some experimenting I got it to work. The secret for me was to make sure that VideoRemix does not use the Windows Default audio. Once I switched to the ASIO audio option that Live also uses, the plugin stopped crashing.

VideoRemix Pro runs in desktop mode as well as plugin mode, but not at the same time. The desktop mode seems solid enough, but even there I have run into glitches. This had to do mostly with customizing the Novation LaunchPad Mini that I wanted to use to control the video. The LaunchPad Mini had been just lying around as a backup for the Ableton Push that I mainly use. It is however not supported by default. The makers of the software prefer you using the full Launchpad Mk2, which has more control options of course.

This means that in order to use it, you have to define a custom control mapping for the software. This seems easy enough, since you have a MIDI learn mode in the software. It took some learning for me to use it. In short, hover over the element in VideoRemix you want to control. Then click or turn the midi knob to link it. Press it again to see if the mapping worked. After this you will see a custom mapping in the list of midi devices in the preferences, which you could then rename.

A new custom MIDI mapping in the VideoRemix Midi Preferences
A new custom MIDI mapping in the VideoRemix Midi Preferences

Then moving over to Ableton Live and running it as a plugin (remember: not at the same time), you will find this same list. Confusing enough there is a VST MIDI device there, but in my case that did not respond to any attempt to control the video. If you switch over to your custom mapping that you created in the desktop mode, things start moving. Now you can record your video sequence.

Creating or recording a video sequence is based on the 6×6 grid of buttons in VideoRemix. This means that you are limited to 36 clips that you can launch. One clip can run for 100 seconds. Hit a clip to start it. Hit it again to stop it. By default running clips is column oriented. You cannot start more clips running on the same column. One clip on the same column will stop a clip on another row. You can start an entire row with a single command. You can start an entire column, but only if you enable all clips playing in a grid of course.

If you want a more complex mix of clips with more than a few clips per song and more then a dozen of songs, you’re probably out of luck with 36 slots. It seems you have to simplify your VJ mix if you are using this software standalone. For now it will have to do for me.

The VideoRemix Clip Grid
The VideoRemix Clip Grid

The effects (FX) section is quite elaborate. You can control it as well as all the faders, through MIDI. The moment you hit full screen on the top right you will see your VJ mix full screen. Hopefully on the right video output, but I will have to look into that yet. The default set of clips also loops sound and this sound can be mixed, so you can also have sound effects playing as part of your performance.

This is my first attempt at working with video as part of a Live based performance. After quite a battle to get it working, it is now seems actually possible to have a video running as part of a Session mode sequence, like there is a real VJ at work. I am still quite worried about the overall stability of the setup and I need to get to grips with the quirks of the software.

If you have experience with this or other software setups, please comment below!

Why you should start using a 360 camera

Already four years ago I started using a 360 camera. At that time I wanted to create those videoclips where you are really in the set and I wanted viewers to experience the video. The video quality was then an issue and for me it still is, unless you have a solid budget to spend. At the 3.000 euro price point video quality is no longer a big issue. At the lower end however, things have improved slightly. I have now invested in an Insta360 ONE X at a fraction of that price, 400 euro. What has persuaded me to invest in this camera if the quality is only slightly better?

First off, it comes with software that allows you to take your full 360 degree recording and cut out a flat rectangle that looks like you recorded it with a normal camera. Where is the advantage in that? It is actually intended to allow you use it as an action camera and then in the video editing cut out, pan and zoom into any action around you. You can see samples of this in the product page. What use is that to me as a musician, you might ask. Well, how about filming a whole gig from several points and cutting, panning and zooming into all the action on stage and in the crowd? Also the software has some really captivating special effects like speeding up, turning the 360 view into a ball, fish eye etc.

Secondly, it has rock-solid stabilization, because it uses gyroscopes to record all movements. This also ensures that the recording is perfectly horizontal, even when recording at an angle. You will find that if there is too much movement in your recording, most viewers will become sea sick really fast. A smooth recording and stable recording makes the difference. I can now confidently record while walking. Also freaky is that if you use a selfie stick to hold the camera, the software will remove the stick. It will appear as if the camera is hovering above you.

Schemerstad
Schemerstad

Thirdly, it actually matters that the quality of the recordings is at least slightly better than that of the first generation of 360 cameras. The performance in low light is dramatically better and the 25% increase in pixels of camera’s in the same price range does make a difference. Am I completely happy? No of course not. I can really and wholeheartedly recommend the ONE X at the lower price tier. It has made some impossible recordings possible and I will keep using 360 as part of my video recording to capture the action and experiences.

So this is why you too (as a musician) should start using a 360 camera. Not because you want people to experience VR, but to capture everything and decide how to use the recording later. On stage and everywhere the action happens.

Komplete Kontrol A49, you’re not using it right

After a month of working on singing and performing. Everything but working in the studio, I wanted to get up and running again with making music. As always, I started with updating the studio software. When updating the Native Instruments (NI) suite I am using, the A49 was part of the updates. When playing around in Ableton Live after that it soon became obvious that things did not work quite right. So it was time to reserve some hours diving into this.

The NI Native Access manager was updated and the first step is then of course to check all the software installations inside it. It soon turned out that the VST installation path of Komplete Kontrol was not correct anymore. NI likes to think that it is the only source for plugins on your computer, so I needed tot tell it that VSTs are located elsewhere on the computer. The Komplete Kontrol installation was then fixed by reinstalling. Nice.

After checking if both the version of Komplete Kontrol inside Live and Komplete Kontrol as a standalone application were matching. Things started working again. A plugin rescan was needed to pick up all NI instruments in both versions, so a lot of instrument settings were not matching up apparently. Also a quick scan of the MIDI integration settings revealed that the integration was still correct.

I use the Komplete Kontrol Rack VST in Ableton Live, but when you update your NI software this is not automatically updated in Ableton Host Integration. Time to copy vst files (vst) all over again from C:\Program Files\Common Files\Native Instruments\Host Integration\Ableton Live to D:\Documents\Ableton\Library\Presets\Instruments\Instrument Rack. Or some equivalent on a Mac.

This Komplete Kontrol instrument rack can host any plug in instrument and map the A49 knobs to macros to controls in the instrument. Please note: Only use this for all instruments other than NI instruments! You must manually map any control to any control inside the instrument. Not very pretty, but once you’ve set it up it works.

And what if you do want to use a NI instrument? I also found out that instead of adding Kontakt to a track to start working with a NI instrument, as I always did, its better to use the Komplete Kontrol plugin. This immediately gives you full control with the A49 and allows you to quickly switch instruments on the fly. Oh well. Never too old to learn.

Versioning Ableton Live projects with large files

This is a follow-up of one of the first posts here keep-track-of-versions-of-your-song-with-Ableton. At first this was a bit tricky, because you could choose leave out large files, like .wav recordings and samples and even the .als project files. Or you could defy a warning from Git stating that it doesn’t handle large files well, performance-wise. This will hit you when you push and checkout your repository remotely. Now you can start using the new Large File Storage (LFS) feature, that handles versions of the large files as markers in the Git repository, improving the speed at which Git can handle these large files when getting the latest version remotely. Please note that these versioning tools might work for your DAW too.

But why Git versioning?

Lets go back to the beginning. Why should you consider using Git for versioning of your Ableton Live projects? Version 10 of Ableton Live keeps backups of your project files. If something goes wrong, you can go back 20 or more versions. The problem is, what version on which time and date contains which changes? There is no way to tell. With Git versioning you can attach a message to each set of changes (commit) and you can decide which part of which commit you want to keep. The thing that holds most people back from using Git is its complexity.

Git is even more powerful in combination with a shared remote repository like GitHub or Bitbucket. This will allow you to work together remotely on a shared project with more musicians, while at the same time giving you the liberty to work stand alone. Contact me if you want to hear more about this. Please note that some remote repositories are not free if you want to store private content and collaborate. Otherwise everything you put on it is public. GitHub now allows free private repositories.

Collaborative repository on GitHub
Collaborative repository on GitHub

With its power comes a set of command line instructions that scares the shit out of any musician. For daily use I turn to SourceTree for a more graphical and pleasant Git experience. SourceTree is free and hides most complex command line instructions behind a more useable interface. There will be a time however when you really will have to dive in to the nitty gritty and this post will also dive deep. Fortunately the latest version of SourceTree also understands the new LFS features.

Large File Storage

The first step will be to install Git LFS on top of Git. By the way SourceTree has embedded versions of Git and Git LFS that you can install alongside. I have no idea how powerful these embedded versions are compared to the stand alone versions. Then here the steps you need to take to activate the Large File Storage feature. Open a command line in the project folder where you created your Git repository and type (as marked in bold):

b2fab@STUDIO MINGW64 /d/Documents/Ableton/Goodbye Project (master)
$ git lfs install
Updated git hooks.
Git LFS initialized.

b2fab@STUDIO MINGW64 /d/Documents/Ableton/Goodbye Project (master)
$ git lfs track *.wav
Tracking "B2FAB - Goodbye ft Hanny (Mastered).wav"
Tracking "Goodbye (instrumental).wav"
Tracking "Goodbye ft Hanny (unmastered).wav"
Tracking "Goodbye ft Hanny.wav"
Tracking "Goodbye Hanny (FY).wav"
Tracking "Goodbye.concept.wav"

b2fab@STUDIO MINGW64 /d/Documents/Ableton/Goodbye Project (master)
$ git lfs track *.als
Tracking "Goodbye Hanny (Exp).als"
Tracking "Goodbye Hanny (FY).als"
Tracking "Goodbye Hanny (FYCD).als"
Tracking "Goodbye Hanny Beat.als"
Tracking "Goodbye.concept.als"
Tracking "Goodbye.Hanny.als"
Tracking "Goodbye.instrumental.als"
b2fab@STUDIO MINGW64 /d/Documents/Ableton/Goodbye Project (master)
$

As you can see the install statement just prepares the repository. The track statements marks large file types to be treated as LFS files. From that point you need to commit this change and its .gitattributes and you are good to go. If you want I can go live on Instagram or help you out.

Commit Git LFS in SourceTree
Commit Git LFS in SourceTree

A live setup for Ableton Live

It just does live gigs

I guess most musicians use Ableton Live live like I use it. Its kind of the standard way of building a live set. This article describes the details in the implementation as I use it.

So there is Session View with the track channels laid out with different instruments and the scene rows with the different songs. Within each song several scenes with the intro, verses, choruses, break and outro. Ableton will follow the bpm mentioned in the description and you can set the Launch Follow Action to let Ableton run the flow of each song. This way Ableton will back your song live with the right scenes with the push of a button. With effects automated or manual as you want it and in the correct tempo. Additionally I use MIDI Program Change commands to instruct the Nord and the Korg to switch to the right instruments for any scene of any song.

Ableton Live live set
Ableton Live live set

In my case I play solo, or with aid of other musicians. I can choose which track to leave out, the backing vocals, the bass or at least one or more keys. On the whole Ableton Live runs the show in my case, so I should be careful not to bore the audience with too much music out of the box. I should keep working on performance, video tracks and light effects all the time. I try to use only Ableton Push, avoiding the use of the laptop to start and stop.

What’s on the monitor?

Lets start cheating a little. Because not every track has drums, I rely on a click that gets routed to the monitor. In the above picture you can see the click track on the left. It just plays and plays and gets send to the Cue out Return Track C. Return Track C works pre-fader so it is in no way linked to the master mix. The cue out goes to a separate output on the audio interface and thus can be mixed to all monitors. For now this suffices.

All live instruments, vocals output and everything from Ableton Live gets mixed in by the audio interface. The audience hears the Master Out mixed and on stage you hear the Cue Out mixed with the click.

Prepare for the worst

My live set contains an instrument rack that is setup to be a playable, plug in based copy of the most important instruments I use live.  Should an instrument break down, I will then have the option to use any MIDI keyboard to replace it. The plug in sounds are not as nice as the Nord and Korg sounds, but I will have something to play instead of nothing.

Live Instrument Rack
Live Instrument Rack

To make sure that I will always have a way to recover in case of emergency the entire live set is stored in the Cloud. This way I can fine tune the Ableton Live live set from home and push it to the Cloud. The moment I open the laptop for a show and there is Internet it will sync up. I use OneDrive but any Cloud product is fine. Should the laptop break down, from any other laptop I should be able to recover the Ableton Live install, a few plugins and packs and any interface and sync the live set again. At the last moment a backup laptop should be ready to swap in on the spot if needed. Lets pray it will never come to this, but if it can happen it will.

 

Trying out the Spitfire eDNA Earth instrument

I will try to write about my impressions with the Earth instrument. However, I will not completely review it. For in-depth reviews please check MusicRadar or TheAudioSpotlight or others. For me, ever since Camel Audio was bought by Apple and its Alchemy synthesizer disappeared as a standalone virtual instrument, I felt lost. Alchemy had a granular synthesis engine and a unique way to parameterize its sounds. The unique sound of this instrument disappeared and there was nothing to replace it. Omnisphere apparently is capable of recreating some sounds, but that is mainly because it can synthesize anything and its priced accordingly. The moment I heard a demo for Earth, I heard back some of that Alchemy sound again.

Technically its a completely different beast, compared to Alchemy. The Earth sounds are based on an orchestral sample library, but are then processed by the Kontakt engine to sound, cinematic, outer worldly and sometimes electronic. Yes its a Kontakt instrument, so you need at least the Kontakt player. Inside Kontakt you will find the eDNA interface of this instrument. As an owner of a Komplete Kontrol A series keyboard, this is very convenient. It means I can use the Komplete Kontrol browser to quickly browse through the sounds and immediately tweak parameters of the sounds once loaded.

The Kontakt engine and the eDNA interface of Earth takes some getting used to. To make sure you fully understand its workings its a good idea to go through the walk through on the Spitfire Audio site. In short, every sound consists of two samples from the library. Which are mangled, then mixed, then chopped up and lastly processed by a set of effects. Very important is to see that you have sounds, but also full versions of the same sound. The full version contains the full range of orchestral samples. This allows you not only to start with a fixed set of samples, but eventually switch out one of the samples for another.

The result is that you get a sound that is usually cinematic. Sometimes a wash or a drone in the background and sometimes a sharp stab in the foreground. Because of the mangling and the chopping, sounds can really get that grainy Alchemy sound, or a dirty sound. None of the patches is really clean. I can only say: I love it. All sounds immediately inspire to let you build a soundscape. Even better, with a Komplete Kontrol Keyboard you can also immediately start changing the sound, bringing it even more to life.

If are looking for cinematic sounds, drones, or dirty stabs and you want an affordable synth then I invite you to take a look at this Kontakt library. In most reviews you will find some comments on eDNA interface of this instrument and I have to agree that it can be kind of hard to find your way in elements that are not inviting you to click or drag. After some getting used to it is not that bad. All in all: recommended!

Soundbrenner Pulse wearable metronome, the verdict

After diving into the basics and getting it to work it was time to really start using it. First off, the concept really works. I have songs without drums. Practicing these can be tough, so I tried working with a click track in the monitoring. That helps if you get in the flow. But after using the Pulse a few times it was completely natural and my mind “felt the beat” and leaned into it. It was important for me to tone the default “hard buzz” of the Pulse down to a more subtle vibration level. Now it really works for me. The battery life of the device is excellent for me, I have been practicing for hours now on end and its not even half empty. Charging it is a bit fiddly though.

There are however several problems with the product. If you look over the appearance, because it looks and feels very plastic and rough at the edges, what are the real problems? At this point and time, for me the Ableton Link feature does not work reliably. If in an Ableton Live session the tempo changes for a new song, I do not know when of if the metronome app will pick this up. This should be a simple bug to fix, or I have a unique setup in my WiFi network, Android version (latest version of Android Pie – 9), or something else is wrong. I am willing to try an iPad in the near future to see if it works better.

Then there is of course the problem that its three devices. Your laptop, a phone or tablet with the Metronome app and the Pulse all have to be fully charged and setup to make this work in a live situation. On several occasions I had to reconnect the phone to the Pulse to make or keep it working. Even if sync between laptop and the Metronome app does get fixed all devices need to be on a perfect working WiFi network on stage and how realistic is that? I think you can see that this device is probably at its best while practicing or in the rehearsal room. Only if you have a dedicated professional crew on stage to keep it working it might just work.

In short, I cannot do without anymore when practicing. I would never try to get this to work in a live situation. Maybe it all gets fixed in the next version, the Core. Lets wait and see.