So when are we going to get to kick the latency elephant out of the living room?

Regarding latency with e-drums, the latency between the drum pad, the “brain” and the time it takes for the midi data to hit the daw has to be factored in. You might say “anything above 6ms is too much for me”, but you may actually be hearing more than 6ms when you make that determination.

Having said that, using edrums to trigger BFD or whatever, you’re only going to have the output part of the buffer to worry about, cause you aren’t going through the A/D stage to get information into the DAW.

Yea, MIDI has some latency. Likely adding a couple of ms on top of the audio output latency. Drum ‘brains’ being embedded devices, likely something like 1ms there. Probably another ms or so getting it into the daw.

I have a system with a firewire focusrite saffire pro 40 that can achieve latencies under 4ms. When I get chance I’ll figure out what is the lowest latency I can achieve with 16 inputs, eq and dynamics on each channel, and a few reverb busses.

If I remember correctly I’m running at 3.4ms (32 sample buffer) round trip latency for the tracking I use it for, but no effects. I’ll be curious to see what can be achieved, as it supports smaller buffer sizes than that.

Keep in mind that the specific plugins you use will matter, maybe a lot. Reaeq, reacomp, and reaverbate are very light but also not particularly good. But effects on digital mixers probably aren’t particularly good either.

If we look at the innards of JSFX or VST plugins, MIDI events have a time stamp of a sample position within a block. The only way that’s possible with multiple FX iterating through samples/events in series in times measured in tiny fractions of a fixed block size is to have the events in advance. Or for different plugins having different MIDI events depending on when they are processed, which doesn’t happen AFAIK.

For just plonking MIDI down on to a track it could be done differently of course.

I learned some jsfx basics at one time, but I forgot all that by now. You’re saying that MIDI events are delayed by the input latency? I wish that I would have stuck with learning jsfx, by the way. It has so much potential use.

Yeah, that’s the way it works. If it worked by processing MIDI events as they came in then you’d get all sorts of weirdness, eg with a big block size of 1024 VSTi A is processed first and manages to receive a MIDI event at sample position 1 before it blitzes over the MIDI events and samples, but VSTi B misses it because it’s processed later and has to play it in the next block. It would make PDC flaky for the player too.

:man_facepalming: worst part is I actually already know that… brain fart.

I’ve read that same thing a hundred times on t’interwebz and did used to think that myself before my fumblings with the undergarments of plugins.

You messing with programming plugins these days? All I ever got to in jsfx was some basic stuff, tinkering a little with altering incoming MIDI to what I wanted, super basic graphics stuffs (graphing a waveform and drawing some basic controls using the jsfx graphics functions), and small things on audio. I didn’t get as far as trying to create filters using the fft functions before getting distracted into other things, which is where I wanted to get at the time. Also wanted to make a personally useful little sampler (as in live sampling, not just playback), and I remember there being a limitation in jsfx of not being able to write files (security reasons), which would have required calling an external function.

Not for a few years now, the last plugin I did was a MIDI playable phaser VST(i) with IPlug. Never did get beyond ankle depth into audio DSP, everything kind of exists now anyhoo apart from what the ninjas like u-he, Melodyne or JSFX’s saike guys are up to and that’s just “nope!” stuff to me since the rest of it isn’t exactly intuitive either. :slight_smile:

Your memory is definitely better than mine. It was only a few years ago since I tinkered in jsfx, and I don’t remember much of anything. I guess I could get a refresher at some point looking back at my little projects and notes. I think there is still room for some new stuff and improvements on aspects of existing plugs. On the latter, I don’t know what is possible in digital though, such as richer harmonics in clipping sections of plugins with acceptably low aliasing and without cramping. I do know that a lot of people are still using analog for clipping / saturation sort of stuff since plugins fall short in that area, such as analog overdrives, fuzzes, transformer based sutff such as preamps and compressors. But there are a lot of really smart people working on that stuff over the years, so chances of a dsp peon turning things around there seem pretty slim.

Yeah, I think I massively missed the point there, there’s definitely lots of room for improvement and innovation that doesn’t require a PhD in audio sorcery. It’s good that so much has been done too since you can plug lots of existing things together and tinker with them.

For live mixing, digital hardware mixers have the benefit of overall lower latency than interfaces and plugins. But it also looks to be in many digital mixers that internal latencies aren’t compensated for. On a digital mixer, say you put a compressor on a drum bus that introduces 1.5ms latency and mix that back into the main with the uncompressed drums, the compressed bus is now going to be out of time and phase with the main bus. Seems like internal latency compensation should be a fundamental requirement of all digital mixers.