The idea is to make things consistent between audio monitored through an audio track and other audio monitored direct via an interface or heard as an acoustic instrument, guitar amp, whatever, at the same time. It’s not an exact process because it doesn’t take account of the time it takes for sound to get from monitors to our ears, but let’s keep things simple and pretend we’re wearing headphones. To compensate for that latency Like shifts the recorded audio back along the timeline by the audio latency setting in preferences. We will hear it only after the audio has gone through the interface into Live and back out through the interface - after the time dictated by the size of the audio buffer. Live assumes that if we are monitoring via an audio track then we want the audio placed as near as possible to when we heard it. How Live decides where to place audio recordings on the timeline when you are monitoring through the audio track you are recording to isn’t exactly intuitive. Sorry this is so long, but there’s a lot involved. I hope my analysis is correct, but fi it isn’t I’m more than happy for someone to correct me. I don’t have either a TR or a Behringer interface which limits my experimenting, but I think there’s at least two different things going on in that project.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |