Calibrating for 'bit-perfect' playback

Audio, MIDI and other software, not including effects or instruments
Post Reply
dB Cooper
Posts: 3
Joined: 10 May 2019, 07:38

Calibrating for 'bit-perfect' playback

Post by dB Cooper » 09 Aug 2019, 17:38

Apologies in advance for the verbose post, and if I chose the wrong place for this, feel free to move it as needed..

I use my 2018 Mac Mini as my primary source of audio playback. I stream Tidal and Idagio lossless, and use a variety of players according to my mood to play back local files. Main ones are Swinsian, VLC, Fidelia, Vox, and occasionally even iTunes. Whatever I am listening to goes out to an external DAC, and from there to an analog amplifier.

Not long ago, I installed the latest release of Rogue Amoeba SoundSource, a little utility that allows me to control the routing of sound the computer. It also allows me to use plug-ins with any sound source. I often use one designed for headphones which gives a more natural stereo image.

Among SoundSource’s features is little level meters. They are useless for determining actual sound levels, but they have one very handy feature: they change color to red when the signal is clipping. That's where I discovered I had a problem: All of the Applications mentioned above have their own volume controls. The standard advice I see (including from rogue amoeba) it Is to max out the volume control on the source, and control the volume using the system volume, to get ‘bit-perfect’ playback. There are a couple of problems with this.

First, when the the DAC is selected as the output device, the system volume setting is grayed out. SoundSource does have a volume setting for it, but (AFAIK) it won't help with the second (and bigger) problem. Namely, all of the players mentioned above apply gain to the signal at their maximum volume settings. This results in clipping on peaks, verified by the indicators in SoundSource. As I just mentioned, changing the system volume going into the DAC can only be done in SoundSource. Also, it seems to me that if any source I am listening to is outputting a clipped signal, turning the volume down after-the-fact is not going to ‘un-clip’ it. And that's not what I want anyway. I want to control the volume in the analog domain. Even I know that a clipped signal is not ‘bit-perfect’.

So that's where I get into calibration, which is what my main question really is. I want to pass the signal to the DAC at the highest level possible without delivering a clipped signal downstream and then control volume after the DAC, in the analog domain on my amp. So far, I have done this by finding the recordings in my collection with the loudest peaks. So far, the champ is ‘Trilogy’, a Chick Corea live recording. I just back down the player volume until it stops turning red. Since Tidal has this recording, I found the same passage and adjusted the volume the same way. I did this with all my players. VLC has a tick mark on its volume slider that seems to correspond to unity gain, but it’s the only one. Fidelia has a ‘Prevent clipping’ checkbox in its prefs but provides no info on how it operates.

This is all pretty confusing because when I started in audio back in the Mesozoic, you just turned the knob until you got the volume you wanted. Now there are multiple volume controls along the way. And of course, when you run out of bits, you’re straight into hard clipping. To confuse me even more, apparently ‘0dB’ in digital isn’t really 0dB, but has some ‘headroom’ calculated in to allow for that.

So any advice on calibrating for ‘bit perfect’ playback (or something close to it?) I have the following tools at my disposal:

Logic Pro X
Audio Hijack
TBProAudio ‘MyMeter 2’ (meter plugin with various modes)
Test tones: (all 88K, 16 bit) : -3dBFS pink noise; white noise 0dBFS ‘Gaussian distribution; White noise -6dBFS ‘Uniform distribution'

Thanks in advance for any help or suggestions.

Posts: 1
Joined: 26 Sep 2020, 20:18

Re: Calibrating for 'bit-perfect' playback

Post by Kevindhoffman » 26 Sep 2020, 20:19

I have the *exact* same questions. Wishing someone had/would weigh in.

dB Cooper
Posts: 3
Joined: 10 May 2019, 07:38

Re: Calibrating for 'bit-perfect' playback

Post by dB Cooper » 27 Sep 2020, 05:30

Kevindhoffman wrote:
26 Sep 2020, 20:19
I have the *exact* same questions. Wishing someone had/would weigh in.
Well, it doesn't seem like this is the place to look for help, considering that it has taken over a year for the thread to get one reply, but I'll share what I have done to solve the immediate issue (clipping).

First, some of the streamer apps have test tone tracks available. I know Tidal does. Install an AudioUnit plugin called MyMeter 2 on your Mac (it's free). If you search 'Test tones' on Tidal, there is a 1kHz 0dB peak tone (turn your headphones, speakers, or amp down low while playing it!). All you have to do is set the scale on MyMeter to Peak (click on the meter itself), play the tone and adjust the player volume control until the meter reads the same as the stated level of the track. Then never touch the volume control on the app (in this example, Tidal) again. Set your listening volume downstream.

Or switch services to Qobuz, which is the only streamer I have found whose volume slider actually maxes out at unity gain. Just hit Play and you're done.They are alone among lossless streaming services in properly engineering their volume management AFAIK. Every other I have tried (read: all of them) has had to be calibrated to avoid clipping.

As far as playing local files:

Most players have the same design fault as the streamers mentioned above. VLC for example can also apply gain but does at least have a helpful 'tick mark' at unity gain on their volume slider. I still suggest using mymeter or the clipping indicator in SoundSource. But what to calibrate to?

The easy way: Download the following file: , play it on your player of choice, and adjust the level on mymeter to display the specified value.

The slightly less easy way: Get Audacity (free) and the 'ACX check' plugin (also free). Pick any audio track, preferably one with loud (but clean) peaks. Load the file into Audacity and run ACX check. Note the peak level. Play the same track on your streamer app and adjust the volume using mymeter as described earlier so the peak level matches what ACX Check reported.

Note, if you have Audio Hijack, use that. In SoundSource the meter appears and disappears when you click outside the window and you have to open it back up while the track is playing. This is is annoying but once done, it's done (for that app). Audio Hijack will allow the meter to remain visible continuously.

Sorry about the verbose reply but hope this helps. Summing up:

Get MyMeter2 (Audio Unit plugin)
Audio Hijack or SoundSource (both from Rogue Amoeba)
Get the test tones (link above)
Go to it

Feel free to pm me if any other questions; I'm no audio engineer but I'm willing to share what I do know

User avatar
Posts: 2028
Joined: 05 May 2003, 18:11
Twitter: glenngutierrez
Location: SF Bay Area, California

Re: Calibrating for 'bit-perfect' playback

Post by G » 27 Sep 2020, 11:33

This may not be the "place to look for help" for a few reasons. More than anything else, this is a professional audio and music recording forum, not a consumer DAC listening forum. The questions you're asking have many variables, and with pro recording they are answered precisely in the software and external hardware that we use. With macOS (and Windows) audio playback, however, just the question of the OS volume slider has never been answered. I remember asking it, long ago, and the answer was that "around 75%" should be unity. With 3rd party apps? Who knows?

If it were me, I'd first make sure there are as few steps as possible between source and DAC. No plugins, no utilities. If I was forced to use a source that didn't just pass bits to an external device (you're absolutely sure there's no direct out mode, skipping any volume control?), I'd ask the software developer where unity gain is on their app, as you found with VLC or Qobuz. I don't know what else you can really do, outside of feeding a digital out to a capture device and comparing that recording with the original file.

You may also need to understand some things about audio mastering in all of this. Recent masters have an ideal peak of just under 0db, older masters may have gone right up to 0 (or worse, nowhere near it, wasting bits). Good masters also have a "true peak" level under 0db, but not many mastering engineers, even good ones today, even know/care about this. It involves the reconstruction of a data-compressed audio signal, and how data that represents audio under 0dB can actually be misconstrued as going over 0 when it is turned back into an audio stream by some codecs or DACs. If you're playing back a master that didn't account for true peak overs, aka intersample overs, you could see clipping even though the original source never did, but they didn't give a shit about modern listening situations. So you'll need to back off your playback level by 1-3 dB to compensate, and that's not bit perfect.

Test tones are going back to the old days of analog level matching and gain staging. I mean, that can work to get you a safe level, but again, not bit perfect.

Sorry, like I said, there are many, many variables here.
MacBook Pro 15" (2016) . UA Apollo x6 + Quad . macOS Catalina . Logic Pro X . TwistedWave . FCPX . PS CC . Affinity Suite . Get Info

Post Reply