Audio

Accuracy: Where’s The Point? – Part 2

Part 1
Figure 1. Analog linear voltage and non-linear dB scales.

Last month, I discussed the differences between accuracy, precision and resolution when making some voltage/frequency measurements. In audio, however, we are also often interested in the relative differences or variations in response and performance. For example, we may want to know that a certain piece of equipment has a nominally flat response but is, say, 3dB down at 100Hz and 5kHz. This is much easier to comprehend than being told that the response is 227mV less at 5kHz than at 1kHz. Equally, it is much easier to comprehend that a response is ±1dB from 20Hz to 20kHz rather than the output may vary between 690mV and 869mV.

There are times, of course, when the absolute values are important to know, e.g., the maximum input or output levels for a piece of equipment. I mentioned that I have a very expensive analog (vintage) audio voltmeter that is scaled in both millivolts and decibels. This is a great advantage when you want to quickly check out the response of a piece of gear because I can set up a level at a convenient dB value and make the adjustments or sweep the frequency range of interest. I can then eyeball the needle and see how it deviates from my notional reference dB value.

Despite its cost, the meter only has a stated accuracy of ±0.5dB. In reality, it is better than that, but for the intended purpose, this accuracy is fine. Most of the analog meters I have that possess a dB scale are scaled linearly in volts, but also have a non-linear dB indication. In other words, the voltage scale is broken up into a series of linear divisions usually graduated from 0 to 10, though also perhaps from 0 to 3, while the width between the divisions on the dB scale varies.

It does this because the divisions are logarithmically spaced, the dB values being derived from the physical logarithmic spacing of the graduations. Therefore, depending on exactly where on the scale I set my reference point, I can increase or decrease the resolution available. See Figure 1 for an example.

This can be advantageous, but you need to set up the measurement carefully, and it is a good reminder of just what the measurement is all about. A not-quite-so-vintage meter uses an electronic log converter and employs a linear dB scale; hence, the dB resolution is fixed and so remains the same no matter what the absolute value is. Digital (dB) meters behave in a similar manner. Most modern sound level meters, for example, have a resolution of 0.1dB, but are they accurate to within 0.1dB? Oh my goodness me, NO! Nowhere near it! Let’s take look, for example, at the class 1 and class 2 sound level meter tolerance curves. These are shown in Figure 2.

Figure 2. Sound level meter measurement tolerance curves.
Figure 2. Sound level meter measurement tolerance curves.

Whereas a type 1 sound level meter has to be within ±1dB over the range 100Hz to 1kHz, at 10 kHz, the permissible deviation increases to +2dB and -3dB. If thinking of this in terms of millivolts (which is what the microphone produces, after all), as we have seen ±1dB, could correspond to a variation of 690mV to 869mV for a nominal 775mV reading. Whereas, at -3dB, the reading could be 548mV. So, actually the ±1dB tolerance of a type 1 sound level meter isn’t as good as it at first may seem, permitting an equivalent voltage reading to be somewhere between 548mV and 869mV. For a type 2 meter, these tolerances increase to ±1.5dB and ±5dB at 10kHz.

This means that two different type 1 meters, with 0.1dB measurement resolution, could show a difference of 2dB and still be within specification, or at 10kHz a measurement could show a difference of 5dB (or 10dB if using a type 2 meter). These are huge differences, yet the 0.1dB reading suggests an apparent accuracy of up to 50 times greater than the meter may be capable of.

Now, acoustically, a difference or variation of 1dB is virtually insignificant, and even 3dB is claimed by many to only just be audible (although I would suggest it’s a very significant difference). So why are we trying to measure sound levels to 0.1dB? Because we can! Clearly, such apparent accuracy is meaningless. The problem is that we live in a 3dB world but seem to insist on an apparent accuracy 30 times better than this. Maybe the ancients got it right and my vintage analog meter is better at displaying a more accurate indication, although, in this case, the resolution is less than its modern digital counterpart. That, perhaps, then should be my new year’s resolution: 1dB!

Previous ArticleNext Article

Send this to friend