Digital volume control & DAC resolution

It has been often stated that some DACs lose resolution with the volume control, I’m assuming this is with anything less than 100% volume, right?
My DAC at full volume outputs +4dBu and my active speakers’ input sensitivity matches that. So, shouldn’t I always keep the DAC at full volume and adjust the volume by tweaking the speakers’ gain (class D) even as low as possible to get the least amount of class D distortion and having the DAC retain its resolution.
Apparently the issue of digital volume control based resolution loss is only a problem in some DACs, depending on the make. I have an Electrocompaniet “PD-1 High Performance Balanced DAC” (that’s what it says on the front plate). Electrocompaniet in general is a company that, I trust, would take care that their DACs don’t change the bit structure in any way by adjusting the volume control. Their representative actually stated this when I called. So… I guess I’m fine.
Still, shouldn’t output voltages always ideally match what the input is expecting? A preamp designer from Gryphon Audio actually stated (this was in Stereophile) that you don’t need a preamplifier if the output voltage of a DAC does match what the amplifier is expecting to receive. A broad statement, but it makes sense…
(Again, this is about necessity, of course if you want to spice up the sound with a particular preamp, it won’t hurt, just that the voltages should match. In the mentioned article about the Gryphon preamp, a stated feature is “unity gain” which sums up this whole voltage matching concept, so it’s “neutral” in the signal chain, meaning simply: voltage in = voltage out = what the amplifier expects)

1 Like

It sure looks like DAC’s need to have very high output voltages. I can even see the logic behind that. I would never ever do anything in the digital domain. It changes the content by definition. Now I would also never use a digital amp so that’s the end of that. But to finish my answer: volume control should either be with an potentiometer (or a digital R2R device like the DS1666, but not in the signal path) OR by controlling the amount of gain in the feedback circuit. IMHO of course…

Well, my monitors have the gain controlled by a stepped potentiometerish thing. I don’t know if it’s relay based (probably not) or just mechanically stepped but in any case, turning it (from the back plate) increases the gain of the speaker. Obviously.

[Disregard that, you mentioned it already:
“OR by controlling the amount of gain in the feedback circuit”]
Isn’t that what this potentiometerish thing does?

SHOULD a class D amplifier always be driven at its lowest listenable gain? As in, can increasing the gain at a certain degree (dunno how much for which amp though, kind of an issue…) actually be beneficial for the sound quality?
I understand somewhat how a class A works, there the case of managing an optimal gain is actually a thing, but class D amps are more complex anyway so they partly remain a mystery…

What I mean @Arenith by gain is an increase or decrease of impedance in the feedback of the amplifier, either locally or globally. I have a Ayre KX-R that does that. The other obvious way to regulate volume is by a potentiometer sorta thingy pushing the audio signal to ground. My Audio Research LS22 does that in a very basic manner.

Now in a theoratical sense: I think I might start to believe in Class D when it is combined/incorperated in a DSD DAC since both conversions are very close. That might be something for @tedsmith. A PowerDAC that in one move creates speaker movement from a digital DSD stream…

1 Like

Interesting idea, totally.

Now, to the subject of DACs having their maximum output voltage in use, preferably fed into something expecting just that voltage… Is there a certain benefit here?

How can I actually with common tools measure what input voltage my amplifier is “expecting” to receive?
Is this largely standardized so that single-ended and balanced have their respective voltages precisely enough set between devices that we don’t suffer from any signal voltage swings at interconnects? Apparently not since it remains an issue to be noted today.
I find the concept of a “unity gain” somehow very appealing.

1 Like

These exist, the NAD M32 being an example. However, its successor the M33 uses a conventional DAC linked to Class D amp combination.

1 Like

Well… There ya go…

Well, a DAC generally performing best at max output is very understandable.

But how about Class D distortion? Should I keep their gain at minimum?