AES/EBU vs HDMI Digital Outputs

If you are dealing with digital signals on electrically sound cables the conductive medium should have no effect whatsoever on sound quality.

The signals on those cables are square waves --serial data, to be exact-- not analog audio. As long as all the bits reach the other end, the resulting sound quality will depend entirely on the quality of your source and DAC.

This is the big advantage of digital signalling… absolute fidelity. It works like this… To send a 1 bit the sending device sets the voltage on the wire to be greater than 1 volt. On the receiving end any voltage greater than 1 volt is considered to be a 1 bit. For a 0 the same logic applies except that the voltage has to be less than .6 volts. Nothing else matters except 0 or 1.

Even jitter (bits that are transferred early or late) is compensated for in the receiving device by using a very narrow sampling window centred in the intended width of a bit. The bit can be early or late by up to 25% of it’s width and you will still get perfect data transfers.

So don’t fret over two different types of digital cables … HDMI is not better than USB or Optical … all are digital methods and all will transfer data with 100% fidelity.

That the one cable you were considering won’t work with one or more digital methods is, as you observed, a much better cause for your decision making.