Currently I use a Mac MINI as my server and also use it sometimes to stream over Ethernet directly from my modem/router. Currently I’m sending data to my Schiit DAC over USB. I upgraded to a Pangea USB cable because I read that it’s cleaner both from a digital as well as electrical noise standpoint. I can’t identify any noise at this point, but when did that stop me from taking steps just in case there might be noise? I recently saw a YouTube video about how USB cables actually differ from each other in how accurately they transmit digital data. So the question I have is whether converting to optical or coax (my DAC can receive both) would likely carry more “original” digital info into the DAC?
With many DACs you can run a test to verify that the path from the source to the DAC is bit perfect. In almost any working digital system this test will never fail. In other words, even tho you may be able to hear a definite difference between various cables or between TOSLink, AES3, S/PDIF, I2S and/or USB it’s not caused by the data being corrupted. The differences are caused by analog noise transmission on the cable, perhaps by ground loops in the system causing analog noise and/or jitter in the digital stream. All of these effects are very system dependent so using your ears is the only way to know if one setup is better than another.
Thanks Ted. I didn’t realize I was going to get a real reply by an actual Chief digital dude! I would have pretty much come to the same conclusion you did - that the noise is electrical, not digital, and there are ways of dealing with that. But the YouTube video I referenced is where he very logically compared the DIGITAL info of the original file to the DIGITAL info coming out of the end of the USB cable and somewhere, somehow, the 0s and 1s are different, because the DAC is actually PLAYING something when he mathematically cancels the original with the result (it should be total silence). Fortunately there’s no distortion created, only a tiny bump in amplitude it seems, so is that bit-perfect? I guess not, but if there are 5 USB cables between the mic, mixer, master, etc., the end result wouldn’t be bit-perfect either, now would it? I guess the best thing again is to minimize what can be heard, so I’m off to look at USB isolators.
I’ve also done the bit perfect test thru USB and all the other interfaces multiple times and always get a bit perfect result for valid configurations, (e.g. sometimes it fails for 176.4kHz or 192kHz thru TOSLink.) His experiment is at odds with my experience. USB is completely reliable for data transfer otherwise we’d have serious problems with USB disk drives, etc. The isosynchronous USB protocol can in theory give errors since it can’t error check if it’s sending real time data, but with valid USB cables, hubs, etc. that’s not a problem in practice. I don’t have the time or inclination to figure out what he did wrong, but you needn’t have different bits to have audible differences with DACs in the circuit.
Hi Ted, thanks for taking the time to type that in. I wonder if that guy ever compared data differences in USB cables with using a dirty stylus on vinyl? I think I’m pretty satisfied that it’s close enough to where there’s no way my ears could discern a difference. As for RF noise, etc., that’s easy enough to minimize - of course there are $800 isolators out there, but the Audioquest Jitterbug for $70 seems to be well-received.
I had one thought: I don’t know the equipment he’s using, but one digital process that changes the data is using ASRC (asynchronous sample rate conversion) which is common in DACs and digital interfaces that remove jitter without changing the output clocks. That would be sensitive to the changes in jitter over the various cables and digital interfaces.