Ethernet already considers noise as ACR, or Attenuation to Cross Talk Ratio. Once the noise is sufficiently below the BER, Bit Error Rate requirements, the digital error is basically zero. Noise is no longer a problem and higher BER rates exists, but are mathematically irrelevant.
When higher BER is required, the PAM, Puse Amplitude Modulation, system used multi step voltage levels as high as 16 levels in 10G baseT verses 5 in category 5e. The granularity of the voltage steps is going to require a broader BW, Band Width and low noise across that BW.
Music does not need 10G baseT, 1G baseT, 100baseT. 10 baseT works fine. This is good because NOISE is well managed at any domestic level.
This community wanted demonstratable superior ANALOG cables, and ICONOCLAST provides exacty that, and with measurements. The analog BW is wider than legacy cables and Rs measurements show exactly that. Managing R, L and C is much harder when you have to use multiple small wires in speaker cables, and hard to properly handle in IC, Interconnect, cables. But design DOES change the measurements. All ICONOCLAST cables are measured for Rs to insure a wider BW to 20 KHz. Above that? This is debatable on a realistic measured linear BW scale, same as BER for digital by increasing the SN, Signal to Noise, ratio.
Digital cables, like ICONOCLAST, already have superb NOISE management and this is hardley an issue in your home. Industrial settings have far more laxed FCC noise limits than home equipment, why? Because industral users KNOW that noise is managable and that hitting it way above the requirement just makes your beloved stuff cost more as overkill has to be paid for.
ONLY when the BER ca’t be achieved do industrial users use more than two pair UTP 5e. Financial firms and the like use massive BW cables and 10G baseT as they move gargantuan QTYS of data. It has nothing to do with noise directly.
We can “manage” noise all we want, but the Ethernet data stream is bit perfect before we even start the project. The data says so, same as the data shows ANALOG cables are indeed improved over legacy cables or I would not be making them.
Unfortunately, the better DESIGN analog cables do cost more to make, and is a volume issue but they won’t ever be as cheap as a zipcord. Digital cables enjoy a far and away better set of design data to insure the proper BW where old analog cable did not. Digital have no reason to be improved past the requirement as the specs clearly define the BER based on the BW requirements for bit perfect data.
Analog is different as there are still BW improvements in Rs that can be measured and managed with careful designs. Here, we have “better” and it sounds better exactly like the data suggest it will, more open and expansive definition in the upper registers where the BW is more linear. I make no claims past what I can measure. Zero.
I can measure DIGIAL and DO make claims based on what is measured. NOISE is not an issue for Ethernet in your home. Heaven help our bank statements if noise was the unmanaged gremlin we want it to be. It is a gremlin, but it is well away from hurting the data.
I can send a digital UL approved test report on ethernet cables to show how exhaustive the testing is, and how precise the technology to achieve it. Current digital cable is already way high-end. Analog cables on close inspection, not so much, so I changed the DESIGN to improve that and where it matters. The exact same tests will show the exact same capabiity on ANY analog cable as to how well it achieves better Rs linearity and impedance.
The harder variable is the cables swept impedance stability relative to the speaker load impedance. This problem will remain with passive cables as mother nature won’t allow a flat impedance through the audio band. And not one that matches a speaker load. Again, I’ve shown this with actual measurements to be so. Simply making Rs more liner is the right thing to do, but there are still non linearities that make amp to cable to speaker “networks” sound different. Two cables with the same Rs won’t sound the same unless the swept impedance and reactance are the same…and this can be measured. I have improved the cables impedance relative to legacy cable, dropping the impedance, but again, we have hard limits to work with in analog where the electromagnetic spectrum is constanty changing at every frequency!
Measurements can show DIGITAL to already be meeting the requirements where RF, Radio frequency, is more linear and predictable. That’s a great thing, too. Working above the needed BER is fine if you have the money, but the BER won’t change the bit perfect requirement needed by the error correction system. More is not always better. It won’t be worse but not better.
My take? Make sure you like the DA, Digital to Analog, filter firmware. My DS DAC sounds great with the Snow Mass firmware, not so much with the earlier versions. The Ethernet cable? Doesn’t matter what I’ve used. My PC? For music files I use a spinning rust HDD and W10 on a SSD with no issues.
ICONOCAST works because the data shows that it will. Digital ALREADY works well because the data shows that it will.
Remember, too, that Ethernet is speced for 100 meter channel, and this is the basis for the requirements, Shorter runs allows worse and worse noise management as the signal is higher and higher. If Ethernet was a 50 meter system, we could re-design the cables to be cheaper. Since audio uses basically no length at all any crack pot cable will work as the S/N ratio is huge. Ethernet just looks at the signal relative to the noise, not the cable and how cool it looks. In my book that’s pretty cool technology!
Best,
Galen Gareis