Ethernet Cables and Sound

Yes they are total overkill as everything else. All one needs is a ghettoblaster and that’s it.
Please everyone,send your expensive usb,power,hdmi,ethernet and speaker cables to me,because they don’t bring any improvement to audio. Just buy 10usd cat6 and be happy with it ,because there is no better :joy:
Luckily ,i actually listen to them and hear huge differences between every cat cable. But that’s me. I don’t use 100feet cables though.

I haven’t been reading this thread so some (or most) of this might have already been discussed.

Noise is the biggest culprit in Ethernet cables: the noise that’s near the frequency of the Ethernet signaling won’t be attenuated much by the transformers at each Ethernet connection.

Ethernet is complicated enough that you need a computer on each end - computers generate noise.

The higher the Ethernet frequency the more complicated it is to handle things so those computers make more noise.

There’s almost always a lot of noise in these frequencies coming from whatever your Ethernet source is.

The first thing is the DAC might be creating or reflecting noise back onto the cable and that noise along with any source noise might be affecting people’s systems via any groundloops (which are basically RF receivers) The (probably) very low level noise that’s picked up in any particular groundloop depends on the area of the loop. Bigger loops, more current noise is circulating in the groundloop.

The thing about RF noise is that any non-linearity in any connected circuit can modulate any audio frequency noise on the RF signal down into the audio band (e.g. a crystal radio, which is just a diode.) Guess what? Almost all of our equipment has diodes, transistors, etc. near the inputs and outputs. They, along with any other non-linear idiosyncrasies of a system can convert audio frequency modulations in any received RF into simple audio noise, perhaps right at the inputs, or perhaps it ends up modulating power supplies, etc.

How loud is this? It depends a lot on the system. My neighbor has a not well maintained simple stereo console. When you walk by it with a cell phone you can hear the cell phone pinging the towers, etc. Most of our systems are much better than that, but even low level noise masks details in our systems (or, at times, enhance details, e.g. dither)

Similarly the noise (conducted or radiated) that gets into the DAC’s case (including the Ethernet receiving computer’s noise) can be modulated down into the audio band at a number of places in the DAC - in many DACs the power supply just before the final gain (the DS’s output is a filter, so that helps a little compared to DAC chips, on the other hand one bit DACs are the most sensitive to jitter.) Any noise that affects the master clock via it’s power supply, it’s frequency control, etc. will cause jitter that directly affects the audio quality.

It’s much easier to just run some experiments with different cables than it is to predict how much effect any of this will have on your system. But using balanced connections (both for digital and analog) can help with ground loop noise rejection. Using star grounding of your components can help. Which outlets things are plugged into can change the area of the groundloops that go thru the AC power lines. I’m sure that some of you can affect things a little by being more careful with the layout of your interconnects (especially unbalanced interconnects), speaker wires, etc. Getting them too close together might cause unwanted cross talk (probably not), but having them way far apart may cause more system noise from received RF.

FWIW I’ve had plenty of guests who can hear the difference when I shut down WiFi in the room - still I hardly even do that and I’ve also got cordless phones, cell phones and all sort of other things that add a little hash into the system.

Also how I route my ethernet cable to my NAS has, at times, made a noticeable difference in the sound quality of my system. I lowered that effect a lot by keeping the NAS far away and keeping Ethernet cables as removed from my system as possible.

8 Likes

Thanks Ted.
So if I understood you correctly, Ethernet cables and power supplies along the streaming chain do matter.

Whew! Now back to my regularly scheduled “audio nervosa”.

Thanks, Ted.

:slight_smile:

Scott

1 Like

The noise getting into the system matters. This still isn’t directly the Ethernet cable. It COULD be. And, the type of cable that MIGHT impact audio isn’t necessarily well defined. So not so simple to say cat6 is better than 5e.

The internal NEXT on better cable with shorter lays suggest noise immunity, and that’s true to the four internal pairs, but not the noise external to the full duplex Ethernet system.

I’d make sure the CLOCK circuit block was well shielded. That’s for sure.

If the cable does act like an antenna to external EMI, and as good as Ethernet is with basic 5e, losing a little internal network performance that is overkill suggest that using 6A with on external shield technology, not UTP 6A, might improve the noise on the 4 pairs in the cable.

6A is designed to reject cable to cable noise in large GROUPS of cables. The external noise from terrestrial stuff isn’t strong enough to foul up 6A. This is not how we use 6A at home, as a single run.

Don’t use UTP unshielded 6A as this is designed differently to reduce INTERNAL and External alien cross talk for 10G. It achieves this with cable to cable spacing, and mitigating one cable’s emissions from another by spacing and pair twist length. Shielded versions will improve ingress from sources other than cables in close proximity. This would include cell phone chatter and the like.

Both UTP and shielded type work for CABLE to cable and internal cable noise. But, we also want to remove ingress noise that is not from cables, but everything else. This means a shield.

As Ted mentions, noise is coming in past our Ethernet, and that matters, too. It is clearly not just the cable where noise MIGHT be getting in. But on a FUD use a 10GXS12 6A cable and you should be set as far as cable goes for external noise. The internal ethernet is cat6, well better than we need for bit perfect Ethernet.

Hi Ted - you think many of these issues discussed would be resolved by converting the signal to optical with this type of setup?

Some here have done optical Ethernet with positive results, and it makes sense too.

1 Like

Sharing my experience with cable below.

https://certicable.com/cat-7a-cables/certicable-baby-blue-cat-7a-cable-1200mhz-s-ftp-lszh-23awg-10g-shielded-rj45-3.html

I received a 3 foot and 10foot last night. This has been my best cable I have tried yet. The double shielding definitely takes care of the noise and hash I did not know existed. I put in the 3 foot section first from the SGC sonicTransporter i5 to the cable modem/router replacing the BJC 6A I noticed stunning increase in space around the instruments and separation of voices width as well as depth and more clarity. Test Tracks were Qobuz.

http://open.qobuz.com/playlist/1952922

Thinking it could not get any better, i next replaced the 5e cable from Sonore ultraRendu to PS Audio DSD. I can say I was completely wrong. I got a sense the sound field depth further increased and live presence was simply uncanny. Even when walking from listening sweetspot which is around 8 to 12 feet wide with my line arrays. You could approach either speaker and feel you were walking around instruments or vocalists even 3 to 4 feet near the speaker. I had been thinking of upgrading the modem/ router power supply and SGC power supply but now feel it is totally unnecessary. The ultra Rendu has linear power supply from SGC where others were standard issue. I will note I am getting imaging behind speakers too and they are only 6 inches from rear wall.

The bass presence was seemingly more at lower volume with the BJC 6A but I later realized it lacked localization in imaging. Kick up the volume a hair and any perceived advantage disappears and you realize it was fuzz or noise versus the clarity of the Certicable. I am running Iconoclast UPOCC ICs and SPTPC speaker wires. The USB to DSD is this. https://www.smallgreencomputer.com/collections/accessories/products/in-akustik-reference-high-speed-usb-2-0?variant=741325897743

Wow, price is right.

Just curious. For thing to be better, they have to measure better…somewhere. So far we predicting that NOISE sources, that MIGHT be Ethernet cable sourced, upset the CLOCK and can influence jitter, or accurate digital interpolation accuracy.

What I don’t see, is ANY evidence to the amount of noise that MIGHT cause this, and ANY data using NOISE from ANY source to validate the CLOCK instability. Zero evidence of a perceived problem based on the theory that the CLOCK is noise influenced is NOT proof that it is a problem, Just that theoretically it could be. We are sitting at theory near as I can see.

Now we sell galvanic isolation devices to stop this “problem” with no evidence of ANY kind that it is happening. And no, the theory is NOT evidence, it is a set of postulates that are somewhat supported by scientific principals that something COULD be an issue and to look for it.

Who has actually measured the CLOCK jitter of ANY digital component over a 24 hour period and measured the jitter with direct correlation with typical terrestrial background EMI and RFI? No, the CLOCK circuit can’t be BLASTED with 100 times the terrestrial variation, but close to reality.

If this is done, somewhere, and the CLOCK jitter is indeed directly correlated to simulated terrestrial EMI, that is NOT constant at all, then we can say we even have the problem we are somehow “deciding” we have. But, this is not a decision, it is or isn’t actually happening to any measurable level.

Now, if we do indeed have the issue, reducing incoming EMI is then a priority. But, Just WHERE is “incoming” coming from? We’ve decided, no proof but a decision, that it is the cable. Is that really the smoking gun? Is it the largest caliber bullet impacting this theory that the CLOCK is being measurably (no measures so far) impacted by noise?

We need to get SOME basis of measurement and weight to this “problem” before I spend one red cent to eliminate it. If it measures better it is better. We need those tests. If they are indeed accurate, and true, then some may indeed hear improvements and / or successive improvements superimpose to make things better. I have no issue with that. One variable by itself can be hard to hear. But ZERO actual evidence of a THEORY causing a problem and spending money on it?

The discussion should be, of the MEASURED data how does this impact accuracy of the circuit? Better is better. Some purists will almost always classify stuff as “inaudible” and leave things less better as a result of that mindset, even with data. I don’t accept that viewpoint and feel we SHOULD do everything we can to improve our tech. The costs to do so drops over time. But, the process to leverage improvements has to be REAL, MEASURABLE and REPEATABLE. Where is the beef in this accusation?

1 Like

One last comment…

ICONOCLAST is 100% designed, every cable, with calculation to industry accepted mehods and measurements. Every single decision was made with PROOF that better was indeed better. Is it past audability? This is now a VALID question BECAUSE the reference is truly better! But, the cable is always better. No question.

I don’t sell pure theory. Copper materials and copper process design, directionality of grains, cryo and all sorts of stuff I don’t “sell” past the cost to offer the various coppers. I have no audible attributes that COULD make a difference outside of resistivity, although small, of lower grain copper. But, resistivity is rolled into the Rs measurements and is not impacting the results.

My point is, using real knowledge correctly can much improve things. The foundation we build on has to be correct, repeatable, and proven. Theory is fine to INVESTIGATE but I won’t charge a single penny for it that does not have the PROVEN alternative available. This is why we offer TPC copper and no one else does…it WORKS as well as any copper to current standards that infer sound accuracy. Changing the copper is a true change, same as adding a galvanic isolator. But so far, the theory is NOT proven to the outcome.

Be aware of what is on the list of theories, and what is not. When people have unsubstantiated “answers” to unproven theories and want your money…run. Well, unless the answer is I don’t know and then the attribute should be economically evaluated compared to spending money on PROVEN improvements.

I love the inquiry, but want more thought into proving things over pretending theory is now a fact.

3 Likes

Yes they were reasonable as well as performed Jay At Audio Bacon did not like the Certicable CAT8. I ran across and audiophile company with a CAT7A they tweaked with some graphite shielding covering. Set me thinking trying this version of Certicable since the CAT7 and CAT7A are getting all the audiophile pixie dust and price per inch attention. . I don’t now if Certicable present CAT8 is same cable Jay tried a few years back. The 7A is audiophile worthy in my system.

Mearsuring jitter requires very expensive equipment. Listening as you change things is reliable if you’re not interested in convincing anyone else. Turn on and off Wi-Fi, start or stop other tasks on your source device with each cable… Tho I can’t do it as well now, used to hear jitter pretty easily. I ran enough experiments to make good progress designing equipment. I feel completely comfortable recommending that people should listen for themselves. I like to use my time more productively figuring things out just for my own sake than trying to convince others with arguments/measurements that they will never believe anyway, tho I do respect those that are more rigorous.

10 Likes

Although I cannot prove it, I feel that there are enough sources of “noise” that may summate at the input of the DAC to potentially affect the sound quality of the output. Given the enormous variation in network configuration from house to house it may be impractical to predict what effect changing a network cable or other component will have. I have not reliably heard changes in sound quality with ethernet cable swaps or using optical interfaces where others have heard large changes. My Ethernet cables are generic (Rosewill?) but I have tried a few other cables, some of them expensive, but not an extensive number of them. This may very well just be a characteristic of my home network. I have noticed, however, significant changes when trialing different servers or swapping usb cables (which I typically do not use). Lots of variables here to deal with and, as always, I feel like using my ears is often the best way to decide.

You can’t verify EVERY situation, no, and at anytime of the day. But, we SHOULD be able to PROVE “a” single situation in a lab somehow. Once we DEFINE the level of EMI injected and the outcome, we can now decide how beneficial the cost of mitigating the problem really is. Do we see measurable variation and with what amount of data injected to mess-up the clock?

If we inject noise into an Ethernet cable, how much EMI really gets OUT of the NIC card and INTO the CLOCK? Where are the leaks into the CLOCK?

I understand that EMI/RFI is there, or stuff won’t work! But, deciding what it does to ancillary equipment with no data anywhere seems less than an ideal way to decide if we’re fixing “problems”. Ears are OK but at some point we need to get real about stuff.

If it is real, there should be a way to measure it. This includes copper grain and pull science in cable so I’m not ducking out on the problem. I just won’t support “my” theory on what is going on. I don’t know. What we DO KNOW, is we have grains differences and resistivity differences that are truly THERE. This does at least mean we have differences that somehow change the electromagnetic wave. We need those same differences with EMI/RFI measured, same as grains and resistivity.

A CLOCK changing with EMI/RFI influences SHOULD be measurable and yes, the equipment is expensive, but it exists and a set experiment can be done to show how much, and maybe from where, EMI is an issue. We can define the problem but we won’t?

I’ll meet you half way and say “a” given experiment can show true influences and from where, but ALL instances are going to vary. That proven situation and the worst influencers can better allow you to make changes to offset the problem. What if the Ethernet cable in a set experimental situation, as a path of influence, can’t be proven to be an influencer no matter the injected noise?

This is like using ICONOCLAST in different amp/cable/speaker/networks. The varying characteristics of each network is PROVEN in real measurement. Less reactive the cable the better but…the change will vary from network to network but it WILL change. We know that using better measuring cable DOES make a difference…although people will argue to what extent those changes impact the sound but at least we DO have a CHANGE.

Can we prove a true change with EMI/RFI, from ANY SOURCE, impacts the CLOCK? If we can, we should do it so we also know how to DESIGN the CLOCK circuit to mitigate the issue and plug the worst leaks. We don’t need to do it over and over, just to where we know that the terrestrial noise is indeed a problem and the best practical way to make it near a non issue.

Galen

1 Like

In my case the ethernet cables must run fairly close to power cords and hook to the wifi router modems. Curiosity tells me this is probably my contributory noise issue. Though the ethernet cables with no or less shielding I routed far away still not as clear as the CAT 7A. Ther certicable CAT8 used OFC copper. The CAT7 made no claim on alloy. If it were not Jay From Audiobacon’s comparison I would have tried CAT8 which he did not like. CAT7A was spec’d for three uses for digital signals. It just so happened to work for audio too.

I am saving a grand on cables and nearly as much on linear power supplies with my poke and hope strategy of try and listen. The two cables I liked but not as well are going to AppleTV and FireTV duty of movie streams.

The thing is that there are many paths for any noise/jitter. Directly measuring jitter takes a much more expensive piece of equipment than I can afford. It’s very subtle when measured indirectly, say by looking at the width of the phase noise spectrum, the other noise in the DS swamps it. The supposed jitter testing input bit patterns make a lot of assumptions about jitter paths most of which don’t apply to the DS. It’s easy to look and see if people’s toes start or stop tapping when you change jitter sources.

2 Likes

I thinks I’ll go put a record (analog) on and have a beer.

6 Likes

LOL, I thought the same, and also I thought, maybe I should dig up a used CD player as well for good measure.
In actuality however, I think that music streaming and music servers are still in their infancy and it’s the mavericks that are testing this stuff and experimenting with various ideas and technologies will improve this new breed of music reproduction. Just like there were people tinkering with different analog cable topologies. Now we have people like @rower30 and @tedsmith who are brilliant minds IMO and breaking new trails.

1 Like

In the past I have tested many high end ethernet cables, but I went back to normal Cat6 cables 4 to 5 years ago. It seems a normal cable is good enough for the stream. The trick is to remove the noise out the stream and a dac will turn in to a superdac. The detail the impact the precision and soundstage will become seriously better. Music is so much more vivid!