Ethernet Cables and Sound

What did you READ, anything, that moved the BER and eye pattern test such the Ethernet has lower bit perfect transmission? Did you see true improvements based on the data stream bps and eye-pattern open eye percentage? I can guarantee you you didn’t. We have GOBS of BW with even 5e cable.

Digital cables are tested every way from Sunday for consistency. Look for the UL verified label in the print legend to make sure the cable you buy is truly CERTIFIED to be what is claimed!

Until the PAM encoding is such that the voltage levels get lost in noise, and require higher BW cables getting “better” cables won’t do a thing using 5e.

ICONOCLAST shows exactly HOW they improve the BW through the audio range and exactly how I do it, and also retain R, L and C at the same time. Digital cables are also able to explain how the BW is improved such that higher level PAM encoding allows faster Ethernet standards.

Shields are AWFUL for performance UNLESS the noise is massively bad with balanced pairs…or the balance of the pairs is awful (use BONDED pairs!!).

An Ethernet permanent link using cabled 4PR23 CAT6 will perform the same as a Mediatwist FLAT CAT 6 design. No difference at all. The advantage of Mediatwist is the more simple construction technique…no cabling process! And, it uses longer lays for the same NEXT so that helps keep wire length shorter for attenuation. Mediatwist is superbly durable to abuse as the pairs are “locked in place” inside the jacket and they are BONDED pairs, too. So if you need a tough cable…this is a good choice.

Best,
Galen Gareis

1 Like

I’m gonna need a glossary to get through the posts to decode all the acronyms!

Nah,you only need two different cables to compare.

In my perspective ethernet cabling is important, but certainly not the most important thing in audio networks. I recommend to first think about this… al your equipment generates a certain amount of noise. Your nass, switches, routers, computers etc. This makes that most systems won’t adchieve the same results with a certain network cable. Noise levels, cable lengths and many other aspects are different from system to system. The only important thing is to keep noise away from your audiosystem and to let the bits and bytes through.

So the first important thing one should do is to reduce noise levels as much as possible. This can be done by creating a dedicated network, replacing switching powersupply’s by linear types, remove unnecessary equipment like a nass and make use of a solidstate drive in your audiostreamer. For example isolate your dac by a set of low power glass converters powered by linear powersupply’s and finally add a pair of shielded ethernet cables. The most important thing of an ethernet cable is that it won’t create noise and that it isn’t sensitive for interference.

Personally I don’t like silver plated cable and I prefer copper above silver.

Wijnand

I believe BW is bandwidth and BER is Bit Error Rate. I am clueless about PAM encoding.

I also find myself lost in the technichal discussion. Not enough knowledge.

I believe data is what @rower30 (Galen) is trying to get everyone to focus on, not abstract terminology. I myself chose a direction of shielding 1.5years ago. I had zero insight and no one to discuss this choice with. I did the best I could in making that choice. Fast forward 1.5 years and my Ethernet infrastructure works fine and sounds great. Is it optimal, NO… is Ethernet forgiving, YES.

If you would like a primer on data cable terminology, data, and some insight, Blue Jeans has a primer. Armed with a better understanding of terminology and Galen helping answer questions, choices become easier. My cable did not come w/ data… it came w/ an invoice… BJ cables come w/ data as shown below.
https://www.bluejeanscable.com/networkcablereports.htm

1 Like

So now go ahead and order a few worlds best ethernet cables from BJC. They cost only around 10usd piece,so they are basically free compared to anything in this hobby.

And then tell us,and most importantly to yourself how they compare to your existing cables. No difference in SQ ? Not possible.

I ( we all ) have seen so many times these same arguments from engineering side and hobbyists alike. Engineers will deny differences even if they hear it. Same thing apply´s to for ex. speaker measurements. Two almost identical frequency responses from totally different designs,but do they sound the same? No.
Do we buy our speakers or amplifiers or what ever by measurements, no. But instead what sounds the best to us.

How does one measure music? Soundstage,depth,tone ,prat,realism? By ears,the most accurate measurement device there is :slightly_smiling_face:

When we buy new gear do we ask how it measures? :sweat_smile:

Happy ethernet cable testing to all…

Actualy both the IT guys and audiophile guys are right. If we talk about the bit stream there can’t be an audible difference, but if we talk about the noise travelling with the bitstream and influences the equipment then there’s a clear audible difference. So let’s get rid of the noise

Yes that is very important thing to do and also betters SQ. I have only different lps´s to power my gear,no smps. But still i hear very easily changes in ethernet cables. So they don´t rule out each other.

Ethernet already considers noise as ACR, or Attenuation to Cross Talk Ratio. Once the noise is sufficiently below the BER, Bit Error Rate requirements, the digital error is basically zero. Noise is no longer a problem and higher BER rates exists, but are mathematically irrelevant.

When higher BER is required, the PAM, Puse Amplitude Modulation, system used multi step voltage levels as high as 16 levels in 10G baseT verses 5 in category 5e. The granularity of the voltage steps is going to require a broader BW, Band Width and low noise across that BW.

Music does not need 10G baseT, 1G baseT, 100baseT. 10 baseT works fine. This is good because NOISE is well managed at any domestic level.

This community wanted demonstratable superior ANALOG cables, and ICONOCLAST provides exacty that, and with measurements. The analog BW is wider than legacy cables and Rs measurements show exactly that. Managing R, L and C is much harder when you have to use multiple small wires in speaker cables, and hard to properly handle in IC, Interconnect, cables. But design DOES change the measurements. All ICONOCLAST cables are measured for Rs to insure a wider BW to 20 KHz. Above that? This is debatable on a realistic measured linear BW scale, same as BER for digital by increasing the SN, Signal to Noise, ratio.

Digital cables, like ICONOCLAST, already have superb NOISE management and this is hardley an issue in your home. Industrial settings have far more laxed FCC noise limits than home equipment, why? Because industral users KNOW that noise is managable and that hitting it way above the requirement just makes your beloved stuff cost more as overkill has to be paid for.

ONLY when the BER ca’t be achieved do industrial users use more than two pair UTP 5e. Financial firms and the like use massive BW cables and 10G baseT as they move gargantuan QTYS of data. It has nothing to do with noise directly.

We can “manage” noise all we want, but the Ethernet data stream is bit perfect before we even start the project. The data says so, same as the data shows ANALOG cables are indeed improved over legacy cables or I would not be making them.

Unfortunately, the better DESIGN analog cables do cost more to make, and is a volume issue but they won’t ever be as cheap as a zipcord. Digital cables enjoy a far and away better set of design data to insure the proper BW where old analog cable did not. Digital have no reason to be improved past the requirement as the specs clearly define the BER based on the BW requirements for bit perfect data.

Analog is different as there are still BW improvements in Rs that can be measured and managed with careful designs. Here, we have “better” and it sounds better exactly like the data suggest it will, more open and expansive definition in the upper registers where the BW is more linear. I make no claims past what I can measure. Zero.

I can measure DIGIAL and DO make claims based on what is measured. NOISE is not an issue for Ethernet in your home. Heaven help our bank statements if noise was the unmanaged gremlin we want it to be. It is a gremlin, but it is well away from hurting the data.

I can send a digital UL approved test report on ethernet cables to show how exhaustive the testing is, and how precise the technology to achieve it. Current digital cable is already way high-end. Analog cables on close inspection, not so much, so I changed the DESIGN to improve that and where it matters. The exact same tests will show the exact same capabiity on ANY analog cable as to how well it achieves better Rs linearity and impedance.

The harder variable is the cables swept impedance stability relative to the speaker load impedance. This problem will remain with passive cables as mother nature won’t allow a flat impedance through the audio band. And not one that matches a speaker load. Again, I’ve shown this with actual measurements to be so. Simply making Rs more liner is the right thing to do, but there are still non linearities that make amp to cable to speaker “networks” sound different. Two cables with the same Rs won’t sound the same unless the swept impedance and reactance are the same…and this can be measured. I have improved the cables impedance relative to legacy cable, dropping the impedance, but again, we have hard limits to work with in analog where the electromagnetic spectrum is constanty changing at every frequency!

Measurements can show DIGITAL to already be meeting the requirements where RF, Radio frequency, is more linear and predictable. That’s a great thing, too. Working above the needed BER is fine if you have the money, but the BER won’t change the bit perfect requirement needed by the error correction system. More is not always better. It won’t be worse but not better.

My take? Make sure you like the DA, Digital to Analog, filter firmware. My DS DAC sounds great with the Snow Mass firmware, not so much with the earlier versions. The Ethernet cable? Doesn’t matter what I’ve used. My PC? For music files I use a spinning rust HDD and W10 on a SSD with no issues.

ICONOCAST works because the data shows that it will. Digital ALREADY works well because the data shows that it will.

Remember, too, that Ethernet is speced for 100 meter channel, and this is the basis for the requirements, Shorter runs allows worse and worse noise management as the signal is higher and higher. If Ethernet was a 50 meter system, we could re-design the cables to be cheaper. Since audio uses basically no length at all any crack pot cable will work as the S/N ratio is huge. Ethernet just looks at the signal relative to the noise, not the cable and how cool it looks. In my book that’s pretty cool technology!

Best,
Galen Gareis

1 Like

I’m pretty sure that when we can rule out the noise that the differences between ethernetcables will be small. Today the differences between cables are for a big part how they handle and filter the noise. For example when you have switching noise at a certain frequency and you would add a cable that does have good filtering on that frequency or the bandwide of this cable is below that frequency then this cable suits your system. But when you have already taken care of this noise frequency then this cable isn’t much better then a cable that does not filter on this frequency.

So when we are able to reduce the noise enough then we only need a cable that does not interference and pick up new noise

As proof of these numbers I probably have one of the worst planned music storage systems on the planet and the SQ to the bridge2 is only slightly different than my Nucleus+ > USB to Matrix > I2s to DSJr.
Here’s the route:
Synology DS1513 NAS plugged into a Cyber Power UPS > 25’ cheap CAT5e cable to a full 24 port unmanaged Netgear switch plugged in with all of the other office stuff (3 work stations, printer, router, WAP) > 200’ cheap CAT5e line (buried for 100’) > Netgear 6400 router acting as a WAP > 3’ BJC CAT6 cable > Netgear 8 port unmanaged switch > 6’ BJC CAT6 cable > DSJr Bridge 2. If this isn’t the best chance a chance for things to go haywire then nothing is and as I said the SQ is only slightly different than the direct USB > I2S connection (< 6 feet total run). The difference is in the Matrix, not the length or quality of the CAT cable AFAICT.

Thank you Galen Gareis,

I think you are completely right. Noise is no problem for the bitstream at all. This is clear for me and I’m aware of the industrial gear and electromagnetic compatibility (due to work).

I remember a testfile back in 2011 (or something) for the bridge 1 (PWD1) that proved the bitstream was already bitperfect. From that point we spend years discussing ethernet cabling and isolators etc.

It took a while to understand what the problem was. It kept me busy for years testing gear and cables. What was clear from day one, that a shorter lenght cable sounded better than a longer cable (crosstalk).

I did test with a few types of Nass, several laptops, desktops, macbooks, macmini’s and more. What became clear was that havier pc’s sounded better than pc’s that produced more processor heat (more heat is more power is more switching noise). In that time Intel processors sounded better than AMD processors. Intel processors stayed cooler then AMD.

From this point I started to built ultimate audio pc’s and looked into galvanic isolaters, powersupply’s etc.

Today AMD Ryzen 65Watt processors sound incredibly good. They produce far lesser noise than older processors. This combined with low noise powersupply’s there’s far less noise that travels with the bitstream. At this moment I still work with a quad core intel I7-7700 proseccor It improved the sound of my system very much.

Technical ethernet is perfect for data (bitperfect). It’s the noise (in particular switching noise) that is a problem for our systems.

Best regards,
Wijnand

1 Like

As a side note - further to Galens note about 4800 i called Blue Jeans Cable. They dont sell this cable.
Also when i told them that I thought the CAT 6A cable did not sound as good- they had spiffy fit.
I feel the folks who support data need to be accepting of the folks who trust their ears to call out sonic differences. Why certian things sound differently can be looked at, but saying they dont is a turtles all the way kind of thinking.

1 Like

…It’s the noise (in particular switching noise) that is a problem for our systems…

Exactly what tests show a problem and how? Where in the analog DA does this change what we hear? I know the current fad is “noise” but that’s the problem, I don’t see any concrete data that shows it upsets jitter or any other artifact that would alter the ANALOG signal.

If the bit stream is bit perfect (it is) then the “noise” has to be on the analog side of the DA conversion. Jitter is near nothing now-a-days with re-clocking. So I’d love to know what this “noise” looks like as an artifact of what we hear. Or, is it a way to get into you back pocket. Somehow it has to change the analog, yes?

Best,
Galen Gareis

Be careful, because it can be turtles all the way saying it does, too. Digital is not like analog. It is in a very stable port of the RF spectrum that is well understood for bit-perfect requirements. Bits respond to noise level below the signal and that in turn is the bit error rate. The bits have no clue HOW the signal to noise level is achieved, just that it is at some level that needs to exceed the requirements for a BER that is bit perfect.

The DA conversion is where a lot of stuff happens, but the bits feed in are always the same as long as the ACR, attenuation to noise, is acceptable. And, with our links being so short the SIGNAL is HUGE relative to noise and that’s good.

I have no issues with measurable issues with cable, but saying it does is just saying it does. Where is the beef on the FUD surrounding “noise”, and how is it propagated into the DA process such that we hear it?

I can show you the data on ICONOCLAST speaker cable that measures just that;

  • Rs linearity.
  • Impedance matching to the speaker.
  • phase response improvements with low inductance.

Every one is improved and measurable. And yes, they are responsible in the audio range not RF silliness.

So with digital, we need the same sort of logic to take place. What fundamental issues are at and, and everyone can’t have a completely different response to what is better (half say shielded all the way and the other half say UTP). But only one half can be right as mother nature doesn’t change her spots based on opinion.

Knowing how good Ethernet is, I really don’t have a dog in the fight as I know 5e UTP on up will get you bit perfect data stream so I agree, get what you feel good with. The prices aren’t bad and as long as you are aware how Ethernet works, I’m good with the differences of opinion simply because Ethernet works the same way bit stream wise for audio. Bring me the data that shows it doesn’t. I’ve been doing this for 30 plus years and the cable tests all assuredly show that we are in control of this issue.

Again, I hear FIRMWARE AD and DA changes for sure…same cable, though.

Best,
Galen Gareis

1 Like

I dont disagree with what you are saying Galen.
The point I make is - when you change stuff you hear something. at that point most of us and definitely me, dont know how to measure what I heard. I just know i heard a change and can either like it or not.
Figuring out what caused a change in a “custom system” cant be a couch exercise.
If some one shows me data - and I try it out at home and come to the same conclusion as the data - we are all good.
The challenge is when I hear something which doesnt line up with the data…the point i make is simply that the data camp shouldn’t rule that out as “not possible”…
And yes if someone doesnt even want to look at the data it may very well be turtles all the way down :slight_smile:

By the way thanks for adding some of the full names and descriptions of acronyms in the recent posts it helps get me back on the same chapter, I don’t think I’ll ever be on the same page but it helps!

1 Like

This helped me understand something - which is 1% more than my current understanding which is 0%


1 Like

Yes indeed it is on the analog side where the switching noise reaches the analog waveforms. From here there’s no way to filter the noise. Actualy it will be send with the music waves to the volume control.

When you look to a digital bitstream on a scoop it is realy hard to recognise a blockform. Actualy it looks like chaos with lots of spikes and different voltage tops.

It helps a bit when a calvanic isolator is used in the stream. The transformer decreases the voltage of the signal and at the same time the noise. The amount of calvanic isolators that can be used is limited, because the signal can lose bits. But whit the use of (good) switches the bitstream signal will be increased, without attenuation the noise back to it’s meant to be form. So with calvanic isolation transformers the noise can be smoothen.

Optic isolation works as well. I use a 10/100mbit converter so high frequency noise can’t pass. The faster the converters the higher the noise frequency that can travel with the lightstream to the other side.

I think the problem must be solved in the digital domain. What we need is a smarter optical convertion where the signal on the Audiosystem side is free of noise. Normaly the light signal transports the noise just as the bitstream and recreates the bitstream and the noise back to an electric stream signal. Because the bitstream is already bitperfect means it can be interpreted as bits and bytes without looking at the noise. (I mean for example a solidstate drive stores bits and not noise so noise can be separated from the signal). From this point on the converter the bitstream should pass a ship (that only can generate pure block forms) fed with clean power and recreates the stream.

Best regards,
Wijnand