So here is my question. Example#1: Jim owns a BLUESOUND $550 streamer and goes digital out to a PS DSD. He owns high end preamp, speakers and mono amps.
Example#2: Tim owns a Lumin x1 $9999 streamer and goes digital out to a PS DSD. He owns the same high end preamp, speakers and mono amps.
Will there be ANY difference in the sound…. And if so, please explain why?
Yes. A lot of difference. Better streamers go to great lengths to reduce jitter and electrical noise. Experience tells many of us that it really matters. Jump over the Darko Audio podcast and listen to interviews with Paul McGowan, Darren Myers, Garth Powell, Bruno Putzeys, Nuno Vittorino and the guy from Antipodes Audio (sorry for my rudeness that I can’t remember his name). You’ll learn heaps from these top level engineers, and they’re entertaining conversations too!
It may well sound more accurate (to the “input signal”), and it may well sound “better” - not always the same thing.
It will certainly sound different due to use of a different DAC and different analogue stage.
Which reminds me, even if both streamers have identical DAC etc. they will have very different analogue output stages, which is another whole debate full of what sounds accurate, better, different etc.
Listen and “see” which you prefer, over an extended period, and ignore the prices I say
I suppose it depends on your attention span. If we tested three different streamers with 100 people, a third of them will say there was no difference between any of them. My own sister is convinced there is no actual difference in picture quality between DVD and 4K. Some people are just lucky.
I personally have had four or five different streamers. The last is easily the best. I am not even interested in searching further. Cause Vinyl you know.
I’d agree with @michael_lichnovsky in that I’d expect the difference to be from better board architecture, lower internal noise, improved power supplies, isolation etc.
I also assume that both digital outputs are input to the Directstream DAC on the same format, e.g. i2s or USB for both 1 and 2.
Which is more appealing to you though is a subjective issue. You might say it’s so close as not to warrant the additional $9450. Only your listening will give you a definitive answer.
I was surprised to find not only the streamer makes a difference, the power supply, cables, and power cord make as much difference as the streamer itself. Then, you have switch, clock, router, PS for router, modem, etc. So, everything in the streaming chain makes a difference. Just don’t go there unless you prepare to dig a deep rabbit hole!
I’ve heard that if you’re able to connect a streamer to your Dac using AT&T glass, there isn’t much difference between streamers. Apparently using the fiber glass connection eliminates any noise traveling from the less expensive streamer to the DAC. Can anyone confirm this?
In my experience different AT&T glass connections matter (depending on your equipment obviously). In fact, I vastly preferred the $9 Fry’s orange AT&T fiber cable to AudioQuest’s (then) $1000 AT&T glass cable. The Fry’s cable (in my system at the time) had better defined bass in a profound manner. Its top end was also clearer, but that wasn’t as striking to me at the time. This was with a Philps SACD1000 transport to an EMM DAC6E dac using EMM’s three AT&T cable connection (one for the clock from the DAC, another for the returning clock and the last for the returning data.)
What Ted said. My own experience matches his, in that different AT&T glass yielded very different results. A lot of what’s behind fiber connection is the mating angle between fiber and receiver, and how much is reflected. I’m sure there are other factors. We used fiber quite a lot for custom short range communication in my job. It could be very tricky.
Would you say that in general, that using a glass fiber connection between a streamer or transport, reduces the difference between a less and more expensive source component?
Compared to, say, TOSLink or compared to electrically conducting wires?
The quality of the connection matters, but with digital transmission you can choose your error rate by the quality of you connection. For audio the error rate (even without retry logic) is good enough that it’s not affecting the reliability of the connection. With optical connections all that’s left is jitter.
With electrical connections there are many more issues, ground loops, noise conduction… The effects of each of these depends greatly on the quality and design of the source, the destination, the cable, the environment, etc. There is much wider variability with electrical connections and much more sensitivity to the sources of interference.
Still some hardware is very sensitive to jitter and if that’s the hardware’s weakest point whether the connection is optical or electrical may not matter much at all. Other hardware is much less sensitive to jitter and may or may not be sensitive to electrical interference…
I like AT&T glass, but the connectors/transmitters/receivers are very expensive (with cheaper DACs they can cost more that the whole rest of the DAC.)
AT&T glass compared to electrical conducting wires.
So if you’re streaming from a hard drive using AT&T glass to a dac, would jitter and noise be a moot point?
Yes, I compared Bluesound Node to Bridge II. Bridge II sounded better, but, the Node provided a better user experience. Then, I compared Node and Bridge II to Aurender N150. The N150 sounded even better and also provided a great user experience. So, I kept the N150 and sold both the Bridge II and Node. Oh, and all of this was done using a DS Sr.
I tried to convey that jitter is always an issue, even with quality glass connections. Roughly speaking only audiophiles (actually any real time use of a connection) care about the absolute levels of jitter. For the digital world (e.g. the people that designed AT&T glass connections) jitter only has to be low enough to not add errors to the digital connection. There are many ways in a digital world to deal with varying levels of jitter.
Any optical connection from a PC to a DAC has the possibility of lowering electrical noise, but the PC can still radiate noise to your preamp, etc. and it can dirty your AC power and that may affect other components in your system…
I like optical connections, but they aren’t a panacea.
I’m going to use my networking background to comment about fiber optics in the audio space and I realize this is “knowing enough to be dangerous” territory… but if you stay till the end I have an experiment that might be interesting to try.
In networking we match the fiber to the tolerances within the receiver which is matched to the transmitter. This includes landing a signal at the receiver for both the type of transmitter and power of the transmitter to the optimal range of the receiver. There are additional parts to this match if its single color or multiple colors (wavelengths).
Single mode and multi mode has to do with the way light travels down the fiber optic cable as well as how focused the tx beam is. Single mode applications usually use lasers where multimode uses LEDs. When using an LED the diodes are cheaper, the fiber optic cable can be plastic, and the core diameter large. For single mode lasers the beam is very focused, usually requires glass to keep that beam intact, core diameter small, and everything is more expensive because of that.
When you use a single mode optical cable on a multimode system you’re, for lack of a better analogy, sipping soda through a stir straw. This is causing attenuation. Works fine for short distances. However, it’s expensive.
Audio SPDIF uses cheap multimode LED diodes. It expects to use a plastic multimode fiber cable.
Now the experiment…
I believe what @tedsmith and others experience by using a glass single mode cable and getting better results is simply the fact the single mode cable is attenuating the single. Better glass / construction tolerances may provide better attenuation to the point the receiver goes out of tolerance. But, audio cables don’t normally publish core diameters so who knows what you’re really buying.
Now, experiment time: You want to know the absolute cheapest way to add attenuation to your optical cable? Find a #2 pencil (or equiv diameter). Wrap your cable around the pencil. Now do it again. Now do it again. Each twist adds a little attenuation dropping the power at the receiver. When you match the perfect power level at the receiver I assume you’ll get the best sound. Much cheaper than buying an expensive mis-matched cable which has the side-effect of doing the same thing as a pencil and a few twists.
Don’t most fiber cables have bend radius limits? Do tight wraps damage the cable?
Is the attenuation from the tight bends on pencil? I assume if you unwrap attenuation leaves.
yes, absolutely which is why the #2 pencil is the general hack as its radius will provide attenuation on any normal jacketed fiber cable without damaging it. Obviously, don’t do this with raw fiber. And, yes, its the bends that cause attenuation. Take out the bend and you’re back to normal.
Like a Tuning Ring almost.
And Fiber cables are too inexpensive to worry about. Buy a couple, and get change from $50.