The input impedance of the next device is usually on the order of 47k to 100k so those differences in the source’s output impedance are inconsequential in almost any system.
ahhhhh... What? I'm sorry, I do not understand the 47-100K as
compared to Ohms...and why it wouldn't matter. Im not an
The differences in the spec’ed output impedance are when measuring the single ended vs measuring the balanced outputs. I.e. the output impedance is the same from either signal line to ground.
Ok, so you are saying the different numbers are simply for testing
purposes and the actual out impedance is the same? And what is that
number, 100 or 200 ?
Just to explain why I am asking so that you may be able to dumb it down
for me. I am getting new ICs one 1/2 meter and one 1 meter, but only one
at a time so I asked the maker which should I worry about first, source to
pre or pre to amp, and he said, "which ever has the highest output
impedance would be more important. My pre has an output impedance of
Thank you for your time,