I’m sure I can hear a difference after a month of playing… how about you?
The P20 contains a power amp as well as other circuitry all of which will change with break in. I did hear improvements but other things, like power cords, were breaking in at the same time so hard to say how much was the P20.
So once again I raise the issue: why is that we audiophiles think that “break-in” must always be a good thing. Even with tubes which we know decline with old age are said to improve with a break-in. This concept defies logic. After all, the very term “break-in” would by its construction imply something not so good. But I think I have the answer. Once upon a time a car engine did indeed need a break-in as they were manufactured to be a bit tight (rings, etc) so that small out-of-trueness errors could wear off (remember, we were told to go easy the first 500 miles). With today’s manufacturing tolerances, engines are ready to go right off the line. Do not misunderstand, I have no problem with stuff, even IC’s, changing as they are used, but why is it always a good thing with audio gear? Can someone come up with just one example where audio gear is said to degrade after 100-200 hours of burn in?
Who would buy audio equipment in which performance degrades after 100-200 hours break-in? I know of no examples.
I too have wondered why sound of audio equipment always improves with break in.
No one claims designers are so brilliant they can keep track of all of the possible break in effects and arrange their circuits accordingly.
Yet we accept the fallacy that break in always improves components. This cannot possibly be the case.
A soulmate, at last! Perhaps the idea of always good burn-in also came in to vogue from Michael Dell. Dell made a fortune by capitalizing on iBM’s stupidity. Way back when, some 1/3 of IBM PC’s did not work out of the box. Dell figured he could make a big inroad by pretesting PC’s, and he was right. But burn-in from the point of view was to detect “infant mortality”, not make stuff better. Obviously, that equipment which makes it through is superior to those that were either dead on start up or died during testing.. The problem that we face is of course that there is no possible, objective and valid A/B testing for 100 hours of burn-in. And we all know auditory memories of detail are short.
Computers like to be COLD, audio likes to be WARM. Sure, OK