Page 2 of 2
Posted: Mon Sep 10, 2007 9:29 am
by dadden
It seems the deal is that the currents in the transformers are not stable until the core is thermally stable. This is also relevant in some of the various other stages of the amplifiers. Also connections become more (or sometimes less) adequate if lightly loaded over time. Anyone who has removed spade lugs from DC terminals knows what I am talking about. I have also heard about Caps growing crystals in them which reduce the Max Voltage and current flow of the devices. Like memory in batteries. But I was under the impression that you just "get" to replace Caps every 10 to 20 years in most audio equipment. Oh yeah and heat is bad for Caps.
Most people I know leave their low voltage devices on all of the time. Pre amps, players etc. and just switch the main amps on and off. Of course they do sound better when they are fully warmed up which can take quite awhile. I'll admit that I notice this sort of behavior much less when the devices have extremely stiff power supplies.
I turn my HT system off because it is in a room that is subject to excess heat and I don't want the A/C to have to work any harder than necessary. I never shut off anything in my Audio Only system. But it is in the living room and the amps are class A/B and don't pump out tons of heat unless signal is running through them.
Posted: Tue Sep 11, 2007 3:00 am
by regman
AHA - so that's why they have temperature ratings on capacitors (being a little facetious here)! The new caps will last a lot longer than the old designs and they are a lot smaller. In the past 20 years they have made major technological improvements in capacitor technology (dielectrics, insulators, etc.). I rarely see bad caps in modern equipment (more often mismarked, wrong value or omitted).
I turn all of my gear off when I am not using it. My Denon sounds fantastic from the moment the relays turn on the speaker circuits. I leave the self powered sub on but it it supposed to go into low power mode when there is no signal applied.
I have honestly never heard of transformers having to heat up for optimum performance either. I have have some pretty good analytical gear here - spectrum , real time and distortion analzers - and good stuff too, Tektronix, etc.
I can always be wrong - show me a link to the science.
Leave Your Amp/Audio system on 24/7
Posted: Tue Sep 11, 2007 6:30 am
by n2ubp
Anyone remember the late 60's / early 70's tube TV sets that when turned off had a lower voltage running thru the tubes ? Think it was called instant on. Claim was it would prolong tube life.
My office UPS logs tell me my electric company delivers a power source that includes
voltage dips and spikes due to switch yard changes, load, storms, or the new 16 year old driver hitting a pole on my street. Even with a UPS on my audio system I cringe when I think what these artifacts do to our equipment over the short and long term.
Posted: Tue Sep 11, 2007 10:41 am
by regman
Yes I remember that, and in a television servicing class I took back in the 70's they said it was a lie. People simply just wanted their sets to come on faster. Tubes rarely ever have filament failures.
Part of the reason they have such large caps in the power supply sections is to deal with voltage fluctuations and More and more I see surge suppressors built right into the power supply circuits. Almost all of the voltage in modern amps and TV's is regulated anyway. Sometimes the power supplies for the power amp IC's or amp output transistors is unregulated.
On another unrelated note the standard line voltage used to be 115VAC. By the raising it to as much as 123VAC they can reduce the amount of current that the transmission lines (the ones on the top of the power poles) carry. Many power supplies these days are so universal they will function nicely at 50-60~ and at voltages from 100-240V.
The main danger of someone hitting a tree is the 5Kv line breaking and hitting the 240 lines on the lower part of the power pole. There's not much suppression equipment can do to protect against an an extended surge like that into your mains.
Posted: Tue Sep 11, 2007 4:38 pm
by Richard
Yes I remember that, and in a television servicing class I took back in the 70's they said it was a lie.
The theory is that the heaters would not have the radical physical changes that must occur when going from cold to hot. By leaving the heater on halfway it kept the temp up and it increased the life of the heater as well minimize the risk of shedding surface material which can then short out elements of the electron gun(s). Bear in mind the heater is just one element of a complex cathode ray tube and this circuit had no bearing on phosphor wear...
Sure, this could be applied to any tube circuit but picture tubes were not your $2-5 6GH8's or 12AX7's that pop in and out...
Another [urban legend?] on this note is the FED requested or passed a law to have this circuit removed from new product designs years later during the 70's energy crisis lest the country waste it's resources on TVs...
