Why did the industry choose these seemingly unintuitive values of 110V and 220V for the mains AC? Is there a physical reason why they aren't simply 100V and 200V?
Answer
They did choose nice round metric values of 100V, 200V and 500V.
And then, they wanted to increase transmission capacity without really adding copper to the system, so they made a series of sub-5% "bumps". (this also made overload-induced voltage drop seem less severe.) The gentle bumps were mild enough that people's light bulbs (the primary load of the day) didn't burn out that much faster, and of course light bulb makers tuned their products quickly with each bump.
By the time Edison was trying to popularize DC, the bumps were up to 110V -- DC, mind you.
When Edison threw in the towel on DC, they chose an AC voltage which would still work with the same DC light bulbs, which worked out to be the same as RMS, so 110 V AC. And this was aggressively marketed to the general public, which is why "110V" stuck in the same way "Xerox copy" stuck.
500V streetcar and subway voltage similarly got several bumps to 600V by that time.
All of them have since had several more bumps. Streetcar and subway voltage has bumped to 750VDC without deleterious effect on motor commutators. BART made a leap to 1000VDC but that was too much for the commutators and they had to begrudgingly back off to 900V.
North American AC power long ago bumped to 115V, some 117.5 and finally 120/240V. There's talk of 125V and everything in the system is insulated for 125V.
Europe did the same thing, UK bumped a little more than the mainland and is now un-bumping to harmonise.
No comments:
Post a Comment