Sorry if this is the wrong place but I'm interested in a scientific and electrical explanation and not some HiFi voodoo magic.
I guess you all have seen the discussions about expensive audio cables and "improved" sound quality.
My question is, what's the best cable for an audio signal? Loudspeaker cables are usually a pair of ~12 AVG cables terminated with banana connectors.
As far as I understand this must be affected by all sorts of noise. To minimize noise it's normal to use coax cable for "radio" signals (LF, MF, HF, ...). Wouldn't coax with proper connectors like APC-7 or equivalent be the best for audio too? Or are the frequencies too low for coax somehow?
Internet was a bit vague on this topic.
What I'm really wondering is why don't we use coaxial cables for loudspeakers in average home stereos?
Lets say a signal level of 50 dBm (100W).
Answer
There are two reasons why speaker cables are not shielded/screened:
The signal is so powerful that any interference would not be noticed.
Speakers are not very sensitive; it takes a lot of power to create sound on a speaker
This is why speakers are connected to an amplifier.
Input to amplifiers are very sensitive and so input should use shielded/screened cables.
P.S. If you have a 100W amp the good ones use about 48V. So you have at least 2A, with peaks higher. Peaks might hit 20A but for a very short time. To hear this your speaker cable needs to be thick.
No comments:
Post a Comment