My goal is to figure out a relation between output voltage and input SPL:
$$dB(SPL) = f(V_{out})$$
First, I get the sensitivity in volts from this formula:
$$Sensitivity_{dB(V)} = 20 * log_{10} (Sensitivity_{mV/PA})$$
For example, a microphone's sensitivity is -46dB(V)/Pa.
$$Sensitivity_{mV/PA} = 10^{-46/20} = 5.0119mV/Pa = 5mV/Pa$$
Since 1 Pa = 94dB(SPL), can it be written as 5mV/94dB(SPL)? Can that sensitivity then be rewritten as 53uV/dB(SPL)? So the final equation is this?
$$dB(SPL) = V_{out} / 5.3e^{-5}$$
Something tells me that it doesn't work this way, but I can't figure out where I've gone wrong.
Answer
For example, a microphone's sensitivity is -46dB(V)/Pa
-46 dBV is about 5 mV RMS and bear in mind we are talking about pure sinewaves at 1kHz (mid band). It's 5 mV because \$10^{\frac{-46}{20}}\$ = 5 mV.
This voltage arises from an SPL of 1 Pa RMS (unit of sound or any pressure in newtons per square metre) hence for 2 Pa RMS the output voltage will be 10 mV RMS. For 0.1 Pa the output will be 0.5 mV RMS.
No comments:
Post a Comment