I am working on an indoor positioning system where I need to:
- Compute distance based on RSSI (I understand this won't be 100% accurate)
- Then do trilateration to pinpoint the location of the wifi signal. This part might be solved via this solution: Trilateration using 3 latitude and longitude points, and 3 distances
I am stuck with (1).
The relationship b/w RSSI and Distance is (source PPT): Where:
Fm = Fade Margin - ??
N = Path-Loss Exponent, ranges from 2.7 to 4.3
Po = Signal power (dBm) at zero distance - Get this value by testing
Pr = Signal power (dBm) at distance - Get this value by testing
F = signal frequency in MHz - 2412~2483.5 MHz for Ralink 5370
But I am not able to figure out how to calculate the fade margin. Based on some findings, fade margin = sensitivity of receiver - received signal
But then again, how do I get sensitivity of the receiver?
I have an Ralink RT5370 chipset wifi dongle with this specification: Ralink 5370 spec
Any suggestions will help!
Notes from: http://www.tp-link.sg/support/calculator/ suggest that fade margin varies from 14dB to 22dB
Excellent: Link should work with high reliability, ideal for applications demanding high link quality. Fade Margin level is more than 22dB.
Good: Link should give you a good surfing experience. Fade Margin level is 14~22dB.
Normal: Link would not be stable all the time, but should work properly. Fade Margin level is 14dB or lower
Answer
Fade margin is the difference in power levels between the actual signal hitting the receiver and the bottom-line minimum signal needed by the receiver to work. It gives an indication of likely bit error rates for instance.
There is a standard formula for calculating minimum theoretical signal level needed by a receiver for a given data rate. This is -154dBm + 10\$log_{10}\$(bit rate). If data rate is 1Mbps then a receiver will need -94dBm to stand a chance of reasonably getting decent data.
If the received signal is in fact -84dBm then the fade margin is 10dB i.e. it can allow fading of the received signal up to 10dB.
To apply this to your situation means you need to understand the data rate so you can calculate minimum acceptable receiver power. Because Fm = Pr - Pm (where Pm is minimum receiver power level calculated from bit rate or maybe marked on the box) I believe you should be able to work this out based on RSSI being equivalent to Pr.
If you look in the link you provided you'll see this: -
Receive Sensitivity: 802.11b: -84dBm@11Mbps
In other words, at 11Mbps, using the formula in my answer you get a minimum receiver power required of -154 dBm + 10\$ log_{10}\$(11,000,000) dBm = -154dBm + 70.4dBm = -83.59dBm.
EDIT
I've been having a little look on this and there is a simpler formula you can use based on this document. The formula is #19 on page 3 and basically it is this: -
RSSI (dBm) = -10n \$log_{10}\$(d) + A
Where A is the received signal strength in dBm at 1 metre - you need to calibrate this on your system. Because you are calibrating at a known distance you don't need to take into account the frequency of your transmission and this simplifies the equation.
d is distance in metres and n is the propagation constant or path-loss exponent as you mentioned in your question i.e. 2.7 to 4.3 (Free space has n =2 for reference).
Your original formula - if you could supply a source for that I can check it against data I have.
No comments:
Post a Comment