I wanted to use an accelerometer or IMU to measure speed and displacement over a period of about 1 minute or 400m. From this answer I found at stackoverflow, the error propagates at a rate of \$t^2\$ (without considering the rotation), so \$60^2 = 3600\$.
The noise density of the ADXL335 accelerometer is about \$200 μg / \sqrt{Hz}\$, so @500Hz we get $$\dfrac{200}{\sqrt{500}} = 87.67 μg\ (\mathrm{or}\ \ 87.67 \times 10^{-6} m/s^2)$$
Getting the error over the 60 seconds: \$87.67 \times 10^{-6} \times 3600 = 0.32m\$.
This looks suspiciously optimistic, Am I correct, or am I doing bad calculations?
Answer
I have tried to do this, with an iPhone’s accelerometer/gyroscope, and can empirically tell you there will be many orders of magnitude more error than that.
Your statement “without considering rotation” is an important one, as this is a huge factor. One of your difficulties will be removing the gravity vector from the integration. If the accelerometer is tilted even slightly, gravity will introduce a large error in each axis.
In my experiment, I was trying to make an iPhone into a 3D cursor a user can wave around in their hand for 3D modelling. It would drift off in random directions at a rate of centimetres per second. Lots of low pass filtering helped this a bit, but it was still way off.
My point is, even if your accelerometer has low noise, in the real world this is a very difficult problem to solve as there are many other sources of ‘noise’.
I recommend you go buy a commercially produced IMU if you want any chance of achieving this over 400m. I will be impressed if you can make an accelerometer work alone, over a distance of 400m with less than ±1km error.
No comments:
Post a Comment