7

So I have been working on a drone project for a very long time, now I decided to work on Kalman filter which is used widely nowadays like Ardupilot. I looked at the source code so basically understand that using double integration we can get linear displacement using IMU and GPS sensor fusion with Kalman filter. But when I run simple code for sample time 0.1s, I didn't get expected result. I run the code for 1 meter and distance I get was like sometimes 768cm or sometimes 2meters or even 100m so am little nervous about the algorithm.

What is the proper way to get linear displacement using IMU ?

N. Staub
  • 1,412
  • 8
  • 21
Robokishan
  • 73
  • 1
  • 1
  • 4
  • 2
    Welcome to Robotics:SE. What code are you running? Can you edit your question to add the code (and ideally your wiring diagram)? – sempaiscuba Nov 27 '18 at 20:36
  • yeah sure, its simple i2c protocol wire library with stm32f103 (blue pill) connected to mpu6050. and the code is very basic right now just reading the registers of mpu6050 to get accelerometer data. and then acceleration to displacement – Robokishan Nov 28 '18 at 18:22

2 Answers2

11

I thought for sure that there would have been a duplicate question somewhere on the site that answers this question, but I can't find one, so here's a quick description of the method.

  1. Put your IMU in a known starting position and orientation (position + orientation = "pose").
  2. Capture IMU accelerometer and gyroscope readings.
  3. Use numeric integration on the gyroscope output (angle += gyroReading*deltaTime) to get the current orientation of the IMU.
  4. Use the current orientation of the IMU to construct a rotation matrix that will transform the accelerometer readings from the IMU "body frame" of reference to the "world frame" of reference.
  5. Use numeric integration on the transformed accelerometer output (speed += xfmAccelerometerReading*deltaTime) to get the current speed of the IMU in the world frame.
  6. Use numeric integration on the world-frame speed (position += speed*deltaTime, or position += speed*deltaTime + 0.5*xfmAccelerometerReading*deltaTime*deltaTime) to get the current position of the IMU in the world frame.
  7. GOTO: 2.

This is the most simplistic way of using an IMU output to get position. All sensors have a bias, though, so when you integrate the output you're left with a drift on the speed, position, and orientation estimates (important because they are estimates and not measurements).

You can use the gravitational "down" vector (the only sustainable long-term acceleration) to correct any drift on your x/y rotations. You can use a magnetometer/compass to correct any drift on your z rotation. You can use GPS to correct position drift.

I am a huge fan of the Madgwick filter and plug it every chance I get - it's free, open-source, outperforms the Kalman filter, and it's probably already written for you. It will take accelerometer, gyro, and magnetometer readings and give you the outputs.

Chuck
  • 16,061
  • 2
  • 18
  • 47
  • okey thanks but its not working out since jts only giving me displacement at the specific time not the entire path can you workout me on this ??? – Robokishan Nov 30 '18 at 07:37
  • @Robokishan - The current position is generally the only thing you get from an IMU. If you want to record the path, then you need to record the path. Something like pathHistory(:, end+1) = position; or pathHistory.emplace_back(position);, etc. You could preallocate the pathHistory variable, but then you'd need to know how long you're going to record the path. – Chuck Nov 30 '18 at 13:31
  • okay got it thank you seems like its the only option – Robokishan Dec 02 '18 at 16:48
  • I am trying to follow step 4 but are confused. How do I apply step 4? I am trying to implement this in C++. Please help – Brandon Gorman Mar 19 '19 at 17:10
  • @BrandonGorman - Step 4 requires the output of Step 3, which is the orientation of IMU. Do you have the orientation? Is your question how to convert Euler angles (or quaternions) to a rotation matrix? – Chuck Mar 20 '19 at 20:49
  • HI , would like to follow your steps, I have the raw IMU reading from the Accelerometer, Gyroscope and Magnetometer (means have orientation , angular velocity and linear acceleration). I only need to get the linear velocity . What would be the steps in that case and using the Madgwick filter . And last question is there already some github or ROS package that implement all those steps and even is more robust and complex than this simplistic way? – bob Sep 22 '21 at 13:35
  • @bob I would urge you to pull the code and take a look. Steps are still generally as described above. Regarding ROS packages, there's one here. – Chuck Sep 23 '21 at 14:51
  • @BrandonGorman that doesnt solve the integration problem. In order to get velocity need to integrate the acceleration. And with the time that error will accumulate and will be huge. So my problem is how to minimize that error. Understand? – bob Sep 24 '21 at 08:49
  • @bob Yeah that's the problem - your integration error is unbounded. There is no solution to this problem, unless you're using some absolute position measurement system (i.e., GPS) to correct for your position estimation errors. The Madgwick filter orientation doesn't suffer this problem because gravity and North can give you orientation fixes. Nothing provides a position fix. – Chuck Sep 24 '21 at 12:13
  • Madgwick filter only can be use to correct the noises in orientation but doesnt solve the integration problem – bob Sep 24 '21 at 15:05
  • @bob - Correct, that's what I'm telling you. There is no solution to the integration problem. Get a higher quality IMU, that has less noise, and you'll still have the integration problem, or get a GPS unit so you can correct the integration problem. You cannot fix the integration problem with an IMU alone. – Chuck Sep 24 '21 at 17:23
  • i done research and there are some numerical methods in terms of optimization function and so om. So they try to minimize the error. One solution is https://robomechjournal.springeropen.com/articles/10.1186/s40648-018-0110-1#rightslink – bob Sep 25 '21 at 06:49
  • but https://robomechjournal.springeropen.com/articles/10.1186/s40648-018-0110-1#rightslink just a bit coorect the error,. Im looking for some better solution in that direction. Without additional sensor . Understand now? So the error will be still there but with some tmathematical methods you minimaze that error – bob Sep 25 '21 at 06:50
  • @bob if you have a question you can ask the question and everyone on the site will get a chance to see it and respond. Right now I'm the only one that's seeing these comments. – Chuck Sep 25 '21 at 18:41
  • I already asked the question. ok – bob Sep 25 '21 at 18:42
1

I can't comment in this forum yet, so I have to post an answer, but I am looking at the same problem. The MPU9250 DMP output includes a quaternion and a fused quaternion. I can't find an exact description of this, but I am guessing this is the rotation from body frame to world frame. (Quaternions do rotations like a rotation matrix, only the numerical accuracy is better and they don't have gimbal lock.)

I've also noticed that the IMU data takes a while to stabilize. I start my robot stable and pointing due north and the magnetometers report it is pointing south at first. Within about 10 seconds the raw magnetometer readings swing north. The fused compass reading can take 60 seconds or more to fully agree. I was very confused until I figured this out, but now I can get reliable compass readings. I still have not got good world frame position output and will look at the Madgwick filter. Thanks for that pointer.

The integrations are easy, but you need to subtract gravity. Ultimately that will degrade accuracy because you can never orient the gravity vector perfectly and error from that will accumulate. An IMU cannot keep accurate position for long, but it should be accurate for some period of time.

C. Eliot
  • 11
  • 1