8
$\begingroup$

I am doing SLAM with a four wheeled (2-wheel drive) differential drive robot driving through some hall way. The hallway is not flat everywhere. And the robot turns by spinning in place, then traveling in the resulting direction. The SLAM algorithm does not need to run online.

The robot takes measurements from a strap down IMU/gyro measuring (ax,ay,az,wx,wy,wz), where ax refers to acceleration the x direction and wx measures angular acceleration about the x-axis. The LIDAR scans the hall way with a 270-degree arc and measures ranges and angles. However, so far as I know the hall way has no discernable features except when it corners

I need to find the best way to fuse the proposed action measured by the encoder with IMU and LIDAR data. It makes sense to me that I could fuse yaw from IMU with encoder data to get a better sense of heading, but how should I incorporate LIDAR data?

In essence, what is the appropriate measurement model and how should I incorporate noise into the motion model? Beside just adding some gaussian noise at some (0,σ)?

Addendum

This somewhat orthogonal to the question but just as confusing to me. Currently I am using a particle filter to do SLAM, and I am a little confused about whether to represent uncertainty in angular acceleration in the particles themselves. I see two options:

  1. A separate navigation filter using EKF (or anything really) to find a vector of "best-estimate" angular acceleration matrix first, then use this matrix as absolute truth for the particle filter. So that any drift in the particles is not from uncertainty in angular acceleration.

  2. Incorporate the uncertainty into the particle drift themselves. This option appears more sensible but I am not sure what a principled way to do this is.

$\endgroup$
1
  • $\begingroup$ If I may make a technical correction, the gyro in an IMU is measuring angular rate. The accelerometer measures linear acceleration. $\endgroup$
    – Sean
    Commented Jul 15, 2014 at 14:09

1 Answer 1

3
$\begingroup$
  • Since you have a 2D sensor which you can not rotate in a controlled way, you can only expect to perform SLAM in a 2D plane.

  • Your best bet is to use the IMU for attitude estimation (roll and pitch only, since you do not have a compass) and correct each slightly rotated 2D laser scan accordingly. If your IMU does not provide you with an attitude estimate already, I would compute attitude estimates using a nonlinear complementary filter since they only need tuning of one single constant.

  • Even though it is theoretically possible to use Particle Filters for SLAM with laser scans (see gmapping), the current state of the art is graph-based SLAM, i.e. least squares optimization of the SLAM graph, or in your case pose graph optimization. Have a look at karto in ROS as an exemplary open source implementation.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.