Skip to content

Encoding features in an image to use as a measurement for the particle filter. Used to localize a drone in GPS denied environments. Tested on PX4 SITL. Algorithm from scratch

Notifications You must be signed in to change notification settings

MayankD409/VisualOdom-Particle-Filter

Repository files navigation

CamDroneLoc

Visual Odometry for Drones using a Particle Filter Approach with Image Encoding Measurements

Authors

Overview

In this project, we present a vision-based method that uses camera image input for localization and visual odometry in a mapped environment by using an encoder to encode camera input which is then used as the measurement for a particle filter localization system. We assume a setup where a drone with constant zero pitch and roll is moving around in a mapped environment with a monocular camera facing down at all times (with a shutter speed high enough to prevent image imperfections while moving at high speeds). We also assume known control inputs (within the limits of reasonable error) to drive the drone in a particular desired motion.

Image Encoding

We experiment with and make use of three different image encoders to generate inputs for the Particle Filter's measurement model. They include the following.

  • CNN-based Encoder
  • VecKM-based Encoder
  • Histogram of Features Encoder

Particle Filter

Motion Model

The velocity of the drone is updated using local ododmetry measurements from the drone's IMU.

Measurement Model

The measurement model uses the similairity score between the encoded image vectors to update the weights of and resample particles. The different similarity score methods used include the following.

  • CNN-based Encoder: Cosine Similarity
  • VecKM Encoder: Inner Product Sum
  • Histogram of Features Encoder: Histogram Correlation and Intersection

Update Step

Systematic Resampling is used with a weighting method reflecting the Bayes Theorem every couple iterations.

Results

Fast convergence of particles to the ground-truth drone location is observed by visual results as well as obtained odomtery readings by the average particle in the Particle Filter. The depiction of the same by running the Particle Filter Model that uses VecKM-based Encodings in a realistic Gazebo PX4 SITL world is shown in the video embedded below.

Video

Executing the Code

To run the Particle Filter model independtly of ROS, execute the following command.

    python3 particle_filter.py

To run the Particle Filter model with ROS (to listen to and publish ROS Topics), execute the command below.

    python3 particle_filter_ros.py

About

Encoding features in an image to use as a measurement for the particle filter. Used to localize a drone in GPS denied environments. Tested on PX4 SITL. Algorithm from scratch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published