Particle Filter Tracking Architecture for use Onboard Unmanned Aerial Vehicles
Ludington, Ben T.
MetadataShow full item record
Unmanned Aerial Vehicles (UAVs) are capable of placing sensors at unique vantage points without endangering a pilot. Therefore, they are well suited to perform target tracking missions. However, performing the mission can be burdensome for the operator. To track a target, the operator must estimate the position of the target from the incoming video stream, update the orientation of the camera, and move the vehicle to an appropriate vantage point. The purpose of the research in this thesis is to provide a target tracking system that performs these tasks automatically in real-time. The first task, which receives the majority of the attention, is estimating the position of the target within the incoming video stream. Because of the inherent clutter in the imagery, the resulting probability distributions are typically non-Gaussian and multi-modal. Therefore, classical state estimation techniques, such as the Kalman filter and its variants are unacceptable solutions. The particle filter has become a popular alternative since it is able to approximate the multi-modal distributions using a set of samples, and it is used as part of this research. To improve the performance of the filter and manage the inherently large computational burden a neural network is used to estimate the performance of the particle filter. The filter parameters are then changed in response to the performance. Once the position of the target is estimated in the frame, it is projected on the ground using the camera orientation and vehicle attitude and input into a linear predictor. The output of the predictor is used to update the orientation of the camera and vehicle waypoints. Through offline, simulation, and flight testing, the approach is shown to provide a powerful visual tracking system for use onboard the GTMax unmanned research helicopter.