In recent years there has been an increase in the number of applications using unmanned aircraft systems (UASs). Additionally, researchers have progressively moved towards using vision as a primary source of perception. This is mainly due to cameras becoming cheaper in cost, smaller in size, lighter in weight, and higher in image resolution. An important UAS application is the tracking of objects via the use of visual information.
This repository consists of the code base and documentation for a vision-based object tracking system based our 2021 ICUAS paper titled "Vision-Based Guidance for Tracking Dynamic Objects." Specifically, we implement experiments for performing the diagnosis and analysis of visual tracking techniques under occlusions along with UAS guidance based on a rendezvous cone approach. Our system contains computer vision algorithms that may be used in various combinations, pipelines, or standalone depending upon the complexity and/or requirement of the task.
If you find this project useful, then please consider citing our work.
@inproceedings{karmokar2021vision,
title={Vision-Based Guidance for Tracking Dynamic Objects},
author={Karmokar, Pritam and Dhal, Kashish and Beksi, William J and Chakravarthy, Animesh},
booktitle={Proceedings of the International Conference on Unmanned Aircraft Systems (ICUAS)},
pages={1106--1115},
year={2021}
}To run the experiments within this repository, opencv, numpy, and pygame
need to be installed along with their dependencies. The requirements.txt file
(generated by pip freeze) may be used as follows. Navigate into the
downloaded source folder where requirements.txt is located. Then, run the
following
pip install -r requirements.txtFrom the source folder, navigate into the experiments folder
cd .\vbot\experimentsTo run the occlusion handling experiment, run the following
python -m exp_occTo run the lane changing experiment, run the following
python -m exp_lcTo run the squircle following experiment, run the following
python -m exp_sfThe process of running experiments has the following steps.
- Simulator window pops up.
- User inputs bounding boxing around car (with some extra room).
- Users hits
spaceto start experiment. - Tracker window appears displaying tracking results.
- To stop the experiment, user selects the simulator window and hits
space, closes the window.