Autonomous Flight with an AR.Drone

The project started during my master's studies at UT-Cluj. The aim was to fly with a quadcopter in a known environment autonomously. In my previous project, I used a low-cost robot. But in this project, I went with a ready-to-fly platform, the Parrot Ar.Drone 2.0. This drone is designed for mobile applications/games. However, it is also applicable for academic purposes.

Ar.drone had some specific features which were decisive for this project:

  • it has 2 cameras (facing front and bottom), 1 ultrasonic sensor, and WiFi connection

  • low payload (it was not required for us)

  • the on-board microcontroller was easily programmable (it was not required for us)

We were using the Robot Operating System (ROS), which has drivers for the drone. I used the AutonomyLAb driver for the drone and openCV 2.7 for image processing. As a novice roboticist, it is easier to start with a simple task to get familiar with the system and use its' functionalities. I create a small program that made the drone follow a symbol or tag. All the movement commands are implemented with tag detection on board. So, the test program is just a way to get familiar with the AR.Drone driver. The Follower program can be found at the bottom of the page.


Flying in indoor & outdoor corridors:

Next, we wanted to fly autonomously on corridors. The drone detects low dimensional features on the video feed in order to create a closed loop control that guides the drone through the hallway. We used the vanishing point feature tracked with an Extended Kalman Filter (EKF). More about the EKF and vanishing point can be found on this tutorial page.

Outdoor railway track follower

We further developed the initial idea of using vanishing points for flight control. The vanishing point manifest when parallel lines are projected on a 2d image. Thus, the next obvious scenario is to apply it on rail tracks.

We compared different image pre-processing method for edge detection, due to the increased noise on rail tracks compared to corridors. Moreover, we extended the control strategy with PD controller over the yaw angle of the drone. The improved perception was tested both in simulated and real video feeds while the controller was tested in simulation. We use Gazebo to test the controller on straight and curved rail tracks.

Git repo can be found here.

Reference publication: Páll Előd, Koppány Máthé, Levente Tamás, Lucian Bușoniu, "Railway Track Following with the AR.Drone Using Vanishing Point Detection,"IEEE Int. Conf. on Automation, Quality, and Testing, Robotics (AQTR), 2014. (pdf link)