Sensor fusion and machine learning for drone detection and classification

From ISLAB/CAISR
Title Sensor fusion and machine learning for drone detection and classification
Summary Sensor fusion and machine learning for drone detection and classification
Keywords
TimeFrame Spring 2020
References
Prerequisites
Author
Supervisor Cristofer Englund, Eren Erdal Aksoy, Fernando Alonso-Fernandez
Level Master
Status Ongoing


Development and construction of an embedded system using a few sensors, sensor fusion and machine learning to detect, track and classify object in the air, especially drones.

The output from the system could be track data in 1/2/3 dimensions (depending on the sensor/sensors) and a classification into one of the following classes: aircraft/helicopter/drone/bird/unknown.

Using a set of simpler sensors, and thereby getting a shorter detection range, the focus is to detect drones and distinguish these from birds. In the initial part of the master’s thesis current research will be studied as well as available technological equipment and solutions.

The usage of such a system could benefit both security issues as well as financial e.g. reduce the time an airport has to stop air traffic due to a drone being spotted.

The subjects to be studied can be:

  • Which is the best choice of sensors to be used (camera/thermal/radar/transponder data (ads-b)/sound/signal detection (wifi)?
  • What is the most efficient choice of sensors regarding quantity and their deployment configuration (e.g. is it better to have them all in one location or is it better to have a single camera and three sound sensors that are deployed with some distance between them)?
  • Which types of machine learning algorithms are best suited for the task and how to collect training data to them?
  • What kind of hardware is needed to do the calculations and sensor fusion?