Lighting up the bicycle roads with drones

Title Lighting up the bicycle roads with drones
Summary Lighting up the bicycle roads with drones
Keywords Drone, machine learning, obstacle detection
Supervisor Cristofer Englund, Lei Chen
Level Master
Status Open

Scope: The thesis is part of an ongoing project to develop drone-based lighting solutions for pedestrians that can follow the pedestrian and light up the road. The aim is to develop a drone and/or a swarm of drones with vision-based tracking algorithms that can recognize a pedestrian, keep a predefined distance, and angle, and follow the pedestrian while biking. The algorithm should be combined with GPS coordinates to allow more robust solutions. Successful development will be considered for a larger test in realistic environments.

The drones could be based on commercial drones with adaptations or be built from scratch. Opensource software for drone control are expected to be used such as Ardupilot. Simulaiton is expected to be done at an early stage in environment such as Airsim or similar. Computer vision-based algorithms are expected to be deployed with NVIDIA jetson or similar. Tasks: The thesis will consist of the following major tasks and milestones. • Overall thesis planning and milestones • State-of-the-art of drone solutions to decide on the drone components, the needs of hardware and software, and so on • State-of-the-art of vision-based recognition and tracking algorithms and their implementations • Design and train models for pedestrian recognition and tracking from the angel of a drone and verify the models through simulation • Assemble the drone and deploy the algorithm and conduct demonstration and data analysis. Requirements: The candidates are required to be handy with developments related to drones and robots. They need to have experiences with machine learning frameworks such as Tensorflow, pytorch for model training, as well as python scripts for e.g., deploying the algorithms onboard and controlling drones.

Contacts: Cristofer Englund ( ) Industry contact: Lei Chen ( )