Aarhus University Seal / Aarhus Universitets segl


Faculty members

Master and Bachelor Students

Project: Collaborative mapping for object identification and inspection using UAV swarms

BSc students: Kevork Christian Dounato, Jens Høigaard Jensen, Kristoffer Plagborg Bak Sørensen

Collaborative and swarm robotics generally use algorithms inspired by nature, in order to localize, plan and execute specific tasks using UAVs working in unison. This project addresses an area of swarm robotics –collaborative mapping for the inspection of 3D structures.

Project: Payload carrying with multiple quadrotor UAVs

BSc student: David Hjortshøj Felsager

Quadrotor UAVs have small payload capacity. They can, however, extend it by working in tandem with other drones. In order to carry heavier payloads, UAVs have to collaborate with other UAVs. The main challenge of this project is to create a method of controlling multiple drones when moving heavy objects.

Project: Inspection Aerial Vehicledesign, build and control

BSc student: Andreas Nørtoft Jørgensen

The main motivation behind this project is to build a small-size drone that can fly inside the drone cage at Deep Tech Experimental Hub. The built drone should utilize a basic controller to be able to follow a designated trajectory as accurate as possible.

Project: Safe trajectory planning with deep imitation learning for autonomous drone racing

MSc student: Micha Heiß

This master thesis aims to apply deep imitation learning to the challenge of autonomous drone racing based on stereo-vision. The goal is to plan safe trajectories in an online scenario. The deep learning model will learn how to traverse a complex drone racing course with high speed while avoiding collisions. The inputs of the model will include a stereo video feed, depth information and inertial measurement unit (IMU) data. The output of the model is a set of actions for the drone to take as a local trajectory. The model will be trained in a simulation and tested both in thesimulation and the real world.


PhD student: Ilker Bozcan

Term: December 2018 - December 2021

Learning-based Anomaly Detection for Aerial Surveillance

Anomaly detection is the classification of objects and events that are labeled as suspicious. Autonomous surveillance systems should be aware of what entities are anomaly in the environment. During this project, we will develop an autonomous surveillance system using cost-effective visual sensors and deep neural networks that are state-of-the-art object detection algorithms.

The system is suited for anomaly detection for both types of data: aerial and ground. Unmanned Aerial/Ground Vehicles (UAV/UGV) are possible robotic platforms to operate anomaly detection systems. As a use case, UGVs will be used for plant classification where weeds are considered as anomaly. In another use case, UAVs will be used for flying object detection where other drones are marked as anomaly.

MSc student: Jakob Grimm Hansen

Jakob performed research in the area of motion planning in autonomous inspection tasks. The idea was to remove the need for human inspection and intervention for hazardous tasks. Jakob concentrated on planning an inspection route around a shipping vessel using an autonomous drone. The planning task was performed on-board, in order for the drone to be able to adapt to dynamically changing environments.

Msc student: Kristoffer Fogh Andersen 

I am a MSc student in Computer Technology at Aarhus University, currently on my 3rd semester. I have a BSc in Electrical Engineering from Aarhus School of Engineering in June 2019. I conducted a Research & Development project at AiR Lab, where I explored, implemented and tested various modern control methods for quadrotors. By being involved with AiR Lab I gained valuable practical experience in my interests, which include robotics, control theory, machine learning and computer vision.

MSc student: Hanshuo Yang

Gate perception has drawn much research endeavors because it plays a vital role in drone racing. In this Master's thesis, we will mainly focus on the gate perception problem, even though planning and control are basic parts of the solution. For recognizing and localizing the gate, the convolutional neural network will be used to detect the gate center, while the RGB-D camera will be used to provide the information of distance and orientation of the gate. Furthermore, we will consider partially occluded and partially visible gates in our experiments. Another problem we will be looking into is the gate detection accuracy under illumination changes. Last but not least, we will also consider multiple gates in the image.

MSc students: Daniel Tøttrup & Stinus Skovgaard

Visual segmentation and recognition of moving objects for drone inspection application.

The aim of this project is to investigate the use cutting edge machine learning and image understanding techniques to optimize UAV-based applications. Daniel and Stinus are working on the project jointly.

BSc student: Nicklas Vraa

Design and building agile UAVs.

Visiting MSc student, Politecnico Di Milano: Deniz Bardakci

The goal of my project is to obtain a fully autonomous intelligent agriculture drone. As a part of the goal during its field coverage duties, it should avoid obstacles by re-planning its local trajectory. For this purpose, we develop end-to-end planning algorithms in Air Lab of Aarhus University. We use Webots robot simulator to develop the test environment to make it suitable for learning from depth and crash information of a drone. Currently, we are evaluating the performance of the algorithm on simulation, and in parallel, I am building the real test drone.

MSc Student and Research Assistant: Theo Martin Morales

The fast and the furious: Artificial Intelligence-based autonomous pilots for drone racing

As intelligent robots become more prominent in the transport and entertainment industry, researchers go to great lengths to bring human-like maneuver skills to micro unmanned aerial vehicles. Drone racing is a great application for that problem, as it is a competitive sport in which autonomous drones race against each other, while also trying to beat professional pilots. Artificial intelligence, and more specifically computer vision, might be a game changer for this sector, where traditional rule-based approaches suffer from their rigidity. In this thesis, instead of dealing with perception-planning-control problems separately, a deep learning approach is used, where RGB images are used as input for a deep convolutional neural network, which outputs reference velocities for the drone on the x-, y-, z-axes. To keep track of the drone velocity at all times, visual-inertial odometry is utilized.

PhD Student: Mohit Mehndiratta

Term: August 2018 - August 2019

Mohit Mehndiratta was born in New Delhi, India on June 28, 1990. He received a B.Tech. degree in aerospace engineering from Amity University, Uttar Pradesh, India, in 2012. In August 2015, he received a M.Sc. degree in aerospace engineering jointly offered by Technische Universität München, Germany and Nanyang Technological University, Singapore. He is currently a doctoral student (PhD. scholar) and pursuing his research in Nanyang Technological University at the School of Mechanical and Aerospace Engineering. His research focus includes nonlinear model predictive control (NMPC), moving horizon estimation (MHE), unmanned aerial vehicles (UAV) and is currently working on the design of VTOL fixed-wing UAV incorporating NMPC-MHE control framework.

PhD Student: Andriy Sarabakha

Term: August 2018 - August 2019

Andriy Sarabakha was born in Lviv, Ukraine, on November 16, 1986. He received a B.Sc. degree in computer engineering in 2012 from”Sapienza” – Università di Roma, Rome, Italy, as well as a M.Sc. degree in artificial intelligence and robotics in 2015 from the same university. From January 2016, he is pursuing his research in Nanyang Technological University (NTU), Singapore, at the School of Mechanical and Aerospace Engineering as Ph.D. student. His research areas are unmanned aerial vehicles, artificial intelligence and fuzzy logic.