Aarhus University Seal

People

Master and Bachelor Students

BSc student: Amalie Kaastrup-Hansen

PhD supervisor: Amir Ramezani Dooraki - Halil Ibrahim Ugurlu

Multi-Task Learning for Autonomous systems

The goal of this project is to design and train a MTL algorithm for image processing which will support an autonomous navigation algorithm. The trained MTL algorithm will be tested using a Webots simulation environment and will optionally be implemented on a physical robot using reinforcement learning


BSc student: Malthe Raschke Nielsen

PhD supervisor: Huy Xuan Pham

Autonomous Inspection Robots Using (Un)conventional sensor

This project aims to train robots in simulations with real world data collected by event-based cameras. Through object detection the robots will be transferred to work in a real-world application.


BSc students: Steven James Hughes, Andreas Calonius Kreth

PhD supervisor: Olaya Alvarez Tunon

Have I been here before? Using cameras for visual place recognition in underwater environments

The aim of this project is to research open-source state-of-the-art place-recognition algorithms, analyze and implement these into a simulated environment and compare the performance of the algorithms in order to determine which are better suited for underwater use.


BSc students: Martin Harder Bruun

PhD supervisor: Olaya Alvarez Tunon

Learning-based VO: What am I learning? Metrics for assessing data selection for VO training

This project aims to assess the information contained in the datasets used for training visual odometry pipelines. For that, a set of metrics to deploy on the datasets will be developed. The aim of these metrics is twofold: firstly, to assess the quality and variability of the data available, and secondly, to establish a criterion for the data splitting in learning-based VO. A learning-based VO algorithm will be selected to train a model following these criteria. The results inferred from that model will be compared with the model obtained under random data selection, to verify the validity of the criteria developed.


MSc student: David Felsager

PhD supervisor: Olaya Alvarez Tunon

Design and control of a mini ship propeller cleaning AUV

Regular cleaning of the ship hulls is required to protect them from Biofouling and to maintain their hydrodynamic efficiency. Existing cleaning robots are usually large and required an experienced pilot, making the cleaning process inefficient and less frequent. The aim of this project is to develop a small-sized and low-cost robot solution that enables autonomous cleaning of ship propellers, such that it can do the cleaning more frequently and reliably, with a low-cost and low-energy consumption.`


BSc students: Kasper Møller Hansen, Jonathan Boel Nielsen

PhD supervisor: Huy Xuan Pham

Robot Localization Using Ultra-wideband system

This project aims to analyze, implement, and evaluate an Ultra-Wideband system for robot localization in terms of applicability and precision in comparison with at least two other methods of position an orientation tracking (Mocap and SLAM).


BSc students: Frederik Toft, Aske Alstrup

PhD supervisor: Abdel Hakim Khaled Saad Amin Amer

Control System for Underwater Drone Navigation

This project is aimed at the control packages needed for the UV-drone: the drone needs a fully working control software/hardware system, making it possible to ‘hover’ the drone at a given distance from the seabed.


MSc students: Jens Høigaard Jensen, Kristoffer Plagborg Bak Sørensen

PhD supervisor: Jonas le Fevre Sejersen

Mobile Robotic Localization in Dynamic Environment

This project aims to improve localization in highly dynamic environments by incorporating local communication and collaboration between the actors.

Alumni

MSc student: Micha Heiß

Safe end-to-end trajectory planning with deep imitation learning for autonomous drone racing: with focus on exploring additional inputs

This master thesis aims to apply deep imitation learning to the challenge of autonomous drone racing based on stereo-vision. The goal is to plan safe trajectories in an online scenario. The deep learning model will learn how to traverse a complex drone racing course with high speed while avoiding collisions. The inputs of the model will include a stereo video feed, depth information and inertial measurement unit (IMU) data. The output of the model is a set of actions for the drone to take as a local trajectory. The model will be trained in a simulation and tested both in thesimulation and the real world.


Msc student: Tobias Damm-Henrichsen

Take-off and landing of UAVs from a moving platform

Quadrotor unmanned aerial vehicles (UAVs) have short flight times due to limited power source. To expand their capabilities, a landing platform/mothership is needed, housing and deploying them as needed. They need to be able to land and take off from this platform when necessary. Therefore, the project will require the development of simulation and real-world platforms to demonstrate take-off and landing in challenging conditions. The platform’s position (x,y,z) and the angular velocity should be disturbed to simulate the surface of a ship or another moving platform. The drone itself can also be disturbed in other ways to simulate changing wind conditions.


BSc students: Kevork Christian Dounato, Jens Høigaard Jensen, Kristoffer Plagborg Bak Sørensen

Collaborative mapping for object identification and inspection using UAV swarms

Collaborative and swarm robotics generally use algorithms inspired by nature, in order to localize, plan and execute specific tasks using UAVs working in unison. This project addresses an area of swarm robotics –collaborative mapping for the inspection of 3D structures.


BSc student: David Hjortshøj Felsager

Payload carrying with multiple quadrotor UAVs

Quadrotor UAVs have small payload capacity. They can, however, extend it by working in tandem with other drones. In order to carry heavier payloads, UAVs have to collaborate with other UAVs. The main challenge of this project is to create a method of controlling multiple drones when moving heavy objects.


BSc student: Andreas Nørtoft Jørgensen

Inspection Aerial Vehicle design, build and control

The main motivation behind this project is to build a small-size drone that can fly inside the drone cage at Deep Tech Experimental Hub. The built drone should utilize a basic controller to be able to follow a designated trajectory as accurate as possible.


PhD student: Ilker Bozcan

Term: December 2018 - December 2021

Learning-based Anomaly Detection for Aerial Surveillance

Anomaly detection is the classification of objects and events that are labeled as suspicious. Autonomous surveillance systems should be aware of what entities are anomaly in the environment. During this project, we will develop an autonomous surveillance system using cost-effective visual sensors and deep neural networks that are state-of-the-art object detection algorithms.

The system is suited for anomaly detection for both types of data: aerial and ground. Unmanned Aerial/Ground Vehicles (UAV/UGV) are possible robotic platforms to operate anomaly detection systems. As a use case, UGVs will be used for plant classification where weeds are considered as anomaly. In another use case, UAVs will be used for flying object detection where other drones are marked as anomaly.


MSc student: Jakob Grimm Hansen

Jakob performed research in the area of motion planning in autonomous inspection tasks. The idea was to remove the need for human inspection and intervention for hazardous tasks. Jakob concentrated on planning an inspection route around a shipping vessel using an autonomous drone. The planning task was performed on-board, in order for the drone to be able to adapt to dynamically changing environments.


Msc student: Kristoffer Fogh Andersen 

I am a MSc student in Computer Technology at Aarhus University, currently on my 3rd semester. I have a BSc in Electrical Engineering from Aarhus School of Engineering in June 2019. I conducted a Research & Development project at AiR Lab, where I explored, implemented and tested various modern control methods for quadrotors. By being involved with AiR Lab I gained valuable practical experience in my interests, which include robotics, control theory, machine learning and computer vision.


MSc student: Hanshuo Yang

Gate perception has drawn much research endeavors because it plays a vital role in drone racing. In this Master's thesis, we will mainly focus on the gate perception problem, even though planning and control are basic parts of the solution. For recognizing and localizing the gate, the convolutional neural network will be used to detect the gate center, while the RGB-D camera will be used to provide the information of distance and orientation of the gate. Furthermore, we will consider partially occluded and partially visible gates in our experiments. Another problem we will be looking into is the gate detection accuracy under illumination changes. Last but not least, we will also consider multiple gates in the image.


MSc students: Daniel Tøttrup & Stinus Skovgaard

Visual segmentation and recognition of moving objects for drone inspection application.

The aim of this project is to investigate the use cutting edge machine learning and image understanding techniques to optimize UAV-based applications. Daniel and Stinus are working on the project jointly.


BSc student: Nicklas Vraa

Design and building agile UAVs.


Visiting MSc student, Politecnico Di Milano: Deniz Bardakci

The goal of my project is to obtain a fully autonomous intelligent agriculture drone. As a part of the goal during its field coverage duties, it should avoid obstacles by re-planning its local trajectory. For this purpose, we develop end-to-end planning algorithms in Air Lab of Aarhus University. We use Webots robot simulator to develop the test environment to make it suitable for learning from depth and crash information of a drone. Currently, we are evaluating the performance of the algorithm on simulation, and in parallel, I am building the real test drone.


MSc Student and Research Assistant: Theo Martin Morales

The fast and the furious: Artificial Intelligence-based autonomous pilots for drone racing


As intelligent robots become more prominent in the transport and entertainment industry, researchers go to great lengths to bring human-like maneuver skills to micro unmanned aerial vehicles. Drone racing is a great application for that problem, as it is a competitive sport in which autonomous drones race against each other, while also trying to beat professional pilots. Artificial intelligence, and more specifically computer vision, might be a game changer for this sector, where traditional rule-based approaches suffer from their rigidity. In this thesis, instead of dealing with perception-planning-control problems separately, a deep learning approach is used, where RGB images are used as input for a deep convolutional neural network, which outputs reference velocities for the drone on the x-, y-, z-axes. To keep track of the drone velocity at all times, visual-inertial odometry is utilized.


PhD Student: Mohit Mehndiratta

Term: August 2018 - August 2019

Mohit Mehndiratta was born in New Delhi, India on June 28, 1990. He received a B.Tech. degree in aerospace engineering from Amity University, Uttar Pradesh, India, in 2012. In August 2015, he received a M.Sc. degree in aerospace engineering jointly offered by Technische Universität München, Germany and Nanyang Technological University, Singapore. He is currently a doctoral student (PhD. scholar) and pursuing his research in Nanyang Technological University at the School of Mechanical and Aerospace Engineering. His research focus includes nonlinear model predictive control (NMPC), moving horizon estimation (MHE), unmanned aerial vehicles (UAV) and is currently working on the design of VTOL fixed-wing UAV incorporating NMPC-MHE control framework.


PhD Student: Andriy Sarabakha

Term: August 2018 - August 2019

Andriy Sarabakha was born in Lviv, Ukraine. He received a B.Sc. degree in computer engineering in 2012 from”Sapienza” – Università di Roma, Rome, Italy, as well as a M.Sc. degree in artificial intelligence and robotics in 2015 from the same university. From January 2016, he is pursuing his research in Nanyang Technological University (NTU), Singapore, at the School of Mechanical and Aerospace Engineering as Ph.D. student. His research areas are unmanned aerial vehicles, artificial intelligence and fuzzy logic.