Aarhus Universitets segl

Student project proposals

We are currently offering Master's, Bachelor's, and R&D engineering projects with wireless communications, networking and data analytics.


On-board Monocular and Stereocular Depth Estimation for UAV Navigation

The project addresses the problem of depth estimation from a front facing camera under resource limited on-board (UAV) processor. To save computation as well as to get quick response from the field of view, learning based method using a convolutional neural network (CNN) is considered. The depth estimation is expected to be used for reactive short-term flying corridor (convex) identification.

You(students) will learn architectures of CNNs and training procedures. You will also gain hands-on experience of deploying neural network based algorithm within Robot Operating System (ROS) on on-board processor.

Collaboration

This project is a part of the Drones4Energy innovation project, which targets proving autonomous inspections using UAVs.

Prerequisites

Mathematical modelling, Python programming, Embedded software development.

Contact

Liping shi (liping@eng.au.dk), Rune Hylsberg Jacobsen (rhj@eng.au.dk

(Image source: A. Geiger, P. Lenz, C. Stiller, R. Urtasun, Depth Prediction Evaluation)  

Flying Ad Hoc Networks (FANETs)

One of the most important design problems for multi-UAV (Unmanned Air Vehicle) systems is the communication which is crucial for cooperation between the UAVs.

The project addresses the problem of forming wireless mesh networks in a swarm of unmanned aerial vehicles (UAVs). Mesh networks are challenged by mobility of the UAVs and interference and must implement self-organizing mechanisms to cope with this dynamics. Important functions of mesh networks are neighbor discovery, path selection and maintenance of routing information. The study will include experimental work based on the IEEE 802.11s standard, which amends WiFi to support mesh networking. You will also gain hands-on experience of wireless communication on our UAV platform.

Collaboration

This project is a part of the Drones4Energy innovation project, which targets proving autonomous inspections using UAVs.

Prerequisites

Wireless networks, Embedded software development.

Contact

Rune Hylsberg Jacobsen (rhj@eng.au.dk

Multi-UAV System for Autonomous Exploration

How to explore an environment with multiple UAVs? The project addresses the problem of multi-agent motion planning for efficiently mapping an unknown environment. It is a sequential decision-making process for each of the UAV, which has a high dimensional decision space, including orientation, position, communication actions and even charge and discharge actions. In other words, itis a project to enable multiple UAVs to know, at any time, which direction should they be heading at, where should they go, do they need to communicate with other UAVs, and what to do under different battery level. The research will involve software development based on ROS (Robot Operating System), Octomap mapping library, PX4 flight control software stack and Gazebo robotic simulation platform.

Collaboration

This project is a part of the Drones4Energy innovation project, which targets proving autonomous inspections using UAVs.

Prerequisites

C++ or Python programming, Linux.

Contacts

Liping Shi (liping@eng.au.dk) and Rune Hylsberg Jacobsen (rhj@eng.au.dk)

(Picture reference: Master thesis project, Simon Anh Cao Nguyen)

Inspection Path Planning and 3D Reconstruction

3D Point cloud to 3D Mesh representationThe inspection industry has adopted unmanned aerial systems (UAV) as a powerful new tool offering new ways to capture and process data and cut costs and safety risks.

The project addresses the problem of 3D structural inspection path planning. The problem is to find best UAV viewpoints and a connecting path for a complete inspection of the 3D structural, with the precondition of a 3D point cloud model or a triangular mesh representation of the structural. 3D image reconstruction need to be developed. The path planning should consider the limitation of the dynamics of UAV and its on-board camera, e.g. FOV.

Sketch path for a complete 3D structure inspectionThe students will start with UAV simulation platform, and learn to use PX4 (UAV flight controller), ROS (Robot operating system) and Gazebo (Robot simulation platform) during the path planning algorithm simulation. Then go towards Hardware-in-the-loop simulation and practical testing with a flying UAV.

Contacts: Liping Shi (liping@eng.au.dk) and Rune H. Jacobsen (rhj@eng.au.dk)

Collaboration

The project is part of the Drones4Energy innovation project. The project may collaborate with another project group that research on 3D point cloud mapping (LiDAR) and free space recognition.

Prerequisites

Linux, programming, experimental skills

Positioning in Urban area for mobile robots

The project addresses the problem of high accuracy positioning in urban area. GNSS-RTK technology has been integrated in many outdoor applications using mobile robots, autonomous vehicles and drones. The proposed project aims to evaluate the radio signal impairments of the GNSS system due to interference, fading and shadowing caused by the surrounding environment to the GNSS-RTK positioning solution.

The target GNSS-RTK solution is a Network Real Time Kinematic (NRTK) positioning system, registering RTK service from TAPAS (www.tapasweb.dk) over the Internet. The TAPAS platform consists of a network of 11 reference stations deployed in Aarhus city, which improve data from global satellite-based navigation systems such as GPS and Galileo and others (GNSS). The positioning quality (e.g., accuracy and timeliness) may be affected by the communication quality between the on-board GNSS receiver (dual-band GNSS) and the TAPAS platform.

Students working in this project will investigate and characterize the interference sources, fading and shadowing components, plan the in-field measurements and evaluate the research assumptions. The project will furthermore investigate adaptive algorithm to let the robot decide when and how to switch or convert from GNSS-based positioning to non-GNSS positioning (e.g. technology you could see in Lunar rover for robot state estimation using visual inertial odometry and wheel odometry), while the robot is moving in some environment that the GNSS signal is not reliable. To provide explainability of the empirical study, models for the latest GNSS-based positioning solutions will be involved.

During the project, students will learn the GNSS-RTK technology, characteristics of GNSS signal transmission and gain experience on mobile robot platform and development using the ROS (Robot Operating System).

Collaboration

This project collaborates with Capra Robotics ApS (www.capra.ooo)

Contacts

Rune Hylsberg Jacobsen (rhj@eng.au.dk) Liping Shi (liping@eng.au.dk);

Beam Steering and Beamforming Algorithms for Nanosatellite Communications

There is a current need for new techniques to improve the communication link between satellites and between satellites and ground stations, without changing the pointing of the satellites.

The goal of this project is to develop beam steering and beamforming algorithms for nanosatellites. A depiction of beam steering is shown in the figure, where by controlling the phases of the feeding signals to the antennas, the direction of the whole radiation pattern can be electronically controlled. This is achieved by providing an input steering angle to the controller C in the diagram. Beamforming is similar, but instead, also the form of the pattern can be designed as well.

The work will focus on the beam steering and beamforming algorithms obtained from an input reference angle from the satellite attitude determination and control system (ADCS) to a desired target angle. Reference signals to digital phase shifters will be considered. The algorithms will be developed in high-level language for design and analysis, where implementation will be carried in a low end platform such as the Arduino or the Raspberry Pi. The student will have the opportunity to work hands-on with our NAN lab facilities, nanosatellite equipment and collaborate on current research in satellite projects. 

Prerequisites

Programming in a high-level (e.g. Python) or/and low-level (C/C++), Linux, embedded programming.

Contacts

Rune Hylsberg Jacobsen (rhj@eng.au.dk)

(image sources: Wikipedia (top), Adaptive beam steering (bottom))

Satellite Image Inpainting using Deep Neural Networks

The amount of publicly available satellite imagery has grown tremendously, as a result of the open data policies from ESA and NASA. However, clouds obstruct the view in the satellite imagery, especially in Northern European countries. To alleviate this issue, recently developed deep learning methods for image inpainting tasks can be employed.

In this project, the aim is to investigate how generative deep neural networks can be used to inpaint the missing parts of satellite imagery after cloud removal. It is important, however, that the inpainted parts are trustworthy representations of the ground. To achieve this, the input image must be combined with historic data, and possibly satellite radar data, and the performance of the deep learning methods used must be evaluated thoroughly.

The satellite imagery used should be from the Sentinel-2 satellite, and the student(s) will have GPU processing power available for training the deep learning models.

Contacts:Jacob Høxbroe Jeppesen (jhj@eng.au.dk) and Rune Hylsberg Jacobsen (rhj@eng.au.dk)

Collaboration

This project is part of the Future Cropping Partnership. The results will be presented as part of the outcome of the project, and if the results are promising, our aim is to collaborate on publishing a scientific paper based on the project.

Prerequisites

Machine learning/deep learning, Python programming