Aarhus University Seal

New professor on the data surge: It will only get worse

Daniel Lucani Rötter is a new professor at the Department of Electrical and Computer Engineering at Aarhus University. He conducts research into how we can solve some of the internet's biggest Gordian knots.

[Translate to English:] ”Mængden af data bliver kun værre end i dag, og nye komprimeringsteknologier er et nødvendigt redskab til at nedbringe effekten af stigende datatrafik," siger Daniel Lucani Rötter, som er nyudnævnt professor på Aarhus Universitet. Foto: Peer Klercke.

The flow of information is growing exponentially. The world's population generates approx. 2.5 million terabytes of data every single day, and with the several hundred billion additional electronic devices expected to connect to the internet within the next decade, this figure will only continue to increase.

However, it is not only the vastly increasing number of connected devices that constitute a veritable tidal wave of data. Artificial intelligence, machine learning and cyber-physical systems also contribute extensively to internet traffic.

The inherent energy costs of this data surge are skyrocketing, powered by an ever-increasing effort to install sufficient infrastructure. Data centers and networks’ electricity consumption are growing three times the rate of average worldwide electricity consumption and might account for one fifth of the total global consumption by 2030.

The hunt for energy-efficiency is real, and one of the solutions is to try and reduce the height of these data tsunamis by new compression technologies.

"We're standing on an island in the middle of the ocean with ever larger tsunamis approaching from all sides," says Daniel Lucani Rötter, who’s just been appointed a full professor at the Department of Electrical and Computer Engineering at Aarhus University. He conducts research into compressing, transporting, storing and securing data, and is therefore one of the people who are trying to stem the torrents of data that’s already beginning to clog up the net.

"The amount of data will only get worse, and new compression technologies are a necessary tool to reduce the effect of increasing data traffic. By exploiting data similarities and correlations across multiple sources, we are already developing new compression technologies that can cope with new data trends and characteristics, including IoT, cyber-physical systems, and machine learning. This will be one of the keys to damping the effect of these tsunamis,” he continues.

In collaboration with other researchers from Aarhus University, the Massachusetts Institute of Technology (MIT), University of Neuchatel, and Boston University, the professor has pioneered a compression technology called Generalized Deduplication, which delivers fast random access on data suitable for low-end sensors, servers, and even network switches with computing capabilities. Moreover, it allows us to compress across data from very different devices with no coordination, thus exploiting data similarities between them.

“Let’s use an example. If you have millions of smart meters in a country, you will find common patterns in each household and across households over the year. People wake up at similar times, turn on the light at similar times, go on vacation at similar times etc. Current compression methodologies do not exploit these similarities due to the limited memory and capability of the meters. Our new method can - even without the need to change the meter itself,” Professor Rötter says.

The technology makes use of so-called deduplication, which is basically about indexing and identifying similarities in data fragments in order to remove all data redundancy. What makes the approach unique, is that it is possible to read at random the compressed content and even perform data analysis and learning directly on compressed files. Current approaches require the system to decompress a file as a whole before being able to access its contents.

"It makes it possible for us to decompress only the part of a file, that we are interested in. This can have a major impact on processing speeds, data availability and the entire infrastructure of cloud-based storage," he says.

Finally, this technology can be deployed in newer, faster switches to compress data in the network itself, called In-network Compression. The result is compression at the speed of the wire (10 to 100 times faster than the fastest current algorithms), high compression potential (similar to the state of the art) and introducing only nano and microsecond delays to perform the compression, which was previously completely unheard of.

Daniel Lucani Rötter was born and raised in Caracas, Venezuela. He graduated from Simón Bolívar University with a degree in electrical engineering, and later moved to the US, where he graduated with a PhD in electrical engineering from MIT in 2010. He arrived at Aalborg University in August 2012 and became an associate professor at Aarhus University's Department of Engineering, as it was then known, in 2017.

In addition to being a professor, he is currently the deputy head of department for talent development and external funding, and head of section for Communication, Control and Automation at the Department of Electrical and Computer Engineering.

Professor Rötter is involved in several scientific projects, including a project funded by Horizon 2020 called IoTalentum, aimed at educating young researchers within IoT. He is also the Chief Scientist of Chocolate Cloud, the cybersecurity and cloud storage company he co-founded in 2014.

He currently lives in Skødstrup, Aarhus, with his family.


Contact

Professor Daniel Lucani Rötter
Institut for Elektro- og Computerteknologi, Aarhus Universitet
Mail: daniel.lucani@ece.au.dk
Tlf.: +4593508763