Categories
Explore project

Certifiable Controller Synthesis for Cyber-Physical Systems

Project type: Explore Project

Certifiable Controller Synthesis for Cyber-Physical Systems

Summary

As cyber-physical systems (CPSs) are becoming ever more ubiquitous, many of them are considered safetycritical. We want to help CPS manufacturers and regulators with establishing high levels of trust in automatically synthesized control software for safety-critical CPSs. To this end, we propose to extend the technique of formal certification towards controller synthesis: controllers are synthesized together with a safety certificate that can be verified by highly trusted theorem provers.

Value Creation

From a distant view point, our project aims to increase confidence in safety-critical CPSs that interact with individuals and the society at large. This is the main motivation for applying formal methods to the construction of CPSs. However, our project aims to give a unique spin to this. By cleverly combining the existing methods of controller synthesis, (timed automata) mode checking, and interactive theorem proving via means of certificate extraction and checking, we aim to facilitate the construction of control software for CPSs that ticks all the boxes: high efficiency, a very high level of trust in the safety of the system, and the possibility to independently audit the software. Given that CPSs have already conquered every sector of life, with the bulk of the development still ahead of us, we believe such an approach could make an important contribution towards technology that benefits the people. 

Moreover, our approach aims to ease the interaction between the CPS industry and certification authorities. We believe it is an important duty of regulatory authorities to safeguard their citizens from failures of critical CPSs. Even so, regulation should not grind development to a halt. With our work, we hope to somewhat remedy this apparent conflict of interests. By providing a means to check the safety of synthesized controllers in a well-documented, reproducible, and efficient manner, we believe that the interaction between producers and certifying bodies could be sped up significantly, while increasing reliability at the same time. On top of that, controller synthesis has already been intensely studied and seems to be a rather mature technology from an academic perspective. However, it has barely set a foot into industrial applications. We are confident that formal certificate extraction and checking can be an important stepping stone to help controller synthesis make this jump. 

This project also contributes to the objective of DIREC to bring new academic partners together in the Danish eco-system. The two principal investigators have their specialization background in two different fields (certification theory and control theory) and have not collaborated before. Thus the project strengthens the collaboration between the two fields as well as the collaboration between the two research groups at AU and AAU. This creates the opportunity for the creation of new scientific results benefiting both research fields. 

Finally, we plan to generate tangible value for industry. There are many present-day use cases for control software of critical CPSs. During our project, we want to aid these use cases with controllers that tick all of the aforementioned “boxes”. This can be done by initiating several student projects and theses supporting theory development, tool implementation, and use case demonstration. The Problem Based Learning approach of Aalborg University facilitates this greatly. Furthermore, those students can use their experience
in future positions after graduating. 

Participants

Martijn Goorden

Postdoc

Aalborg University
Department of Computer Science

E: mgoorden@cs.aau.dk

Simon Wimmer

Postdoc

Aarhus University
Department of Computer Science

E: swimmer@cs.au.dk

Categories
Explore project

Methodologies for scheduling and routing droplets in digital microfluidic biochips

Project type: Explore Project

Methodologies for scheduling and routing droplets in digital microfluidic biochips

Summary

The overall purpose of this project is to define, investigate, and provide preliminary methodologies for scheduling and routing microliter-sized liquid droplets on a planar surface in the context of digital microfluidics.

The main idea is to use a holistic approach in the design of scheduling and routing methodologies that takes into account real-world physical, topological, and behavioral constraints. Thus, producing solutions that can immediately find use in practical applications.

Value Creation

DMF biochips have been in the research spotlight for over a decade. However, the technology is still not mature at a level where it can deliver extensive automation to be used in applied biochemistry processes or for research purposes. One of the main reasons is that, although rather simple in construction, DMF biochips lack a clear automated procedure for being programmed and used. The existing methodologies for programming DMF biochips require an advanced level of understanding of software programming and of the architecture of the biochip itself. These skills are not commonly found in potential target users of this technology, such as biologists and chemists.

A fully automated compilation pipeline able to translate biochemical protocols expressed in a high-level representation into the low-level biochip control sequences would enable access to the DMF technology by a larger number of researchers and professionals. The advanced scheduling and routing methodologies investigated by this project are one of the main obstacles towards broadly accessible DMF technology. This is particularly relevant for researchers and small businesses which cannot afford the large pipetting robots commonly used to automate biochemical industrial protocol. One or more DMF biochips can be programmed to execute ad-hoc repetitive and tedious laboratory tasks. Thus, freeing qualified working hours for more challenging laboratory tasks.

In addition, the scheduling and routing methodologies targeted by this project enable for online decisions, such as controlling the flow of the biochemical protocols depending upon on-the-fly sensing results from the processes occurring on the biochip. This opens for a large set of possibilities in the biochemical research field. For instance, the behavior of complex biochemical protocols can be automatically adapted during execution using decisional constructs (if-then-else) allowing for real-time protocol optimizations and monitoring.

From a scientific perspective, this project would enable cross-field collaboration, develop new methodologies, and potentially re-purpose those techniques that are well known in one research field to solve problems of another field. For the proposed project, interesting possibilities include adapting advanced routing and
graph-related algorithms or applying well-known online algorithms techniques to manage the real-time flow control nature of the biochemical protocol. The cross-field nature of the project has the potential of providing a better understanding of how advanced scheduling and routing techniques can be applied in the context of a strongly constrained application such as DMF biochips. Thus, laying the ground for novel solutions, collaborations, and further research.

Finally, it should be mentioned that the outcome of this project, or of a future larger project based on the proposed explorative research, is characterized by a concrete business value. Currently, some players have entered the market with DMF biochips built to perform a specific biochemical functionality [12,13]. A software stack that includes compilation tools supporting programmability and enabling the same DMF biochip to perform different protocols largely expands the potential market of such technology. This is not the preliminary aim of this research project, but it is indeed a long-term possibility.

News / coverage

Participants

Project Manager

Luca Pezzarossa

Assistant Professor

Technical University of Denmark
DTU Compute

E: lpez@dtu.dk

Eva Rotenberg

Associate Professor

Technical University of Denmark
DTU Compute

Lene M. Favrholdt

Associate Professor

University of Southern Denmark
Department of Mathematics and Computer Science

Categories
Explore project

Automated Verification of Sensitivity Properties for Probabilistic Programs

Project type: Explore Project

Automated Verification of Sensitivity Properties for Probabilistic Programs

Sensitivity measures how much program outputs vary when changing inputs. We propose exploring novel methodologies for specifying and verifying sensitivity properties of probabilistic programs such that they (a) are comprehensible to everyday programmers, (b) can be verified using automated theorem provers, and (c) cover properties from the machine learning and security literature.

This work will bring together two junior researchers who recently arrived in Denmark and obtained their PhDs working on probabilistic verification.

Project description

Our overall objective is to explore how automated verification of sensitivity properties of probabilistic programs can support developers in increasing the trust in their software through formal assurances.

Probabilistic programs are programs with the ability to sample from probability distributions. Examples include randomized algorithms, where sampling is exploited to ensure that expensive executions have a low probability, cryptographic protocols, where randomness is essential for encoding secrets, and statistics, where programs are becoming a popular alternative to graphical models for describing complex distributions.

The sensitivity of a program determines how its outputs are affected by changes to its input; programs with low sensitivity are robust against fluctuations in their input – a key property for improving trust in software. Minor input changes should, for example, not affect the result of a classifier learned from training data. In the probabilistic setting, the output of a program depends not only on the input but also on the source of randomness. Hence, the notion of sensitivity – as well as techniques for reasoning about it – needs refinement.

Automated verification takes a deductive approach to proving that a program satisfies its specification: users annotate their programs with logical assertions; a verifier then generates verification conditions (VCs) whose validity implies that the program’s specification holds. Deductive verifiers are more complete and more scalable than fully automatic techniques but require significant user interaction. The main challenge for users of automated verifiers lies in finding suitable intermediate assertions, particularly loop invariants, such that an automated theorem prover can discharge the generated VCs. A significant challenge for developers of automated verifiers is to keep the amount and complexity of necessary annotations as low as possible.

Previous work [1] co-authored by the applicants provides a theoretical framework for reasoning about the sensitivity of probabilistic programs: the above paper presents a calculus to carry out “pen-and-paper” proofs of sensitivity in a principled and syntax-directed manner. The proposed technique deals with sampling instructions by requiring users to identify suitable probabilistic couplings, which act as synchronization points, on top of finding loop invariants. However, the technique is limited in the sense that it does not provide tight sensitivity bounds when changes to the input cause a program to take a different branch on a conditional.

Our project has four main goals. First, we will develop methodologies that do not suffer from the limitations of [1]. We believe that conditional branching can be treated by carefully tracking the possible divergence. Second, we will develop an automated verification tool for proving sensitivity properties of probabilistic programs. The tool will generate VCs based on the calculus from [1], which will be discharged using an SMT solver. In designing the specification language, we aim to achieve a balance so that (a) users can conveniently specify synchronization points for random samples (via so-called probabilistic couplings) and (b) existing solvers can prove the resulting VCs. Third, we aim to aid the verification process by assisting users in finding synchronization points. Invariant synthesis has been extensively studied in the case of deterministic programs. Similarly, coupling synthesis has been recently studied for the verification of probabilistic programs [2]. We believe these techniques can be adapted to the study of sensitivity. Finally, we will validate the overall verification system by applying it to case studies from machine learning, statistics, and randomized algorithms.

 

Participants

Alejandro Aguirre

Postdoc

Aarhus University
Department of Computer Science

Christoph Matheja

Assistant Professor

Technical University of Denmark
DTU Compute

Categories
Explore project

Understanding Biases and Diversity of Big Data used for Mobility Analysis

Project type: Explore Project

Understanding Biases and Diversity of Big Data used for Mobility Analysis

Summary

Our capabilities to collect, store and analyze vast amounts of data have greatly increased in the last two decades, and today big data plays a critical role in a large majority of statistical algorithms. Unfortunately, our understanding of biases in data has not kept up. While there has been lot of progress in developing new models to analyze data, there has been much less focus on understanding the fundamental shortcomings of big data.

This project will quantify the biases and uncertainties associated with human mobility data collected through digital means, such a smartphone GPS traces, cell phone data, and social media data.

Ultimately, we want to ask the question: is it possible to fix big mobility data through a fundamental understanding of how biases manifest themselves?

Value Creation

We expect this project to have a long-lasting scientific and societal impact. The scientific impact of this work will allow us to explicitly model bias in algorithmic systems relying on human mobility data and provide insights into which population are left out. For example, it will allow us to correct for gender, wealth, age, and other types of biases in data globally used for epidemic modeling, urban planning, and many other usecases. Further, having methods to debias data will allow us to understand what negative impacts results derived from biased data might have. Given the universal nature of bias, we expect our developed debiasing frameworks will also pave the way for quantitative studies of bias in other realms of data science. 

The societal impact will be actionable recommendations provided to policy makers regarding: 1) guidelines for how to safely use mobility datasets in data-driven decision processes, 2) tools (including statistical and interactive visualizations) for quantifying the effects of bias in data, and 3) directions for building fairer and equitable algorithm that rely on mobility data. 

It is important to address these issues now, because in their “Proposal for a Regulation on a European approach for Artificial Intelligence” from April 2021 the European Commission (European Union) outlines potential future regulations for addressing the opacity, complexity, bias, and unpredictability of algorithmic systems. This document states that high-quality data is essential for algorithmic performance and suggest that any dataset should be subject to appropriate data governance and management practices, including examination in view of possible biases. This implies that in the future businesses and governmental agencies will need to have data-audit methods in place. Our project addresses this gap and provides value by developing methodologies to audit mobility data for different types of biases — producing tools which Danish society and Danish businesses will benefit from.

Participants

Project Manager

Vedran Sekara

Assistant Professor

IT University of Copenhagen
Department of Computer Science

E: vsek@itu.dk

Laura Alessandretti

Associate Professor

Technical University of Denmark
DTU Compute

Manuel Garcia-Herranz

Chief Scientist

UNICEF
New York

Elisa Ormodei

Assistant Professor

Central European University

Categories
Explore project

Ergonomic & Practical Effect Systems

Project type: Explore Project

Ergonomic & Practical Effect Systems

Summary

Effect systems are currently a hot research subject in type theory. Yet many effect systems, whilst powerful, are very complicated to use, particularly by programmers that are not experts at type theory. Effect systems with inference can provide useful guarantees to programming languages, while being simple enough to be used in practice by everyday programmers.

Building on the Boolean unification-based polymorphic effect system in the Flix programming language, we want to pursue two practical short-term objectives: to (a) improve the quality of effect error messages, and to (b) develop techniques to improve the performance of Boolean unification and effect inference. Thus laying the foundation for a more ambitious objective: The Futhark programming language supports a form of referentially transparent in-place updates, controlled by a system of uniqueness types inspired by Clean, but which is too limited in the presence of polymorphic higher-order functions. Recasting the type system in terms of effects, based on the one in Flix, might provide a more intuitive system.

A unique aspect of this project is that it brings together two programming language researchers, one from Aarhus and one from Copenhagen, who are both working on full-blown programming language implementations.

Value Creation

We address value creation following the three outlined categories:

Scientific Value: We see two clear publishable scientific contributions: (a) new techniques to improve the performance of Boolean unification and (b) new applications of type and effect systems based on Boolean unification.

Capacity Building: Flix and Futhark are the two the major academic efforts towards building new programming languages in Denmark. Bringing the two research groups together will facilitate knowledge sharing and technology transfer; enabling both projects to thrive and grow even further. This unique opportunity exists because both languages are based on similar technology and because they do not compete in the same space. Success for one is not at the expense of the other, and they can rise together.

Business and Societal Value: A significant amount of research effort has been expended on designing effect systems. Despite widespread belief that such systems can lead to safer programs, few systems have been implemented in real-world programming languages. By focusing on improving the ergonomics, we want to make these technologies more accessible. Being the designers of Flix and Futhark, we are in great position to conduct such work. We can show the way for other mainstream programming languages by having real, full-blown implementations.

After decades of relative stagnation, programming languages are now rapidly absorbing features previously only seen in obscure or academic programming languages. Java and C# and prominent examples of originally very orthodox object-oriented languages that have been augmented with concepts from functional programming. We believe that effect systems and other fancy type system features are a logical next step, but before they can be added to mainstream languages, it must be shown that they can be designed and implemented in a form that will be palatable to industrial users. Thus, while Flix and Futhark may or may not be the languages of the future, we believe that our research can help impact the direction of future programming languages by providing solid formal foundations and real-world implementations that others can build on directly or indirectly.

News / coverage

Participants

Project Manager

Magnus Madsen

Associate Professor

Aarhus University
Department of Computer Science

E: magnusm@cs.au.dk

Troels Henriksen

Assistant Professor

University of Copenhagen
Department of Computer Science

Categories
Explore project

Hardware/software Trade-off for the Reduction of Energy Consumption

DIREC project

Hardware/Software Trade-offs for the Reduction of Energy Consumption

Summary

Computing devices consume a considerable amount of energy. Implementing algorithms in hardware using field-programmable gate arrays (FPGAs) can be more energy efficient than executing them in software in a processor.

This project explores classic sorting and path-finding algorithms and compare their energy efficiency and performance when implemented in hardware.

The use of FPGAs is increasing in mainstream computing, and the project may enable software developers to use a functional language to efficiently implement algorithms in FPGs and reduce energy consumption.

Computing devices consume a considerable amount of energy. Within data centers this has an impact on climate change and in small embedded systems, i.e., battery powered devices, energy consumption influences battery life. Implementing an algorithm in hardware (in a chip) is more energy efficient than executing it in software in a processor. Up until recently processor performance and energy efficiency have been good enough to just use software on a standard processor or on a graphic processing unit. However, this performance increase comes to an end and energy-efficient computing systems need domain specific hardware accelerators.

However, the cost of producing a chip is very high. Between fixed hardware and software there is the technology of field-programmable gate arrays (FPGAs). FPGAs are programmable hardware, the algorithm can be changed at runtime. However, FPGAs are less energy efficient than chips. We expect that for some algorithms an FPGA will be more energy efficient than the implementation in software. The research question is whether and how it is possible to reduce energy consumption of IT systems by moving algorithms from software into hardware (FPGAs). We will do this by investigating classic sorting and path-finding algorithms and compare their energy-efficiency and, in addition, their performance.

Such results are essential to both data centers as well as embedded systems. However, the hardware design of these accelerators is often complex, and their development is time-consuming and error-prone. Therefore, we need a tool and methodology that enables software engineers to design efficient hardware implementation of their algorithms. We will explore a modern hardware construction language, Chisel. Chisel is a Scala-embedded hardware construction language that allows to describe hardware in a more software-like high-level language. Chisel is the enabling technology to simplify the translation of a program from software into hardware. This project will furthermore investigate the efficiency of using the functional and object-oriented hardware description language Chisel to express algorithms efficiently for execution in FPGAs.

Programs running on a general-purpose computer consume a considerable amount of energy. Some programs can be translated into hardware and executed on an FPGA. This project will explore the trade-offs between executing a program in hardware and executing it in software relative to energy consumption.

Scientific Value
The FPGA and software implementations of path-finding algorithms have recently been evaluated in the lense of performance, e.g., [?], whereas sorting algorithms have also been evaluated on energy consumption, e.g., [2]. Here FPGAs performed better than CPU in many cases and with similar or reduced energy consumption. The language used for implementation is Verilog and C which is then translated to Verilog using Vivado HLS. In this project, we will implement the algorithms in hardware using Chiesl and evaluate their performance and energy consumption. DTU and RUC will advance the research in the design and testing of digital systems for energy saving. Our proposed approach provides a general software engineering procedure that we plan to validate with standard algorithms used in cloud applications. This research will drive the adaption of hardware design methods to the education curriculum towards modern tools and agile methods. 

Capacity Building
The project establish a new collaboration between two Danish Universities and is a first step towards building a more energy-aware profile of the Computer Science laboratory FlexLab, RUC. In return FlexLab make FPGAs available to the research assistants at RUC. Thus, this project will improve visibility of energy-aware design IT systems nationally and international. This project with the cooperation between researchers as DTU and RUC will allow Denmark to take the lead in digital research nd development for reduced energy consumption. The upcoming research positions at RUC will contribute to building RUC’s research capacity, and the project will also recruit new junior researchers directly and in future subsequent projects.

Business Value
The changes in the hardware industry indicates that the use of FPGAs will increase: A few years ago Intel bought Altera -one of the two largest FPGA production companies- to include FPGAs in future versions of their processors. Similar, AMD is aiming to buy Xilinx, the other big FPGA vendor. In addition, one can already rent a server in the cloud from Amazon that includes an FPGA. These changes all points towards that FPGAs are entering mainstream computing. Many mainstream programming languages like C# or Java already include functional features such as lambda expressions or higher-order functions. The more common languages for encoding FPGAs are Verilog, a C inspired language, and VHDL, a Pascal inspired language, Therefore, it may be efficient for mainstream software developers to use a functional language to efficiently implement algorithms in FPGAs and thus both increase performance and reduce the energy consumption. 

Societal Value
Currently ICT consumes approximately 10% of the global electricity and this is estimated to increase to 20% in 2030. Thus, reducing energy consumption of ICT is critical. If successful, this project has the potential to reduce the energy consumption via rephrasing the essential software programs in FPGA units.

Impact

Currently ICT consumes approximately 10% of the global electricity and this is estimated to increase to 20% in 2030. Thus, reducing energy consumption of ICT is critical. If successful, this project has the potential to reduce the energy consumption via rephrasing the essential software programs in FPGA units. 
 

Participants

Project Manager

Maja Hanne Kirkeby

Assistant Professor

Roskilde University
Department of People and Technology

E: majaht@ruc.dk

Martin Schoerberl

Associate Professor

Technical University of Denmark
DTU Compute

Mads Rosendahl

Associate Professor

Roskilde Universlty
Department of People and Technology

Thomas Krabben

FlexLab Manager

Roskilde University
Department of People and Technology

Categories
News

Meet Tijs Slaats, who just won a prize for best process mining algorithm

Meet Tijs Slaats, who just won a prize for best process mining algorithm

Tijs is Associate Professor at the Department of Computer Science at the University of Copenhagen and Head of the Business Process Modeling and Intelligence research group. In DIREC, he works on the Bridge project AI and Blockchains for Complex Business Processes.

Tijs’ research interests include declarative and hybrid process technologies, blockchain technologies, process mining, and information systems development.  

He co-invented the flagship declarative Dynamic Condition Response (DCR) Graphs process notation and was a primary driver in its early commercial adoption. In addition, he led the invention and development of the DisCoveR process miner, which was recognized as the best process discovery algorithm in 2021. 

Can you tell us briefly about your research and what value you expect to get from it?
We try to describe processes. It can be basic things that we do as human beings. It could be assembling a car at a factory, but it could also be treating patients at a hospital. If a patient is admitted to a hospital, they need help and treatment.

What it has in common for these examples is that you need to go through a number of steps and activities to reach your goal, and those activities are related to each other. It may be medication that needs to be taken in a certain order.

In our research, we have developed a mathematical method for describing these processes. The reason for doing this is because it gives you the tools to ensure that the process goes the way you want it to.

In the new DIREC project, we take one step further. We have observed that many companies and organizations have large amounts of data on how they have performed their jobs. And we can look at these data and analyze them to see how they actually perform their jobs, because the way many people do their jobs does not necessarily match the way they expect to do it. Maybe they take shortcuts unintentionally.

Our idea is to find these data and analyze them and on that basis we get a model.

It is important that such a model also is understandable to the users so that they can understand how they perform their work. We call this process mining, and it is a reasonably large academic area. Two years ago, I developed an algorithm, and it was in a contest, where you compare which algorithm is most accurate to describe these “logs of behaviour”, and we won the contest.

Read more

What results do you expect from your research?
Our cooperation with industry is particularly important. In the project, we collaborate with the company Gekkobrain https://gekkobrain.com, which works with DevOps, and they are interested in analyzing large ERP systems and in finding tools that can optimize a system and find abnormalities. These systems are quite complex, so it is important to be able to identify where things are going wrong.

Gekkobrain has a lot of data because they work with large companies that have huge amounts of log data, and these systems are so complex that it adds some extra challenges for our algorithms. To get access to such complex data is an important perspective.

How can your research make a difference to companies and society?
The biggest impact of our work and models is that you can gain insight into how you perform your work. It gives you an objective picture of what has been done.

Companies can use it to find out if there are places where work processes are performed in an inappropriate way and thus avoid the extra costs.

Can you tell us about your background and how you ended up working with this research area?
I initially got a Bachelor degree in Information & Communication Technology from Fontys University of Professional Education, then worked in industry where I led the webshop development team of a Dutch e-commerce provider and acted as project leader on the implementation of our product for two major customers; Ferrari and Hewlett Packard.

I decided to move to Denmark after meeting my (Danish) wife, at the time I was already considering pursuing further education, while my wife was fairly settled in Denmark, so it made sense for me to be the one to move.

I got my MSc and PhD degrees at the IT University of Copenhagen. There I became interested in the field of business process modeling because it allows me to combine foundational theoretical research with very concrete industrial applications. Process mining in particular provides really interesting challenges because it requires learned models to be understandable for business users, something that has only recently come into focus in the more general field of AI. 

After a short postdoc at ITU I accepted a tenure-track assistant professorship at DIKU, which was a very good opportunity because it offers a (near) permanent position for relatively junior researchers. At the time this was uncommon in Denmark.

Categories
News

Project will transfer AI from the cloud to the IoT device

13 December 2021

Project will transfer AI from the cloud to the IoT device

In a new DIREC project, computer science researchers collaborate with industry to develop artificial intelligence to let the IoT devices handle more things.

Photo: Kaare Smith, DTU

Digitization of society is one of the prerequisites for achieving the climate goal of 70 percent CO2 reduction by 2030. And there small sensors (IoT devices) installed in e.g. buildings, heating systems, and treatment plants will play an important role in managing energy consumption, heat, indoor climate, etc.

In a new project Embedded AI – supported by the national research center DIREC – researchers together with industry will investigate how to develop AI (artificial intelligence) that can be implemented in IoT devices so that they can do more themselves. Today, sensors are dependent on AI algorithms on cloud platforms or decentralized networks (Edge Computing), where data and commands are sent via internet / wireless networks.

“It is quite obvious that you will not be able to do the same as with the cloud and edge, but it will cost less, use less energy and be able to react faster. It will also increase security and privacy because data can be kept where it is collected. So there are many benefits to embedded AI, says the project manager, Professor, Section Manager, and Deputy Director at DTU Compute Jan Madsen.

In the project, DTU, Aarhus University, the University of Copenhagen, and CBS collaborate with the pump manufacturer Grundfos Holding, the engine and machine manufacturer MAN Energy Solution, the window manufacturer VELUX, and the technology company Indesmatech.

Move AI from large platforms to small ones
During the three years, the partners will work on specific issues within the four industry partners. They are strong representatives of companies that will be able to strengthen competitiveness by knowing the right tools and platforms to leverage embedded AI (eAI) in their products.

The project will examine the process of going from large platforms to small ones, explore suitable tool platforms, check what opportunities new types of chip provide for embedded AI and map out how embedded AI will be able to change the business models for companies.

Grundfos is experiencing a knowledge gap


The idea for the DIREC project has come through network meetings, where research institutions and industry talk about future competencies and technology needs. Here, Thorkild Kvisgaard, Head of Electronics, Director Technology Innovation at Grundfos, has participated.

He says the company sees a clear need to be able to move some of the artificial intelligence from the large platforms that run on mainframe computers, etc., down and run in more embedded devices (AIoT), even though it will be very resource-limited platforms to work on. Because you can save energy, and you avoid having to send data over the Internet and be dependent on the Internet and cloud solutions that run outside your own control.

“It will, of course, turn out that you can not do quite as much on platforms with limited resources, but we do not know those limits today. And maybe we can do a lot more than we think. If we work with something that is not time-critical, it does not matter that the embedded AI has to spend several minutes figuring something out if it is a slow and complex process,” says Thorkild Kvisgaard.

“At Grundfos, we have experimented with the technology ourselves, but we are experiencing a gap between what data science experts work with on large cloud platforms and what IoT programmers work with. So we hope that the project will also create a better understanding of each other’s work areas.”

Chip becomes crucial

The industry partner Indesmatech acts as both the local office for chip manufacturers, facilitates various development projects with new technology and helps companies to develop technology.

The company is looking forward to clarifying the possibilities when working with Embedded AI algorithms, explains Co-founder of Indesmatech Rune Domsten:

“What is interesting about the Embedded AI project, in addition to the software used for AI, is to investigate which chip and hardware platforms to execute on and use in the different situations. Because the battery consumption in sensors really depends on which chip you use, and it can be a question of whether the battery lasts for e.g. five or ten years.”

Although the industry partners in the DIREC project as large companies are already working on AI, the project could also have great significance for especially small companies that lag behind with artificial intelligence, says Project Manager Jan Madsen:

“While it may seem rather uninteresting in research to develop small AI algorithms, there are actually major research challenges in developing efficient architectures and methods that can be used in smaller and resource-limited sensors / IoT devices. It can also be what gets a small business started using AI for complex tasks and processes.”

About DIREC – Digital Research Centre Denmark

The purpose of the national research centre DIREC is to bring Denmark at the forefront of the latest digital technologies through world-class digital research. To meet the great demand for highly educated IT specialists, DIREC also works to expand the capacity within both research and education of computer scientists. The centre has a total budget of DKK 275 million and is supported by the Innovation Fund Denmark with DKK 100 million. The partnership consists of a unique collaboration across the computer science departments at Denmark’s eight universities and the Alexandra Institute.

The activities in DIREC are based on societal needs, where research is continuously translated into value-creating solutions in collaboration with the business community and the public sector. The projects operate across industries with focus on artificial intelligence, Internet of Things, algorithms and cybersecurity among others.

Read more at direc.dk

Embedded AI

Partners in Embedded AI:

  • DTU
  • Aarhus Universitet
  • Københavns Universitet
  • CBS
  • Grundfos Holding A/S
  • MAN Energy Solution
  • Indesmatech
  • VELUX

Contact 
Jan Madsen
DTU Compute
Technical University of Denmark
jama@dtu.dk

Categories
News

A new project will make it easier to design and certify IoT systems

8 December 2021

A new project will make it easier to design and certify IoT systems

IoT devices are blending into the infrastructure of both society and our personal lives. Many of these devices run in uncontrolled, potentially hostile environments, which makes them vulnerable to security attacks. Moreover, with the increasing number of safety critical IoT devices, such as medical and industrial IoT devices, IoT security is a public safety issue. Thus, the need for security in these systems has even been recognized at governmental and legislative level, e.g. in the EU, US and UK, resulting in a proposed legislation to enforce at least a minimum of security consideration in deployed IoT products.

Photo by Søren Kjeldgaard

Professor Jaco van de Pol will lead the DIREC project Secure IoT systems (SIoT), which aims to model security threats and countermeasures for IoT systems and services, to develop secure solutions, and to analyze residual security risks.

“Our goal with the SiOT project is to make it easier to design and certify secure IoT devices. Security and privacy are very important to many people and organizations that use IoT devices for measurements in smart cities, natural environments, logistics chains, and in their private homes. Engineering IoT devices is challenging, since they are physically small and must run on low power. Yet, they must perform accurate measurements and communicate with high efficiency. So how can one achieve security on top of that? We will provide new tools to model security threats, implement countermeasures, and analyze the final security risks”.

Jaco van de Pol continues: “I am happy to be able to work with a team that includes both academic researchers and industrial experts. This will ensure that the project addresses the right questions, and that we can find new solutions by combining the expertise from several disciplines. And we can evaluate the solutions in an industrial setting.”

The strategy is to use algorithms from automata theory and game theory to automate risk analysis and security strategy synthesis. The implementation of the security policies will consider both technical as well as social aspects, in particular usability in organizations and training of people.

For TERMA A/S, who are part of the project, their motivation is to be aware of the landscape in IoT systems in order to make them more cyber-resilient. Samant Khajuria, Chief Specialist Cybersecurity at TERMA A/S, explains:

“When we integrate IoT systems in our line of business, our main purpose is to provide safety for critical systems. Our systems go both to the defense and civilian sector such as Wind Farms, airports or harbors. We know that IoT devices sooner or later become obvious pieces of the puzzle in providing good systems in the future. And before integrating in systems like this we need to understand the threats and risks. Secondly, we would like to collaborate with universities in Denmark, because the researchers are working with this everyday. We are merely the users of the technology.”

Jørgen Hartig is Managing Director and Partner in SecurIOT, who are also part of the project. He hopes the project will help create the needed awareness on both sides of the “table” about the environment of industry 4.0. They often hear customers saying: “Why would the hackers go for us? We do not produce anything interesting…” or “the production has been for 25 years, and we haven’t had an issue” or “there are no connections between IT systems and OT systems.”

“The last statement will be challenged dramatically in the next 5-10 years. IoT and OT vendors will come out with new technology solutions that will utilize cloud-enabled applications and 5G connections to the factory floor, so there will be no “air-gap” in the future. I am not saying it is wrong, I am just saying that the consumers and IoT vendors need to work with the cyber threats and risks in a structured way.”

According to Gert Læssøe Mikkelsen, Head of Security Lab at the Alexandra Institute, there is a need for improved cyber security in IoT, which is also the reason why they participate in the project:

“We see a need for academic research in close collaboration with industry to deal with this. We hope that the tools and methodologies developed in this project will be deployed and improve the cybersecurity of IoT so we are all ready for the future, where we both expect an increase in the threats from cybercriminals and, as a consequence, an increase in requirements and regulation in this area that the industry must be ready to handle.”

About DIREC – Digital Research Centre Denmark

The purpose of the national research centre DIREC is to bring Denmark at the forefront of the latest digital technologies through world-class digital research. To meet the great demand for highly educated IT specialists, DIREC also works to expand the capacity within both research and education of computer scientists. The centre has a total budget of DKK 275 million and is supported by the Innovation Fund Denmark with DKK 100 million. The partnership consists of a unique collaboration across the computer science departments at Denmark’s eight universities and the Alexandra Institute.

The activities in DIREC are based on societal needs, where research is continuously translated into value-creating solutions in collaboration with the business community and the public sector. The projects operate across industries with focus on artificial intelligence, Internet of Things, algorithms and cybersecurity among others.

Read more at direc.dk

SIoT

In SIoT, the following parties will participate as collaborators:

  • Aarhus University
  • Aalborg University
  • DTU
  • Copenhagen Business School
  • Alexandra Institute
  • Terma
  • Grundfos
  • Develco Products
  • Beumer Group
  • Micro Technic
  • SecuriOT
  • Seluxit

Contact
Jaco van de Pol
Department of Computer Science
Aarhus University
jaco@cs.au.dk

Categories
News

Companies and researchers will develop digital artefacts to support the future hybrid workplace

1 December 2021

Companies and researchers will develop digital artefacts to support the future hybrid workplace

What should the next generation of Zoom and Teams look like? This question will be expored by researchers and companies in a new DIREC project led by Associate Professor Eve Hoggan. The project will gather researchers from universities all over Denmark, as well as several industrial collaborators. Hoggan will lead the project REWORK, which will re-think and develop the future of hybrid work forms.

Photo by Søren Kjeldgaard

There are a multitude of reasons to embrace remote and hybrid work. Climate concerns are increasing, borders are difficult to cross, work/life balance may be easier to attain, power distributions in society could potentially be redressed, to name a few. This means that the demand for systems that support hybrid work will increase significantly.

The recent COVID-19 pandemic, and the attendant lockdown, demonstrated the potential benefits and possibilities of remote work practices, as well as the glaring deficiencies such practices bring. Zoom fatigue, resulting from high cognitive loads and intense amounts of eye contact, is just the tip of an uncomfortable iceberg where the problem of embodied presence remains a stubborn limitation.

The research project REWORK: The Futures of Hybrid Work, led by associate professor Eve Hoggan, aims to enrich digital technologies for hybrid work. The goal is to design and develop artefacts and processes to support organizations in exploring and preparing for successful collaboration in the future.

Remote and hybrid work will certainly be part of the future of most work practices, but what should these future work practices look like?

“I think we need to aim higher than merely fixing the systems we already have,” says associate professor Eve Hoggan, and continues; “We need to be bolder and consider a different future for our workplace if we want to secure successful collaboration. And that is what REWORK is all about. We will, in particular, focus on representation of embodiment and physical surroundings in a digital/analog setting, as this is one of the most important obstacles for successful hybrid work.”

Bankdata is a company which needs such tools. To them it is crucial to be able to attract and retain the best employees. According to Peter Bering, Head of Digitalization at Bankdata, the workplace must be flexible with good opportunities for socializing, and in this regard the company’s digital products play an important role.

“The hybrid workplace is more than just a good video connection. It should also be characterized by a high level of commitment, creativity and cohesion, which is not easy to achieve with the technology we use today. But we are ambitious in this area, and therefore we have decided to engage in – and not least contribute to – the latest research in the field through a collaboration with DIREC,” says Peter Bering.

Lene Bach Graversen, Head of Facility at Arla, hopes that in the project will provide more digital tools that can support the agile collaboration at a distance.

“Like many other companies, we do not know exactly what will happen in the future. We hope that the feedback and knowledge we gain can direct our focus towards what tools are needed by the employees to optimize their online meetings, which have become a regular part of our work. Many of our employees work both at home and at the office, and we see that it offers advantages as well as disadvantages. We need to look at other available tools and how to develop them so that we can continue to support our employees. The strength of collaborating is that you learn from each other.”

Mads Troelsgaard, CEO and co-founder of SynergyXR, participates in the project to knowledge-share with the universities, but also because they want to make their AR/VR and Mixed Reality platform available for the project. For the past ten years, SynergyXR has developed AR/VR and Mixed Reality applications for some of the largest companies in the world.

On their platform, you can meet colleagues in Hololens, with VR glasses or in a room on a PC, and in this way explain complicated knowledge on a completely different level than is possible on Zoom or Teams. The companies may also upload videos, photos, pdf files or other, and in this way establish their own AR/VR setup. They build ‘the corporate metaverse’, where companies can build their own metaverse.

A lot of things appeal to us in this collaboration. We have a platform that is easy to access, and which provides the opportunity to meet in a completely new way, and which changes the way companies collaborate at a distance. In addition, we would like to contribute with our many years of experience as tech specialists within XR technology. In return, we hope to gain a lot of new knowledge both about what’s happening out there, but also to get feedback on our platform. We can also help train future employees to better understand the potential of XR technology which is another advantage. In this way, we see a lot of ‘wins’ from the collaboration”.

About DIREC – Digital Research Centre Denmark

The purpose of the national research centre DIREC is to bring Denmark at the forefront of the latest digital technologies through world-class digital research. To meet the great demand for highly educated IT specialists, DIREC also works to expand the capacity within both research and education of computer scientists. The centre has a total budget of DKK 275 million and is supported by the Innovation Fund Denmark with DKK 100 million. The partnership consists of a unique collaboration across the computer science departments at Denmark’s eight universities and the Alexandra Institute.

The activities in DIREC are based on societal needs, where research is continuously translated into value-creating solutions in collaboration with the business community and the public sector. The projects operate across industries with focus on artificial intelligence, Internet of Things, algorithms and cybersecurity among others.

Read more at direc.dk

REWORK

In ReWork, the following parties will participate as collaborators:

  • Aarhus University
  • Copenhagen University
  • IT University of Copenhagen
  • RUC
  • Alexandra Institute
  • Catch (Center for Art, Design, and Technology)
  • Microsoft Research, Cambridge UK
  • L&T InfoTech
  • Khora
  • Zimulate
  • KeyLoop
  • Studio Koh
  • Synergy XR
  • Lead
  • BEC
  • Cadpeople
  • Bankdata
  • Arla

Contact: 
Eve Hoggan
Department of Computer Science
Aarhus University
M: +45 93 50 85 56
eve.hoggan@cs.au.dk

Follow the project on cs.au.dk/rework or on Twitter @ReWork_Direc