Søg
Close this search box.

DIREC-projekt

Explainable AI

Resumé

Kunstig intelligens åbner op for teknologiske løsninger på problemer, der tidligere blev betragtet som afhængige af menneskelig intelligens. Det gør det muligt at levere menneskecentrerede løsninger, der i samarbejde mellem mennesket og AI-systemet både er mere effektive og af højere kvalitet end løsninger skabt af mennesker eller AI alene.

Projektperiode: 2021

Projektleder

  • Professor Thomas T. Hildebrandt
  • Department of Computer Science, KU
  • hilde@di.ku.dk

However, compared to traditional problem solving based on logical rules and procedures, some artificial intelligence systems, in particular systems based on neural networks (e.g. as in deep learning), do not offer a human-understandable explanation to the answers given. Lack of explanation is not necessarily a problem, e.g. if the correctness of an answer can be easily validated, such as automatic character recognition subsequently validated by a human. However, in some situations, a lack of explanation may pose severe problems, and may even be illegal as it is the case for governmental decisions.

Partnere