Metrology-Inspired Methods to Assess the Biases of Artificial Intelligence Systems
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33087
Metrology-Inspired Methods to Assess the Biases of Artificial Intelligence Systems

Authors: Belkacem Laimouche

Abstract:

With the field of Artificial Intelligence (AI) experiencing exponential growth, fueled by technological advancements that pave the way for increasingly innovative and promising applications, there is an escalating need to develop rigorous methods for assessing their performance in pursuit of transparency and equity. This article proposes a metrology-inspired statistical framework for evaluating bias and explainability in AI systems. Drawing from the principles of metrology, we propose a pioneering approach, using a concrete example, to evaluate the accuracy and precision of AI models, as well as to quantify the sources of measurement uncertainty that can lead to bias in their predictions. Furthermore, we explore a statistical approach for evaluating the explainability of AI systems based on their ability to provide interpretable and transparent explanations of their predictions.

Keywords: Artificial intelligence, metrology, measurement uncertainty, prediction error, bias, machine learning algorithms, probabilistic models, inter-laboratory comparison, data analysis, data reliability, bias impact assessment, bias measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 136

References:


[1] JCGM GUM-6:2020 Guide to the expression of uncertainty in measurement Part 6: Developing and using measurement models
[2] NF ISO 5725-2 (1994) Accuracy (trueness and precision) of measurement results and methods. Part 2 Basic method for the determination of the repeatability and reproducibility of a standard measurement method
[3] ISO/TS 21748: Guide to uncertainty estimation using interlaboratory study data
[4] Guide ISO 43 Proficiency testing of laboratories by intercomparison
[5] NF ISO 13528 December 2005, Statistical methods used in proficiency testing by interlaboratory comparisons
[6] NF EN ISO/CEI 17043 (April 2010), Conformity assessment - General requirements for proficiency testing
[7] FD ISO GUIDE 34 May 2010 General requirements for the competence of reference material producers.