Current Issue

Romanian Journal of Information Technology and Automatic Control / Vol. 34, No. 3, 2024


Defective truth. AI or HI ideological imprints and political biases?

Adrian LESENCIUC

Abstract:

Artificial Intelligence, the AI 2.0 version (Pan, 2016), implies continuous adaptation to the information environment. The development of AI is generated by research and development requirements and by the need for an optimal response to the changing information environment. The change in the information environment entails the development of AI and, consequently, the development of information networks understood as human-machine hybrid-augmented intelligence. This dynamic is not reduced to the information or physical dimension, but to the cognitive one (JCOIE, 2018), which can be affected by the information flows necessary for the decision-making process. Overall, these are a few sources of AI-generated corruption of truth. The first of them is related to the generalization process through which statistical algorithms create instructions to be able to build the artificial neural network. The second concerns the human selection of samples on which statistical algorithms are applied to produce learning and the selection of principles on which information filtering occurs. Both produce trust-twisting errors, similar to those that operate in prejudice, stereotyping, and discrimination and leave ideological imprints on how AI operates. This article aims to analyse from the perspective of AI ethics, the forms of truth falsification through the process of machine learning specific to AI. In this respect, an interpretive/ qualitative meta-analysis of primary studies regarding the political biases of AI is proposed.

Keywords:
AI Ethics, Machine Learning, Training Examples, Social Representations, Political Biases, Meta-analysis.

View full article:

CITE THIS PAPER AS:
Adrian LESENCIUC, "Defective truth. AI or HI ideological imprints and political biases?", Romanian Journal of Information Technology and Automatic Control, ISSN 1220-1758, vol. 34(3), pp. 9-22, 2024. https://doi.org/10.33436/v34i3y202401