Biases and cultural and social responsibility

(Kim) #1

Talking about biases, do you feel that AI and machine learning has the danger to stagnate cultural shifts and progress?

For example you teach your algorithm to show the correct ratio of women and men in CEO roles. Women will be a lot less represented, so can this easily change?

Should it already know to choose 50/50 representation even if this is not currently true? Are we in danger in coding in our own bias and blocking cultural change or affecting cultural change?

AI Testing TestChat