dvtt@lemmings.world to News@lemmy.worldEnglish · 10 months agoStudy Reveals Gender Bias in ChatGPT Translationsresearchinenglish.comexternal-linkmessage-square31fedilinkarrow-up166arrow-down114
arrow-up152arrow-down1external-linkStudy Reveals Gender Bias in ChatGPT Translationsresearchinenglish.comdvtt@lemmings.world to News@lemmy.worldEnglish · 10 months agomessage-square31fedilink
minus-squareAndOfTheSevenSeas@lemmy.worldlinkfedilinkarrow-up5arrow-down17·10 months agoComputers do not have the sentience required to be sexist.
minus-squareknightly the Sneptaurlinkfedilinkarrow-up14arrow-down1·10 months agoThey don’t need sentience to be sexist. Algorithmic sexism comes from the people writing the algorithms.
minus-squareAndOfTheSevenSeas@lemmy.worldlinkfedilinkarrow-up1arrow-down1·edit-210 months agodeleted by creator
minus-squareAndOfTheSevenSeas@lemmy.worldlinkfedilinkarrow-up1arrow-down17·10 months agoInteresting then that you chose to describe the LLM as sexist and not the programmers, regardless of the fact that you know nothing about them.
minus-squarelolcatnip@reddthat.comlinkfedilinkEnglisharrow-up17·edit-210 months agoProgrammers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.
Computers do not have the sentience required to be sexist.
They don’t need sentience to be sexist. Algorithmic sexism comes from the people writing the algorithms.
deleted by creator
Interesting then that you chose to describe the LLM as sexist and not the programmers, regardless of the fact that you know nothing about them.
Programmers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.
This is a nothing argument.
They’re nuts. Easy block, IMO.