vonxylofon@lemmy.worldtoToday I Learned@lemmy.world•TIL The US secret service trains in Tyler Perry's White House, because they can't get a new training facility fundedEnglish
4·
2 days agoYes, the United States Schutzstaffel.
Yes, the United States Schutzstaffel.
I still fail to see how people expect LLMs to reason. It’s like expecting a slice of pizza to reason. That’s just not what it does.
Although Porsche managed to make a car with the engine in the most idiotic place win literally everything on Earth, so I guess I’m leaving a little possibility that the slice of pizza will outreason GPT 4.
Join us. Thrive.
With a clear set of criteria, you can easily make this argument that the designer of the discrimination system is culpable because they input discriminatory criteria into the system, I’m with you there.
However, with AI, it may easily happen that unforeseen discriminatory behaviour emerges, in which case I would argue it is indistinguishable in practice whether a computer is purely evaluating criteria or making a decision on its own for the purposes of calling decisions discriminatory.
The same happens e.g. when discovering new proteins using AI. AI comes up with a protein, you confirm it’s better than the previous one, victory. There may be a better one, but that’s not really a concern here. Same can’t be said when targetting a group of people with repressive measures.