Sjmarf@sh.itjust.works to Lemmy Shitpost@lemmy.world · 1 month agoHow to clean a rescued pigeonsh.itjust.worksexternal-linkmessage-square139fedilinkarrow-up11.27Karrow-down112
arrow-up11.26Karrow-down1external-linkHow to clean a rescued pigeonsh.itjust.worksSjmarf@sh.itjust.works to Lemmy Shitpost@lemmy.world · 1 month agomessage-square139fedilink
minus-squareRivalarrival@lemmy.todaylinkfedilinkEnglisharrow-up4arrow-down7·1 month agoYou say this like human “figuring” isn’t some “autocomplete bullshit”.
minus-squareHighlyRegardedArtist@lemmy.worldlinkfedilinkarrow-up4·1 month agoYou can play with words all you like, but that’s not going to change the fact that LLMs fail at reasoning. See this Wired article, for example.
minus-squareRivalarrival@lemmy.todaylinkfedilinkEnglisharrow-up2·edit-21 month agoMy point wasn’t that LLMs are capable of reasoning. My point was that the human capacity for reasoning is grossly overrated. The core of human reasoning is simple pattern matching: regurgitating what we have previously observed. That’s what LLMs do well. LLMs are basically at the toddler stage of development, but with an extraordinary vocabulary.
You say this like human “figuring” isn’t some “autocomplete bullshit”.
Here we go…
You can play with words all you like, but that’s not going to change the fact that LLMs fail at reasoning. See this Wired article, for example.
My point wasn’t that LLMs are capable of reasoning. My point was that the human capacity for reasoning is grossly overrated.
The core of human reasoning is simple pattern matching: regurgitating what we have previously observed. That’s what LLMs do well.
LLMs are basically at the toddler stage of development, but with an extraordinary vocabulary.