• DarkCloud@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    5
    ·
    1 day ago

    Do we know how human brains reason? Not really… Do we have an abundance of long chains of reasoning we can use as training data?

    …no.

    So we don’t have the training data to get language models to talk through their reasoning then, especially not in novel or personable ways.

    But also - even if we did, that wouldn’t produce ‘thought’ any more than a book about thought can produce thought.

    Thinking is relational. It requires an internal self awareness. We can’t discuss that in text so much that a book is suddenly conscious.

    This is the idea that"Sentience can’t come from semantics"… More is needed than that.

    • A_A@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      1 day ago

      i like your comment here, just one reflection :

      Thinking is relational, it requires an internal self awareness.

      i think it’s like the chicken and the egg : they both come together … one could try to argue that self-awareness comes from thinking in the fashion of : “i think so i am”