• JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    Well, we haven’t solved the hard problem of consciousness, so we don’t know if size of brain or similarity to human brain are factors for developing consciousness. But perhaps a more important question is, if it did develop consciousness, how much pain would it experience?

    • ColeSloth@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      Physical pain? Zero.

      Now emotional pain? I’m not sure it would even be able to accomplish emotional pain. So much of our emotions are intertwined with chemical balances and releases. If a brain achieved consciousness, but had none of these chemicals at all…I don’t know that’d even work.

      • Warl0k3@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 months ago

        While we haven’t confirmed this experimentally (ominous voice: yet), computationally there’s no reason even a simple synthetic brain couldn’t experience emotions. Chemical neurotransmitters are just an added layer of structural complexity so Church–Turing will still hold true. Human brains are only powerful because they have an absurdly high parallel network throughput rate (computational bus might be a better term), the actual neuron part is dead simple. Network computation is fascinating, but much like linear algebra the actual mechanisms are so simple they’re dead boring - but if you cram 200,000,000 of those mechanisms into a salty water balloon it can produce some really pompus lemmy comments.

        Emotions are holographic anyways so the question is kinda meaningless. It’s like asking if an artificial brain will perceive the color green as the same color we ‘see’ as green. It sounds deep until you realize it’s all fake, man. It’s all fake.