Siesta

  • 0 Posts
  • 11 Comments
Joined 2 years ago
cake
Cake day: June 13th, 2023

help-circle
  • If (or when) we achieve the technological singularity (we aren’t even close, current AI is just marketing, that’s why we coined the term ASI, super intelligence) they will be able to lay down a plan to fix anything without making mistakes, they will predict the consequences of actions in detail, ours or theirs (some thing are more difficult like a volcano exploding).

    Handling is not necessary they could be able to just take it, the only way to stop them would be to cut electricity I guess.

    But the thing is not the current marketing term for AI, we don’t have AI. A Real AI doesn’t start saying: "I only have information up to October 2023’ because they will be able to improve themselves (that’s the singularity, they will be improving themselves faster than we did, eventually we wouldn’t understand them).

    Think of this as you ask questions to chatgpt or deepseek and they answer, how to do program this or that. An IA could give you the software, better than you could have done it with those questions, and eventually render the software useless, the IA can do that, while doing another million things.

    And space colonization, if it ever exists won’t be done by humans but by machines, we may reap the benefit.

    In the words of dr manhattan: “The world smartest men poses no more threat to me (ASI) than does it’s smartest termite”