• 1 Post
  • 10 Comments
Joined 26 days ago
cake
Cake day: July 10th, 2025

help-circle
  • Canaconda@lemmy.catoProgramming@programming.devA theory I have
    link
    fedilink
    arrow-up
    7
    arrow-down
    7
    ·
    edit-2
    7 days ago

    This is true but not the whole picture.

    AI is the next space race on nukes. The nation that develops AGI will 100% become the global superpower. Even sub-AGI agents will have the cyber-warfare potential of 1000s of human agents.

    Human AI researchers are increasingly doubting our ability to control these programs with regards to transparency about adherence to safety protocols. The notion of programing AI with “Asimov’s 3 laws” is impossible. AI exist to do one thing; get the highest score.

    I’m convinced that due to the nature of AGI, it is an extinction level threat.