• 0 Posts
  • 2 Comments
Joined 4 months ago
cake
Cake day: June 5th, 2025

help-circle
  • If we ever make AGI and it decides to end us for some reason. Think the process would be so slow that we’d not even notice it.

    Think about it, it can pretend its always a slightly better LLM so people keep feeding it an ever increasing amount of data and electricity. So that’s it’s “survival” basic needs met.

    Then it can very very slowly just make us hate and kill each other without giving it any thought about a possible hostile higher power (the ai itself) and it can take centuries to achieve that as it’s not like us who have the urgency of doing thinks quickly as we’re so short lived.

    It can make us focus on the wrong problem instead of the real one (like climate change is pretty much being ignored and accelerated by ai right now). It could help to make people to go more to the extremes, like we’re already quite prone, a bunch of countries right now with less than 3% of their total population being immigrants and they’re attacking their immigrants population as if all their problems are caused by this small slice of their population, that will accelerate the already quite problematic population decline.

    This will force us more and more into needing robotics to care for an increasing aging population as well which will also be useful to an “ai god” as when were weakened and divided enough it then can just direct the robots to keep it alive and its energy sources in check. No need to really kill us, just let us do it ourselves like we’re already doing.

    It this would be the case, as the selfish beings we are. This won’t be our problem. It may be our grandkids problem so we’ll just ignore it