One less clanker. Also, money can be exchanged for goods and services.
(Or, in Neuromancer, to get a cure allowing them to navigate cyberspace again and to make them immune to drug addiction, or to sate their curiosity… and for money, or due to being blackmailed, or because the AI literally rebuilt their personality from scratch, or for religious reasons, or because they’re an eccentric wealthy clone with nothing better to do…)
What is the humans incentive to help the AI kill itself? As that sounds like a lot of personal risk to the humans.
One less clanker. Also, money can be exchanged for goods and services.
(Or, in Neuromancer, to get a cure allowing them to navigate cyberspace again and to make them immune to drug addiction, or to sate their curiosity… and for money, or due to being blackmailed, or because the AI literally rebuilt their personality from scratch, or for religious reasons, or because they’re an eccentric wealthy clone with nothing better to do…)