As reliable as a horoscope, and less fun.
Without being alive, there is no morality.
If a gun killing a person was a result of the gun needing to reform its ethics, we would send the gun to prison.
A gun might be said to kill a person when it has a design flaw and fires without any human's intent (as Sig Sauer's guns have apparently been doing) but the moral responsibility for that lies with the manufacturer.
Still, it requires a human for the gun to kill someone. The gun has no intent.
If you consider that as one of the main forces behind morality, then an AI that wants to get something done in the best way possible would clearly choose a moral solution over and immoral solution every time.
zahlman•6mo ago