Fully autonomous weapons would lack the human judgment necessary to evaluate the proportionality of an attack, distinguish civilian from combatant, and abide by other core principles of the laws of war. History shows their use would not be limited to certain circumstances.
It’s unclear who, if anyone, could be held responsible for unlawful acts caused by a fully autonomous weapon: the programmer, manufacturer, commander, and machine itself. This accountability gap would make it is difficult to ensure justice, especially for victims.
Fully autonomous weapons could be used in other circumstances outside of armed conflict, such as in border control and policing. They could be used to suppress protest and prop-up regimes. Force intended as non-lethal could still cause many deaths.