A stock objection to Calvinism is that it would be unjust (or "monstrous") for God to condemn evildoers whom he predestined to commit evil in the first place. They were never a chance to do otherwise.
Let's assume, for the sake of argument, that determinism (or predeterminism) is incompatible with moral responsibility. Now let's recast the argument by making a comparison.
In the Terminator franchise, a Terminator is a robotic assassin. An artificially intelligent android that's programmed to kill a particular individual.
(For some reason they are called cyborgs, but from what I can tell, they don't have any human parts. They merely have a human appearance.)
Terminators are like glorified cruise missiles or smart bombs. They don't necessarily need full-blown consciousness. They just need enough (artificial) intelligence to identify the target, ascertain information on the ground, and adapt to varied situations.
They don't need "consciousness" in the sense of the internal dimension, viz. first-person viewpoint. They don't need to know "what it's like to be me."
But since this is all hypothetical, we could endow them with consciousness. That's surplus.
Terminators are fearful in two respects:
i) They have superhuman strength. They are tireless, relentless, and resourceful. Virtually unstoppable. Humans on the run have to sleep. They don't. Even if you get a head start, they will catch up.
ii) But, if anything, they are even more fearful in another respect: they are utterly pitiless. They's because they are inhuman. Machines. As such, they are incapable of feeling compassion for another human being. They can't project themselves into our mindset. They don't know what it feels like to be human. You can't appeal to their empathy. There's no hook.
Now, suppose a Terminator is programmed to kill a child, to preempt what he will become. To change the future.
And to make sure they kill the child, they allow themselves a margin of error by planning to wipe out an entire classroom full of second-graders.
According to the hypothetical under consideration, the Terminator is amoral. Because its actions are programmed, it isn't blameworthy.
But even if we grant that for the sake of argument, it would be morally imperative to stop the Terminator by any means necessary. Destroy the Terminator before it kills innocent children.
That's despite the fact (ex hypothesi) that the Terminator isn't a morally responsible agent. Even though it's not culpable, it has no right to endanger the kids.
Neutralizing the Terminator isn't punitive. Rather, it's protecting the innocent.
BTW, this isn't just hypothetical. There are some real-world analogues. For instance, people on a psychotic drug-high can be dangerous.
Someone might say that, given a choice, it would be preferable to reprogram the Terminator rather than destroy it. Perhaps so.
However, we don't owe it to the Terminator. A Terminator can, indeed, be reprogrammed. It can be programmed to be a nanny, gardner, chef, quarterback, ballet instructor, or violinist. It can be programed to be masculine or feminine.
That's because a Terminator is a blank slate. It has raw intelligence. It has great potential. But it has no innate personality or character traits. Its memory is wiped after each mission.
It isn't supposed to be any particular way. Its identity is essentially indefinite. Whatever the programer wants it to be.
So it wouldn't be wrong to destroy it rather than reprogram it. You wouldn't be wronging the Terminator. It's not as though it deserves better treatment. For its character is supplied by the programer.