


Once we create autonomous(‑ish) artefacts, who is responsible, who has rights, and who governs so that humans don’t end up in the creature’s position—or in Victor’s?
c. Asymmetry as a design principleFor a “Frankenstein and AI” jurisprudence, one useful move is to build explicit asymmetry into rights‑based frameworks:
Humans: Full bearer of rights; AI must not be designed or deployed in ways that undermine human dignity, autonomy, or equality.
AI systems:
No intrinsic rights (for now);
Possible derivative protections (e.g., rules against gratuitous virtual cruelty) to safeguard human character and prevent slippery‑slope desensitisation, akin to some animal‑law rationales.
In other words: a jurisprudence that learns the lesson Shelley tried to teach. Law should not wait for the creature to become a monster; it should regulate Victor, the lab, and the society that enable him—while keeping the dignity and safety of ordinary humans at the centre of the story.