By Elias Watanabe
A century ago, courts debated whether the state could detain a body without due process. Today, the question looks eerily similar—but the “body” in question is digital. Every person now trails a data double: a shifting dossier of clicks, purchases, health metrics, and geolocation pings. Banks use it to judge creditworthiness, insurers to price risk, employers to screen candidates. Increasingly, this second self is more decisive than the flesh-and-blood one. Yet unlike our physical bodies, our data selves enjoy no clear legal protection.
The Rise of the Data Double
Consider the case of predictive policing. Algorithms fed on historical arrest records identify “high-risk” individuals or neighborhoods. Those flagged may face heightened surveillance—even if they have never committed a crime. Or think of a health app that sells biometric data to third parties. A person may never consent to higher insurance premiums, but their heart rate and sleep cycles whisper otherwise. In both cases, the digital shadow makes decisions before the individual even knows it is cast.
Whose Property Is a Person?
At the root lies an unresolved tension: is data property, identity, or something else entirely? If treated as property, individuals should own and trade it. But if data is an extension of the self, selling it looks more like organ trafficking than a market transaction. The law is split. Europe’s GDPR leans toward identity, granting rights to erasure and correction. The U.S., by contrast, often treats data as a corporate asset, with ownership defaulting to the collector. The result is a patchwork that leaves most citizens unprotected.
The Case for Digital Habeas Corpus
Habeas corpus—the right not to be unlawfully detained—was once the cornerstone of physical liberty. A digital analog would recognize the right not to have one’s data self seized, altered, or exploited without due process. Imagine a system where algorithms labeling you “high risk” must provide reasons, subject to appeal. Or where biometric datasets cannot be bought and sold without explicit consent, just as bodies cannot be trafficked. The principle is radical, but so was habeas when it first appeared.
The Warning Signs
Ignoring this debate is not a neutral stance. Already, identity theft shows how easily data selves can be hijacked. Deepfake technology suggests futures where reputations can be forged wholesale. And generative AI raises the prospect of “digital ghosts”—versions of ourselves speaking, writing, and acting long after we are gone. Without legal guardrails, our second selves may prove more vulnerable than our first.
We once fought to establish that a person could not be imprisoned without recourse. Today, the same battle looms in the digital sphere. Do we still own our data selves? Or will we discover too late that they were never ours to claim?


