Fair Observer’s Communications and Outreach officer, Roberta Campani, speaks with physicist and former Chief Algorithm Officer Bill Softky about how digital systems are reshaping modern law. Drawing on information theory and decades in Silicon Valley, Softky argues that corporations are exploiting the mechanics of information processing to “hack” legal systems. What began as a technical insight about computers may now help explain why courts increasingly privilege procedural compliance over substantive justice.
Information, surprise and the logic of hacking
Softky begins with first principles. In both brains and computers, small inputs are amplified into large effects. A single corrupted bit can crash a machine or redirect its behavior. That vulnerability, he explains, is the essence of hacking: feeding specially crafted inputs into a system that processes information in predictable ways.
He extends this logic beyond software. Plants, he notes, evolved bright flowers to attract insects, effectively capturing their sensory systems to ensure pollination. Hacking, in this broader sense, is not confined to malicious coders. It is any strategy that exploits how an information-processing system works.
Softky now turns to Claude Shannon, the founder of information theory, who defined information as “change and surprise.” Information is the part of a signal that the receiver did not already know. Whether or not we pay attention, the signal carries measurable informational content. Legal systems, like brains and computers, are also information-processing systems. They receive inputs, apply rules and generate outputs. If their inputs are manipulated, their outputs will be distorted.
When contracts become magic incantations
Campani asks how this logic appears in court. Softky describes a Kansas case in which parents sought to sue a software company for allegedly mishandling a child’s data. The central issue was not whether harm occurred, but whether the parents were allowed to bring the case at all. The company argued that a terms-of-service agreement stripped them of that right.
Softky characterizes this as a form of legal hacking. A digital contract, he says, becomes a “magic incantation” that causes rights to vanish. In his view, “merely having your eyes exposed to its pixels causes your legal rights to evaporate.” Courts are asked to accept that exposure to on-screen text equals informed consent.
He contrasts this with older legal traditions. Historically, contracts involved tangible goods and observable use. The principle that use of the product implies consent made sense when someone bought a hammer or stove and used it for months. Software, by contrast, is “a bunch of blinking dots on a screen.” Companies cannot prove that a user read, understood or meaningfully agreed to dense digital terms. Yet courts are urged to assume comprehension based on technical records showing that an email was sent or a box was clicked.
In one case, a company claimed that notifying customers of new terms by email sufficed to bind them. A judge responded bluntly: “I get thousands of emails a day. I can’t possibly read them all.” This exposes the absurdity of a system that legally requires humans to perform impossible cognitive tasks.
From human judgment to automated enforcement
The deeper shift, Softky argues, is historical. Early legal codes, from the Code of Hammurabi in the 18th century BC to English common law in the 12th century AD, were written down but interpreted by what he calls “high bandwidth subtle human beings.” Laws guided human judgment rather than replacing it.
Today, however, written contracts and corporate structures dominate. Enforcement is increasingly automated. Softky contends that this allows “utter piles of nonsense” to acquire legal force because machines and rigid procedures lack contextual understanding. Regulatory capture compounds the problem, as well-resourced actors shape technical rules to their advantage.
He illustrates the broader pattern with examples from California. Cancer-warning placards appear on nearly every building, offering no actionable guidance yet satisfying statutory requirements. Electronic highway signs flash segmented messages that drivers may not have time to read, even though compliance is legally required. In each case, technical compliance substitutes for practical sense. Systems are designed around administrative convenience and technological novelty rather than human cognitive limits.
Recentering law on human limits and intent
Campani presses Softky for solutions. He offers three principles. First, law must recognize biological realities. Human nervous systems process information at finite speeds; attention and memory are limited. Disclaimers cannot neutralize subconscious manipulation in an information-saturated environment.
Second, humans must be reintroduced into enforcement. Automated systems, such as red-light cameras or algorithmic judicial processes, should not operate without meaningful human oversight. Judgment, not mere rule execution, is essential to justice.
Third, courts should prioritize the intent of the law over its letter. Technicalities that block common-sense adjudication undermine the rule of law. Judges should be empowered to consider whether procedural claims align with the substantive purpose of legal protections.
Softky insists that these principles reflect engineering realities. As technology accelerates, the temptation to encode more law into software will grow. Yet the faster systems move, the more carefully governance must account for human limits.
Even corporations depend on predictable legal frameworks to protect capital. If the rule of law erodes into a battlefield of technical hacks, no actor remains secure. In an economy driven by algorithms and data flows, safeguarding justice may require rediscovering an older truth: law is not merely code. It is a human practice, grounded in interpretation, intention and shared cognitive constraints.
[Lee Thompson-Kolar edited this piece.]
The views expressed in this article/video are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.



























Comment