A more perfect tech
We need a digital bill of rights. And the First Amendment is a good place to start.
NPR has a podcast called More Perfect, about the amendments to the US constitution, that I’ve been listening to lately. It’s fascinating, and full of metaphor, example, and quirk that help unpack the US constitution.
In the first episode, Norman Dorsen professor in civil liberties at New York University Law School Burt Neuborne explains the first amendment. If you don’t know these 45 words by heart, they are:
Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances.
Dorsen argues that there is a logic to the order of these rights. They aren’t just a list. He points out that these are the “life cycle of a Democratic idea.” That is, a citizen may have ideas and beliefs within their mind, where they’re free to form. Then they can try them out, talk about them, tell others, organize, and finally ask government to change.
I think we need an equivalent set of rights for freedom of technology.
Why isn’t this a thing yet?
As Dorsen points out, these rights build upon one another, and recognize the evolving nature of law and justice.
Freedom to imagine
Innovation starts in our heads. When it comes to technology, ideas often have unintended consequences. But they start with an ideal, often subversive, spark. Startup founders want to bend the world to their ideas, and those ideas are often unpopular or unreasonable at the outset. By definition, what startups do is search for a new business model.
A germinating idea is indistinguishable to the constitutional right to freedom of religion. It gives us permission to believe in a world different from the one we inhabit today.
Why do ideas need defending? Because while it might seem like we can all believe what we want, but oppressive regimes give us pause. Simply knowing that certain beliefs aren’t legal curtails our imagination.
In the early stages of an idea, we discuss them with others in private. Here, surveillance has a chilling effect, even if those ideas aren’t ever going to become fact. Criminalizing speculation limits imagination. If you’re considering, say, replacing voting with a Blockchain-based, syndicated democracy, is that treason?
I’m reading The Three Body Problem, a Chinese hard-science SF novel that’s packed with controversy on science, government, theoretical physics, and much more.
If a future government disapproves of the ideas it explores, is that criminal?
This is digital freedom of religion: The right to imagine myriad technological futures.
Freedom to experiment
Once that idea is fully formed, we need to be able to experiment. This is where idea becomes fact. A maker building something, or a hacker unlocking a phone. A researcher testing the security of a device. Unfortunately, such behaviour can often be done for nefarious purposes—but we have laws for that. There are ethical guidelines that govern research.
In an era of academic research, we experiment for the pure, unfettered pleasure of finding things out. Lately, in most liberal democracies, the belief in the invisible hand of the market means research is often funded for profit, ending in a patent or a company. That’s dangerous: Fundamental research has outcomes that we can’t easily see in advance.
Should you be allowed to unlock a phone to try something out? Are the protections of commercial interests violated by such experimentation—or only when the hacker acts in ways that undermine the intellectual rights of the phone’s manufacturer?
Microsoft just transferred its 60,000 patent portfolio to the Linux Foundation. Think about that. A company which has been in front of the world’s courts, aggressively defending its inventions, just made them available to everyone.
Photo by rawpixel on Unsplash.
The fourth amendment to the US constitution protects people against search and seizure, and the fifth against the seizure of private property. There are shadows of such rights at play in experimentation: Companies have rights, but they shouldn’t be able to curtail experimentation in ways that hurt the public good. And are those rights counter to the public interest? Should we have an Eminent Domain for overreaching intellectual property? How do the courts decide such things in favour of a better society?
This is digital freedom to exercise religion. As a society, we have to say that tinkering is allowed, even without commercial intent. Hacking is okay.
Freedom to document
If you’ve found something cool, you need to be able to share it. Whether publishing source code on Github, or releasing something as open source, or writing a user manual, you should be able to give your ideas to others.
This is a critical juncture in the development of technology: It’s not just you any more. It therefore a degree of restriction, but it’s still ideas.
In the early days of public key encryption, the PGP algorithm was declared a munition—leading its creators to publish it as a paper book to circumvent export laws. People have tattooed the algorithm on themselves, effectively making them weapons under the law.
The DeCSS decryption algorithm had a similar fate:
A T-shirt with the DeCSS algorithm on it. https://www.flickr.com/photos/edrabbit/3228250152 under BY-NC 2.0 license https://creativecommons.org/licenses/by-nc/2.0/
Similar protections must exist for code shared under Apache, Linux, GPL, Creative Commons, and other share-alike models. If anything, these must be free of commercialization. We need to protect these from those who would privatized what is intended as a public good, while enforcing the digital equivalent of slander.
This is digital freedom of speech. We have to recognize the right of open source, distributed documentation, and shared code, to exist without making its authors criminals.
Freedom to distribute
Software often changes the world because of strong network effects: The more people use it, the more useful it becomes. This is true of the Internet, the Web, all social media, tools like Skype, and shared content systems. More than just sharing documentation, we’re now allowing third parties to distribute what we have to say, at scale. We’ve compiled our ideas, turned on our servers, spun up our platforms.
Speech is an individual. The press is broadcast. The Internet is multi-directional; it can be one-to-one or many-to-many, or anything in between. Just because transmission isn’t a familiar broadcast model doesn’t strip away its rights. By the same token, as our idea reaches others, we need to consider how it affects them. We can be free, but we can’t be reckless with the rights of others.
Code is fundamentally different because it does something. The press might publish an article, but that could at most change minds. Code, on the other hand, can act independently of humans. That raises the bar: Where the press has libel laws, we need to ensure that software is transparent and avoids abusive dark patterns. Authors of code must be accountable; what they build must be signed, and should do what it claims.
This is digital freedom of the press. Publishing code that does something should have the same protections as publishing words at scale.
Freedom of community
In the constitution, the public has the right to assemble peacefully. That is, people can come together to organize, protest, and combine into a force that cannot be ignored. The constitution enshrines this right, because that’s how we keep government honest. It’s how we evolve and adapt.
This right has a condition, of course. It isn’t a license to assemble no matter what. That assembly must be peaceful — it can’t be a mob. It must abide by the rights of others, because this is the point at which it might infringe on them.
Similarly, in a digital world, we need the right to distribute as long as it doesn’t infringe on the life, liberty, and pursuit of happiness of others. User groups can exist. But law-abiding software distribution, or the delivery of content on public platforms, is more than just protected speech. The fact that public platforms like Twitter, Facebook, or LinkedIn are commercial properties, and aren’t enshrined in the constitution the way traditional press are, doesn’t absolve us from figuring this out.
User groups are digital freedom of assembly. Digital spaces where technology is shared, improved, discussed, and upgraded are the online equivalent of assembly. Ongoing support matters if software is to remain current.
Freedom to demand upgraded government
The last right in the First Amendment—“to petition the government for a redress of grievances”—isn’t so much a right of citizens as an obligation of government. Congress must listen.
Otherwise congress can’t become progress.
This last right forces an upgrade. Once things have been imagined, tested, documented, shared, and a community of support has emerged, governments must remain current.
We’re asking governments to adopt Lean Startup principles. If a better way exists, governments must embrace it. They have to want the future sooner, despite the uncertainty. But we have to give them permission to do so:
As citizens, we have to recognize that digital means experimentation
We must acknowledge that perfect is the enemy of good enough because a digital tool is constantly updated. The first version won’t work for everyone, or have every feature, and we need to be OK with that.
Given license from the electorate to do so, governments need to try things out, and communicate their goals, progress, and outcomes clearly.
Most importantly, we need to realize that failure isn’t failure if we learn, and we have to stop punishing public servants and elected officials for genuine attempts to improve.
Until now, much of digital government is abdication. The Billions of dollars spent each year on tax preparation software happen because governments ceded their user interfaces to the private sector.
This has horrible side-effects. When Federal agencies fail to create proper foundational technologies like digital identity, they force Crown Corporations to use private-sector tools.
How Canada Post currently verifies identity with a hacked foreign national credit service.
Similarly, we can sign onto sites with Twitter, Google, Facebook and more using federated authentication models, but we have no unique identification with our own government—which is far better able to identify us in a trusted way that doesn’t try to profit from our identity.
This is digital freedom to petition: We, as citizens, have not only a right, but an obligation, to force governments to adopt proven technologies in a timely manner when they can benefit society as a whole.
The rights of our digital doppelgängers
The transition from an analog to a digital world is the biggest shift human society has ever faced:
Patterns of communication that were once costly are now free—or even compensated when they capture attention for a platform.
Modern software, unlike machines of the steam age, creates its better successors.
In a connected world, everyone is a sensor and a broadcaster, challenging the friction of atoms that gave us an illusion of privacy.
We find our special interests at a global scale, tricking our reptile brains into thinking we’re part of a moral majority.
We’ve gone from finding suspects and collecting data on them to collecting data and finding suspects in it.
Technology knows more about us than any human, and yet our phone doesn’t enjoy attorney-client privilege.
Digital records mean we can’t ever outgrow our past, or shed past opinions; there is no reconciliation.
Yet despite these obvious, tectonic changes to society, we continue to try and apply outdated laws to modern troubles, twisting ideas that are increasingly irrelevant. We need to amend society for the connected age—and the First Amendment needs a technological cousin for our digital doppelgängers.