There's a meme running around that somehow the iPhone fingerprint scanner uploads your biometrics to Apple's servers.
The way these things work is that your fingerprint is turned into a series of numbers, usually through a one-way mathematical function called a hash. The device that does the scanning emphatically does not upload your fingerprints. That would actually be really bad security.
For a very simple analogy: Imagine that you signed your name on a piece of paper, and I then counted the number of loops in your signature, and stored that number. The next time you signed something, I would count the number of loops and if they matched the previous count, I would assume the two signatures matched, and therefore, that you were who you claimed to be.
The actual process of a hash function is more complicated—many more numbers, so it's very very unlikely that two signatures have the same "number"—but the basic idea is the same.
Note that this process does not mean you have my signature. Put another way: if I tell you my signature has twelve loops, can you now sign my name?
It does, however, mean that if you had my fingerprint (from, say, a crime scene or the wall of a mall in Africa) and you performed the hash function, and compared it to a set of hashes, you could tell it was me if my matching hash were in that database next to my name.
Now that we understand the science behind it, we know it's not a risk of identity fraud, but it might be a risk of a global fingerprint database if the hash were uploaded to Apple's servers and not properly encrypted with a password only you knew.
That is a useful discussion around which to set public policy. When is it OK for the government to access such a database? Does the constitution limit where and how Apple can store it?
But to have that discussion, you need to understand the underlying technology.
tomorrow demands an understanding of today
The average user of technology doesn't have enough of a foundation to understand things like the NSA, or biometrics, or cell phone spectrum, or global warming, or electronic voting, or DNA sequencing—all of which are vital, critical discussions.
The educational system, and society, has failed them in that regard. Online resources aren't much better, since they're optimized for what's popular rather than what's right. You need look no further than Popular Science's decision to block comments after a decade of trolling to see this happening before our eyes.
So increasingly, we can either trust others to think for us (seldom a good idea) or adopt what Stephen Colbert calls "truthiness"—an explanation that's simple, and wrong. And we're choosing truthiness.
Millennia ago, when we didn't understand science, we came up with things like Sun Gods and burning bushes and satan-in-a-pig to explain the rotation of the earth, or oil-laden shrubbery, or the side effects of Ascaris. I worry that we are manufacturing new Gods for the modern era, and their apostles are the vaccine-denying Jenny McCarthy, the fuel-loving Koch brothers, and the climate change apologists and their ilk.
It's not the end of science or technology. Indeed, we're in a period of unprecedented, accelerating innovation. But it is about a concentration of scientific knowledge without a commensurate growth in popularity, and a rejection of science by masses that increasingly depend upon it.
Why the flight from science?
One commenter on my original post asked why I thought this was happening. I'm not sure, but I do know that in the fifties science was going to make a better future for us all. With hindsight, we now understand that unchecked research is a two-edged sword; we have cautionary tales like Napalm, Thalidomide, and DDT in recent memory.
Moreover, our reptile brains can't keep up with the rapid pace of change. Biologically, we're wired to seek controversy and novelty and to find false positives; all of these make bad science but good headlines.
Even this rejection of science might be survivable if it weren't for the unvirtuous loop of democracy. Science is by definition uncertain and unpredictable, since it's based on experimentation. If you're doing it properly, most of your experiments fail. That makes it politically unappetizing. And as voters abandon science, they cut funding for it.
So this spiral continues: bad science understanding; less funding for education; bad science understanding. Rinse, repeat.
My friend Meg commented that,
I feel it has always been true that the average person doesn't want to understand how most things work. That is the foundation of good user experience design. Satan-in-a-pig is just as practical an explanation for most people.
Meg has a good point. Many people are happy to be the pig, satisfied. But if this is true then the average person needs to abdicate his or her opinions on education and the right to decide funding. And abdicating opinions about the future to others is a dangerous precedent.
William Gibson observed that "the future is here; it's just not evenly distributed." That's also true of our understanding of the present.
Arthur C. Clarke said "any sufficiently advanced technology is indistinguishable from magic." Because of Gibson, Clarke needs rewriting: Any technology sufficiently advanced from its user's understanding looks like magic to them—and since they're voters, they will find magical explanations for it. In doing so, they'll lap up the pablum of Fox News and its bedfellows, cut funding for the sciences, and set the human species back centuries.
We have to stop all this magical thinking. Otherwise the magic will stop.