Don't like being stared at? That's why we have a primal need to fix the Internet
Paper: 'How privacy's past may shape its future'
(NOTE: Hello Red-Tape-ers. I’ve been a bit quiet lately. I am hard at work on a large project that will be published soon, and then you’ll see me back in your inbox more frequently. Thanks for being a loyal reader.)
Ever sit in a coffee shop, staring idly into space, when you get the weird sensation that someone is staring at you? How did you know? What a strange skill to have.
Our innate desire for privacy is indeed powerful. It's also primal.
Privacy is a human right, at least in Europe; a new paper makes the convincing argument that privacy is human nature, too. Carnegie Mellon professor Alessandro Acquisti writes in the new issue of Science that humans have engaged in privacy-seeking behavior throughout history and across the globe. This matters because there are those who think privacy is a modern invention, and perhaps a temporary one at that. Such thinking would be convenient if your technology company was based on invading people's privacy.
"From the Greek philosophers of the fourth century BC ‘discuss(ing) the concept of private life,’ Acquisti writes, "To Chinese thinkers developing ‘a sharp distinction between the concepts of public and private by the third century BC; from lovers in ancient Rome who would need to ‘borrow the house of an indulgent friend’ to escape the prying eyes of their servants, to 1950s Javanese culture, where little physical seclusion was available and people manifested their privacy needs through stiff social interactions. Evidence of privacy mores even arises from within the ancient holy books of monotheistic religions—from the Quran to the Talmud to the Bible."
The piece is co-authored by Laura Brandimarte at the University of Arizona and Stanford’s Jeff Hancock.
I've heard Acquisti and others make this point before, but he felt it necessary to study the subject now, as Congress creeps closer to passing regulations on Big Tech companies. To do that right, policymakers must understand how fundamental privacy is to being human, Acquisti argues.
"There is this implication of presenting privacy as a modern invention … it can be used to convince people not to worry about the current state of privacy. We wanted to (examine) those claims and counter them," he told me.
At stake is the way Congress or other government agencies might choose to implement and enforce any new privacy protections. A clumsy notice and consent model has slowly evolved during the Internet's first few decades, with consumers tolerating pop-up boxes and privacy policies they couldn't possibly understand -- and tech companies doing whatever their lawyers feel they can get away with after jamming those notices with fine print. An occasional privacy vigilante might take days or weeks rejecting permissions and deleting cookies, but as anyone who has tried knows, that task is a bit Sisyphean. Often, software updates wipe out such preferences anyway.
In real life, privacy takes far more subtle, but effective, forms. There’s that sense that you are being stared at, for example. Often, staring back is enough to chase away the space invader. Alernatively, you might decide to engage with the person, and smile back. Our lives are full of small actions, Acquisti writes.
"We lower our voice during intimate conversations and raise it when we want a large audience to hear us; we cover a document we are reading so it is protected from prying eyes, or raise it up and emphatically show it when we want to make a point."
What Acquisti calls "sensorial" cues that we use to navigate privacy in the real world do not exist online. Despite years of handwringing, we've done almost nothing to provide consumers with any analogous tools for their digital lives. Instead, consumers unconsciously divulge gigabytes of data about themselves to thousands of companies they've never heard of, only to be horrified every once in a while when there's a large data breach or scandal involving a data broker.
In a world without these sensorial cues, a system that relies on consumers to understand what's happening to them and make sensible choices is destined to fail. The consumers can't be "fixed." Instead, the technology must be fixed to fit human nature. It can be done. We've done it before, Acquisti argues. Consider the evolution of automobile safety:
“Thanks to continuous technological improvements, the speed and acceleration of production cars kept growing over time. Once they reached velocities that rendered drivers’ reaction times unreliable tools for avoiding collisions, the solution was not to teach individuals to develop faster reaction abilities, but rather to develop policy interventions (e.g., mandatory safety standards on cars for accident avoidance and damage reduction) and technological fixes (e.g., anti-lock braking systems, airbags) that countered the challenges arising from other technological progress. Better safety in cars was the result of deliberate policy intervention driving investment in technical and infrastructural changes, not merely of more driver education and market forces."
If privacy is part of what it means to be human, then technology which is designed to invade that privacy is perhaps a bigger menace than we've recognized so far. This means it's time for dramatic, corrective action. That won't come if efforts to pass a federal privacy law boil down to adding more pop-ups and fine print to our consumer products.
“My concern is that this momentum towards a federal law will be channeled towards notice and consent, perhaps under a different name … rather than going the direction of privacy-enhancing tools,” he said to me.
Acquisti likes the notion of "data fiduciaries" who collect consumer information but have an obligation to use it only in the consumers' best interest -- akin to financial advisors who are legally bound to do what's best for their clients.
“The history of privacy tells us that the drive to maintain a private space may be as universal as the drive to commune (and that the two drives are in fact intimately related); and that humans invariably attempt to carve out those spaces even when odds are stacked against them by surveillance, be it digital or physical,” the paper concludes. “The reason why concerns over privacy endure, despite privacy being repeatedly declared dead, may be in part cultural and in part related to visceral, evolutionary roots. The current state of privacy also tells us, however, that those spaces have become unquestionably harder for individuals to manage. Solutions that predominantly rely on notice and consent mechanisms are inadequate—because they fail to take into account the visceral roots of our sense of privacy, and thus can be easily gamed by platforms and service providers. Understanding and then accounting for those ancestral roots of privacy may be critical to secure its future.”
That's why I'm not on social media! We need laws and regulations so good luck to those involved in creating these.