Facebook data, a powerful weapon, has been stolen and used against us. That's the real threat

You like a cute dog photo. You click a link to get NCAA basketball scores. You comment on a friend's new job. You download an app to navigate a conference, or a family wedding. And suddenly you are dropped into buckets like "a neurotic introvert, a religious extrovert, a fair-minded liberal or a fan of the occult." And a few shady business deals later, you become a pawn in global warfare.
As you contemplate what Facebook might or might not have done wrong during the election, let me clear things up for you: Facebook was just playing by our outdated American rules. So don't blame Facebook, blame us.
Privacy is a boring topic. Hacks are sexy, but flee from the headlines quickly. Those are side shows to the actual frightening tech problem of our time: Unintended consequences. This weekend's Facebook-Cambridge Analytica-Russian election hacking story should lay this clear for you.
The problems that arise with reckless data collection and privacy abuse often aren't apparent at the time of collection. Sure, maybe you trust company X to do the right thing with your information, but what if it's acquired by company Y? Or if it's hacked? Or if it becomes a pawn in a nation-state attack? Privacy is a natural resource that's being exploited by large corporations every day, and even the most well-meaning of the bunch can't know what kind of downstream disasters might occur.
A breezy review of what happened, by my reckoning: You know how college basketball coaches uses video and data analytics to find that one flaw in an opponent's game that could make the difference on a last-second shot ("Force the point guard to his left! He doesn't like to shoot when dribbling to his left.")? Well, Facebook and other companies do the same thing to you and me. Facebook uses advanced research techniques every day to turn small actions -- liking a puppy photo -- into a detailed analysis of you that can be exploited. "Harmlessly," some would say, to sell you insurance, or anti-wrinkle cream. OK, perhaps politically. ("Stir those puppy-loving environmentalists up with another post about the global warming hoax!"). But, all's fair in business and politics, right?
If you think that's all crazy talk, if you think that doesn't work, why else do you think Facebook is worth....HALF A TRILLION DOLLARS? It works. Many people believe it's the most powerful tool -- I would say, the most important natural resource -- of our time.
Now we know that this powerful weapon was stolen through a fraud and used as a weapon by a foreign power against us. Well, it was sold by Facebook to a researcher, who in turn gave it to Cambridge Analytica, who helped Donald Trump get elected, with enthusiastic help from Russia's Internet Research Agency.
If nuclear technology was stolen by a "researcher" and given to a foreign power, America would leap to Def-Con 5. This story is a bit more subtle, but only because people still don't recognize the incredible power of Facebook, and to a larger extent, the rampant privacy violation that is permitted by America's rule of law right now.
Inferences about our lives and our most intimate personal details are indeed a natural resources. They are what makes us most human. Right now, companies acquire this raw material basically for free, and exploit it for all it's worth -- consequences be damned. It's not unlike that halcyon days when chemical companies were allowed to use all the water they wanted, for free. This Cambridge incident is the digital equivalent of a chemical leak into the water supply.
People like me, who harp annoyingly on privacy issues, have always been afraid of this kind of thing. Yes, we know that you clicked "yes" when Facebook asked if it could collect your comments and photos. And maybe by using their software for "free," you give the company implicit permission to use the data in other ways. You certainly don't give it informed consent to sell your intimate details to the highest bidder. You'd be shocked, embarrassed, and angered by what actually happens to the data, the way you are angered when you realize all the things Equifax has done with your data. (Only Facebook suffers far less regulation than Equifax.) Still, however, at least you've heard of Facebook.
Had you heard of Cambridge Analytica before this weekend? Only if you were a political nerd. And this is the point. Facebook, and all large data collectors, store this radioactive material in their servers and might genuinely do their best to keep it safe. But hackers can take it, as we've seen literally a billion times. Worse yet, a foreign power might take it and use it against us. Facebook might willingly give it to a firm using a false pretense.
Facebook has invented a disease that it can't be trusted with for safe keeping. The disease must be destroyed before it destroys us.
Not long ago, America and the world stared down a frightening threat and did the right thing through a mixture of regulatory action and corporate responsibility. Subliminal television ads were all the talk during the 1960s. Single-frame ads that were inserted into TV shows, whizzing by way too fast for our conscious brain but capable of injection messages directly into our subconscious, caused a lot of thoughtful debate. In 1974, the FCC issued a policy statement saying "subliminal perception" is "contrary to the public interest." A book about the topic stirred up widespread anger and Orwellian warnings. After a few flirtations, TV executives abandoned the idea, for obvious reasons.
Today's data buckets are yesterday's subliminal ads. They are contrary to the public interest, and now we have direct evidence that they are dangerous. As others have pointed out, U.S. consumers can delete apps like Facebook, but in many parts of the world, it's the only way for people to communicate. And it's unrealistic to drop out of the data society anyway.
Now is the time to recognize that the exploitation of our innermost selves is akin to exploitation of the air and water, and act with the same national focus. The problem is that big, and that important. Once upon a time, America took swift action for future generations to ensure all our rivers weren't polluted . A mixture of sensible regulation, corporate self-restraint and employee conscience is needed to control this disease before our minds are permanently polluted.
Here's how Nuala O'Connor, President and CEO of the Center for Democracy & Technology, sees it:
"Now is a time of reckoning for all tech and internet companies to truly consider their impact on democracies worldwide. While the misuse of data is not new, what we now see is how seemingly insignificant information about individuals can be used to decide what information they see and influence viewpoints in profound ways."
"Internet users in the U.S. are left incredibly vulnerable to this sort of abuse because of the lack of comprehensive data protection and privacy laws, which leaves this data unprotected. The level of online microtargeting through major communications platforms that is possible today makes increased transparency essential."
"Every individual must have greater agency and control over their personal data by default, which is clearly not reality today. Communications technologies have become an essential part of our daily lives, but if we are unable to have control of our data, these technologies control us. For our democracy to thrive, this cannot continue."
"But those kinds of records were useless for figuring out whether a particular voter was, say, a neurotic introvert, a religious extrovert, a fair-minded liberal or a fan of the occult. Those were among the psychological traits the firm claimed would provide a uniquely powerful means of designing political messages."
If you've read this far, perhaps you'd like to support what I do. That's easy. Buy something from my NEW LIBRARY AND E-COMMERCE PAGE, click on an advertisement, or just share the story.
