Is Apple's big bet on privacy working?
In Conversation: Are consumers (finally) paying for digital safety? Should they?
INTRODUCTION
If you search Google for evidence that consumers are willing to pay for privacy, you will quickly find this story by tech writer Declan McCullagh. He was using the anniversary of the first U.S. Data Privacy Day -- founded at Duke University by professors Jolynn Dellinger and David Hoffman, and created by an act of Congress in 2009 -- but he wasn't painting a pretty picture. Instead, McCullagh was giving an extensive tour of the privacy startup graveyard. The piece is full of long-forgotten names like Zero Knowledge Systems, IDcide, PrivacyX, and Disappearing.com. Meanwhile, data collection firms like DoubleClick were being swallowed up by Internet giants for billions.
The market, it seemed, had spoken. Companies would pay handsomely for data, but consumers weren't interested in paying for privacy. At least, that's what it looked like.
A lot has happened since that first U.S. Data Privacy Day. Ed Snowden. Cambridge Analytica. Equifax. Thousands of other data breaches. Consumers at least *say* they care enough about privacy they are willing to demonstrate that with their wallets -- 31 percent of U.S. consumers told Larry Ponemon they’d pay for privacy tools in a survey earlier this year.
Maybe, things are changing.
Perhaps the most visible sign of this new attitude is Apple’s extensive marketing campaign around the iPhone's privacy protections. Seemingly ubiquitous ads debuted during 2019 March Madness basketball tournament with the catchline: “If privacy matters in your life, it should matter to the phone your life is on.” Apple CEO Tim Cook has said repeatedly, “We at Apple believe that privacy is a fundamental human right." And the tech giant has made no disguise of its intention to separate itself from Facebook and other firms that make money by selling users’ data rather than selling them products.
Now seems like a good time to ask: Is Apple's integration of privacy into the Apple brand something that helps really sales and profits?
To kick off the conversation, I posed some questions to Duke University professors David Hoffman, Ken Rogerson, and Jolynn Dellinger, Duke Bioethics & Science Policy graduate student Taimur Kouser, and to Gal Ringel, Co-Founder & CEO at Mine, a company that sells privacy tools to consumers. Mine also has an ongoing research project with Duke.
(If you are new to In Conversation, I am a visiting scholar at Duke University this year studying technology and ethics issues. These email dialogs are part of my research. See all the In Conversation dialogs here.)
-——————
FROM: Bob
To: Jolynn, Gal, Ken, and Taimur
I remember being at conferences 10 or 15 years ago and people telling me no company would ever bother marketing privacy as a product feature. That turned out to be wrong. Apple's ads have certainly helped raise the profile of privacy issues in consumers' minds.
But I wonder: Is Apple really getting consumers to "pay" for privacy? Does the firm believe consumers are buying iPhones or other Apple products, and can Apple get a premium on those purchases, because people are now willing to pay for privacy? Has there been a lift in brand loyalty? And is there enough impact that others in the industry will benefit from it?
From: Jolynn
To: Bob, Gal, Ken, and Taimur
Well, I would absolutely pay for privacy if that’s what I had to do to get it ..... but I shouldn’t have to do so. Also, I think it is better to have an option to pay for privacy than to have no access to privacy at all - which I believe is an accurate characterization of my option when participating in social media platforms powered exclusively by the behavioral advertising business model.
But I digress ... and here, we are talking about Apple. I could not honestly say that Apple is getting me to pay for privacy because - full disclosure - I have been a loyal Apple customer since 1986 when I got my first computer - pre-Internet (for me) and long before I had any understanding of information privacy. But if I were newly in the market today, looking to buy a phone for the first time, I am certain that Apple’s public commitment to privacy as a value and the design and policy decisions Apple has made that appear to provide evidence of that commitment would induce me to buy an iPhone as opposed to an android.
I have to admit, however, I do not know if Apple is literally “getting consumers to pay for privacy.” I understand iPhones are more expensive than some android phones. I don’t know if the cost of the iPhone is actually increased by anything having to do with the privacy protective design, features or company policies. But if a customer is willing to pay more to buy an iPhone and the only discernible or meaningful difference to the customer between the products being considered is Apple’s commitment to privacy, then I guess we could assume that customer is paying for privacy. Because there are any number of differentiating features, I think it is hard to draw this conclusion.
But Apple would be correct to believe people care about privacy — because polls show people are concerned about their privacy, and because people have opted out of using products and services because of their privacy concerns. It makes good business sense to show consumers that your company values privacy - not only through advertising but also through actions. Apple’s decision to stand behind encryption mattered to me as a consumer. The acknowledgment that a back door for the good guys is a back door for everyone and the commitment to protecting the privacy and security that their product was designed to provide mattered.
No company is perfect, and I have no doubt there are ways that Apple can continue to improve privacy protections and that there are battles (about encryption and otherwise) that Apple will have to continue to fight. But as a consumer, I appreciate a company considering privacy an important value and a feature worth advertising, and I appreciate having a choice — being able to choose privacy even if I am “paying” for it. o return to my earlier point, however, consumers should not have to “pay” for privacy with our dollars OR our data - people should be entitled to a baseline level of privacy protection in the commercial space, and a federal comprehensive privacy law would help substantially to move us in that direction.
From: Ken
To: Jolynn, Bob, Gal, and Taimur
Some years ago I came across a comic strip portraying two people talking about online privacy. First panel: person 1 raises hands in the air and says something like there is so much to worry about these days: global warming, crime, wars, staying physically fit, having healthy relationships. How do we know what is most important? Second panel: friends stare at each other. Third panel: friend says, “Online privacy.” Fourth panel: person 1 screams.
I have always thought that this was prescient. Privacy online is important only when it is: when the invasion of privacy opens up our lives to unwanted attention or scrutiny, when the lack of privacy leads to our grandmas losing their life savings, when a breach of privacy is extremely public (i.e. Equifax; government trying to gain access to a criminal’s cell). But, most of the time, the consequences of not having control of one’s information – i.e. feeling that we have some privacy – do not affect us directly.
I think that, while Apple may be providing a level of privacy that works, most people don’t think about it until they are forced to. Then they might be willing to pay for it. The rise of companies like LifeLock shows that there may be a market for this. But I really am not sure how many people they are reaching or whether people perceive that they work (which is different than whether they actually DO work).
All of this highlights the paradoxical relationship between privacy and security. The more secure we believe our information is, the more we can feel calm. But, in order to have that security, we must relinquish some privacy, at least to the company or organization we engage to protect us. The same works for Apple. It sells a more secure device and Apple may promise us privacy, but the company knows more about us because of it.
I am not sure privacy is ever a goal; it is a tradeoff and how much we are willing to trade depends on the circumstances. And those circumstances can change over time.
From: Taimur
To: Gal, Bob, Jolynn and Ken
I agree with Professor Dellinger—I might pay for privacy if the option were there and it made sense to, but I more so find it strange that we expect people to pay for what is considered a fundamental human right in the first place. Basic rights shouldn’t come at a price—if we’ve accepted, like Apple, that privacy is one of those rights, then it should be manifested in new and old technologies to the furthest extent that it can be, period. At the very least, Apple’s embrace of the “rights” rhetoric should oblige them to offer more secure and private devices at no added cost to their consumers.
But costs aside, I’m more impressed by Apple’s willingness and imminent follow-through on their privacy promise. Although they delayed the release of a new privacy feature until later this year, it seems that Apple’s newly articulated commitment to privacy is substantial enough to really “rock the boat.” The new iOS 14 feature would signal “the end of collecting iPhone identifiers for advertisers (IDFA),” which may bring an end to online advertising as we know it. Just imagine—a person’s iPhone would require websites to actually ask the user if it’s okay to track them—twice! Many have astutely noted that the chances of anyone actually allowing themselves to be tracked is quite low, which would be a huge blow to the advertising industry. Facebook has most notably come out swinging against Apple for their new move, among many other major companies. Ultimately, though, their efforts have just delayed the inevitable. Apple will move forward with its new privacy feature, and developers will either have to get on board or…not.
In all of this drama, I find two things really remarkable. First, it feels like Apple’s new feature (at least on the surface) is nothing short of revolutionary. They are using their platform as one the world’s most valuable companies to push a new conversation, a new agenda—user privacy. Whether their efforts are successful or not almost seems secondary to me, because the power of focusing the discussion on privacy is itself just immense—we only need to look to Facebook, Google, and other major companies’ backlash against Apple to see just how powerful. But second, the substance of Facebook and other companies’ backlash is really telling about where the societal conversation currently sits. No one seems to be criticizing Facebook’s position of defending the “violation of user privacy”—instead, people only seem to be applauding Apple for providing new protections.
While it’s great that Apple is breaking new ground here, the conversation needs to include a recognition of why Facebook and other companies’ position (i.e., continue to collect user data without their permission) is wrong, not just neutral. And as Apple has demonstrated on their website, doing so within context of the privacy debate forces a shift from “profit” language to “rights” language. So far, the advertising world is stuck on economic losses, which are undoubtedly important to consider. But Apple, at least on the surface, doesn’t seem to be appealing to profits. If privacy is on the table, then it will require us to think in terms of ethics and rights, not profits and losses.
From: Gal
To: Ken, Jolynn, Bob, and Taimur
First, although I 100% agree with Jolynn regarding her last point -- "consumers should not have to 'pay' for privacy with our dollars OR our data" -- I think that if we wanted to achieve that, we'd have to restructure the entire internet and have both consumers and companies participate in the change. For example, using blockchain to create a "personal distributed vault" of my data could solve one problem: letting companies hold all the data and decrease a data breach or privacy scandal threat to the company, but this is a technology implementation. My point is that it requires a lot of players to collaborate and since each one has a different agenda (and business goals), the objectives are not aligned, so I believe it will take 10+ years at least. Nevertheless, I think the only way to achieve real privacy is by having much more extreme and strict regulations.
BTW, I'm coming from a decade of dealing with cybersecurity offensive engineering. A big part of my experience is within the 8200 unit, part of the Israeli intelligence. The sad reality is that no company can really protect itself, and so it's not a matter of if, it's a matter of when. Given that data breaches and privacy scandals are here to stay, I'd like to suggest a different way to think about the privacy problem - to think about our data and how we, the consumers, are taking a proactive role in protecting it. "Stop sharing" is not the solution because it will hurt our online experience. If one could manage their data on an on-going basis and quickly delete it from services when he/she no longer using it, that would dramatically decrease the data exposure that is out there about us and reduce digital risks in the long term.
As Kenneth and Jolynn said, it is now clear that consumers are far more concerned about their privacy, especially their data, than ever before and willing to take an extra step. You can read about it in different researches by McKinsey and KPMG. Moreover, we can see more and more privacy and security products for consumers that demonstrate substantial growth, such as LifeLock, VPNs (34% of the entire VPN market revenues are from users that buy it for the sole purpose of being anonymous online), Password Managers, Data Broker deletion tools, Online Reputation Management tools, etc. In fact, according to our internal research, that are over 100M consumers within Europe and the US together that are paying for at least one privacy/security product. So I think consumers are starting to understand that they need to take a proactive role in the fight for their privacy and data since the companies are not able to provide that.
It all boils down to the simple fact that people like to share their data. They want to use the internet because it is a great place that helps them in many things. They want to sign up for services, to purchase things online, to travel, etc. This online behavior would never change. In fact, since COVID hit Europe on March 1st and the US in the third week of March, the entire modern world went 100% digital. During the first worldwide lockdown, we had to move our entire life to online, instead of all the things we used to do offline, which means our data exposure grew by 55% (we posted research in BusinessInsider about it). Therefore more digital risks.
Moreover, we discovered that 350 companies hold the average users' sensitive data (excluding newsletters). In the US, the average is higher - 550 companies. The interesting part is that 80% of the companies that hold our data got it through one-off interactions we had in the past. This means that 80% of these companies keep our data for no reason. We do not get any value as their customers for that anymore, which means we are 80% more at risk to experience digital threats like identity theft, reputation damage, financial loss, manipulation (Cambridge Analytica), and more. Most of the companies don't have any data retention policies, and so our data would stay there forever, putting us at high risk for the day the company would get breached or decide to sell our data.
The problem with services like LifeLock is that they don't offer prevention. When you get an alert from LifeLock, your data was already stolen. It's too late. I truly believe that we all need to be able to reduce the amount of data that is out there about us, proactively, and this is our vision at Mine. We do that by making privacy regulations, specifically the Right to Be Forgotten, accessible with a click of a button to initiate the deletion process with any company (the rest depends on the regulations and the company's internal process). It's all about putting users in the driver's seat and letting them control their data. Our users have already sent more than 1,300,000 deletions worldwide, and we were able to reduce our users' digital footprint by 60% and saved 15,000 from a potential data breach because they deleted their data on time. Unfortunately, two years into the GDPR and almost one year to CCPA, a lot of companies are still not compliant.
Now, as for Apple, I think the firm does value their users' privacy in the form of encryption, no backdoor, more transparency on apps installed, and the recent green and yellow dot when using microphone and camera. But, I would not say it helps me, as a consumer, protect my privacy or data since the problem is that we, the consumers, are sharing sensitive data with companies as part of our daily lives. I think what Apple, and also Microsoft, understand is that privacy is a brand necessity. They know that when companies put privacy as a first priority, it increases their credibility, loyalty, and trust among their customers which helps them both in the short them and long term. I wouldn't say that I'm buying an Apple device because of their privacy, because in my perspective, it's not true :) My privacy and data could easily be exploited using an iPhone, which is not related to Apple at all.