What is the original sin of the Internet?
In Conversation: The quickest way to right the Web's wrongs
INTRODUCTION
Perhaps you were a holdout against the notion that something is terribly wrong with the Internet. The events surrounding Jan. 6 -- both the plot to attack the Capitol and tech giants' reaction to it -- surely make that position harder to defend. The argument that democracy is under assault via technology seems easy to make.
If the Internet is really broken, what went wrong? Was there a fundamental mistake, some kind of original sin, that led to the mess of disinformation and Facebook angst and techno-exhaustion we all seem to feel? For this big, complex discussion I've enlisted the help of Richard Purcell, the first privacy executive at Microsoft; Khiara Bridges from UC Berkeley; Anne Klinefelter from the University of North Carolina; and also Jolynn Dellinger and David Hoffman from Duke.
Today’s question around the original sin of the Internet is also the subject of a podcast I’ll be hosting next month.
(If you are new to In Conversation, I am a visiting scholar at Duke University this year studying technology and ethics issues. These email dialogs are part of my research and are sponsored by the Duke University Sanford School of Public Policy and the Keenan Institute for Ethics at Duke University. See all the In Conversation dialogs at this link.)
FROM: Bob Sullivan
TO: Khiara, Anne, Jolynn, Richard, and David
Looking backward isn't always worthwhile, but sometimes it is. When you are doing a long mathematics calculation and make a mistake, it's not possible to erase the answer and correct it. You have to trace your steps back to the original error and calculate forward anew. I think it's time we did that with the Web.
So what's the Original Sin of the Internet? Nearly all business models it supports require spying on consumers and monetizing them. I heard a tech executive recently describe this version of the original sin as, "This notion that widespread invisible collection of your data is just the way the Internet works." While most people were busy searching out cute cat videos, the Internet was built on the idea that companies would just load up on data, all the time, and keep it forever. "Nobody really understood at the beginning what that would turn into."
And I fear that so many other ills -- whether it be Covid-19 misinformation, or data-driven discrimination, or our newly fragile ability to hold free and fair elections, or just the inability to be left alone -- flow from that error.
What do you think?
FROM: David Hoffman
TO: Bob, Khiara, Anne, Jolynn, and Richard
Bob, thanks for kicking this conversation off. For me, the original mistake in the mathematical equation was the idea that allowing the collection of large amounts of personal information about people would not also need a commensurate increase in protections to make certain that people at risk are not taken advantage of.
By way of a metaphor, it is as if we decided to throw a party at a club and originally invited fifty people and hired one person to work security. The party was then so successful that we decided to hold it in a sports stadium and to invite 50,000 people but still only hired one person to work security. That is a recipe for making the most vulnerable people at the party into prey. Now let’s say that we allow some people at the party to know exactly where each person parked and to be able to track when they will be walking to their cars and are most vulnerable.
That is the situation we have now with the internet. Individuals have little to no idea what malicious actors can learn about them and how they can use it to harm them and society. Whether this is from buying data from data brokers, third party code downloading tracking software to our phones and computers, or cybersecurity attacks, we have not created law enforcement and regulatory organizations that have sufficient authority and resources to protect individuals (and especially the poor and non-white communities) from harm. In the U.S. we convinced ourselves that the Federal Trade Commission could enforce its unfair or deceptive trade practices authority with similar amounts of resources to what they had thirty years ago and that the market would largely self-regulate to punish bad actors. That clearly has not worked. If we had passed a strong U.S. federal privacy law in 1990 and created a metric for scaling the resources of the Federal Trade Commission at a rate similar to the scaling of the data collection, then our mathematical solution might be much closer.
FROM: Anne Klinefelter
TO: Bob, David, Khiara, Jolynn, and Richard
You ask, what went wrong with the Internet. I agree the business model is deeply flawed, and it is increasingly used to play on our very human insecurities. Even when we pay money for internet services, those interactions allow us to be tracked, profiled, monetized, and manipulated by layers of actors. And, it is not just the internet. We are spied-upon through all sorts of technologies even in what we still call real life.
No doubt the harms I experience and try to avoid are not as bad as those with fewer resources than I have. I can afford to limit some of my engagement and create workarounds to distract the trackers. But none of us is an algorithm. And, frankly, the collective harm to our society is as big a problem as our individual harms. So, we do need to increase protections for each and all of us.
David is right. We need more regulation to respond, as law does, to new harms. We are overdue. Stronger federal privacy law is a great goal, and I support more funding for and encouragement of the FTC, perhaps with broader authority. I think it is realistic to also support state efforts to identify problems and develop innovative solutions.
FROM: Khiara Bridges
TO: Bob, David, Anne, Jolynn, and Richard
I think that at the dawn of the internet, it was impossible to predict just how huge and vicious a beast it ultimately would become. Because of that, I think that the most dangerous sin of the internet is not an original one; rather, it is our refusal to competently regulate it today. As I type this email, we know that the lack of regulation has led to the destabilization of democracy. The failure to regulate has exacerbated the marginalization of already marginalized groups—like the poor and racial and religious minorities. It has allowed individuals to construct realities—around viruses and vaccines, voter fraud, the issue of child sex trafficking, etc.—that are entirely fact-free. All of these things are crystal clear to us. Yet, we have not taken decisive steps to fix it—to atone for the sins of the internet.
I fear that we won’t regulate until we believe that regulation can generate a profit for a few lucky actors. By then, it really may be too late.
FROM: Jolynn Dellinger
TO: Bob, David, Anne, Khiara, and Richard
I think comments of the type you mention that this “is just the way the internet works” and that “nobody really understood at the beginning” both reflect an absence of intentionality and moral agency. These statements echo the old idea that technology drives us, and that we are just along for the ride; increasingly, we are hanging on for dear life and dealing with the inevitable wreckage. The Internet is a virtual reality but it is also a tool, and it is up to us how we choose to use it and how we choose to let it loose in the world. I think part of the original sin of the Internet is our collective failure to realize our responsibilities in this area – as lawmakers and regulators, as corporations, as innovators, as technologists, and as citizens.
There are many different ideas to take up in this line of thought but for now I will focus on power and commercial use of data. In unequal relationships, power differentials are easily exploited. While relationships of unequal power are just part of life – parents and kids, employers and their employees, a government and its citizens – implicit duties of care, laws and regulations often protect the more vulnerable person in a relationship. In the case of the Internet and commercial data, we decided to let the more powerful entities, the companies, just decide for themselves how they chose to treat those affected by their products and services. We didn’t insist on guardrails. While we can fall back on “no one really understood at the beginning” how that would turn out, I think it actually never required a feat of imagination to conclude that for-profit businesses that exist to make money would optimize for just that – profitability, and that other values like privacy, justice, fairness, and equality would fall by the wayside, especially where providing any of those things costs money. Add to this the information asymmetry that necessarily characterizes people’s relationship with online platforms and apps when it comes to the collection and use of data, and you get a situation in which the more vulnerable parties here – the individuals – can’t even use the one option available to them in a so-called “free” market, an informed choice about participation. If we don’t know what is happening with our data, we cannot vote with our feet, leave or choose another service, or make any other kind of informed decision. Add to this the proliferation of data brokers, businesses that exist to trade in our personal data but that don’t even have a direct relationship with us. How could we possibly be expected to assert our preferences effectively (assuming we could learn enough to have a preference) when there is no relationship in the first place?
Americans are culturally dedicated to the notion of freedom – we love our free market, our freedom of speech, our freedom of choice, and getting things for free (undoubtedly a key to the success of the Internet’s dominant business model). But at the same time we admit there is no such thing as a free ride or a free lunch; we criticize free-loaders; we caution that “you get what you pay for”; and we understand at some level the oft-cited concept that if a service is free, we are not the customer, we are the product.
The power and information asymmetries at play in the context of many Internet services and the prevailing business model built upon the collection and unrestrained use of data make us fundamentally less free in the ways that matter. We can make a different choice.
FROM: Richard Purcell
TO: Jolynn, Khiara, Anne, David, and Bob
Perhaps the original sin of the Internet is the universal and constant original sin of humanity – we put our own wants ahead of our collective needs. In truth, why are we surprised that bad things happened as a result? Engineering the internet took hard work, brilliant minds and naïve souls. Legal, cultural and social leaders were not only absent, they largely abdicated. Commercial leaders complied with their self-interest, naturally, and it all began. I do not take this view lightly, as I was among the complicit. I accepted the self-interested argument that individuals could control their own personal information. That, of course, is and always has been a myth.
But, wow, we thought we were gods. The power of the internet to communicate a tightly controlled narrative is awesome. We had control and we took it. Coincidentally, money poured into the technology companies, allowing them broad influence over public policy, a trend that only accelerated with Search, Social Media, Cloud and AI. Many good people, including some on this thread, worked hard to stem the tide. Perhaps we accomplished some good, generated some trust, influenced some changes.
Now, we reflect on more than two decades of this asymmetrical power relationship in which power behooves to those who control the narrative. Can you see the parallels with the rise of religion as a violent force? Or with slavery as an accepted evil? The original sin does not lie with any singular action; it is in us every day. Any no singular action remedies our condition. It’s back to basics now. We must follow centuries of teaching. We must exert public control through strong, meaningful legal standards. We have to exert social control through education with standards of awareness, values and skills. We have to exert individual control through normative expectations and standards. Western democracy is replete with examples of how we’ve fallen down; it’s our turn to show history we can recall our better angels and recover through law, justice and community. We must be impatient, demanding laws that work, fair treatment that matters and self-control that recognizes we are, actually, all equal.
What do you think? Add your thoughts below:
These are all very thought provoking commentary. The idea of “correcting the wrong” came up in many of the responses; and so I wonder, where do updated policies like GDPR fall in relation to the notion of correction?
In fairness, most of these issues have crept up on us. Way back when, I think we were all so astounded at what an incredible thing this Web was. It's amazing when we think back just how fast it grew. We debated how large our graphics cud be at MSNBC.com, based on what percentage of our users had 28.8 modems and how many 56.6. It took 3 minutes to download a 30-second audio clip. And then it took 30 seconds. And then we had this 15 fps black and white 2-inch square video. And then, amazingly, you could play it wirelessly! And fly out menus. And carousels for covers. There were no online stores, so no worries about somebody storing your purchases. In fact, we WANTED somebody to keep track of everything, that's why we had web logs, sites where folks showed us what they had found on the web.
Then we had cookies and they were great cuz you didn't have to sign in every time you visited a site.
I don't think anyone worried that having access to all of this amazing stuff would push us into silos. It seemed as though it would broaden all of our horizons. Imagine, somebody in rural Georgia (I live there) cud read the NYTimes w/o having to drive to a newsstand in Atlanta!
So when did it change? FB certainly was a big part. Suddenly everybody had a web site and so did all of our friends and relatives. Then we all got smartphones so we could carry our web site around with us. And Amazon came along and won us all over to online shopping. I think those were the steps to the swamp we're in now -- It became very easy to post things online and share them with friends or relatives or strangers anywhere, we could take those thoughts with us everywhere, and we suddenly were all buying things so our preferences for how we were spending our $ became known and valuable.
I think the first step is just to agree that we need some regulation. Yeah, we need a privacy law. Yes, we need to have some way of seeing our digital identity and either correcting mistakes, or adding context.
And we need some way of discouraging disinformation and bad behavior. I think paring back §230 is worth trying. If it means that FB and Twitter don't allow me to post as many snarky comments as I do now, I'm willing to give that up if it also means QAnon has a much harder time spreading. If FB and Amazon yank their "if you like this, you may like that..." algorithms, I'll survive.
But I think we all need to agree to TRY, without panicking over possible consequences. If we abolish/tweak §230 and lots of good stuff disappears, then tweak it again. Should we limit anonymity? Yeah, in some instances. Can I predict the consequences? Nope. But Bob and I worked on the Microsoft campus where they had the motto of "Ready, Shoot, Aim." At the time I thought it was stupid given how many bad initial releases we got of Microsoft software. But over time I've come to appreciate it more: if you want to be certain what you're doing is perfect, you won't get much done. We need to try some things, look at the result, and tweak them some more.
I actually live in Marjorie Greene's district, so the threat of disinformation is very real. Way back in '96 when I started working at MSNBC.com we thought if everybody had access to all the info in the world, everyone would make smart decisions based on the best info. Sadly, we've learned that is not the case, that the sensational lie often is more convincing than the mundane truth.
We don't need one fix; we need a dozen or hundreds.