We arrived at this situation 'because there were no rules to stop it'
Debugger podcast -- Original Sin of the Internet, Part 2
This is the beginning of the transcript for part two of our Debugger podcast on "The Original Sin of the Internet." The technologists who built the Internet were naive, and their failure of imagination led us all down a path to rampant fraud, abuse, and misinformation. In part 2 of this episode, Jessica Rich, Ari Schwartz, and Tim Sparapani join me to discuss why things went so wrong. Click play below, visit the podcast home page, or skim the transcript below of my discussion with Jessica Rich.
Bob Sullivan: Is there an original sin of the way that the internet works. And if you believe that, what might it be?
[00:00:08] Tim Sparapani: It's a fabulous question. I think we trusted. Too much
[00:00:15] Ari Schwartz: I think there’s been an attitude for a long time that if it's legal, that it's okay. And instead of looking at what's moral and ethical, and so there, there was a lot of, “Are you saying I can't do that or that I shouldn't do that?” And if it's, I shouldn't, then they sometimes they would go ahead and do it anyways. If it was what they needed to do to make money,
[00:00:37] Jessica Rich: Our concepts of privacy need to be updated. And I also think if we had passed a gosh-darned law in 2000, even a bad one, we'd be better off than we are today.
[00:00:51] Bob Sullivan: Gosh-darned laws are my favorite kind of laws.
[00:00:54] Bob Sullivan: Welcome back to Debugger, a podcast about technology and democracy brought to you by Duke university's Sanford School of Public Policy and the Kenan Institute for Ethics. I'm your host, Bob Sullivan. Was there an original sin built into the internet at its creation? I'm exploring that question in this first episode. It's such a thick topic, we cut it into two parts. This is part two. If you haven't heard part one, you should really go back and do that now.
The internet has given us so many wonders, instant connection with old friends halfway around the world, life-saving long distance surgeries … cat memes. But it's also given us Cambridge Analytica and COVID disinformation and more people who think the earth is flat and less people who trust, well, almost anything. Certainly democracy. We're going to talk about how to fix that. But first I think important to go back in time and understand where the mistakes were made, kind of like lung division, where you have to work backwards and find your mistake and then work forwards again. So I'm talking to people who were there at the beginning to see what they think went wrong, to find out what their most teeth clenching shouldda-coullda-wouldda, I-told-you so moments were. And are.
Jessica Rich is the perfect person for this conversation. She was at the Federal Trade Commission when it created its privacy enforcement group. She eventually rose to lead the agency’s bureau of consumer protection. Along the way, she's tangled with all the big tech firms in and out of court. Apple, AY&T, Uber, Google, Amazon, Microsoft. But back in the 1990s, she was a new FTC lawyer and quickly alighted to the idea that privacy was going to be a big deal. She’s spent most of her life trying to convince other people that privacy is a big deal.
I started by asking her about all the companies that vacuum up all that information whenever you open a smartphone or look at a webpage and the entire industry that grew up around that leading to all these situations we have today. It all happened while she was running around regulating and suing many of the major firms doing those very things.
[00:03:16] Bob Sullivan: So as a consumer, when you log in to read the news and you look at a simple webpage, you have no idea how many companies that you're dealing with. How did we ever arrive at that situation?
[00:03:27] Jessica Rich: [00:03:27] Well, we arrived at it because there were no rules to stop it. And, uh, it was a great way for companies who had no relationship with consumers to grab hold of all their information anyway. So, very early on, we saw that hundreds of companies were collecting information at one single website. And for the most part, it has been the position of all of those companies that in order for consumers to exercise any control of it, they have to go opt out one by one, but they don't even know about most of those companies.
So this is something that happened because there was no law preventing it.
[00:04:10] Bob Sullivan: And that wasn't for lack of trying. Way back at the beginning of internet time, it was decided that Rich's agency, the FTC, would basically be in charge of privacy issues. But the tools Jessica Rich had were pretty clunky for dealing with them.
Remember Richard Purcell from part one, when he described the mistake of borrowing old business models from TV and radio for the web. Well, we made the mistake of borrowing old laws and trying to apply them to the web too, instead of coming up with new rules that would apply to this world.
Is there and original sin of the internet, how it works, how it was designed, and if there is, what do you think it is?
[00:04:48] Jessica Rich: Well, I know there are tech answers to that question that are definitely beyond my experience. But my focus is consumer protection, law and policy. So my mind immediately goes to how different things would be if we passed a privacy law. Early on the debate about whether to pass a privacy law started in the mid nineties, right after the internet exploded on the scene. The privacy issues were obvious from the start. Data was being collected in new ways. Instantaneously, invisibly, you didn't know who was on the other end. Your kids could be giving information. Many issues were identified early on. The FTC made the first of many legislative proposals to Congress in 2000 and proposed some basic principles to govern data collection. And these principals, many of them remain at the heart of laws and proposals that are still being discussed today. And I believe the world would be very, very different if a law was passed back in 2000. We'd have rules that everyone would be able to count on, both consumers and businesses. We wouldn't have the chaos, the confusion, the Wild West we have today.
The tech platforms would look very different. Ad networks, data brokers, they wouldn't be enormous influence in the marketplace they have today. There will be fewer data breaches and abuses, more transparency and accountability. And if the FTC, which was the agency that would have been given this authority had been given rule-making authority as was proposed, the law would have been able to evolve with technological changes.
So to the extent it wasn't perfect at the time, it could have grown with technology. So for me, that's the original sin from a policy perspective.
[00:06:39] Bob Sullivan: I asked her to imagine how the world might be different today had we passed a lot 20 years ago. What kind of things spring to mind?
[00:06:49] Jessica Rich: Oh, lots of things spring to mind. So I think the tech platforms which have built their empires on unfettered collection and use of data would likely be much smaller and much less powerful in the marketplace. And of course, that's one of the big issues we're debating today. What to do about these ginormous tech platforms? I think data brokers and ad networks wouldn't have grown the way they have, either because consumers would have been able to cut off sharing of data with them.
[00:07:19] Bob Sullivan: Is there one incident that when you think about, uh, you know, whether you were at the FTC or later, or are you just like, God, why did this, this, this shouldn't have happened? We absolutely could have stopped this?
[00:07:33] Jessica Rich: Cambridge Analytica. As described by many, a use of information, contrary to the settings and choices and preferences that consumers have expressed. And that kind of abuse would have been illegal under the 2000 privacy law that was proposed.