Headline
How to Reclaim Your Online Privacy
We talk to the Signal Foundation’s Meredith Whittaker about how the surveillance economy is newer than we all might realize—and what we can do to fight back.
On this week’s episode of Have a Nice Future, Gideon Lichfield and Lauren Goode talk to Meredith Whittaker, president of the Signal Foundation, about whether we’re really doomed to give up all of our private information to tech companies. Whittaker, who saw what she calls the “surveillance business model” from the inside while working at Google, says we don’t need to go down without a fight, and she outlines strategies for getting our privacy back.
Show Notes
Here’s our coverage of Signal, including how to use the app’s encrypted messaging. Also check out the WIRED Guide to Your Personal Data (and Who Is Using It).
Lauren Goode is @LaurenGoode. Gideon Lichfield is @glichfield. Bling the main hotline at @WIRED.
How to Listen
You can always listen to this week’s podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here’s how:
If you’re on an iPhone or iPad, just tap this link, or open the app called Podcasts and search for Have a Nice Future. If you use Android, you can find us in the Google Podcasts app just by tapping here. You can also download an app like Overcast or Pocket Casts, and search for Have a Nice Future. We’re on Spotify too.
Transcript
Note: This is an automated transcript, which may contain errors.
Gideon Lichfield: I’m trying to remember, is it privacy or privacy?
Lauren Goode: Hi, I’m Lauren Goode.
Gideon Lichfield: And I’m Gideon Lichfield. This is Have a Nice Future, a show about how fast everything is changing.
Lauren Goode: Each week we talk to someone with big audacious ideas about the future, and we ask them, is this the future we want?
Gideon Lichfield: This week our guest is Meredith Whittaker, the president of the Signal Foundation, which runs the Signal messaging app and also works on the bigger problem of online privacy.
Meredith Whittaker (audio clip): We don’t wanna just simply own our data, that’s a very simplistic palliative. I think we want to take back the right to self-determination from a handful of large corporations who we’ve seen misuse it.
Gideon Lichfield: Lauren, do you worry about your privacy online, or have you basically given up by now.
Lauren Goode: Can both be true? Have I given up on online privacy, or just have I just given up?
Gideon Lichfield: Have you given up? Have you given up on everything? Are you just sailing off into the sunset now?
Lauren Goode: Yes, yes. I think about online privacy all the time, literally—I would say almost every day—both in the acute sense, like, should I download this app so I can write about it for WIRED, and should I be using a burner account to do that? And then I think about it in a broad sense too, like, why am I still getting spammed with photos of wedding dresses from 2019, which is literally a whole story that I wrote for WIRED about my canceled wedding and how the internet wouldn’t let me forget. But I don’t think there’s any going back to a pre-digital era. I think we have to come up with new lifeboats for this digital ocean that it feels like we’re drowning in sometimes.
Gideon Lichfield: I like this metaphor. Lifeboats.
Lauren Goode: Yeah, I just came up with that. Thank you.
Gideon Lichfield: It’s very good.
Lauren Goode: What about you? Do you worry about this a lot?
Gideon Lichfield: Yes, in that I really keep my social media locked down, and I use burner email addresses for everything I sign up for. But then every so often, you read a story like the kind we publish a lot on WIRED about how some supposedly good piece of tech is actually a terrific privacy nightmare—like normal security cams or education tech or software for detecting child sexual abuse material—where your data can then get scooped up, hacked, leaked, or given to law enforcement. Or some innocuous app like a game or a fitness app is actually capturing data about you, that is then sold to third-party brokers, who can then sell it on to anyone else who wants it.
Lauren Goode: I remember writing a lot about health and fitness apps in the early 2010s, only to see most of them get acquired or sunsetted or absorbed into something else by the late 2010s—and then realizing at that moment: Oh right, that data is still out there and just lives somewhere else now. It’s in Google’s cloud or it’s owned by an apparel company or by private equity, and it feels like there’s nothing we can do about that, right? It’s just floating around on the internet, these little digital footprints that you left years ago and maybe didn’t think about much at the time.
Gideon Lichfield: Well, so this is what makes Meredith, our guest, interesting, because she’s worked deep inside the surveillance economy, and she does not think that we are necessarily screwed.
Meredith Whittaker (audio clip): I draw a lot of hope from history that even in the darkest times, people find ways to resist, and people find ways to create worlds that are at least more livable than the world they’re handed.
Lauren Goode: From what I know about Meredith, she’s well qualified to have this conversation. She spent a lot of time at Google, which is a place that relies very heavily on what she calls the “surveillance business model,” which is the way businesses use and sell our data to make money.
Gideon Lichfield: Exactly. She worked at Google for 13 years, and while she was there in 2018, she helped lead that massive employee walkout over how Google handled several sexual harassment cases. And now she’s leading the Signal Foundation, which runs the Signal app. So she’s well versed in the subject of privacy and has experience in activism.
Lauren Goode: I know Signal is very popular among journalists. Like people often say, “DM me for Signal,” because it’s a really secure way to communicate with sources. Do you use Signal, Gideon?
Gideon Lichfield: I use it obviously to buy my drugs and to order hits on my enemies, and to plot the overthrow of the government every so often.
Lauren Goode: Right, right. You haven’t done one of those in a little while now.
Gideon Lichfield: This job doesn’t leave much time. Anyway, what makes Signal interesting is it was the first app to offer end-to-end encryption where the company can’t read the contents of your messages, but now lots of other apps offer end-to-end encryption as well. What makes Signal different is, it still does not collect almost any metadata, like who you’re sending messages to, or the timestamps on them, and a lot of knowledge can be reconstructed from that kind of metadata. So it is really a lot more private than the other apps.
Lauren Goode: But Signal, at the end of the day, is still just a messaging app, and the privacy problem we’ve been talking about extends to everything across the internet, not just messaging. So I’m curious how we get from having this very private messaging to private everything else?
Gideon Lichfield: Well, that is exactly what I wanted to ask Meredith, and that conversation is after the break.
[Break]
Gideon Lichfield: Meredith Whittaker, welcome to Have a Nice Future.
Meredith Whittaker: Gideon, I’m so happy to be here. Thank you.
Gideon Lichfield: Some of the guests that we have on this show are here to tell us about their vision of the future and how wonderful it’s going to be, and then our job is to ask them if this is really the future we want. And I feel like you’re here to tell us about a future that we can all agree we probably don’t want, which is one of total surveillance.
Meredith Whittaker: Yeah, I don’t think any of us want that, and I think there are happily many ways to avoid it, but they will take a bit of work.
Gideon Lichfield: My cohost Lauren sometimes likes to say that we’re like frogs boiling in surveillance water, and that in the last 15 or 20 years, we’ve just gradually come to accept that privacy is dead, that every single thing we do online and increasingly offline just generates data for big tech companies to feed on. And you started at Google in 2006, you left in 2019, so you’ve sort of watched that water go from room temperature to boiling point. Was it a slow realization for you or something that you clocked all at once?
Meredith Whittaker: Well, I think I was sensitive to privacy because I was in my teens, late teens during 9/11, and then I watched the Patriot Act, I kind of listened to the critique on that, I watched the FISA court basically stand up a regime of unaccountability around mass surveillance. I was on my way to a Tor project development meeting when the Snowden papers dropped, so I’ve been sensitive to these issues for a long time.
Gideon Lichfield: Right of course you’re referring to the Edward Snowden leaks, which were a decade ago now. Those showed how much government surveillance there was—but now we also have massive corporate surveillance as well, right?
Meredith Whittaker: I’m not that interested in separating them. I see surveillance as a tool of power. So governments use it for different modes of social control—you have surveillance used to get an edge on international negotiations, you have surveillance used as a sort of information advantage in a contest of power, and that can be deployed in many ways, from crushing dissident groups to a corporate negotiation in which you know what your adversary is gonna say, so you certainly have an advantage there in getting what you want. So I don’t think it’s useful in this context to cleave corporate from government surveillance. I think those two things are inextricably connected at this point. And that’s one of the problems we’re facing.
Gideon Lichfield: Right. A lot of people might say, “I know that these big tech companies are sucking up all of my data. I know that the government could track my cell phone if it wanted—there’s nothing I can do about it. Unless I’m breaking the law or something, it doesn’t do me any harm, and maybe I even get some good things out of it.” So what do you say to those people? What are the reasons that they should, nonetheless, worry?
Meredith Whittaker: Well, I haven’t heard that many actual people say that. I’ve heard people in general characterize it that way, and I think there’s a lot going on there—one is that I don’t think we can project this sort of neo-liberal individualist frame onto human use of technology. It’s not really a choice. We could live in some sort of ideological purity in a cabin in the woods and make sure our assistant is the only one who has email and shuttles that back to us on notes of paper. But ultimately to live in the world, we need to interact with digital technology, we have risk-assessment algorithms now that use a lack of social media presence as an indicator of risk. You can’t interact with government services without creating some ID on an online portal run by some vendor that has terrible security practices, et cetera, et cetera, et cetera. So we’re compelled, in order to participate in everyday life, to interact with and use these services. And in so doing, give companies and governments the right to create data about us that they use to mean whatever they want. So I don’t see—I don’t see the sweeping privacy nihilism, so much as the fact that we are effectively coerced into interacting with these technologies and have very little agency.
Gideon Lichfield: Signal tackles that problem from the messaging point of view. DuckDuckGo gives you a search engine that doesn’t collect data about you. How feasible is it, do you think, to extend this kind of privacy principle across the digital economy in general?
Meredith Whittaker: Hypothetically, from a technological perspective, it is very feasible, but it would require a radical change of the economics, the political economy that is governing technology. We could not continue to have a model of tech that rests on monetizing surveillance in order to make the billions of dollars a year that are required to maintain the infrastructures, to maintain the staffing, to maintain the data pipelines and to maintain the software, which … Software is never dead, right? So you would have to be able to do all of that in a way that severed the dependence on monetizing surveillance, and we don’t have that model right now, and we don’t have incentives for undermining that model from an economic perspective at this moment.
Gideon Lichfield: Do you see a way to create that model? Do you have an idea of what that model might be like? What a world without that kind of surveillance capitalism, what it might even look like?
Meredith Whittaker: I mean, we lived in it for hundreds of thousands of years, right? The iPhone was 2008.
Gideon Lichfield: One tends to forget, right?
Meredith Whittaker: I got a Hotmail account in high school, it wasn’t—this is very recent. This is not inevitable.
Gideon Lichfield: True. What is the core reason that people should be worried about this inability to do anything about the lack of privacy?
Meredith Whittaker: You really have to look at the power relationships between the handful of large companies that at this point are kind of the paragons of the surveillance business model, and the governments that they trade data with. So you have a scenario in which large corporations are run by a handful of people at the top who have two objective functions, two main goals, and those goals are: Increase revenue forever exponentially, and increase growth forever exponentially. Literally the definition of metastasis. And if they don’t do that, they’re gonna be replaced—those people, the executives at the top—and the business model of tech is some form of monetizing surveillance data. So you create algorithms with it that can claim to do things like assess whether someone’s a good worker, assess whether they get access to benefits, et cetera, et cetera. You also use it to sell advertising profiles, advertisers get access to people like me based on these profiles, and that’s the cash cow still of the surveillance business modelj. And the real danger here is that the interests of those corporations and potentially governments they collaborate with, which are not all sort of aligned with the public good—many of whom are increasingly authoritarian—will sort of have access, have leverage via that data, and will use that in ways that harm the people who ultimately created that data.
Gideon Lichfield: You cowrote an article in The Nation a couple of years ago where you talked about the need for a “militant progressive vision” for tech, and you talked about the need for a broad coalition of groups like tech workers, gig workers, community activists, tenants, sex workers, anyone else who stands to lose out from Big Tech surveillance, they would need to come together to work against this tendency. So how do you see that coalition coming together? Is it happening? And what do you think it can realistically do to blunt the power of these Big Tech companies?
Meredith Whittaker: Yeah, absolutely. And I’ll just caveat there to say that the term “militant” is used in a more academic sense to mean self-governing and ready for action, not armed. That would mean that this coalition would be ready to fight back against bad laws, like the UK’s Online Safety Bill, which we’re seeing floated like the kids online privacy act in the US, and many of these anti-privacy pro-surveillance pieces of legislation that we’re seeing. I think the vision there was also speaking to the fact that the coalition or the implicit coalition that is backing surveillance is very powerful and actually very widespread, right? You have governments, you have corporations, you have other actors who benefit from access to the surveillance and from this business model, and the people who are often harmed by this, the way in which they are harmed, the mechanism of surveillance of targeting, of blocking and blacklisting can work similarly on seemingly disparate populations. A coalition of people who may not see themselves as linked by a common harm, but in fact are, and that is a natural starting place for pushing back against a lot of the harms of these tools, which apply to all of them, not necessarily equally, but act on them using the same mechanisms and the same logics.
Gideon Lichfield: OK, so how can that coalition come together or work together?
Meredith Whittaker: Well, I think it’s already happening in some senses. I would say that what is in effect often resistance to surveillance or resistance to tech doesn’t always go under that name. So I would point to—in Virginia, I think it was around 2016 or 2017, there was a big wave of teacher strikes, right? And these were protesting bad conditions in education, they were protesting a lot of what we hear from teachers who have just been defunded and left to manage with very few resources. But core to that wave of strikes was a complaint about surveillance technology, was a complaint about a health care app that they were all being forced to use that was going to track their movements, track their activities, and if they consented to this invasive surveillance of their details of their everyday life, they would get some discount on their health insurance. And so we can see that as an example of this kind of pushback against invasive surveillance, against the overreach of the power of these tech companies. But it’s, again, not always narrated that way.
Gideon Lichfield: One of the other arguments that I hear people sometimes raise around the surveillance economy is that they say, “Well, maybe it’s too late, maybe privacy is already dead. We can’t stop this juggernaut. But if surveillance is happening anyway, maybe there are things we can do to protect people from the harms that surveillance might bring about.” And so for example, just as we banned health insurers from pricing coverage based on preexisting conditions, we could ban car insurers from pricing coverage based on how you drive—which is now a thing that cars do, they will surveil your driving. What do you think of that view, that you might be able to protect people from the harms of surveillance, even if you can’t stop surveillance itself so much?
Meredith Whittaker: It relies on an understanding of regulatory enforcement that is counterfactual. These agencies are vastly understaffed. I don’t know, how do you detect harms, what does it look like to prevent them, particularly in a world where these tech companies spend now more than oil and gas and tobacco combined on lobbying? So sure, we can have a post-hoc approach, but what happens to the people it doesn’t catch? What happens when they do it anyway, what happens when insurance companies now exercise their right to use, say, social media data in pricing their plans?
Meredith Whittaker: I think particularly in the context of these AI and algorithmic systems that are trained on this data and used to make decisions about people’s access to care or access to education or what have you, it’s very difficult for the people harmed, in almost every case, difficult to impossible for them to trace that decision that profoundly impacted their life back to a specific proprietary system that was sold by X vendor that uses an API from a third-party company that was wrapped in this skin that they told the bank manager in the background that they weren’t gonna get a loan, and the source of that decision wasn’t even communicated to them. We don’t instrument these systems for accountability. So what you’re talking about is a nice hypothetical, but it’s absolutely impractical in the world we actually occupy right now.
Gideon Lichfield: What kinds of things do you think regulation needs to aim for?
Meredith Whittaker: Well, I think we should be looking at the cloud monopolies—
Gideon Lichfield: Like Amazon for instance, like AWS.
Meredith Whittaker: Amazon, Google Cloud, Microsoft’s Azure, these large companies that are running cloud infrastructures that at this point are the foundation for most of our—what we used to call IT infrastructure. And we’ve outsourced that to these large companies, which means that our government infrastructure, our corporate infrastructure, so much of our life is dependent on these resources. So I think there’s a real question around cloud monopolies and how you curb the centralized power of these companies by addressing the monopolization of the infrastructure market.
Gideon Lichfield: So breaking them up.
Meredith Whittaker: I’m intentionally, if you caught that, maybe not using that term, because I think part of what you’re dealing with, that makes it not that simple, is an economy of scale. It’s kind of like cutting the arms off a starfish. Right? So you break them up, but the way this technology works “best” is at scale.
Gideon Lichfield: Right. And even if you didn’t have these monopolies, the basic approach of the surveillance economy would still apply the same technology, the same use of data. So whether it’s concentrated in the hands of three companies or 10 doesn’t seem to really change the equation.
Meredith Whittaker: Yeah. I mean, the fundamental engine of the surveillance business model remains.
Gideon Lichfield: If you are successful in your mission, if the Signal Foundation is successful in its mission, what do you think the future, the hopeful future looks like in, I don’t know, 20 or 30 years?
Meredith Whittaker: Well, I think to answer that question, I have to stop focusing so much on tech and just talk about being a human in the world. I think in 20, 30 years it would be great to have a world where people were able to have most of the resources they need to live healthy, happy lives. I think we will need to tackle climate to be able to get there, and we will need to fundamentally rethink the role of computational technology in our lives. Where is it serving us, and where is it ramifying and exacerbating historical power asymmetries that in this halcyon future we don’t wanna see replicated?
Gideon Lichfield: What is a way that we can have this computational technology in our lives? In other words, not throw it out, but without it exacerbating these power asymmetries.
Meredith Whittaker: Well, I think that is a political question, not a technological question. Right? It has to do with governance. Who gets to decide what they do, who gets to decide who they do it for, and who it’s done on, and are there computational technologies that aren’t serving us?
Gideon Lichfield: So it’s about much more than simply who owns or controls your data. It’s about, what is technology allowed to be used for and not?
Meredith Whittaker: Yeah, and who gets to use it. I mean, even the term “your data,” I think needs to be contested. Right? What has happened is not so much that this natural off-gassing of a thing called “data” that we all do was suddenly captured by companies smart enough to capture it. What has happened is, we’ve given the authority to define us to a handful of companies. And that definition is called data. That profile of Meredith Whittaker that often has more credibility than I would have myself in determining who I am is suddenly in the hands of a company who gets to create that however they want. And I think that meaning-making—that epistemic authority, to be a little academic about it—that is extraordinarily concerning. And I think something we want to—we don’t want to just “own” our data. That’s a very simplistic palliative. I think we want to take back the right to self-determination from a handful of large corporations who we’ve seen misuse it.
Gideon Lichfield: We often like to ask our interviewees at the end, what keeps them up at night? I mean [laughs], it sounds like may be everything.
Meredith Whittaker: What helps me sleep, maybe?
Gideon Lichfield: Is there—but is there anything in particular that keeps you up right now or that you’re thinking about?
Meredith Whittaker: I actually sleep well, thankfully, but I do—
Gideon Lichfield: It’s funny, so many of the people we’ve spoken to tell us they sleep well, and we’re like, what are we doing wrong? We sleep terribly. Yeah.
Meredith Whittaker: Well, it’s not that I don’t worry, it’s just that I don’t think any of this sort of dystopia is inevitable. The thing that right this moment is keeping me up at night is the wildly misguided provisions in the UK’s Online Safety Bill that would, if they were implemented to their full extent, mandate that everyone’s devices contain a mass surveillance application that would scan every message they sent before they sent it against an opaque database of prohibited speech. And if their speech was prohibited, they would be flagged. And who knows what would happen then. It is a bill purportedly addressing online abuse, but the solutions they provide are some of the thinnest, flimsiest, technically unsound science fiction I’ve seen.
Gideon Lichfield: And final question: You’ve already said you’re pretty optimistic, but how do you stay hopeful when you are looking at these enormously powerful forces that you are arrayed against?
Meredith Whittaker: Oh, I read a lot of history. I kind of keep a foot in the scholarly work I’ve been doing for many years. And you read history and, again, it’s never inevitable that the forces that have power at a given moment will win and prevail. I also believe that people are smart. I don’t have a view of people that these topics are over their head, that they’re not gonna be able to understand them. And what I have seen, even with this sort of micro example of the Online Safety Bill, that once people started talking about the stakes—even if it was sort of explaining complex concepts—you saw people get it. So I do, I have a lot of hope in people. Every cook can govern [laughs], and you know, I draw a lot of hope from history, that even in the darkest times, people find ways to resist, and people find ways to create worlds that are at least more livable than the world they’re handed.
Gideon Lichfield: Well, you’ve given me more hope that we can have a nice future, even though we’ve been talking about a very bleak subject. So Meredith, thank you for joining us on Have a Nice Future.
Meredith Whittaker: Thank you so much, Gideon. It’s been a pleasure.
[Break]
Lauren Goode: Gideon, I think I now count three guests in, what, five episodes so far of Have a Nice Future who have told us they sleep well. So I’m proposing that for our next guest, we interview a sleep specialist, because I need to understand what I’m doing wrong here.
Gideon Lichfield: Same, if I had to think all day about how to stop us from turning into a total surveillance society, I mean any further than we already are, I wouldn’t be sleeping at all.
Lauren Goode: So do you believe Meredith?
Gideon Lichfield: Do I believe she sleeps?
Lauren Goode: Well, let’s say we believe that she sleeps well. Do you believe her when she says there is hope for saving some element of our privacy, that technically the approach that Signal uses could be applied across more of the internet?
Gideon Lichfield: So I feel like I had these assumptions going in, which I stated in the interview: Privacy is dead, there’s nothing we can do to get it back, but maybe we can protect people from some of the harms of not having privacy. And I like that she really challenged me on those assumptions. She said, first of all, not having privacy is really, really new. We’ve forgotten how recently we lost it, so we shouldn’t just roll over and assume we can’t get it back. And then, as to the theory that we could nonetheless protect people from the harms of not having privacy, she said this rather guarded thing. She said, “I think it relies on an understanding of regulatory reinforcement that is counterfactual.” Which was a very diplomatic way of her saying, Gideon, you are smoking crack. Government agencies like the FTC or the FCC are never going to protect people from predatory practices by Big Tech companies that have vastly more resources.
Lauren Goode: Yeah. I have to be honest, when she said that, that it’s counterfactual, it made me want to Google the definition of “counterfactual” again, just so I could better understand what she was saying.
Gideon Lichfield: It means wrong.
Lauren Goode: It just means wrong. So yeah. So it sounds like you’re saying that she convinced you not to be totally fatalistic about privacy, but … ?
Gideon Lichfield: But I still don’t know if I see the path from here to there, or in other words, from a world in which both the generation and exploitation of data are still increasing at an exponential rate to one where citizens have real control over how their data’s used and governments tell tech companies to get in line.
Lauren Goode: Yeah. I thought it was interesting how she used the word “metastasize,” that that’s literally what’s happening when you have these growth goals in place.
Gideon Lichfield: Yes, cancerous industry.
Lauren Goode: Right. And it doesn’t sound like you were totally swayed by the example she used of teachers striking against the health care app that was sucking up their data, that that’s the thing that’s going to move the needle.
Gideon Lichfield: I don’t know, you could say it’s still early days for those kinds of protests. A few years ago the teachers might not even have had enough awareness of how their data might be used to even mount a strike. So perhaps I should be more optimistic. What about you?
Lauren Goode: Yeah, I have to say, I think this is one of my favorite episodes of Have a Nice Future so far, because after listening to you and Meredith talk about this, I did feel a little bit more optimistic somehow. I liked what she said about how she assumes that most people are actually smart and aware of what’s happening to some extent with regards to data collection. I’s just that the current forces are much bigger than any one individual. But it made me think, OK, at an individual level and then maybe at a collective level, there is some way to push back against what the tech companies want us to believe is an inevitability.
Gideon Lichfield: Right. Because a thing I didn’t ask her directly, but I can imagine what she might have answered, is that the evidence suggests that most people don’t care about privacy, because they are happily giving over their data in exchange for free social media and free email and all these other free services. But I suspect what Meredith would answer is, well, it’s not that they don’t care about privacy, it’s that nobody has given them the choice.
Lauren Goode: I thought what she said about taking a closer look at the cloud monopolies was really interesting. She noted that we used to call them IT infrastructure, and now these cloud companies—Amazon Web Services, Microsoft Azure, Google Cloud—they’re basically the underpinnings of Web 2.0 and of many if not all of the mobile apps that we use, and they have the ability to hoover up so much data and control it in a sense. But she said we shouldn’t necessarily look to break them up. She was careful not to say that. Instead it’s like we should rethink these, but how exactly do we do that?
Gideon Lichfield: Right. She was saying this common refrain “break up Big Tech” doesn’t really get you anywhere, because big tech broken up into slightly smaller tech still has the same basic business model and the same basic problems. I think she was saying we need to get more comfortable with deciding as a society that certain uses of tech are simply not allowed. And maybe an example of that would be collecting data from a health care app, which is then given to insurance companies in order to determine your premiums. We should just decide that that is not allowed.
Lauren Goode: Yeah. Like around this idea that we don’t really have a model for severing surveillance from the current monetization schemes of the internet. I was wondering if there are other parts of the world where this is being embraced more that we should be looking to as a model like should Europe, which tends to have more stringent privacy laws in place than the United States, be a model for us? Because I tend to look at something like GDPR, which is Europe’s sweeping privacy laws around how we need to opt-in to being tracked on the internet. I tend to look at something like that and just think, well, now all we do is click “accept all cookies” all day long, but we’re still having basically the same internet experience.
Gideon Lichfield: Yeah. I think we should do an episode on GDPR because it is this landmark privacy law and yet for most people it’s the experience of it is simply more annoying popups on their web browsers. And it’s not clear how it helps you gain back control of your data and it doesn’t solve the basic problem, which is that the surveillance economy, regardless of how much data you are sharing, is still predicated on this exchange of data for services. So I don’t think there is a good example in the world yet.
Lauren Goode: I think you’re right in that we could do an entire other podcast on GDPR, but we’re probably getting into the weeds on that. It sounds like generally you were pretty optimistic after talking to Meredith and then she does have some really good ideas for how we should at least be thinking about a privacy-centric future.
Gideon Lichfield: I guess I would say that Meredith embarrassed me a little bit into being optimistic. She basically said, stop being such a fatalist, stop assuming that all is lost. It is still worth fighting for things, even when it looks glum. And so I think that means remembering, just as Meredith reminded me that the surveillance economy is very new. I think she was also pointing out that it too may not last for a very long time before something else comes along and replaces it. Whether that will be something better or not, I dunno.
[Music]
Gideon Lichfield: That’s our show for today. Thanks for listening. Have a Nice Future is hosted by me, Gideon Lichfield.
Lauren Goode: And me, Lauren Goode.
Gideon Lichfield: And if you like the show, we’d really appreciate if you … told everyone else! You can leave us a rating and a review wherever you get your podcasts. And make sure you’re subscribed to get every episode when they’re out each week.
Lauren Goode: You can email us at [email protected]. Tell us what you’re worried about, what excites you, the question you have about the future, and we’ll ask our guests.
Gideon Lichfield: Have a Nice Future is a production of Condé Nast Entertainment. Danielle Hewitt and Lena Richards from Prologue Projects produce the show.
Lauren Goode: We’ll see you back here next Wednesday and until then, have a nice future.