It’s been a wild last year or so in tech. We’ve seen a marked rise in the development of artificial intelligence, large language models and prolific growth of augmented reality systems. At the same time, it can feel like we’re moving backwards as concerns continue to rise about user privacy and the methods by which personal data is collected and monetized. Our guest this week points out that protecting privacy requires tech companies to ditch traditional business models that monetize user surveillance. Meredith Whittaker is president of Signal App and serves as the chief advisor for the AI Now Institute. She joins WITHpod to discuss the rise of big tech, the trajectory of the internet from being more commercialized to open, concerns about tech’s role in American democracy, her thoughts on proposed TikTok bans and more.
Note: This is a rough transcript — please excuse any typos.
Meredith Whittaker: The business model in tech is monetizing surveillance. And if you want to focus seriously on privacy, which requires, you know, not adopting that business model, not monetizing data you collect, either through training AI models or selling ads or what have you, then you cannot be pressured by commercial incentives to put profit and growth above that mission, which is, you know, ultimately what a commercial entity is bound to do.
Chris Hayes: Hello, and welcome to “Why Is This Happening?” with me, your host, Chris Hayes.
It’s been a very weird last year or so in tech, in the sort of broad universe of Silicon Valley and the internet and the digital world. There was this enormous bubble that happened during COVID and huge skyrocketing of valuations, lots of money sloshing around, lots of huge expansions and hiring. And there were these two huge bubbles that formed.
One was the Bitcoin and cryptocurrency bubble, which has, I would say, subsequently popped and also reinflated. So that’s one. And then remember, there was the whole metaverse bubble. Mark Zuckerberg changed the name of the company to Meta because they were going to put all this investment, I think it was literally billions of dollars of investment, into the metaverse, which is going to be this virtual world.
There are people on the metaverse, which I never actually quite understood, who are buying property on the metaverse for like six figures. It wasn’t real. I mean, I guess everyone knew that, but I never quite understood it. It seemed like a ludicrous bubble that has obviously since popped and I think shown to be a ludicrous bubble. In the wake of all that, there’s been a lot of retrenchment. There’s been big layoffs in Silicon Valley.
There’s been a kind of creeping sense of maybe that we’re at the end of some kind of era, Web 2.0, it’s often called. And then, of course, there’s now the rise of this new technology that everyone’s going all in on, which is AI, artificial intelligence, large language models, generative AI, ChatGPT, huge amounts of money investment going into that. The chip maker, NVIDIA, that makes the chips necessary to really have the sufficient computing power to do large language models has seen an enormous run up in its stock price and its projections of future earnings.
And, you know, I feel kind of two ways about this. One is this technology seems incredibly promising in certain ways and sort of astonishing. And also, it does feel like Silicon Valley scrambling around to find the next big thing. And the last next big thing was an enormous bust. So should we think the same thing’s going to happen here? And so I wanted to get a kind of big picture view of where we are at this moment in the internet and in tech. One of my favorite thinkers on this is both a sort of theorist and practitioner whose name is Meredith Whittaker.
Now, she actually runs a tech company, which is the Signal app. Signal’s a messaging service, encrypted messaging service. You might use it. It’s also a nonprofit. It’s probably the biggest nonprofit tech company out there and we get to talk about that. But she’s also a really interesting thinker and writer. She also co-founded and was chief advisor at the AI Now Institute. She was at Google where she was very active on AI issues as well. And so for the kind of broad view of where we are right now, this very strange moment in tech, I thought it’d be a great time to check in with Meredith Whittaker.
Meredith, it’s great to have you in the program.
Meredith Whittaker: So happy to be here, Chris. Thank you.
Chris Hayes: I want to start really broad. I just read this really interesting, I think it was a speech you gave or is it essay about the TikTok fight, which I want to get to in a second. But even before the specifics of the TikTok fight, I wanted to talk about at the broadest level, a thing that I’ve been wrestling with in my generational cohort. And I think we’re broadly similar cohorts, which is I used to be really excited about technology.
I used to really think like I was at first like Wired magazine subscriber generation. I convinced my parents to ditch AOL and get like an ISP. And I was like had Mosaic and I was on the internet early and it was mind blowing and explosive. I had the first iPhone. And every new technology seemed promising and cool to me. And now every new technology seems desperately scary. And I don’t know if I’m just getting older or not, but you’re so connected to all this and think so deeply about this. Like, do you feel the same way as someone who lives in it?
Meredith Whittaker: Well, you are getting older. I’m getting older, too, but I don’t think that’s the reason. I think the answer to that question is complicated and it boils down to an answer that’s not really about tech, but more about the business models that we organize tech within and the incentives of the people who currently have the power to decide what tech gets built and who it serves.
So I have actually spent a lot of time over the last year or so going back to the archives and reading a lot of the debates in the 1990s, the legislative proposals, the different civil society actions and what have you, that were informing the conversation on how what they called the internet or the information superhighway or cyberspace was going to be commercialized. Because we had a big moment in the ‘90s where the Clinton administration had, you know, determined that the internet was a big platform for them and they were going to rest a lot of promises of economic prosperity on the idea of this commercial internet.
So I’ll fast forward through that, but basically what happened is the neoliberal consensus, which was very alive in the ‘90s, and that’s a fancy word that means basically we trust the market for everything and we deeply distrust government, it determines, and the Clinton administration codified that, that platforms or internet companies would not have any restrictions on privacy and endorsed very explicitly surveillance or advertising as the business model. And this was, of course, important to the advertising industry who didn’t want to lose another source of revenue.
You know, they supported newspapers. They supported a lot of other media. They didn’t want another media platform that wouldn’t have, you know, them involved. But that combination effectively created this incentive structure for mass surveillance as the economic engine of the tech industry, because on the one hand, you have no restrictions on privacy so commercial surveillance can go well beyond what a government can legally collect and create about citizens.
And on the other hand, you have advertisement as the revenue source. And of course, you know, what do advertisers want? They want to know their customers and they want to reach the people who are going to buy the thing that they are selling. So the more information you gather on the subjects, the better for advertising, or at least that’s the theory. And I think that really set in place the cornerstones that created the collateral consequences we’re living now, where we have a handful of large, you know, surveillance driven tech companies located largely in California that are controlling core infrastructures and platforms and media services for populations across the globe.
Chris Hayes: Part of what I keep coming back to because you talk about the ‘90s is the reason that this sort of question of the model, the business model and the market mechanics, as opposed to the tech itself, is a really important one, is that we live through a period where we went from this very commercialized internet to a more open internet. So the sort of gated community walled garden of Prodigy, CompuServe and AOL, and you would go on one of these services and everything was sort of in this universe, right? It was exploded by the open internet.
And there was this moment where it felt like we had moved from essentially platform dominance, AOL, CompuServe and Prodigy to something far more open. Anyone could go to anyone’s website. I could respond to you. There are all these open protocols. And then now to go back into a world in which everything seems so platform dominated, I think because I had the experience of watching it go from closed to open back to closed, I have this very tangible sense of what’s lost.
Like every time that someone wants me to download one of their apps, I’m like, no, the whole point of the internet is that I don’t have to download your goddamn app. I could just go to your website to make a reservation or whatever. We figured this out. Why are we moving backwards?
Meredith Whittaker: Yeah, I mean, I completely agree, and again, I think we’re around the same generation. So I did have an experience of getting on all the message boards and the blogs and, you know, having a very well populated RSS feed. And now you’re living through what is sort of, you know, trash feeds and bot armies. It feels pretty devastating in some ways.
Chris Hayes: Yeah.
Meredith Whittaker: You know, I think that kind of open close dynamic is a tricky one, in part because, you know, what we were talking about was openly available protocols or standards, right? But they still were gated on the ability to pay for the servers, to pay for the bandwidth, to set up the material infrastructure.
Chris Hayes: Totally.
Meredith Whittaker: So there’s a way that we confused everyone being able to do it if they could afford it and had the skills for a sort of democratic equity --
Chris Hayes: Yeah, great point.
Meredith Whittaker: -- which was not what we were dealing with. And I think, again, really feeds into some of the consequences we’re living through now because, you know, we left that wide open for capture and monopolization.
Chris Hayes: Yeah, this is a really important point just to stay here for a second, which is some of the sort of halcyon days that I often recall, it’s a much smaller percentage of people online. It’s a much more elite, way more white male, and it’s very more like tech nerds focused. And so in that little universe, if you had a high degree of disposable income, could pay for a computer and a good internet connection, had tech knowledge, some of it you can code a little bit, then it was like this utopian open protocol you could do everyone. But this is a tiny little sliver. This was not a mass phenomenon and it’s important to check myself on that because it wasn’t really small D democratic in any broader sense, even if it felt that way to the little community that was involved in.
Meredith Whittaker: Yeah, I mean, it was certainly fun for us, right?
Chris Hayes: Yeah. Right. Yeah.
Meredith Whittaker: And I think that, you know, there were also community networks that were being built where there were new structures of, you know, a kind of micro social wage contributing to supporting the network infrastructure. Like there were models that were trialed, you know, in the ‘80s and ‘90s. It was simply that there was a huge amount of pressure from the advertising and tech industries to codify this very particular form of, you know, surveillance advertising or ultimately what became surveillance advertising as the commercial engine. I should note, you know, Matthew Crane is the scholar that I reference a lot for this. So people interested in this history should look up his work.
Chris Hayes: Yeah, he’s great. I’ve been reading him for the book that I’m working on, which brings me to the question of Signal, which I want to start with just because I was just writing about Signal in the last chapter of this book I’m writing about, which is about attention. And one of the things I was writing about when we talk about physical metaphors all the time, you know, the information superhighway or surfing the web, right, these different physical metaphors. You know, one of them is that like we move through public and private spaces all the time in real life, like some spaces are commercial and some are not. Some are publicly held, some are privately held.
Increasingly, the internet is all commercially held, like you’re always in a mall or you’re always in someone’s building, you know, Meta’s building or, you know, Musk’s building or whatever. And one of the things that is fascinating to me about Signal is it’s a place you can go on the internet. It’s a protocol and an app that you can use that is non-commercial.
Meredith Whittaker: Yeah.
Chris Hayes: And just the idea of that, like it’s amazing how little of that there is. It’s just a non-commercial app. But just explain the basics of the philosophy and how it works that you’re running this app that is not there to make money.
Meredith Whittaker: Yeah. Well, I mean, that is a deep choice for us. It wasn’t simply that we thought, hey, being a nonprofit is a virtuous corporate model. Let’s adopt it to be virtuous. It was really based on an understanding that the business model in tech is monetizing surveillance. And if you want to focus seriously on privacy, which requires, you know, not adopting that business model, not monetizing data you collect either through training AI models or selling ads or what have you, then you cannot be pressured by commercial incentives to put profit and growth above that mission, which is ultimately what a commercial entity is bound to do.
You have certain objectives that you need to meet and those involve revenue and growth and you promise your board. And then if you’re not meeting those, you’re going to have to meet them somehow. So privacy would always be subtended to those objectives. And so being a nonprofit was really important for Signal as a prophylactic to push back on any possibility of pressure, you know, to commercialize and thus to adopt a surveillance business model and whittle away the promises we make to people who really do rely on Signal for life or death situations when digital security is linked to physical safety.
Chris Hayes: Can you say a little bit more about that or just what Signal offers its users and what the app does?
Meredith Whittaker: Yeah. So Signal is the largest truly private messaging app in the world. We have tens and tens and tens of millions of users at any given time, and we are committed to collecting and creating as little data about people who use Signal as possible. So our threat model, which is how we think about, you know, threats, I guess it’s a security term, but is really to even view ourselves as an adversary. So we can’t access the data, which means if someone comes with a subpoena or a warrant, we can give them very, very little.
The sum total of what we have is we know that a given phone number uses Signal. It’s a messenger app. You have to register with the phone number. We know when that phone number registered for Signal. We know the last time they logged in, but we know nothing else. So unlike Meta or Apple or others, we can’t give your contact list. We can’t give your profile photo. We can’t give the members of your groups. We can’t say when you contacted whom and, you know, how many conversations we had. We don’t know what you said or when.
And all of that, you can imagine, is incredibly sensitive information for, say, dissidents or people doing human rights work and authoritarian regimes, people organizing labor unions, you know, journalists talking with their sources, whistleblowers, et cetera.
Chris Hayes: That’s what I use it for. Yeah. Yeah.
Meredith Whittaker: Yeah.
Chris Hayes: Yeah.
Meredith Whittaker: So it’s a core infrastructure for dissent and privacy. And I think it’s really important to note that, you know, Signal has something really special that no other competitor has in the truly private messaging space, which is that we have huge adoption around the world. And of course, a messenger, a communications network doesn’t work if your friends don’t use it. You can be deeply ideologically committed to privacy and, you know, you can’t really use Signal if you don’t have a friend on it, right? So, you know, that we are very committed to maintaining the app at a standard and a kind of usability.
That means, yes, the ideological purists can adopt it, but also their dad can adopt it, you know, their mom, you know, their landscaper, you know, whatever other human being who may have other commitments, but nonetheless can pick it up and use it easily when it’s really important and, you know, when they just want to talk about something casual.
Chris Hayes: And I want to just come back around to what you said at the beginning and re-hit the point for folks that are not embedded in this conversation about tech, right, which is that that information about your users, all the data that comes with them, right, who they’re connected to, mapping them, you can use it to map them, you could geolocate, you could figure out what their social networks look like, what kind of person, what kind of things they might want to buy, what their interests are, politics.
All that data around them is both a threat to their privacy, but also is commercially valuable. In fact, it’s the main commercial value of these folks to a platform, which is why to go back to the original thing, choosing to be non-commercial means that you aren’t driven by the desire to take that data and monetize it because that’s the inevitable incentive structure of the current commercialized internet is to do that.
Meredith Whittaker: Yeah, beautifully put. And it means, you know, in brass tacks, it means I don’t have some guy on my board when we have a quarterly meeting who’s like, you know, you’re not meeting revenue targets and we really need to consider this privacy thing. Is it actually working out right? Like that would be the real pressure we would face. And at some point we would have to choose profit or privacy. If we were for profit, we would obviously choose profit.
Chris Hayes: There’s ways in which the commercialization even come back to the surveillance stuff, but just continue with this trend of the commercialized internet in its 2024 form and why it feels crappy. You know, even outside of the surveillance question, like the spamification of the internet is to me so intense these days. It feels like it’s the ratio of Signal to noise is getting out of whack, things I want to see and things I don’t want to see. It feels like just spam. It feels like it’s a spam.
Meredith Whittaker: Nudes and bio everywhere.
Chris Hayes: Yeah, exactly. Right. Yeah, nudes and bio, exactly. And, you know, that to me also is a little bit of part of the way the commercial internet functions. I mean, even just Google as a product to me has gotten worse. I know you worked at Google, if I’m not mistaken, at some point in your career.
Meredith Whittaker: Yeah, for a long time.
Chris Hayes: Yeah, for a while. They seem like they’re spamming their own users more and more and partly that feels to me like people having those quarterly meetings of revenue and it’s like, instead of three sponsor links, what if we went to five instead of five? Instead of five, what if we went to seven? And it’s like, okay, but at a certain point you’re now spamming me and I can’t use Google anymore. So something’s wrong here.
Meredith Whittaker: Yeah. And, you know, the reality that Google was built for an open web where they were crawling it and indexing content, you know, for better or for worse, and there’s a lot of worse. And now there’s not that much to crawl, right?
Chris Hayes: Right.
Meredith Whittaker: You’re talking about paywalls. You know, not paywalls for crawling, but you know, you’re talking about walled gardens. You can’t link out to a Facebook post from a Google result. So I think it’s a combination of just the real way a corporation works, where every director is in some petty fiefdom trying to make sure that they’re working on the hot new things so that they can get their OKRs met, so that they can get their promotion so that whatever, whatever, right. Like there’s a lot of just sort of petty self-interest that tracks to different broad incentives.
Chris Hayes: Well, it also strikes me, and Signal’s an app that I use and really like. I wonder if we’re at a tipping point where the user experience is degrading such that it starts to take a toll commercially, right. That you’re juicing revenue through all these different means of selling your users data or essentially my obsession is their attention, right. What you’re doing with their attention and spamming them, right? Abusing their attention, right, as opposed to conserving it.
You use Google because it conserves your attention. The thing that made it so great was I need to focus my attention on the thing I want to know and you provide that to me. Once you start abusing my attention and wasting it, the value proposition of the product has declined. And it seems to me like that even commercial equation for the customer has changed in all kinds of parts of the internet where I wonder if it starts to have commercial consequences. You know, they’re going to lose users and --
Meredith Whittaker: And use losers actually.
Chris Hayes: And use losers. They also do that. It’s true. There’s going to be bad commercial consequences, even in the own sort of neoliberal terms of how all this is structured.
Meredith Whittaker: I think that is right. I think it’s hard to measure that for a number of reasons, in part because we’re dealing now with infrastructures that we often don’t really have a choice to use, right? What is the alternative to Google in a world where encyclopedias are out of businesses. Libraries are now closing on weekends because they’re defunded.
Chris Hayes: Right.
Meredith Whittaker: I think there is a way that the users would love to go somewhere else. Where else is there? And this gets back to why there’s only one Signal, right? There’s only one, you know, non-profit, non-commercial, mass private messaging service. And there are very, very few non-profit, non-commercial tech endeavors that actually build tech at scale. It’s because it’s incredibly expensive.
Chris Hayes: That’s the point.
Meredith Whittaker: And it’s incredibly expensive.
Chris Hayes: And you just gave me my next question, right?
Meredith Whittaker: Yeah. Well, we’re just vibing right now because we’re on the same page. But, you know, I’ll just say Signal costs around $50 million a year to develop and maintain. And so there’s a weird split you know, when I say that number.
Chris Hayes: Wow.
Meredith Whittaker: All the non-profit people are like WTF. That is so much money.
Chris Hayes: Yeah.
Meredith Whittaker: All the tech people are like, whoa, you are --
Chris Hayes: That’s nothing.
Meredith Whittaker: -- so lean, right?
Chris Hayes: Right. Yeah, correct.
Meredith Whittaker: So, you know, we’re dealing with a paradigm --
Chris Hayes: Yeah. Yeah.
Meredith Whittaker: -- that is so capital intensive. And how is it able to be so capital intensive? Because that surveillance business model that has been developed by these, you know, network monopolies is very, very profitable.
Chris Hayes: More of our conversation after this quick break.
(ADVERTISEMENT)
Chris Hayes: Can we talk a little bit more about the surveillance model because one of the things I think about the surveillance model is it doesn’t actually work that well. This is one of the great, to me, scams of the modern internet is ad tech. You know, the idea is you have this incredibly granular sense of who your customers are and I know that you’re Chris Hayes and you live in Brooklyn and you like these things and you like to work out and you like basketball and you have your dad and, you know, you’re into politics or whatever it is. And so I can sell you these things.
And there’s some level that that’s true. Like my Instagram feed shows me some dumb gadget to buy from my home gym. Totally. Yes. And then I buy it because it’s like you got me. But a lot of it doesn’t work. And also, it doesn’t seem to actually remember when I bought a thing that it doesn’t have to keep advertising it to me.
And I’m just not clear that, like, there’s this vision of it as all this, like, amazingly efficient machine of we surveil you. We take your information and we sell it to advertisers who sell you exactly what you want. And it actually all feels incredibly schlocky to me and like pretty kludgy, like it’s not actually operating with the degree of sophistication that it’s, quote unquote, supposed to.
Meredith Whittaker: You know, I think that is largely correct. And I think what it does do well is sort of it has captive audiences, right?
Chris Hayes: Yeah.
Meredith Whittaker: So if you spam a billion people with something, a handful of them are going to click. Maybe one of their kids is going to accidentally kind of, you know, move through some Venmo flow and buy something. You know, occasionally there’s something cute on Instagram and I’m like sluggish enough to buy it, right? But it’s not, you know, that the ads on Twitter actually kind of make me happy and that I feel so anonymous when I’m advertised like a, you know, like I love shamrocks, St. Patty’s Day, extra-large T-shirt. And I’m like, you just do not know me, do you?
Chris Hayes: Right. That’s my point.
Meredith Whittaker: Thank you.
Chris Hayes: Really, exactly. But it’s funny to me how often that’s the case and also how I guess what I would say is that I didn’t think that the cutting edge of internet commerce would feel like infomercials on a Saturday morning in 1984, which is what they all feel like. It doesn’t feel like slick and glossy. It feels like just a sea of schlock is what it’s being used to sell.
Meredith Whittaker: Remember, like the AOL homepage from back in the day --
Chris Hayes: Right.
Meredith Whittaker: -- when you were like a kid in the ‘90s and you were just like, whoa, I can’t pay attention to anything. Everything is flashing and glowing. And, you know, it feels very similar and I think Tim Wong, who is a researcher, wrote a really lovely little book a couple of years ago called “The Subprime Attention Economy,” which was, you know --
Chris Hayes: I love that book.
Meredith Whittaker: -- looking at this --
Chris Hayes: Yeah, I love that book.
Meredith Whittaker: -- at this very issue. I don’t want to dismiss the kind of effectiveness of this out of hand, because I do think the power here is the ability, correctly or not, to begin to segment people, like classify them into different categories and then make guesses about, you know, Meredith, lady in New York does a lot of yoga. She’s maybe going to like this, you know, Pilates mat or, you know, whatever it is.
But of course, you know, that’s kind of trying to influence different set, you know, create different segments and then try to sort of order them and influence them. And I think it, you know, we don’t see behind the scenes on how easily those objectives are met. I do think, you know, the one thing that this, you know, the revival of neural networks that happened in the early 2010s did is begin to sort of probabilistically determine those segments. Like that’s why AI got popular is because these surveillance companies were like, oh, dang, we have the data and we have the compute and we can now make these old probabilistic techniques to do new things.
And you know what that’s useful for? That’s useful for sort of building heuristics around ad targeting. That’s useful for crafting engagement driven algorithms. I think it’s not a, you know, coincidence that there was a big announcement kind of guilelessly made about Google Brain Team who had come out of the lab and into the real world of Google doing this ML stuff after kind of Jeff Dean had pushed forward a couple of advances. And they were like, we are redoing the YouTube algorithm.
And that was right around the same time you started to get all the problems of like, hey, why am I watching a, you know, a lecture by a feminist philosopher? And then I’m being recommended Jordan Peterson on auto play, right?
Chris Hayes: Right.
Meredith Whittaker: And so, you know, I think they’re trashy. I think they are broadly inaccurate, but I think the ability to sort of probabilistically create categories into which you situate people and then to sort of sort them and order them and try to influence behavior based on those categories is something we need to take seriously as a, you know, a bad thing to have in the hands of a few corporate actors who are, you know, broadly unregulated and, you know, well, they are regulated, but not nearly enough. And of course, there’s no federal privacy law in the U.S.
Chris Hayes: Yeah, I mean, I totally agree. I think there’s an interesting intellectual debate between the Shoshana Zuboff vision of this, which wrote this book “Surveillance Capitalism.”
Meredith Whittaker: Yeah.
Chris Hayes: And the Tim Wong vision of it, which I think is just an interesting one. In some ways, it’s an empirical one, like how well does this stuff work? But that point about segmentation, this is something I’m pretty obsessed with, that the segmentation for consumer marketing to me is less problematic than what it does to like the public realm in the broadest sense of democratic discourse, which is to just completely fracture and balkanize it in a way that’s opaque.
So if you’re reading the “National Review” and I’m reading “The Nation,” right, well, at the very least, we could swap magazines.
Meredith Whittaker: Right.
Chris Hayes: In some ways, it’s transparent what was published in “The National Review” that you might have read and what was published in “The Nation” that I read (ph), you know, a left-leaning publication, or right-leaning publication. The black box opacity of the algorithm is like, I don’t frickin’ know what you’re seeing. You don’t know what I’m seeing. No one knows what anyone else is seeing.
I mean, in the specific sense of what thing after another, there’s one video that you can say, well, that has one million views. But the specific sense of what information is a person being fed over time, I find it so unnerving that there’s no actual definitive answer --
Meredith Whittaker: Yeah.
Chris Hayes: -- that it’s all just a black box.
Meredith Whittaker: Yeah. Yes. Period. Exactly. And it’s a black box that is, again, heuristically determined by a handful of companies to maximize engagement in order to maximize ad revenue. So it’s not, you know, there’s no balanced diet here. We’re just being fed Skittles.
Chris Hayes: Yeah. Yeah.
Meredith Whittaker: And I also think we do need to mention the role of these platforms in hollowing out the possibility for those two physical magazines sitting on a table together. There’s, you know, the decimation --
Chris Hayes: Destroyed. It’s destroyed all of, yeah.
Meredith Whittaker: Of local media, of, you know, even, you know, anything but the like handful of large media houses that the FT, “Wall Street Journal,” “New York Times,” you know, we don’t wear the practices and journalistic norms and ethics that undergirded for the state are all but hollowed out.
And I have this experience frequently of being on Twitter and, you know, some guy will quote tweet me and I’ll look at who it is and I’ve never heard of this dude before. And he has like twenty five million followers. Like there’s an incredible parallel ecosystem that just does not ever permeate into my world, but is somehow, you know, clearly more persuasive than a lot of what I do follow and know to some people or, you know, at least has, you know, bought a bunch of bots to follow it.
Chris Hayes: Yeah. Nudes in bio.
Meredith Whittaker: Nudes in bio.
Chris Hayes: So, to me this segues to TikTok nicely, because to me, I want to put aside the question of the forced sale of TikTok or shutting it down if there’s not a for sale and the policy question around that, which you have really interesting and strong feelings on, which I want to talk about. Before we even get to that, just TikTok, claw TikTok (ph). Like, let’s say it was an American company, okay.
Meredith Whittaker: Yeah.
Chris Hayes: Let’s say it was American. It doesn’t matter. As a technology, it feels really unnerving to me as like a kind of apex predator. There’s a great book about slot machines by a sociologist named Natasha Dow Schull. And it basically is about how they kind of outcompete everything else on a casino floor and over the years, they’ve taken up more and more of casino floors because they’re just better at retaining gamblers and making money. They’re sort of the best attentional machine devised. And TikTok feels like that and I actually think the physical scrolling, it’s like a slot machine.
Meredith Whittaker: Yeah.
Chris Hayes: And I think that’s not an accident. And yeah, it just feels like, you know, fentanyl is to heroin or whatever metaphor you want to use, that it’s just better at retaining attention in a way that others aren’t and it kind of freaks me out how good it is.
Meredith Whittaker: Yeah. No, I mean, I feel like actually the sensory overload of walking into a casino or one of those out of time, loud, blaring, flashing environments is my visceral reaction opening TikTok.
Chris Hayes: Yes. It’s very casino like, yes.
Meredith Whittaker: Like, damn, who is that person? What are they doing and why are they selling like crystal healing? What is going on with that dance? You know, I have no love for TikTok. I think what they’ve done is sort of take a paradigm that Instagram is trying, YouTube is trying, you know, all the platforms are trying.
Chris Hayes: Oh, they’re all copying it now, just straight up. Yeah.
Meredith Whittaker: Yeah. IV feed, you know. So, yeah, I don’t see great benefits from any of these platforms. In fact, I think, you know, I’ve argued like the platform form itself as sort of centralized overseers and ultimately centralized modes of control over our global information ecosystem is a, you know, really terrifying place to be in, you know, for a number of reasons at this time in the world.
We’re facing imminent climate crisis. We have rising networked authoritarianism around the world. And, you know, the way that good journalism acts as an immune system to some of the kind of authoritarian propaganda, you know, we don’t have that muscle and it’s, you know, almost atrophied completely. So, I have no love for TikTok and I do think there’s something really annoying about it that I can’t really spend much time there or on Instagram anymore. Any of the kind of, you know, the sort of visual feeds.
Chris Hayes: Yeah. Well, the worst feeling is to not like it and still do it, which I think that is the worst feeling and the feeling I think increasingly people feel about the internet, actually, which I think gets it part of the weird situation we’re in. You’ve come out strongly against the legislation that passed the House on a bipartisan sort of very interesting, strange bedfellows, you know, both for it and against it, that would essentially force ByteDance, which is the parent company, to sell TikTok to a purchaser that was not essentially controlled by the Chinese government.
And you think this is a very bad idea. Just to put my cards on the table, I feel like I’m a swing voter on this or like I’m I feel cross pressured on it. Some arguments in both directions, I find persuasive. So I don’t I don’t have settled opinions. But what is your argument here?
Meredith Whittaker: Yeah, I mean, look, my argument is coming from the perspective of someone who, you know, starts and ends with the conclusion we are in a really bad place and anything we do here is going to be harm reduction, not actually attacking the root cause, right.
Chris Hayes: Right.
Meredith Whittaker: You know, my position is these platforms should not exist in their current form and they’re, you know, dangerous to democracy. They are dangerous to journalism. They’re just simply too tempting for any centralized government to try to manipulate and control for us to keep them around in their current form. But, you know, the argument on the TikTok ban in particular has a couple of points. One, there’s a lot of noise being made about privacy and about TikTok being, you know, kind of the vector for the Chinese state into data gathering on U.S. citizens, I see no evidence that TikTok offers special access, right.
There are thousands of data brokers across the world and, you know, a number based in the U.S. that, you know, Chinese operatives can access. You know, there is way, way too much data out there. And that I don’t see that as a barrier to entry and I’ve seen no evidence, right. You know, people will kind of gesture to like if you had classified information. And I see that as kind of a NatSec red herring in this case, given my understanding of how these platforms work.
I also think that, you know, this is clearly a protectionist policy, even if it is also a NatSec policy in good faith, which, you know, for some it may be. But I think there’s a real concern I have, particularly as we face the very likely possibility of a Trump presidency that is right now preparing to gut the administrative state pool, all power in the executive branch, you know, that would be ready on day one to issue a federal abortion ban, like real drastic measures that you may or may not agree with.
Chris Hayes: Right.
Meredith Whittaker: But nonetheless, I think, you know, a government in waiting that is preparing to pull that much power under the presidency, you know, to be ushering in a ban that gives the president the power to determine a platform is, you know, foreign adversary controlled gives them a really big stick to discipline any platform in the world such that it complies with certain mandates on acceptable speech and expression.
And that gets to the, you know, the kind of uncomfortable third point, which is that we’re already living in a time where in many states, access to LGBTQ resources are banned and criminalized. We’re seeing books being banned. We’re seeing pedagogy that engages race and racism. We’re seeing access to abortion resources banned. So those become, you know, when we think about harmful content or, you know, off limits speech, which, you know, there’s obviously, you know, child exploitation and other things really do fall into that category.
We’re seeing a move that is very intentional and very well organized to include a lot more expression into that bucket. And I think it’s concerning to me that that is happening at the same time that we’re seeing a full frontal campaign to bring every platform under U.S. jurisdiction. So, you don’t have the benefit even of a kind of cross jurisdictional agonism where maybe the Chinese state doesn’t care about our ability to access LGBTQ resources.
So there is at least a way to access them there, even as the U.S. state, you know, is clamping down on Meta and what have you. And look, I recognize all of this is very hypothetical. You know, I can’t prove that this is what’s going to happen. But I also think the stakes are so high that it is irresponsible for me as an executive, you know, thinking about these issues and leading Signal not to take them very seriously and prepare for them.
Chris Hayes: We’ll be right back after we take this quick break.
(ADVERTISEMENT)
Chris Hayes: So this point that my understanding is the way the legislation works is the president deems, because it’s not specifically TikTok, right --
Meredith Whittaker: Yeah.
Chris Hayes: -- it’s that the president can deem a network or a platform whatever.
Meredith Whittaker: Foreign adversary, which is like China, Iran, Russia, North Korea right now. So, you know, the classic slate.
Chris Hayes: Yeah. Although that’s moved around a little bit.
Meredith Whittaker: I know. I was like, wait, what was the ‘90s --
Chris Hayes: Yeah. It wasn’t the access of e-mail, right? China got in there. So, yes, that there’s something ominous about the power that gives the president to make that determination, particularly in the context of imagining Donald Trump wielding it as a tool of discipline for disciplining networks.
Meredith Whittaker: Right.
Chris Hayes: Right.
Meredith Whittaker: And we recognize that there is a deep tradition of almost total deference in the courts to the executive branch when it comes to matters of national security.
Chris Hayes: Right. Yeah, I find that pretty persuasive. I mean, to me, the best argument isn’t the privacy one. It is actually just the mercantilist trade one, which is like, look, if you can’t sell a Ford car in China, then China can’t sell cars here. And actually, that’s kind of how it works with cars right now. And like, you know, if Facebook can’t operate there, they can’t run TikTok here. Like now it’s very different, a car and a platform because it implicates speech and information and democracy in a way that a car doesn’t.
But that argument, I think, is a decent one. It is not like hysterical, is just much more clear eyed and more honest. Like they don’t let us sell our cars there. We won’t let them sell their cars here.
Meredith Whittaker: Yeah. And I think, like, let’s just put ourselves in the shoes of a European watching that and like, huh, which one do we want because we don’t have any (inaudible). Right?
Chris Hayes: We don’t have any, right. That’s a great point.
Meredith Whittaker: Because there’s only, you know, the top four platforms by our U.S. based TikTok is way behind YouTube and WhatsApp and Instagram. So we’re talking about the U.S. having control over the platforms that constitute a global information ecosystem, not simply the U.S. So I think, you know, yes, if we’re just going to put our blinkers on and look at, you know, like tit for tat kind of trade issues, that’s a kind of metaphorical way to understand this. But I think the stakes obviously redound well beyond —
Chris Hayes: Yeah.
Meredith Whittaker: --- are we driving a Ford or a I don’t know the name of a Chinese car company because they can’t sell them here but I’m sure they’re --
Chris Hayes: Exactly, but they’re very good and very cheap, very successful.
Meredith Whittaker: Yeah, I would probably drive one.
Chris Hayes: At some point that’s going to fall apart, actually, which is sort of next interesting frontier. So let’s talk about AI because you also have been thinking about AI for a very long time, involved in AI discourse. The place I want to start is just I feel torn. I have two conflicting impulses on AI right now. One is that like, oh, it’s very clearly a pretty impressive technology. And when we say AI, large language models that are useful to produce things that sound like humans, I guess would be one way of thinking about it, right.
That can do the kinds of knowledge work that only humans can do until very recently. Like, please read this chapter and summarize its main points for me. Couldn’t get a computer to do that two years ago, really. And now you kind of can. There’s a whole bunch of other things. Read this brief and try to write a brief that’s a counter response to it. Five years ago, it’d be very hard to get someone a computer to do that. Now, a computer can do it and like, it’s not bad.
So at one level, I’m like, yeah, this is an obviously serious, real technology with real impacts and could be enormously revolutionary. And then also the rush that everybody, just the gold rush feel of it has a very crypto feel to me where I’m like, is this just some nonsense bubble? And I feel conflicted because with crypto, I basically never got the use case. It never seemed like a useful technology to me other than for the purpose of paying ransom. Like that was like the one actual use case for it. Whereas AI doesn’t feel that way. AI feels like a real technology with enormous potential and use cases and also a very frothy bubble around it right now. Help me figure out how to feel.
Meredith Whittaker: Yeah, I mean, there’s a lot going on. You know, I mean, it’s certainly a bubble, you know, not all bubbles pop and crash and go away, but all bubbles do pop at some point, right. The dotcom bubble pops and then sort of cemented the ad tech, you know, the surveillance advertising industry in its wake.
Chris Hayes: How useful do you think the technology is?
Meredith Whittaker: To do what? And that’s really like summarize a brief?
Chris Hayes: Yeah, to do a lot of the quote, unquote knowledge work that a lot of essentially what we call white collar professionals are paid to do, which is to generate marketing memos and to read briefs and write response briefs or to come up with an ad campaign for a product, you know, all that sort of stuff.
Meredith Whittaker: So I think it can write an e-mail prompt. An e-mail prompt is probably like fine if you’re sending an e-mail and no one reads an e-mail anyway so you just have bots prompting each other forever, right. You know, it can come up with a marketing memo, but you’ll definitely need to get an intern to correct it because it’s going to be boring and anodyne.
Chris Hayes: Right.
Meredith Whittaker: It can summarize a brief, but like you better not be my lawyer.
Chris Hayes: Right.
Meredith Whittaker: Because that summary is very likely to be wrong.
Chris Hayes: Right.
Meredith Whittaker: Very likely to miss key nuance. And a real issue here is it’s not accurate. It is not trustworthy for any domain where facts actually matter. And that I want to contrast to the fact that it is so expensive, right. We’re talking hundred million dollars to do one training run of, you know, one of these advanced models. There’s kind of discussion, a billion dollar training runs.
And that doesn’t even get into the cost of the labor required to basically tame these systems into some sort of form that will produce, you know, effectively like kind of acceptable liberal discourse that isn’t spewing slurs or otherwise, you know, making it unsafe to use in your chatbot business or whatever.
Chris Hayes: Right.
Meredith Whittaker: And then it costs a lot to run. So running one of these is much more computationally intensive than like a regular information retrieval search query. So we’re talking about a really expensive set of technologies that only a handful of companies actually have the resources to produce and use. And we’re talking about that like, okay, is that worth it for an e-mail prompt?
Chris Hayes: Right.
Meredith Whittaker: Is it actually? Like is that actually going to be a return on investment? Is there a problem big enough that this solves that it’s actually worth all the money? And so that’s where I see the bubble crashing down.
Chris Hayes: Okay, so that actually clarified this for the first time and I feel like I understand because that actually syncs up the two things that are in tension, which is the float of all the money rushing in is obscuring the resource cost. Like the reason that it seems because you could just go to ChatGPT and like mess around and be like, do this, do that. And it’s like, wow, this is kind of cool. I had it write a stand-up comedy routine in the voice of Ulysses S. Grant about the Battle of Vicksburg and it was like pretty good.
But whatever that costs, electricity, computing power, all that is completely hidden because of how much money is rushing in so that no one’s paying an actual price for it. It’s a little like, you know, it’s like Uber before Uber had to be profitable where, it’s like, oh, I just I went 25 miles for $7 like this work.
Meredith Whittaker: Yeah.
Chris Hayes: It’s sort of doing that right now is what you’re saying.
Meredith Whittaker: Yes. And, you know, we have to see ChatGPT as an advertisement that Microsoft is running a very, very expensive advertisement that does two things. One, it marketed Microsoft as an AI leader, which it certainly was not considered before it ran that ChatGPT ad. And two, it’s a really good ad for their GPT APIs, which they sell as part of their Azure cloud services, right.
So, you know, they’re saying you could play with GPT. It costs us, you know, $100 million a year to run ChatGPT because we’re having to pay for all the compute and we’re having to maintain the system. But, you know, what that’s really doing is that luring cloud customers in where we market them like, hey, you should buy access to our GPT services that you can effectively white label for your fake AI startup, which isn’t really building AI. It’s just rewrapping a Microsoft API or an Amazon API or a Google API. So that’s the go to market strategy. Either you go to market through a cloud company --
Chris Hayes: Okay.
Meredith Whittaker: -- or you go through to market because you’re Meta and you have a billion users on your platforms and you can just sort of use AI to reach those users, you know, basically to calibrate your ad, you know, advertising systems or, you know, give advertisers new tools or what have you. But there isn’t really another go to market strategy here. So this is why I’ve written with the Amba Kak and Sarah Myers West and others at AI Now that AI is really a big tech technology.
And this is why open AI is bolted onto the side of Microsoft. This is why, you know, Inflection just sort of had a weird, you know, it looks like kind of an acquisition, not quite an acquisition.
Chris Hayes: Didn’t Microsoft steal their CEO or something?
Meredith Whittaker: It was super weird. It was like their CEO and then most of the staff, and then there’s some investor payment coming. But basically it looks like Microsoft is sort of buying their model so they’ll now be able to, you know, you can license an Inflection API through Azure. You can license an open AI model, you know, GPT model through Azure. You can license a Mistral model through Azure, right. But ultimately, like the business model of these startups is to bolt themselves on to one of the big companies because there’s no other way to go to market.
Chris Hayes: And that’s because the computing power you need is so intensive.
Yeah. You know, it’s so intensive and you don’t have the cloud businesses and the economies of scale, right. Microsoft has data centers across the world with, you know, hundreds of thousands of SREs maintaining those --
Chris Hayes: What are SREs?
Meredith Whittaker: Site Reliability Engineers. Sorry. It’s like, you know, like they’re the guys who are like making sure the servers run at the low level. And then you have like DevOps who are crawling through the server rooms, like getting the rat out of the --
Chris Hayes: Right.
Meredith Whittaker: -- tunnel that like mess up the wiring, all that. You know, it’s expensive.
Chris Hayes: There’s actual physical infrastructure here and a ton of it and it’s extremely power intensive, which is there’s a “Times” article about how all this is coming online. It needs 24/7 power. It’s pushing power consumption up at a time when we’re in a very precarious situation because we’re doing this kind of grid transition to lower carbon energy. So there’s that part of it, too.
Meredith Whittaker: And then, I mean, it’s four times more power intensive to run a data center with GPUs, which is the AI chip that, you know, at this point, than it is with CPUs, which is what everyone was using.
Chris Hayes: Wow.
Meredith Whittaker: So you’re having to retrofit these data centers for these sort of hyperscale compute environments. And, you know, Northern Virginia recently said we cannot accept any more data centers unless they bring their own power source because Dominion Energy, which supplies their grid, doesn’t have the power. So it’s, you know, we’re hitting hard caps here. And again, I want to go back to this sort of only go to market strategy because no one else has the sort of cloud business model or the global reach.
So, you know, if I want to stand up a global app, I want it to get to users everywhere. I want that to be high availability and have, you know, the features like video sharing and all of this stuff that takes a huge amount of bandwidth and you need your servers proximate to your users or it’s going to round trip around the globe, and then we wouldn’t be able to be having this conversation.
Chris Hayes: Right.
Meredith Whittaker: It’s Microsoft, Google and Amazon have 70 percent of that market followed by some Chinese companies, right. So you don’t have the opportunity. You know, you can’t just do that yourself if you are any other smaller company.
Chris Hayes: Okay. Again, I’m a little slow here, but what I’m hearing is that --
Meredith Whittaker: Well, I talk fast.
Chris Hayes: What you’re saying is in a tech world that is already dominated by giants, that it is conveniently the case that this technology is the most favorable to massive, well-resourced incumbents. And it’s probably not an accident, the enthusiasm for this technology by same well-resourced massive incumbents, because they’re aware that this is not like a technology where there’s like some upstart that’s going to upend things. But they see it as a means of consolidating further their sort of massive incumbency.
Meredith Whittaker: Yeah, exactly. And, you know, it’s a really good narrative. So, you know, I think it’s important just a little bit of the history here.
So there’s a paper in 2012 that basically kind of like launched AI machine learning, whatever, onto the scene. You know, basically it showed that you could use these GPUs, these powerful parallel processing chips and huge amounts of data via the ImageNet data set, which was scraped from the web, you know, the collateral of the surveillance advertising business model --
Chris Hayes: And human production like cost.
Meredith Whittaker: Exactly. And, you know, it was only possible due to Amazon Turk workers. In fact, like Fei-Fei Li says, she was about to abandon creating ImageNet until a grad student told her that, oh, there’s this microtask service. You can get them to label it instead of grad students and it will take less than 100 years.
So, you know, it’s very contingent. But, you know, this paper showed that techniques that were developed in the late 1980s could do new things when you had compute, huge amounts of compute and huge amounts of data. So it was basically favoring an industry that had already sort of consolidated many of these resources in a way that had no real competition. And I think it’s really notable that when this came out, we weren’t really talking about AI.
We’re talking about machine learning. We’re talking about neural networks. We’re using kind of technical terms of art. But the AI narrative was kind of bolted onto that with the super intelligence, with this idea of building AGI, which I find to be, you know, it’s a really powerful marketing narrative if what you want to do is sell the derivative outputs of your kind of surveillance business model, these models created by the data and the compute as intelligent, as capable of solving problems across a billion different markets from education to health care to, you know, whatever.
So I think we need to trace also the history of that term AI and particularly like how it became favored now and how much benefit we’re giving to these technologies because, you know, we recite the word intelligent when we speak about them.
Chris Hayes: Yeah, I mean, I had a conversation with Kate Crawford, who you and her co-founded a center together, if I’m not mistaken.
Meredith Whittaker: Yeah. She’s at Microsoft. Yeah.
Chris Hayes: At Microsoft, where we talked about the kind of like the magician trick feeling, you know, that like, yeah, it looks like he made it disappear, but it didn’t actually disappear. There’s just a lot of computing power running. I guess I do feel, though, that and I totally understand that history, right, that the surveillance tech model, huge data sets, huge amounts of computing power, you have these two things. Okay, what can you do with them? Okay, you can do large language models.
And that there was something utterly ingenious, right, about the chatbot and the packaging, right, to go back literally to the Turing test, Alan Turing, the sort of one of the original thinkers on artificial intelligence. And his test is, can you interact with artificial intelligence, a robot and not know that it’s a robot, not know that it’s a computer? And to go back to the Turing test and come up with, you know, a chatbot that you interact with and does uncannily feel like something’s on the other side of it.
I guess my question is, so then what next? Like your view on this is that it’s pretty cynical and that it’s sort of mostly marketing mumbo jumbo to repackage a set of resources and business advantages that already existed. But it also seems to me like it’s maybe it won’t. I guess maybe what you’re saying is not actually that useful in the end, but isn’t it going to get better and then be really useful?
Meredith Whittaker: It is useful, but it’s not necessarily useful to all of us, right? I think, you know, you can look at the writer’s strike and see who it’s useful for and who it may threaten, right. It doesn’t, you know --
Chris Hayes: Yes, I mean, that’s the worry.
Meredith Whittaker: Replace a human being, right. We know that the scripts it produces are going to be like worse than the worst Netflix dreck, right. And that it will still require human writers with sensitivity and ingenuity to get in there and rewrite that script because it’s going to be just trash unless they do. But it does allow the studios to give those, you know, people who used to be called writers the title of, you know, editorial assistant, hire them through some gig platform and degrade the skill that they put into that, right.
So I think, you know, again, these are really neat. But given, you know, the centralized nature of the resources required, given the cost and given, you know, what they’re actually good at, which is, I think, you know, they’re very good surveillance techniques, right. AI as a, you know, a surveillance, you know, it’s very good in military techniques. It’s very good in employee surveillance, right.
But it’s not necessarily good at things that are, you know, particularly beneficial for the many. You know, I do think we’re going to have to do a lot of pruning of this very weedy garden of claims and hype and really get down to, like, what problem does this actually solve? While keeping in mind that the issue is not necessarily the tech itself, it’s how the structures that create it are shaped, the corporations and the governments that have the ability to determine who it serves and who it subjects.
Chris Hayes: Meredith Whitaker is president of Signal app. She’s also co-founder and chief advisor to the AI Now Institute. That was such a great conversation, Meredith. Thank you so much.
Meredith Whittaker: Thank you so much, Chris. I genuinely enjoyed it.
Chris Hayes: Once again, great thanks to Meredith Whitaker. By the way, just a plug. I use Signal app and I think it’s great. You can check it out and download it and use it yourself. You can e-mail us at withpod@gmail.com. You can get in touch with us using the #WITHpod. You can follow us on TikTok by searching for WITHpod. You can follow me on X, Threads or Bluesky @chrislhayes.
“Why Is This Happening?” is presented by MSNBC and NBC News, produced by Doni Holloway and Brendan O’Melia, engineered by Bob Mallory and featuring music by Eddie Cooper. Aisha Turner is the executive producer of MSNBC Audio. You can see more of our work, including links to things we mentioned here by going to nbcnews.com/whyisthishappening.
“Why Is This Happening?” is presented by MSNBC and NBC News, produced by Doni Holloway and Brendan O’Melia, engineered by Bob Mallory and featuring music by Eddie Cooper. Aisha Turner is the executive producer of MSNBC Audio. You can see more of our work, including links to things we mentioned here by going to NBCNews.com/whyisthishappening?