This week, the team dives into why disinformation and the AI industry battles have quickly positioned themselves at the center of the ongoing conflict between the US and Iran. They also discuss how prediction markets like Polymarket and Kalshi are increasingly facing insider trading accusations and ethical questions. Also, how did Paramount beat Netflix in its bid for Warner Bros? Plus: Hosts Zoë Schiffer, Brian Barrett, and Leah Feiger share their predictions for the future.
Articles mentioned in this episode:
- X Is Drowning in Disinformation Following US and Israeli Attack on Iran
- How Journalists Are Reporting From Iran With No Internet
- Anthropic Hits Back After US Military Labels It a ‘Supply Chain Risk’
- A Former Top Trump Official Is Going After Prediction Markets
- Everything Larry and David Ellison Will Control If Paramount Buys Warner Bros.
You can follow Brian Barrett on Bluesky at @brbarrett, Zoë Schiffer on Bluesky at @zoeschiffer, and Leah Feiger on Bluesky at @leahfeiger. Write to us at uncannyvalley@wired.com.
How to Listen
You can always listen to this week’s podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here’s how:
If you’re on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts and search for “uncanny valley.” We’re on Spotify too.
Transcript
Note: This is an automated transcript, which may contain errors.
Brian Barrett: Hey, it’s Brian. Zoë, Leah, and I have really enjoyed being your new hosts these past few weeks, and we want to hear from you. If you like the show and have a minute, please leave us a review in the podcast or app of your choice. It really helps us reach more people. And for any questions and comments, you can always reach us at uncannyvalley@WIRED.com. Thank you for listening—on to the show.
I’m so excited that I am in New York doing this in-person with Leah Feiger. Zoë, you’re still on the screen.
Zoë Schiffer: I know I am. Brian, should we brag about why you’re in New York?
Brian Barrett: Yes, we absolutely should. Leah was honored last night at the Front Page Awards as Journalist of the Year.
Zoë Schiffer: Correct. She got a literal physical award.
Brian Barrett: She did. Not only that, Zoë, you don’t know this, but I do because I was there.
Zoë Schiffer: Wow.
Brian Barrett: She got introduced by our editor in chief, Katie Drummond. She made a video about her achievements last year, and she gave a lovely speech.
Zoë Schiffer: I love this.
Leah Feiger: It was really nice. To be clear, to me, this is an award for all of WIRED.com. And in my video, I did not mention myself once because this is about WIRED and all about WIRED.
Brian Barrett: It’s about Leah, and it should be.
Zoë Schiffer: Welcome to WIRED’s Uncanny Valley. I’m Zoë Schiffer, director of business and industry.
Brian Barrett: I’m Brian Barrett, executive editor.
Leah Feiger: And I’m Leah Feiger, senior politics editor.
Zoë Schiffer: This week, we’re diving into the ongoing conflict in the Middle East, particularly as the AI industry has been entrenching itself with the Department of Defense. We’ll also discuss what’s going on with prediction markets and what we make of the potential Paramount and Warner Bros’ historic merger.
Leah Feiger: Let’s jump right into what’s going on with Iran. It has been nonstop since the US and Israel began a coordinated military strike on Iran on Saturday. Iran has responded with their own attacks on US bases and countries across the Gulf. Things have escalated really, really quickly.
Archival audio: Iran Supreme Leader, Ayatollah Ali Khamenei, has been killed in today’s joint attack by the US and Israel.
Archival audio: Iranian officials say airstrikes hit an elementary school Saturday, killing more than 160 people, mostly children.
Archival audio: US embassies across the region are now telling Americans to shelter and place—
Leah Feiger: I know we were all working this weekend on this, but I was sort of stunned how quickly disinformation became the center of this conflict. WIRED reviewed hundreds of posts on X, some of which racked up millions and millions of views, that promote misleading claims about the locations and scale of the attacks. Our colleague, David Gilbert, reported on some of these very specific examples, and the range was wild. They included AI-generated images to video game scenes being passed off as real footage, to countries getting mistaken for each other. To me, it’s a combination of—obviously, there’s a lot of disinformation out there, but it’s also because this is just chaos.
Brian Barrett: Yeah. I think to me, the disinformation itself is maybe less surprising than the lack of urgency around fixing it or doing something about it—which, I guess I shouldn’t be surprised at that either. But I feel like every time anything happens, you get sort of—it’s almost the same lineup of the video game footage and—
Leah Feiger: No, the blog writes itself.
Brian Barrett: Yeah, it really does, as does the part of—and also X got rid of most of its public safety team. They’ve got community notes in there that they append to some of these, but by the time a community note gets on there, it’s already been viewed 4 million times. And also, it’s below the post anyway, so you’ve already seen it, and doesn’t really seem to stop them from getting distributed.
Zoë Schiffer: This is the culmination of years of product and policy decisions. It’s what happened when you make the platform hostile to journalists. You get rid of most of your fact-checking team and content moderators. You rely on community notes, which have proven time and time again that they’re really effective for certain things, but during breaking news, they’re woefully inadequate. And you pay people for traffic, which incentivizes people to have quick hot takes, whether or not those takes are actually grounded in reality. So yeah, I mean, we should continue to report the story. We should not continue to be surprised by this story.
Leah Feiger: And that’s absolutely right. I mean, I’m looking at, there’s been some amazing WIRED coverage of how Iranian journalists and activists and just ordinary citizens are trying to get information on the ground and actually get it out then from the country. So it makes me think, who is this disinfo for, right? Is it just to really create more chaos, muddy the waters? Whatever it is, X is an absolute cesspool. It already is so difficult to trust numbers and facts and figures about what’s coming out. Who’s actually able to get internet access? It’s constant.
Brian Barrett: I think one thing it’s for, I think a lot of this comes from accounts with blue checks and blue checks can monetize content. So a lot of it is really—I hate this phrase because it’s targeted at journalists a lot, but I think in this case, true for the clicks. But your point is right: Iran has 4 percent internet connectivity right now. So there is all this narrative happening around the country that is being: The journalists left X, but the politicians are still there. I think there were a couple of high profile instances of legit politicians free posting, commenting on things that were fake as though they were real. And that shapes public opinion. And public opinion really matters in a time like this. We’re in a war that is not authorized by Congress, that threatens to spill out into a much bigger conflagration.
Leah Feiger: I mean, it doesn’t really seem like this is ending anytime soon if our defense secretary, Pete Hegseth’s recent comments about how this doesn’t seem to be ending anytime soon or to be believed. I mean, there’s over a thousand people, I believe at this point CNN has been reporting that have been killed during the fighting in the Middle East from strikes. A number of those are US service members, and WIRED’s own core interests as well: We’re talking about trade. We’re talking about oil. We’re talking about data centers, like how all of this is going to be swept up. I don’t know. I don’t really see a world where we’re not going to be talking about Iran anytime soon.
Brian Barrett: It’s interesting. Molly Taft, who’s our great climate writer, they’ve written about how oil and gas prices spiked, which I think you would assume the Strait of Hormuz is not officially closed, but it’s basically closed because the Iranian military has said, “Don’t go in there.” But it has downstream effects too. Fertilizer prices are going through the roof. They have a story about this on WIRED on Wednesday because the Middle East supplies a huge amount of the world’s fertilizer. Now, you may have noticed that it’s also just about springtime, which is kind of when US farmers need fertilizer the most.
Leah Feiger: Great.
Brian Barrett: Yeah.
Leah Feiger: This is good. It’s just great all around. I mean, there’s so many knock-on effects, I have to also shout out CNN’s world team has been doing the most unbelievable updates minute to minute, hour by hour. It’s just this little corner on the side of my screen, and they’re getting really into the nitty-gritty of what this means. It’s not just these different strikes. The idea that there’s, for example, right now, an 18-hour traffic jam in Lebanon as people are trying to get out. These are very, very specific things. But yeah, Brian, I think about the fertilizer. I think about all of these knock-on effects and just the full, full spiral that the entire world is being pulled into right now. Already this morning on a bunch of different travel accounts, people were starting to talk like, “Can we be in Europe this summer?” And I’m like, “Whoa, this has hit the influencer spaces.” People are having a conversation about World War III. They’re sharing that Sex in the City, Sarah Jessica Parker, “What do you mean World War III? And that made me think, what about World War Me?” The fact that this has made it into the public lexicon so quickly, I just don’t see it ending anytime soon.
Zoë Schiffer: I mean, I think within this conversation, we have to talk about the AI angle because the conflict is happening on the heels of the Department of Defense making, and then potentially breaking deals with top AI companies. So this past Friday, OpenAI struck a deal with the Department of Defense right as Anthropic was going head-to-head with that same department over concerns about how its technology would be used. It wanted a couple of conditions, including a ban on surveillance of American citizens and a ban on using its technology to build fully autonomous weapons. The DOD was not a fan of putting those conditions in the contract. And then on Saturday evening, the day the Iran strikes began, Sam Altman started an Ask Me Anything thread on X saying basically that the deal that he’d cut with the Pentagon was rushed. The optics didn’t look great, to say the least. But ultimately he defended the company’s decision by explaining that their goal was to deescalate things between the AI industry and obviously, Anthropic and the DOD.
Leah Feiger: I mean, our conversations from last week’s episode feel so prescient now. We were like, “This is it, guys. This could be real bad.” Days before the strikes. It’s so much worse now. It is so much worse. I mean, can we talk about Sam Altman’s AMA?
Zoë Schiffer: Yeah. I mean, so it’s so interesting because I feel like during our last conversation, we were talking about the optics and the branding. And I feel like yet again, Anthropic has really come out on top. I was thinking of these events in terms of recruiting, which sounds so dumb, but it’s like there is such an intense talent war taking place among the major AI labs. And Anthropic, I feel like continues to position itself as the good, the level-headed AI firm. And OpenAI continues to kind of blunder in these moments. And whether or not you believe Sam Altman, it comes out looking a little sloppier and a little less like it has a firm set of values it’s following. And I think that’s actually really going to matter in terms of who is able to get top research talent to join their labs.
Brian Barrett: The public perception thing is really interesting to me because I feel like—and I’m curious what you guys are seeing too, it feels a little bit off. I think there’s this sense that Anthropic is the—if not the woke AI—it’s coded blue now, right? But first of all, Anthropic products were used extensively in the initial strikes on around and continue to be, right? It’s a six-month phase out.
Leah Feiger: This feels a little bit like we’re giving them almost too much credit and maybe they’ve won this culture perception war here, but in the long term, they were part of this. They said, “Please don’t use our robots to kill people without a human clicking the button.” That’s the conversation here.
Brian Barrett: But I think that conversation too—isn’t it more around because they’re not ready yet, then it can never do this?
Zoë Schiffer: Yeah. They’re basically saying, “Look, it’s not fully reliable.” But it’s interesting because the reason that Anthropic was able to cut these deals in the first place was because their models were being run on Amazon secure servers. Amazon was already Fed ramped. And so, they basically had access to contracts that would allow their models to be run on classified systems, which OpenAI did not have. I also was having conversations with people inside OpenAI who were kind of pushing back at this idea that this moment is a real sea change for the company because as you might remember, OpenAI famously had kind of a blanket ban on military use, and then they started to redefine that policy over time. But people at the company were like, “Look, we were a very small AI lab. We had a blanket ban.” But then we started to have these conversations internally of like, is it a blanket ban on all uses or could you allow the Pentagon to use ChatGPT to summarize their emails, but not build autonomous weapons? And they started to need to really define that. But because they’d come out so strong by being like, “We’re building AI for the benefit of all of humanity.” As soon as they started to tweak their policies, people were really calling them out for being hypocrites, which I think they’re getting more and more sensitive to over time.
Leah Feiger: We had talked earlier about how tech workers at Google and OpenAI were circulating tons of letters calling for clearer limits on how their employers were working with the military and DOD after the US’s strikes on Iran. When it comes to recruitment, when it comes to actually keeping people in house, do things like this matter?
Zoë Schiffer: Yes, 100 percent.
Leah Feiger: OK. OK.
Zoë Schiffer: I think it’s a really good question, but I think that if you talk to researchers—and not everyone feels this way—but there are a ton of researchers, and I think it makes sense because a lot of these people come from academia, they tend to be a little more idealistic. They do not want anything to do with military use. And they really want their company to say firmly, “We’re not going to be involved in autonomous weapons.” You can no longer join a frontier AI lab and say, “I’m not going to work with the military at all.” Because they’re all trying to get these government contracts. But I think that there is a pretty large contingent of people who are very disgusted with the idea that their cool, nerdy, cutting edge AI research could be used in—basically could be used to kill people. They just don’t want that. And I think that that actually matters.
Leah Feiger: Do you think they’ll quit over it though?
Zoë Schiffer: I mean, I think we’re already seeing some people quit OpenAI and join Anthropic.
Brian Barrett: Yeah. Yeah. I think that’s absolutely true. And I think you have to remember too, it’s a weird market in the sense that anywhere you go, if you’re there for a decent—let’s say you’re vested, you’re going to have enough money that you are never going to have to work again. So you can bake in more—so you’re not really trapped. There’s so many places that are going to give you so much money that I feel like that’s less of a consideration as well.
Zoë Schiffer: Yeah. I mean, to be clear, I think we should underline that point. We’re not talking about people quitting and giving everything up. They’ve made generational wealth. They can now quit their current job that’s paying them millions and millions of dollars and then go to another job that will also pay them millions and millions of dollars. So the world is their oyster.
Brian Barrett: I want to go back to one point really quickly too, Zoë, that you made a little bit ago, which in terms of the messaging, how OpenAI’s messaging early got them backed them into this corner where now they’re kind of unwinding it. I think messaging has a lot to do with this too in terms of the US military’s interest in this stuff. If you go around saying from the very start, “I’m building something that is more powerful than nukes and is going to irrevocably change society.” The US government and the US military and other governments and other militaries are going to be interested. I think if you think of—if someone were to say, “Hey, by the way, I’m building a nuclear bomb in my garage,” they are also going to get a call from Pete Hegseth, right? So I think they have built it and right or wrong, however strong you think AI is going to be at a certain point, if that’s the message you’re projecting, it should not be that surprising that it’s come to the head like this.
Zoë Schiffer: Yeah. I mean, and it’s also worth saying really clearly that we’re talking about fully autonomous systems. We already have fully autonomous weapons. They’re being deployed in Ukraine. They’re little drones and they’re not run with LLMs. The models are simpler, but it makes sense that—especially when we’re talking about a system that is more advanced and less reliable, there’s a little hesitation to just say, “Carte blanche, do whatever you want.” But that’s in fact what the Pentagon is saying is required for a government contractor at this point.
Leah Feiger: I have to say that watching the Pentagon respond to all of this has been wild, really, really wild. They very much view this technology as theirs. The idea of this ownership, the idea that, no, this was made in America, you made it for you, but it’s for us. This is very much for us. And watching that reaction that played out in press conferences on X, I mean, it was really wild to see Emil Michael, for example, just go on rants. Zoë, are you going to say it? Are you going to say—
Zoë Schiffer: Emil Michael, you should turn off your views on LinkedIn because people can see when they look at your phone.
Brian Barrett: Yeah, that’s a tip for Emil Michael if you’re listening. And then, if you listen to Uncanny Valley, which you should.
Leah Feiger: And this is really the Trump official that’s kind of leading the war against Anthropic here. He has these deep ties to the tech world and he’s the Pentagon’s public enemy number one of anyone who’s trying to cross the Pentagon. So the fact that, one, he has not maybe the best op sec in the world? But two, is comfortable making this such a public battle, is very much something to take note because it’s not just a message to Anthropic in my mind. It’s a message to anyone else who would dare question these policies.
Brian Barrett: I want to go to something top of mind. Have you all checked your Kalshi or Polymarket portfolios lately? How are we doing?
Leah Feiger: Well, I haven’t invested yet in our Survivor winners, but soon, soon.
Brian Barrett: No, in terms of things that are surprising but shouldn’t be anymore, prediction markets obviously have sort of taken over so much of our lives in so many ways by—I mean the general “our,” not the three of us.
Leah Feiger: The three of us actually text every single morning and go, “What’d you make on Kalshi last night?”
Brian Barrett: Yeah. But surprising, not surprising, there’s so much betting going on around the Iran war to continue that thread. Right now, one of the top bets on Polymarket is “Will the Iran regime fall by June 30th?” Total bets around 7 million dollars in that market alone.
Leah Feiger: That’s so upsetting, Brian.
Brian Barrett: Yeah.
Leah Feiger: These are people’s lives. I don’t know. I understand that so much of this has become a gamified version of itself. I understand that the stock market and the way that we do so much of all of this, but this feels extra gamified to me.
Brian Barrett: Well, and in terms of people’s lives, I mean, there was a big controversy just earlier this week about how Kalshi settled a bet or resolved a market. There was a 54 million dollar market on the fate of Iran’s supreme leader. I believe that they phrased it—Leah, correct me if I’m wrong—they phrased it as, like, he’ll be out of power.
Leah Feiger: That was exactly what happened.
Brian Barrett: And then he was blown up by a missile.
Leah Feiger: So technically out of power, but that wasn’t the bet.
Brian Barrett: Because you can’t bet, and they’re like, “We don’t allow you to bet on deaths here.” So he’s out of power, but not. So they are invariably betting on whether people will die, just finding cute ways around it and then having a hard time resolving these markets, so that’s a problem. The fact that there was a 54 million dollar bounty out on this guy collectively from betters, including one user called MAGAMyMan, who had won $553,000 on the timing of all this. It is outrageous. And I think whether or not any of this is insider trading, it’s grotesque.
Leah Feiger: It’s grotesque. That’s the word. That’s the word that I was looking for, for sure. It’s the gamified grotesque, like a very, very narrow look at the value of people’s lives.
Zoë Schiffer: On slightly lighter Polymarket and Kalshi news, we are seeing a lot of markets that have to do with what looks like internal company data. So like the launch of GPT5 or Sam Altman being ousted as CEO years ago. And last week, Kate Knibbs reported a story that OpenAI had actually fired an employee for insider trading on prediction market platforms like Polymarket. OpenAI’s CEO of applications, Fidji Simo, disclosed this in a note to staff. She said that the employee in question, quote, used confidential OpenAI information in connection with external prediction markets, but we’re seeing this at a bunch of other firms. Kate Knibbs reached out to a lot of big tech companies. Very few would actually comment on the record, but data suggests that this is far from the first time this has happened. There’s a pretty famous example of the so-called Google Whale, which was a pseudonymous account on Polymarket that made over a million dollars trading on Google related events.
Brian Barrett: And like multiple, right?
Zoë Schiffer: Yes.
Brian Barrett: It’s not just they took a random shot. And there’s just no real interest. I mean, Kalshi has recently taken action against two people who had—they found two instances of insider trading, one person who was running for office, one person who was tied to a YouTube account—and they suspended their accounts for a couple of years or so. But those are tiny enforcement actions and I think sort of meant to show, “Hey, we’re doing something,” but really they’re not.
Zoë Schiffer: I think we’re also like, we’re looking to the platforms and the companies to crack down themselves at this point because the Trump administration has taken a much friendlier stance toward prediction markets than the Biden administration did. And why might that be?
Leah Feiger: I mean, look, the Trump family has these ties to the prediction market world that are—it really reminds me of their investments in crypto world too. In so many ways, it’s these things that don’t necessarily have the government regulations that they could or should have, but Truth Social, the social media platform that is majority owned by Trump and his family. They’re planning their own prediction market offering. It’s going to be called Truth Predict. But Trump Jr.—Donald Trump Jr. is an advisor to both Kalshi and Polymarket already. His venture capital firm has invested in Polymarket. They are very much in this. The exact same conversations that we were all having last year or even just a couple of months ago about Trump family being in, in crypto world, they are—I wouldn’t go out on a limb and say they’re personally gamifying everything and that’s in Iran, but they’re benefiting from it, they’re profiting off of it, or at the very least that they’re seeing that this isn’t a ripe market for expansion and going, “I want in on that.” Forget the fact that their family—their father—is the one that’s helping make these world decisions.
Brian Barrett: And this is where that insider trading thing gets potentially even scarier, is no indication that these people are insider trading off of big global events yet, but there are people who have a vested interest in doing that, who would know. And even just the perception that policy could be made based off of looking to score a quick buck, that’s damaging in and of itself. And I think that’s sort of where we’re headed.
Zoë Schiffer: Before we go to break, I want to give listeners a little insight into a very different business story. This one is about a transaction that we’re keeping a very close eye on, which seems to be very near the finish line. So late last week, Warner Brothers agreed to be acquired by Paramount Skydance in a 110 billion dollar deal with Paramount agreeing to pay a seven billion dollar termination fee if federal regulators don’t approve the merger. So Larry and David Ellison are continuing to become even bigger media moguls than they already are.
Leah Feiger: I mean, let’s spell this out. They already are running CBS. This would give them control of CNN.
Brian Barrett: Well, and let’s not forget that Larry Ellison has a huge stake in TikTok. No, it is sort of this ongoing consolidation of media in the hands of Trump allies, which will surely be fine.
Leah Feiger: I’m talking to friends that are reporters at CBS, at CNN who are freaking out, you guys. They’re looking at this and forget even the fact that, is Bari Weiss soon to be their boss? Are the Ellisons going to be canning anyone who’s ever spoken out against Trump on CNN? Jake Tapper, are you out of here, buddy? But people are talking a lot in these newsrooms about the overlap already between CBS and CNN. These are very similar products here. You look—like there’s an incredible financial team at CBS, there’s an amazing biz team at CNN who are constantly breaking news, who’ve done just amazing reporting over the years. But if you’re looking at them on a side to side thing, do you need to have all of this in your portfolio at the exact same time? You just doubled the amount of journalists you have and the journalists there are flipping out. Everyone is really, really scared about what’s to come.
Zoë Schiffer: Right, because we’re not even talking about the seemingly ideological purge that Bari Weiss has already kind of overseen at CBS. I mean, I’m thinking of that Clare Malone article in The New Yorker where I think the quote was like the “de-Baathification of CBS,” which will never leave my head. But I think that they are really trying to remake these news companies supposedly in their mind to make them less lefty, I guess. But I think the result is that lots of people are losing their jobs and the potential for that is only going to spike.
Leah Feiger: It’s very scary. I also—I mean, Paramount wasn’t even the preferred partner here for the deal, guys. This was supposed to be Netflix. People thought that was a done deal at a certain point. WBD, the parent company of CNN, they were questioning if the Saudi financing backing part of the deal, if the Ellisons could really guarantee that they could even put up the billions of dollars for this. There were a lot of questions before we got to this point.
Brian Barrett: I think if you’re Netflix and you’re looking at it, I think what seems to happen is one, Paramount overpaid by miles and miles—
Leah Feiger: My God.
Brian Barrett: And say, what about Netflix, they are relatively concerned—they know how to spend their money. But then also, Susan Rice saw the Netflix board, Trump had a meeting with the Netflix CEO prior to this, had put pressure reportedly on him to remove—I’m not saying that that was the sticking point. I think at a certain point, if you’re Netflix, you’re saying, “Well, if I do own this, am I going to get past a Trump regulatory approval process?” And even if I do, what kind of pressures am I going to see on the other end of it? Is the juice just not worth the squeeze? And I think either way, as someone who enjoys seeing movies in a theater, I’m broken up about this no matter what. I think Netflix would have been a tough situation for other reasons. But yeah, I think there was no real path given how much money Ellisons have and how much Trump did not want the Netflix part to happen.
Leah Feiger: The pressure was real. Honestly, we’ve talked about so many diverse topics this week. I keep getting back to all of this as just the world of Trump. This is Trump’s pressure. This is Trump’s family getting in on the markets. It’s endless. I can’t imagine what the US business landscape—what the US media landscape—looks without Trump’s fingerprints in everything right now.
Brian Barrett: In terms of everything, can I list off some of the IP that the Ellison’s will now own.
Leah Feiger: Don’t do it. God.
Brian Barrett: This isn’t everything, but it is—so CBS and CNN, we talked about. HBO, DC Comics, Harry Potter, Star Trek, Looney Tunes, two dozen cable networks that your mom watches.
Leah Feiger: I mean, they’re going to de-woke Star Trek. Shout out to my mom. She’ll be really, really upset if anything happens to that IP.
Zoë Schiffer: They can have Harry Potter. I felt like that franchise was very disappointing.
Leah Feiger: It’s kind of mind-boggling though. It’s really giving Veruca Salt in, right?
Brian Barrett: Yeah. Willy Wonka, also a Warner Bros discovery property.
Leah Feiger: Fantastic. I’m going to have to pay a million dollars now that we’ve made that reference.
Brian Barrett: Well, Leah, save some of that energy for after the break when we are inaugurating a new segment called Futurecast. We’re going to share some of our predictions related to tech and beyond. Stay with us.
Leah Feiger: So guys, this week we’re switching it up and doing our first Futurecast segment. This is our time to bring predictions to the table. What do we think is coming down the pipeline next week, next year, next month? Who wants to go first?
Zoë Schiffer: OK. I’m going to think through this in real time because I had one and then I changed my mind. But I think open models are potentially an existential threat for OpenAI and Anthropic. What we’re seeing right now is this question around are scaling laws going to hold. If you throw more compute at these frontier models, are you able to train smarter and smarter models that leap forward in terms of their intelligence, every time there’s a new release? Despite what happened with OpenAI’s GPT5, where it was kind of hailed as the coming of AGI. And then it was pretty disappointing, the models are continuing to advance really rapidly. I think Anthropic’s coding models are a really good example of that. But we also know that if you throw enough tokens at a model, you can extract the model parameters and essentially build a frontier model that looks a lot like Claude, essentially. We’re seeing places around the world, I’m not going to name any names, don’t come at me, do this. And so their open models are getting really, really advanced really quickly. And if you can essentially access Claude without paying Anthropic, and instead you’re paying, I don’t know, DeepSeek, like a nominal sum.
Brian Barrett: Not that we’re naming names, definitely not naming names.
Zoë Schiffer: Sorry, that just slipped out. I think that you’re going to do that. And if Anthropic is able to keep advancing, that might not be an existential threat yet, but I feel like in the next couple of years, it might change.
Brian Barrett: So can I ask a quick follow-up? Because this was Meta’s strategy for a long time, right? They were saying, “We’re going to go all in on Llama, we’re going to have an open weight model,” and that’s going to undercut the businesses of these other folks. I don’t mean trying to rip off the model, they were building it themselves, but they have since abandoned that. So what’s different?
Zoë Schiffer: Yeah, I think that that’s a really good point. I mean, I think one difference like you alluded to is if you’re trying to build the open model yourself versus say distilling someone else’s frontier model, that’s a really big difference in terms of the cost of compute that you’re throwing at it in terms of how quickly you can advance and surpass the model you trained on originally. And so, I think if you’re willing to take a different tact than Meta did, you can move much more rapidly.
Leah Feiger: Interesting. That’s a good one. I’m going to think about that.
Brian Barrett: And I have a Futurecast.
Leah Feiger: Take it away.
Brian Barrett: Do we use Futurecast noun, verb, or both?
Leah Feiger: I think both. Yeah.
Brian Barrett: Can OK. Well, I’m going to futurecast something now. No, I think that they’re—there’s probably already somewhere. I think there will be a well-known, highly visible, red pilled prediction market out there that will let you bet on violent crimes and let you bet on yourself doing violent crimes. And no one will regulate it because it will be in some random island somewhere that’s mostly inhabited by gazelles or penguins. And so people are just going to make a ton of money off themselves doing horrible things.
Zoë Schiffer: My God.
Leah Feiger: At the very least, you should get the IP for that.
Brian Barrett: I know.
Leah Feiger: This is a movie.
Brian Barrett: This is also my spec script. This is also my side project.
Leah Feiger: Yikes. Yikes, Brian.
Brian Barrett: Futurecast.
Leah Feiger: Futurecast. OK. Mine is actually kind of a little sad too. I’ll be totally honest. I’ve been thinking a lot this week about Iran. I’m thinking a lot about the timeline here and how chaotic and unclear the ending of all of this is. And obviously, the US has gone into all of this with their partner, Israel and Benjamin Netanyahu. Bibi wins consistently when he is putting the country at war. And throughout history, that is very much like you’re a country at war, you’re going to win your election. People keep the incumbent. It’s trustworthy. It’s like just stay the course. I’m looking at this. I’m looking at the midterms. Stats are bad for Republicans right now.
Brian Barrett: Yeah.
Leah Feiger: They’re theoretically going to get crushed in the midterms. Again, I say theoretically, we’re many months out and a lot can change. To be a country at war, regardless of how it started and who started it, right now there’s only a couple of US service members that have died, but each death is a tragedy. I’m waiting for that number to go up. I’m waiting for the US to say, now this is … We have to. We have to do it to honor the military. We have to do it to protect US interests abroad in a very, very scary way. So my Futurecast is that this is going to be used to keep Trump in power, to keep Republicans in power for a little bit longer.
Brian Barrett: I think that’s probably right. Despite how unpopular this war is in the US right now.
Leah Feiger: Yep.
Brian Barrett: I also think it creates a pretext for declaring national emergency of some way or the other—
Leah Feiger: Absolutely.
Brian Barrett: That would potentially unlock illegal, but whatever, it would lock efforts to make it harder to vote in areas that would not vote for Trump.
Leah Feiger: But a government at wartime is— this is an entirely different situation right now. We have to look at our midterms polls and these close races way closer. We’re looking at folks that are going to be pitching their experience, their knowledge, their age in some ways. “No, I know how this goes. I am right in there with the president. I know what we need.” Brian, you mentioned US farmers needing fertilizer. This all ramps up to the federal government. And I don’t know if I see a world where the GOP is not going to be taking advantage of that.
Brian Barrett: Anyone have anything nice to say, before we sign off?
Leah Feiger: It’s so nice having you in New York, Brian.
Brian Barrett: It’s great to be in New York and Leah won a big award. OK. That’s our show for today. We’ll link to all the stories we spoke about today in the show notes. If you have any comments, you can find the episode transcripts at WIRED.com to discuss. Uncanny Valley is produced by Kaleidoscope Content. Adriana Tapia produced this episode. It was mixed by Amar Lal at Macro Sound. It was fact checked by Matt Giles. Pran Bandi is our New York studio engineer. Kimberly Chua is our digital production senior manager. Kate Osborn is our executive producer, and Katie Drummond is WIRED’s global editorial director.