Comedian Jon Stewart again defended Spotify podcaster Joe Rogan on Friday, saying that ''truth'' is sometimes ''shifting sand'' and that today's ''misinformation'' can become tomorrow's fact.
''The New York Times was a giant purveyor of misinformation and disinformation'' during the Iraq War in 2003, Stewart said on his podcast, ''The Problem With Jon Stewart.'' ''That's as vaunted a media organization as you can find. But there was no accountability for them.''
He pointed out that at the time, he was a vocal critic of the war and the ''intelligence'' being reported about Iraqi leader Saddam Hussein's weapons capabilities and the threat they posed to the United States and allies in the region as a pretext for the U.S. to take military action.
In today's climate, Stewart said, he might have faced censorship from Viacom, which owned the Comedy Central network that hosted his ''Daily Show,'' the same way that some other artists and celebrities do on their respective platforms.
''Where I get nervous is, in the runup to the Iraq War, and in the prosecution of the Iraq War, I was vocal, and sometimes cursed about that,'' he said.'' But the mainstream view, The New York Times' view, was '[Iraq has] weapons of mass destruction, they have these tubes that can only be used for nuclear war.' Couldn't I have gone down, fallen down, this [way] if Viacom or Comedy Central had wanted to censor me?''
Stewart said that truth is sometimes like ''shifting sands,'' and what was ''misinformation'' one day can turn out to be fact the next.
Rogan, who has the most popular podcast in the world with more than 11 million listeners and viewers on Spotify, has been under fire for having people on his show that question the narrative on topics such as COVID-19 vaccines or transgender athletes.
Singer-songwriter Neil Young recently pulled his music catalog from the platform because it would not remove Rogan's podcast.
Stewart said he knows Rogan and brings a bias to how he sees the situation, but he also believes there should be more ''engagement'' between people to clarify certain topics for the audience rather than summarily censoring the content.
He also said that the algorithms used by Big Tech companies, mathematical formulas that allow the system to recommend similar content to what a user is seeing, may push a person further down a ''rabbit hole'' of misinformation.