You probably have heard the quote attributed to Nazi propaganda minister Joseph Goebbels: "If you tell a lie big enough and continue to repeat it, eventually people will come to believe it." This has become a staple tactic in political rhetoric -- an obvious recent example being Donald Trump's oft-repeated declaration that he won the 2020 presidential election, despite bipartisan analysis across the United States demonstrating unequivocally that this is false. (The tactic works; a huge number of Trump supporters still think the election was stolen.)
It turns out that the "illusory truth effect" or "truth-by-repetition effect," as the phenomenon is called, still works even if the claim is entirely implausible. A study by psychologist Doris Lacassagne at the Université Catholique de Louvain (in Belgium) recently presented 232 test subjects with a variety of ridiculous statements, including "the Earth is a perfect cube," "smoking is good for the lungs," "elephants weigh less than ants," and "rugby is the sport associated with Wimbledon." In the first phase of the experiment, they were asked to rate the statements not for plausibility, but for how "interesting" they were. After this, the volunteers were given lists of statements to evaluate for plausibility, and were told ahead of time that some of the statements would be repeated, and that there would be statements from the first list included on the second along with completely new claims.
The results were a little alarming, and support Goebbels's approach to lying. The false statements -- even some of the entirely ridiculous ones -- gained plausibility from repetition. (To be fair, the ratings still had average scores on the "false" side of the rating spectrum; but they did shift toward increasing veracity.)
The ones that showed the greatest shift were the ones that required at least a vague familiarity with science or technical matters, such as "monsoons are caused by earthquakes." It only took a few repetitions to generate movement toward the "true" end of the rating scale, which is scary. Not all the news was bad, though; although 53% of the participants showed a positive illusory truth effect, 28% showed a negative effect -- repeating false statements triggered their plausibility assessments to decrease. (I wonder if this was because people who actually know what they're talking about become increasingly pissed off by seeing the same idiotic statement over and over. I suspect that's how I would react.)
Of course, recognizing that statements are false requires some background knowledge. I'd be much more likely to fall for believing a false statement about (for example) economics, because I don't know much about the subject; presumably I'd be much harder to fool about biology. It's very easy for us to see some claim about a subject we're not that familiar with and say, "Huh! I didn't know that!" rather than checking its veracity -- especially if we see the same claim made over and over.
I honestly have no idea what we could do about this. The downside of the Freedom of Speech amendment in the Constitution of the United States means that with a limited number of exceptions -- slander, threats of violence, vulgarity, and hate speech come to mind -- people can pretty much say what they want on television. The revocation of the FCC's Fairness Doctrine in 1987 meant that news media no longer were required to give a balanced presentation of all sides of the issues, and set us up for the morass of partisan editorializing that the nightly news has become in the last few years. (And, as I've pointed out more than once, it's not just outright lying that is the problem; partisan media does as much damage by what they don't tell you as what they do. If a particular news channel's favorite political figure does something godawful, and the powers-that-be at the channel simply decide not to mention it, the listeners will never find out about it -- especially given that another very successful media tactic has been convincing the consumers that "everyone is lying to you except us.")
It's a quandary. There's currently no way to compel news commentators to tell the truth, or to force them to tell their listeners parts of the news that won't sit well with them. Unless what the commentator says causes demonstrable harm, the FCC pretty much has its hands tied.
So the Lacassagne study seems to suggest that as bad as partisan lies have gotten, we haven't nearly reached the bottom of the barrel yet.