“We also know how cruel the truth often is, and we wonder whether delusion is not more consoling.” – Henri Poincare
This observation by Henri Poincaré—later referenced by Carl Sagan in The Demon-Haunted World—hits harder today than perhaps either man intended. It suggests that delusion isn’t just an error in judgment; it is a refuge. I can attest to this personally. I have indulged in more than a few delusions in my day, using them as necessary self-defense mechanisms to protect my own integrity and self-worth when the reality was simply too sharp to hold.
But what happens when that personal mechanism becomes industrialized? I was forced to confront this question while watching a video by the YouTube baseball commentator Foolish Bailey, who dissects the bizarre rise of “AI Slop” on Facebook. His video, Facebook A.I. Slop (Starring: Your Favorite Athlete!), exposes how our digital ecosystem is flooded with low-quality, AI-generated hallucinations. There are countless fake stories about celebrities, sometimes featuring bizarre religious imagery. It oftentimes even incites manufactured outrage, specifically targeting American audiences.
Bailey traces the mechanics of how click-farms in countries like Vietnam generate this “Slop” to arbitrage the difference in ad revenue between the developing world and the US. But while the economics of the scam are fascinating, the psychology is terrifying.
Why does this”Slop” work? It generates massive engagement not because people are gullible, but because the content is engineered to bypass the intellect and strike directly at the emotion. It taps into biases, fears, and desires that reality often leaves unfulfilled.
We’re not just dealing with an information problem here; it’s a cultural shift toward what Bailey calls a “Post-Truth” society. In this environment, objective fact is secondary to emotional resonance. We aren’t clicking because we believe the AI image is real; deep down, most of us know it’s not real. But we’re still clicking because the delusion feels better than scrolling through a feed of nothing that interests us.
The Economics of the Lie
Foolish Bailey notes that this phenomenon is, at its core, a brutal form of economic arbitrage. Click-farms in countries like Vietnam generate this “Slop” to bridge the gap between the low GDP of the developing world and the high ad-revenue potential of American attention. From a purely capitalist perspective, it is efficient. Why not siphon ad money from Americans who are seemingly begging to be distracted?
But to view this strictly as a scam is to miss the uncomfortable symbiosis at play. A scam implies an unwilling victim. What we are seeing here is a transaction.
The content—whether it is bizarre AI-generated celebrities or manufactured “anti-Pride” outrage—is not designed to inform. It is designed to trigger. It bypasses our logic centers and hits the raw nerve of our biases. Bailey correctly identifies this as the hallmark of a “Post-Truth” society, where objective reality is discarded the moment it interferes with emotional validation.
With how much content is being shoveled out to us on a daily basis, we’ve moved into a space where how we feel trumps whether it’s factual. Indeed, the engagement metrics don’t lie: even when users know the images are fake, they engage anyway. This is because the lie offers a dopamine hit that the truth cannot compete with at all. Bald nonsense as they may be on the surface, these lies can console us, while the truth merely demands we pay attention.
After all, why pay attention to inconvenient truths and cruel realities when we have fascinating lies to pay attention to instead? If nothing else, such slop featuring people we recognize and subjects that interest us allows us to stay at least a bit delusional. Consuming such content, low-quality and low-effort as it may be, serves as a self-defense mechanism. Unfortunately, while a lot of it is harmless, so much of it warps our view of objective reality.
This creates what academic researchers call “epistemic dysfunction.” It’s not just that we’re distracted, but that our mechanisms for understanding the world has broken down. We become trapped in social-media-driven echo chambers that reinforce what we want to believe. Social engagement metrics make us more vulnerable to misinformation, polarizing us further with every like and share.
We become more than victims of the algorithm; by continuing to engage with these platforms; we’re its willing accomplices.
The Courage to Remain Unconsoled
The stakes here at both a personal and societal level are incredibly high. Organizations like the AP News warn that conspiracies fueled by AI and socially engineered content have become a “clear danger to democracy,” fracturing trust in authority. Meanwhile, a New Yorker piece argues that the very idea of shared, objective truth is eroding, also mentioning the threat to democratic stability.
So, are we cooked? “Candid cynicism” seems fair. We certainly need systemic repairs. Above all, platform accountability is necessary, as well as algorithms that are optimized for something other than raw engagement. We need the “AI babysitters” that seem to be becoming a cottage industry. But technological patches are just band-aids on a psychological wound.
Like Carl Sagan suggested way back in the mid-1990s, we need to double down on media literacy—teaching people how to verify, question, and research. He was right, but perhaps he was also too optimistic. He assumed that if people could find the truth, they would want it.
Yet today, we face a harder problem. More than just media literacy;,we need emotional literacy. We need the internal fortitude to sit with the “cruel truth” that Poincaré warned us about, without immediately reaching for the anesthetic of a comforting lie.
Honestly, even well-meaning interventions in education and potential algorithm changes may not be enough to restore some shared reality on their own. After all, our own content consumption habits, as in what we choose to engage with, defines our reality. If we cannot bear the weight of a complex, often disappointing reality, no amount of fact-checking tools or algorithm tweaks will save us. We will simply find new ways to deceive ourselves. The challenge goes beyond being able to see the world as it is. We all must find the courage to live in it without the consolation of delusion.

Leave a Reply