
If Orwell predicted a future of authoritarian oppression, we missed the plot twist: truth isn’t being silenced, it’s being distorted, mass-produced, and flooded upon us like a knock-off Rolex that still ticks. The root isn’t the Press nor Politicians, it’s not the White House or foreign influence, it’s everyone, all of us speaking at once, all claiming to be right.
We live in the Age of Artificial Truth. But don’t get it twisted as pointing blame on any one nor anything but ourselves, this didn’t start with ChatGPT. The arrival of generative AI didn’t create the crisis of truth; it merely industrialized it. Truth today isn’t a matter of fact or fiction. It’s a matter of narrative dominance. The loudest wins. The most convincing is crowned the “truest.” And the machine? It just regurgitates what it’s been fed.
I want to trace the dirty lineage that birthed this monstrosity of epistemological collapse because it might be the most important discussion of our time.
Yellow Journalism: The Birth of Clickbait Before Clicks
Let’s go back to the late 1800s, before digital media, before Facebook memes, before a tweet could move markets. In the days of Hearst and Pulitzer, truth was already for sale. The Spanish-American War didn’t erupt solely because of imperial ambitions or geopolitical chess. It exploded because of front pages screaming lies. “You furnish the pictures, and I’ll furnish the war,” Hearst allegedly told illustrator Frederic Remington. Whether or not he said it hardly matters now, it’s true because I just said so; the story itself is truer than truth in this new era.
Yellow journalism was the original content marketing strategy, sell fear, sell drama, sell simplicity. Nuance? Accuracy? Who cares when the public is addicted to sensation. The seeds were sown: if it bleeds, it leads. And that psychological addiction to sensationalism metastasized into modern media.
“In chasing down cheap clicks at the expense of accuracy and veracity, news organisations undermine the very reason they exist: to find things out and tell readers the truth – to report, report, report.” – Katherine?Viner, Editor?in?Chief at The Guardian
The motive? Profit. Influence. Control. And those never left the building.
Propaganda: When Lies Wear a Uniform
Propaganda elevated yellow journalism into national strategy. In wartime, in politics, and in religion, truth became utilitarian. Not a discovery but a delivery mechanism. Hitler’s Minister of Propaganda, Joseph Goebbels, understood this intimately: repeat a lie often enough, and it becomes accepted as truth. Stalin, Mao, even Roosevelt, all knew that control over information was more powerful than armies.
But the propaganda machine didn’t stop when the tanks did. Fast-forward to the Cold War, and the U.S. government was seeding American pop culture with pro-freedom messages while accusing Hollywood of Communist sympathies – and while that may well have been valid, it doesn’t dismiss the fact that it was still propaganda to push support for government decisions in light of an opposition. The media wasn’t reporting reality; it was manufacturing allegiance.
And it worked, because once the public becomes accustomed to curated reality, they’ll beg for lies that flatter them.
SEO and Content Farms: Truth Optimized for Google
Enter the internet. Not as democratizer of knowledge, but as industrial press for content sludge. Around the 2000s, search engines became the gatekeepers of truth. And so began the race to game them.
“Content farms” like Demand Media didn’t care about journalistic integrity, they cared about keywords. Articles were churned out by the tens of thousands, each “optimized” for search. A user types in “how to remove a tick” and gets a 500-word article written by someone paid $5 and who may never have seen a tick in their life – now imagine that churned out at no cost and programmatically in thousands of variations (this is what AI is doing) – now realize that Russia, Israel, Ukraine, Iran, or the White House, can do it, under any brand, domain name, or social media profile conceivable, say for example, “TheTruth.com.” The illusion of authority was preserved by headers, bullet points, and the uncanny cadence of someone trying to sound like they know what they’re talking about.
Google didn’t intend to create this, but algorithms don’t understand truth. They understand engagement, links, and popularity. So the web flooded with content that wasn’t meant to inform, but to perform.
The motive? Ad revenue. Page views. Rankings. And a public so desperate for shortcuts and “life hacks” that we didn’t care if the advice came from someone who thought peanut butter cured cancer.
Social Media and Fake News: The Crowdsource Collapse
Then came social media, where everyone got a printing press. Now, not only could you say whatever you wanted, but algorithms would help you find your people. Flat Earthers? You’re not alone. The Holocaust Didn’t Happen? There’s a group for that. Echo chambers formed, not out of malice, but out of convenience. And while Flat Earth or Holocaust truth might not really matter today, the same group exists denying vaccines, claiming parental alienation isn’t real, undermining cleantech, and influencing your perspective on wars, your country, and your leaders.
Truth fragmented into tribes in which you can assert that you’re right because others say so.
“Fake news” became the accusation of choice, lobbed from every side, weaponized not just by authoritarians but by anyone losing an argument. And the media? Once trusted as a fourth estate, it now competes with influencers, Reddit threads, and TikTok explainers. A 19-year-old with good lighting can (easily) out-rank The New York Times in shaping perception.
The irony? The more you consume, the less you know. Because content is no longer vetted for accuracy—it’s vetted for shareability.
And let’s be honest, the public doesn’t want nuance. We want validation. We want to be right. Not informed.
The AI Apocalypse (Spoiler: It’s Just the Sequel)
Now here comes AI, but AI isn’t an inventor of lies, it’s an amplifier. ChatGPT, Claude, Gemini… they don’t know anything. They synthesize what already exists. They’re a mirror held up to the culture. When they hallucinate, they’re not breaking from reality, they’re echoing it.
“We are facing a crisis of epistemology in today’s world. How do we know things? How do we distinguish between true beliefs and delusions, facts and fiction?” Brett McCracken in Five Facets of Our Epistemological Crisis. “What can be trusted?”
Ask AI for a source and it might fabricate a peer-reviewed journal article that doesn’t exist. Why? Because it knows we expect a source. It knows the shape of credibility, it can create it, but not its substance and it isn’t doing the research to validate something, it’s manifesting what you’re seeking. That’s not a bug. It’s a feature of us. We trained it this way.
We created machines to reflect our informational gluttony, and now we’re shocked that they don’t distinguish between fact and fervor. It’s not artificial intelligence. It’s artificial consensus. And we’re to blame.
The Courtroom, the Classroom, and the Cafeteria
Let’s zoom in on some real-world implications. Courtrooms increasingly rely on “expert witnesses” whose credentials are wielded in the same way a PhD throws around their title as a justification that they’re right. Judges, overwhelmed and underinformed, rule based on rhetoric rather than evidence. One expert says so, never mind the facts, the stories disagree. The result?
- Parents losing their relationship with children through family courts deciding based on preference of opinions
- People convicted and imprisoned with influence from the cancel culture of social media
- Businesses lost because laws and regulations are passed when people don’t like what’s being done
In schools, kids are told to think critically while being fed standardized curricula optimized for test scores, not truth. Teachers are no longer trusted. Google it. Ask TikTok. Have ChatGPT write the paper.
Even in your breakroom or Thanksgiving dinner table, the debate is no longer about facts. It’s about whose facts. The political becomes personal. The personal becomes weaponized. Disagreement isn’t intellectual — it’s existential.
Because in the Age of Artificial Truth, your opinion isn’t a truth. It’s the truth. And anyone who doesn’t agree? They’re not just wrong. They’re immoral.
No One’s Coming to Save Us
What can we do about it? That’s the wrong question.
There is no “we.” There is no collective sense-making anymore. There is no shared reality to build upon, no trusted authority to rein in the chaos, no final arbiter who can say, “This is true,” and not get crucified for it. And arguably, there never was nor should be; you shouldn’t take the word of a journalist or politician yet for centuries, that’s exactly what people did, and yet, we did put faith in fact checking, peer reviewed research, and experienced testimony.
The institutions failed. The algorithms can’t help. The media sold out. And the public prefers tribal identity to intellectual integrity.
Truth isn’t dead. It’s just irrelevant because the new currency isn’t facts, it’s conviction. And the only thing scarier than a world where a world leader lies, a court hands down the wrong consequence, AI writes lies that the internet pushes upon us… is a world where humans stopped caring whether they’re lies in the first place.
Great insights, Paul O’Brien. From SEO, to fake news, to tinny content that is robotic, and feeble. AI might equalize for junior tasks, but it does not equal expertise. I love how you point out the crowdsource collapse. Values and critical reasoning are more important than ever.
Well framed, well written, well done!
In all the discussion of “truth” and “fact” vs. “fake news” and “artificial truth” – we lose sight of “created truth” – art, engineering, things that we imagine and then make real.
“We are as gods and might as well get good at it.” Brand
“Any sufficiently advanced technology is indistinguishable from magic.” Clarke
“The best way to predict the future is to invent it.” Kay
Engineering: Our magic works. Fidelman
Miles Fidelman Science is what we really ought to value. Science never produces “truth”. Instead it integrates evidence and experiment into testable theories, and doesn’t support myth or intuition
David Reed Indeed. Science provides us with predictive models. Technology provides us with design models. Engineering provides disciplined art & craft of putting those models to use.
Great article Paul O’Brien. I’m assuming you read Nexus? I’m not sure I agree with its conclusion that AI ends up amplifying the skew of the current algorithm(s). At some point (maybe already), I think the LLMs get smart enough to correct for it, and in the future we get back to the truth being the truth (because it is, in most cases).
A super-intelligence should understand human bias, whether it be concious or subconcious. It’s also possible that this part takes a while.
Christian Watts https://a.co/d/0CGJp8r
I have not, adding it to my list.
Agreed, I think AI could solve this, but that’s all the more reason that it struck that *now* were in the age of artificial truth. AI could bring about the end of this age by amplifying it, enabling more to realize the bad, and then fixing it.
My trivial comment to signal I cant add much.
Hannes my trivial reply
The TRUTH is that most of our media and institutions are flat out lying to us about most things most of the time. The motive isn’t really PROFIT it is really DECEPTION, which is as you say is for influence and control. But as Lincoln said ” you can’t fool ALL of the people all of the time” Which has led to the rise of alternative citizen journalists who oddly are more truthful and reliable than the lame stream media. Which has led to the rightful DECLINE in propaganda outlets like CNN.
Say what you want about the profit motive, but what is more chilling is their willingness to go broke because the lies are more important than money.
1. Yellow Journalism: Propaganda Before Platforms
Pulitzer and Hearst invented the modern disinformation machine, decades before Facebook.
Amen brother. I could go on and on and on about this. People are naive. They don’t realize how distractible they are. They don’t realize that when they’re focused on A and B instead of Y and Z, that someone had hand – indirect and direct – in that distraction. “Follow the money” has been replaced with “Focus on the distractions…”
The best and most current example is Trump**. His biggest detractors (i.e., the establishment, the status quo, life-long politicians, the mainstream media etc.) ALL played a key role in building the stage that DJT + MAGA stands on. Such “movements” don’t just happen. They arisde out of a void.
But has any one of those entities owned their accountability and responsibility? (Hint: OhHellNo). Why should they? The Public is too thick to ask, “How did we get here? Who is responsible?” Instead, they mindlessly march in the witch hunt and follow the distracting narrative dictated by the guilt.
I digress. lol
/rant.
** It’s simply an example of how the machine works, it’s not meant to be political so please spare me any discussions of politics. Thanks.,
Yuval Noah Harari pointed out in ‘Sapiens’ that a significant part of human development has been down to ‘trust’. For example, there is trust in paper currency but what will happen if trust in currency and institutions continues to be undermined?
We have entered a world of relativism and ‘my truth’ and ‘my facts’ are leading us towards nihilism and a rejection of the common good. I am elderly and was brought up on ‘my word is my bond’ and I truly despair at what the future will bring for my children and grandchildren
Sharp observations, as always.
We know that the truth has been bought and sold for centuries. No one knew what was real then nor now, it’s just spreading so much exponentialy faster. The sad reality of ‘divide and conquer’ narrative successfully leads to tribalism, nationalism and narcissism and many other ~isms which are reinforced again to further dehumanize each other with- my truth vs yours.
Fascinating Paul one of your best articles. The old journo adage never let the truth get in the way of a good story is as true as ever unfortunately & can be amplified by AI a brave new world for us all!!
Working on the solution for that – it’s people-driven (AI helps, but it’s human-first), accountable, transparently verifiable and trust based. DM me, if you want to know more. 😉
We outsourced memory to Google and judgment to AI. Now we’re surprised that people believe whatever fills the silence fastest. Machines didn’t kill truth. Our laziness did.
Artificial consensus.” We are most certainly in a trust crisis.
This is exactly the epistemological fracture line that erases the minds society can’t see.
When the Age of Artificial Truth floods the system, the people who lose first are the ones who don’t have armies of handlers, PR reps, or legacy status to protect their narratives.
Invisible founders. Single moms. Late-diagnosed neurodivergent women.
The moment they falter, the machine fills the silence with distortions; or worse, nothing at all.
It’s not just “truth at risk.” It’s cognitive safety at risk. Because when the system stops verifying real stories, real minds collapse alone. And they can’t come back, because there’s no narrative left to return to.
My thesis:
• Belongingness is early prevention.
• Narrative safety is cognitive safety.
• If we don’t protect real human truth before the AI consensus drowns it, we don’t just lose journalism; we lose the people it was meant to protect.
? The solution is not just media literacy. It’s building narrative safety nets for the human engines behind innovation, before they break.
Thank you for naming the fracture. Now we have to patch it with real belonging, or there’s no resilient future left to innovate for.
This is one of the most necessary articulations of what The Age of Artificial Truth actually threatens, not just facts, but the people whose truths are easiest to erase.
You’re dead right: this isn’t just about epistemology, it’s about erasure. And when narrative verification systems collapse, they don’t collapse equally. Founders without PR firms, single moms, neurodivergent innovators, those can be the first stories to disappear. AI doesn’t distort them; it silences them.
And when that happens, innovation collapses too. Because invisible founders are often the clearest signal of what’s next. Lose their story, and we lose the pattern recognition that startups, social policy, and markets all require.
Media literacy isn’t enough. You don’t counter deepfakes with critical thinking alone. You protect cognitive safety with narrative infrastructure; systems, trust layers, institutions designed to see real people before the machine replaces them with consensus noise.
We talk about infrastructure as broadband and bridges. But what you’re naming is narrative infrastructure.
Now the hard part: Who’s responsible for building these narrative safety nets? Because if we’re waiting for platforms, it’s already too late.
Paul O’Brien
This is exactly the right question, and the reason Carol’s Lobby blueprint exists.
If we’re waiting for platforms, it’s too late. If we’re waiting for government, it’ll be too slow. If we leave it to individual founders, they’ll keep burning out in silence; because no mind can build its own narrative infrastructure while surviving collapse.
So the answer has to be: we build it ourselves, together, but we don’t treat it as charity or soft comms. We treat it as critical economic infrastructure.
Just like broadband connects data, narrative infrastructure connects trust.
It’s the scaffolding that holds a founder’s truth in place when they can’t hold it alone. It’s radical delegation teams, safe PR bridges, policy protection for “no contact” from coercive actors; and public narratives that prove we’re worth saving before the machine decides we’re obsolete.
Belongingness is early prevention.
Narrative safety is cognitive safety.
And saving the dandelions isn’t charity, it’s economic strategy.
We don’t have to wait for permission. We can build narrative safety nets now; founder by founder, story by story, law by law; until the pattern becomes visible enough that the consensus machine can’t overwrite it.