LLMs are coherence engines, not truth engines

Large Language Models (LLMs) such as GPT are best understood as coherence engines. They are trained to produce outputs that are statistically and rhetorically coherent given an input. Their function is not truth, but the generation of sequences of tokens that resemble what humans would reasonably say next. This includes the use of tools.

As a result, an LLM can generate arguments that are internally consistent, convincing, and persuasive, yet entirely false. The model has no access to the world, no sensory grounding, no lived experience, and no intrinsic way to check correspondence between its outputs and reality.

LLMs ≠ truth

Humans aren't truth engines either

The uncomfortable part is that this critique applies almost as well to humans. Human cognition is also coherence driven. We construct narratives, causal explanations, identities, and moral frameworks that hang together, rather than ones that are objectively correct.

We also continuously prioritise narrative consistency, social acceptability and reinforce biases based on beliefs. Most of what we call "belief" is post-hoc rationalisation of events combined with habit and social signalling. Whether we like it or not, we are story-making machines operating under evolutionary constraints.

humans ≠ truth.

Truth as grounding, not absolute truth

We don't have access to absolute truth. The best we have is grounded truth, which emerges from shared experience, empirical interaction with the world, and feedback loops between belief and consequence. Even in science, we oscillate between seeking coherent models (the path of coherence) to explain an observation, and then attempting to falsify the model we have just created (the path of truth) through observation. Since there is no absolute truth, we rely on falsification through grounded truth, in this case the measurement of interactions with the world.

No one in science creates a coherent model and has the rest of the community announce "that must be right" and stops all questioning in that field. Instead science works because it builds institutional scaffolding that forces grounding through measurement, replication, falsification, and peer review. Without grounding, both humans and LLMs drift into elegant nonsense.

ground truth ≠ absolute truth.

A key difference between Humans and LLMs

Humans can ground, LLMs cannot (yet). To put it bluntly, humans can test beliefs against reality, suffer consequences for being wrong, and update beliefs through embodied experience.

LLMs cannot do any of these things autonomously. Their grounding is always borrowed from human produced data and human validated feedback. Even when plugged into tools or sensors, that grounding remains mediated. So while humans and LLMs both generate coherence, only humans have the capacity for grounding through lived interaction.

coherence ≠ coherence + ground truth.

A key danger

The risk with LLMs is not that they lie, but that they speak with fluent confidence in domains where humans already confuse coherence with truth. When an LLM produces a well structured argument, people instinctively grant it authority they would not give to a rambling human. Why? Because it is a computer, and we are so used to deterministic systems getting things right (a common belief that most humans share) that many assume nondeterministic systems built with computers (which an LLM is) must also be right. They are not. LLMs are coherence engines, not truth engines.

coherence ≠ truth.

But the horse has bolted

As I explained before, "If you can gain control over language, medium, and tools then you can change a person's reasoning of the world around them". Without the armour of critical thinking, then I suspect that in the West, these LLMs will become the new "grounded" truth for many, if not most, regardless of whose beliefs they maybe skewed by — think Grok and Elon Musk.

We've already seen what this looks like with X and Truth Social which for many have become the "grounded" truth for news, rather than traditional media which is often derided as fake. The only difference is scale. It won't just be news but everything. That's the world you have to be thinking about and adapting to.

In this new world ...
truth = coherence + creators ground truth
... and that should terrify you.

Originally published on Medium.