Sitemap

The Undiscussed Threat of Large Language Models

4 min readFeb 18, 2025

--

“The past was erased, the erasure was forgotten, the lie became the truth.” 1984, George Orwell.

In an era where technology shapes our reality, we find ourselves at a crossroad. Large language models (LLMs), are sold to us as a revolutionary tools, that are capable of reshaping the way we perceive information. But beneath their polished promises lies a profound threat to critical thinking and our connection to authentic sources.

If million people would write in the internet, that the Earth is flat, LLMs would tell us after some “training”, that it is flat and next generations would not argue about this “fact”.

Imagine a world where history is not written by victors but rewritten by algorithms. LLMs, in their quest to provide answers, often average out information without discerning truth from fiction. This statistical smoothing can lead to a dangerous consensus: the crowd is always right, even when it isn’t. As George Orwell warned in his dystopian masterpiece, “1984,” “The past was erased, the erasure was forgotten, the lie became the truth.” In this digital age, such manipulation is no longer fiction but a looming reality.

George Orwell, 1984

Orwell further cautioned us with the chilling notion that “Who controls the past controls the future. Who controls the present controls the past.” This highlights the peril of allowing technology to dictate historical narratives. When history is malleable and facts become fluid, society risks losing its anchor to reality. The implications are staggering. With every query answered without source citations, we edge closer to a future where any narrative can be constructed and accepted as truth.

The film “Don’t Look Up” offers a modern parallel to Orwell’s warnings. It portrays a society so consumed by media spin (or imagine LLM taking the central stage) and denial that it ignores an impending catastrophe. The main line from the movie captures this sentiment:

“If we can’t all agree at the end of the day that there’s some truth… what’s the point?”, Don’t look up movie

This reflects how easily truth can be overshadowed by popular opinion and misinformation — a theme deeply resonant with Orwell’s vision.

Don’t look up movie

This opens the door to unprecedented manipulation — severing people from their history and roots. Wealthy individuals investing in nuclear bomb shelters in remote locations may seem like science fiction, yet it underscores a growing fear of societal collapse fueled by misinformation.

It’s crucial to recognize that LLMs are not artificial intelligence in its truest sense; they lack genuine understanding and consciousness. Instead, they serve as sophisticated tools for putting averaged data from the internet in a form of a language that most would trust — a reflection of popular opinion rather than objective reality.

When a lie is mixed with a set of facts and communicated in a language that the recipient understands, the person is likely to accept the lie as truth. The presence of factual information can manipulate their opinion, leading them to trust the entire narrative, including the false elements. This demonstrates how blending truth with deception can effectively obscure reality and influence beliefs. So the hallucinations of LLMs is not just a side effect, but the thing, that can easily become a reality for many people, just due to the fact, that the model is using the language and some facts, that a re accepted by the society.

“The ideal subject of totalitarian rule is not the convinced Nazi or the dedicated communist, but people for whom the distinction between fact and fiction… no longer exists.” Hannah Arendt, “The Origins of Totalitarianism”

Hannah Arendt

As we stand on this precipice, we must heed Orwell’s cautionary books, Hannah’s imperative or Don’t look up story and take decisive action.

We need strengthen limitations on the use of these technologies and introduce robust constraints to ensure they do not become instruments of mass manipulation. The outputs of LLMs should be viewed critically — as mere aggregations of existing content — not definitive truths.

To be continued…

--

--

Dr. Alexey Minin
Dr. Alexey Minin

Written by Dr. Alexey Minin

Consultant on Digital Economics, Ecosystems and Digital business models. PhD in AI @ TUM, Honored professor

No responses yet