Generative Artificial Intelligence Is a Denial of Computer Science

Once stated, this observation seems self-evident. Yet it had to be thought and dared to be written.

Computing is the science of automated information processing. Its aim is to design, formalise and implement processes often complex or repetitive in a systematic manner. The results are reliable and reproducible.

Text generation, whatever language is used, relies on programs that draw upon massive corpus in which information is unstructured and unorganised. The user designs processes often quick or simple aimed at generating outputs in a loosely defined way. The results are generally approximate, random, and inconsistent.

The difference is stark

The use of randomness in the functioning of generative programs can be controlled, thereby improving reproducibility. However, the representation of information extracted from training dataset remains a highly active field of academic research.

Generation is useful as a component for very specific applications, for example, to build flexible human–machine interfaces using natural language, or in weird edge cases.

The enthusiasm for genAI is therefore not tied to a classical or reasoned use of this technology. It has no technical justification. Yet the interest bubble is massive, despite a low adoption rate (source: Internet).

The outsized enthusiasm for generative AI rests on psychological mechanisms

Decision-makers were understandably impressed, even stunned, by the performance of ChatGPT. They soon wanted to apply it everywhere, to efficiently replace those pesky humans with their slow, squishy brains. Political, economic, and social interpretations of this trend are predictable and unsurprising. Zeitgeist: since computing automates and replaces part of human labour, why stop now especially when these new tools appear to perform at human or even superhuman levels? The question is legitimate.

This effect is well known and well documented. Subconsciously equating the behaviour of a computer with that of a human being, and believing what a program displays on a screen, has a name: the ELIZA effect.

It doesn't work.
It's clearly a mistake.

In the beginning was the Word
and the Word was with God
and the Word was God. Prologue to the Gospel according to John

The link is obvious.

translated with ChatGPT and manually proofread


Thomas


We are still not machines. Our texts are thought out and written without any content generation programs. All financial support is welcome.