Turing’s Question in Today’s Digital World
The original question that the British logician and mathematician asked himself back in 1950 was: Can machines think?
To him, this question was not precise, since to properly answer it we must define what a "machine" is, and what "thinking" means. To reformulate it, he proposed a game — the well-known "Imitation Game." In this game, there are three characters, we can call them A, B, and C. A and B can be either a man or a woman, and character C could be of any gender. The aim of the game is for person C to find out who is the man and who is the woman. To do so, C must ask questions like “how stark is your voice?”, “how long is your hair?”, etc.
What Turing thought was: what if we replace a machine in the place of A or B? Would we get the same “wrong” and “right” results of this game as if we were playing with a human being? In order for the machine to help the interrogator C, it must "act" as a human being would act.
This question nowadays is rarely discussed. But back then, it was a revolutionary assessment, which ended up in the grounding of today’s digital world.
Nowadays, 75 years later, Turing’s dream is realized by machines that can speak human language, understand questions, provide advice, and even talk to us like a friend. So far, it seems possible that we could do anything in a couple of years with AI — but at what cost?
Some Facts About the Use of AI and Its Consequences
AI applications such as Large Language Models (LLMs) require computing, storage, and transmission capacities which are provided by data centres. But the energy consumption of these centres is enormous: in 2020, it was around 16 billion kilowatt-hours in Germany — about 1% of the total German electricity requirement. For 2025, an increase to 22 billion kilowatt-hours is predicted.
The water consumption of these facilities also has an impact on the environment. Cold water is used to cool data centres by absorbing the warmth of the computer equipment. It is estimated that a data centre requires two litres of water for cooling per kilowatt-hour used.
On the other side of the story, many efforts are currently being pursued to reduce the environmental impact of AI. See for example the research at the Technical University of Munich (TUM), where Prof. Felix Dietrich and colleagues implemented a probability-based training method for Hamiltonian Neural Networks: reducing training time (and thus energy use) by more than 100 times without losing accuracy [1].
Other Implications That Deserve Reflection
- Lack of self-confidence: When we rely too much on external sources, we lose confidence in our own problem-solving capabilities. We start consulting every single step with a tool, which has a terrible impact on our self-esteem.
- New trends: LLMs are trained on past data sets. That means we won’t get anything new if we always replicate old patterns. We should craft the right prompts when trying to innovate or solve new problems.
- Laziness in developing the craft: It's easier to ask a model for a piece of work than to go through the mental strain it actually requires. A new paper from MIT, Your Brain and ChatGPT [2], analyzed the cognitive impact of writing with AI tools.
During 4 months, 54 participants who heavily using ChatGPT for writing tasks were observed. The main finding was LLM users consistently under-performed in neural, linguistic and behavioural levels. Specifically :
- There was 47% reduction of brain activity among Chat GPT users.
- Behaviourally, they found that 83% of the participants who strongly used Chat GPT could‘t quote no information from what they have written.
- At a linguistic level, the production were lacking individuality, with subjective quality in the writing,
What does this mean? If we use Chat GPT to write for us it is easier and time efficient, but on the long term we get a decrease in cognitive fitness, called cognitive debt. We are trading the ability to think for ourselves for a short solution.
Today’s Questions
I can’t imagine Turing foresaw all the consequences of human-machine interaction, nor do I believe his intention was to lead us into a machine-dependent world. Perhaps Touring's today questions may include:
- Could we train neural networks more energy efficiently?
Some progress has been made. But just in Germany, data centres used ~16 TWh in 2020 — projected to rise to 22 TWh in 2025 [3]. - Do I really need AI for this task?
Many people don’t know the environmental impact of their usage. Prompts that could be solved with a 3-minute google search should be reconsidered. Society might benefit from a basic “AI Efficiency Crash Course” from high school level onwards. - What should we do with the time we save?
AI speeds up problem-solving. This could free us to tackle deeper issues. Machines can now make computations and calculations way easier and more effective that human beings, that can help to solve many unsolved-problems. Maybe this is a good use for AI. Saving the electrical resources for large scale calculations. That would save us some time, of course. We will have to come up with new ideas to what to do with that time. Back in the 1900‘s, scientists were spending their entire day surrounded by books, or immersed in labs. Nowadays, we have everything at our disposal. Does that mean that we have more time to focus? and to think? Probably yes. Use that time smartly..

References
- F. Dietrich et al., “Energy-Efficient Neural Network Training via Probabilistic Hamiltonian Learning,” Technical University of Munich, 2024.
- MIT Study, Your Brain and ChatGPT: Cognitive Debt from AI-Assisted Writing, 2024. [Pending link]
- German Federal Environment Agency (UBA), “Energy Consumption of Data Centers,” 2023 Report.