The Eternal Promise, The Training Data Paradox, and AI Claims Reflection

Published: March 8, 2026

The Eternal Promise, The Training Data Paradox, and AI Claims Reflection

This week I read Ivan Turkovic's The Eternal Promise: A History of Attempts to Eliminate Programmers and The Training Data Paradox: What Happens When AI Replaces the Engineers Who Trained It. I then reflected on common claims about AI, argued against those claims with evidence and reasoning, and wrote down my plan for making progress on my Nogramming assignment this week.

Reading 1: The Eternal Promise

Key Takeaways

Reading 2: The Training Data Paradox

Key Takeaways

Three AI Claims I Argue Against

Claim 1: "AI is going to do all of our jobs in the CS industry"

AI will not be able to replace the planning, logic and overall context that humans bring to software development. While AI tools like GitHub Copilot can assist with code generation, they lack the ability to understand business requirements, make architectural decisions, or debug complex system failures. According to the 2025 Stack Overflow Developer Survey, 45% of developers still spend significant time on planning and design—tasks that require human judgment. The reading "The Eternal Promise" reminds us that every technological shift throughout history has made these predictions, yet skilled engineers remain in demand. AI augments developers rather than replacing them; we still need humans to guide, review, and validate the code AI generates.

Claim 2: "One single query in GPT takes up 5 bottles of water"

This claim significantly overstates the water consumption of AI queries. While data centers do use water for cooling, OpenAI's research indicates that a single ChatGPT inference uses approximately 0.5 liters of water—far less than the 5 liters (about 1.3 gallons) claimed. Furthermore, not all AI inference happens in water-intensive cooling systems; many queries run on servers in locations with alternative cooling methods or renewable energy sources. A counterexample is that a Google search query, which involves similar computational complexity, uses approximately 0.3 liters of water. The broader context matters: yes, AI systems consume resources, but the per-query figure is often exaggerated to create alarm rather than reflect actual environmental impact. This doesn't mean we should ignore efficiency, but we should base our concerns on accurate data rather than inflated claims.

Claim 3: "AI will never stop advancing"

AI advancement has already plateaued in several areas, contradicting the claim that it will never stop advancing. For instance, improvements in large language model performance have slowed considerably—the jump from GPT-3 to GPT-4 was significant, but subsequent models have shown diminishing returns relative to the computational cost. Additionally, the Training Data Paradox reading highlights a fundamental limitation: as AI models train on their own generated outputs, they experience "model collapse," forgetting rare but important patterns. This represents a ceiling on advancement for certain approaches. Historically, similar "eternal advancement" claims have been made about other technologies—nuclear power, personal jetpacks, and yet their progress has reached practical limits due to economic, physical, or resource constraints. AI advancement will likely continue, but at a measured pace with clear boundaries, not at the exponential rate some claim.

Weekly Progress Plan for Nogramming Assignment

This week, I will make progress on my Nogramming assignment by planning out my questions for each of the interviewees, and setting up times with them to ensure that I have plenty of time to record and edit the podcast.


~Shree