My ENGL 170 Blog

The Efficiency Paradox: Why Some Struggles Aren't Worth the Cost

January 18, 2026

In a recent post on Gabriel's English Blog, my classmate Gabriel argued in "The Cost of Human Relevancy" that we should fiercely defend our right to do "expensive" cognitive work[cite: 3, 37]. He suggests that by using AI to bypass the struggle of understanding or summarizing, we are essentially choosing "intellectual atrophy"[cite: 4]. While Gabriel’s call for deep engagement is noble, I believe it rests on a flawed assumption: that all struggle is created equal[cite: 5]. In the modern world, refusing to use efficiency-boosting tools isn't a sign of intellectual strength; it’s a failure to prioritize where our limited mental energy should actually go[cite: 6, 7].

The Myth of the "Right" Kind of Hard

Gabriel’s argument relies on the idea that the "struggle" itself is where the growth happens[cite: 9, 142]. However, in education and professional life, there is a major difference between constructive struggle and empty labor[cite: 10, 143]. Constructive struggle is when you are grappling with a complex ethical dilemma or a deep scientific contradiction[cite: 11, 144]. Empty labor is the hours spent formatting a bibliography, summarizing a 50-page report just to find one relevant fact, or agonizing over basic sentence structure[cite: 12, 145].

When we insist on doing everything the "expensive" way, we aren't necessarily getting smarter; we are just getting tired[cite: 13, 146]. As noted in the post "Why AI is the Future, Not the Enemy," a professional who insists on manual data processing rather than using AI is wasting cognitive resources that could be spent on actual analysis[cite: 14, 40]. If we spend all our energy on the "atoms" of the task, we have nothing left for the "architecture" of the solution[cite: 15, 148].

The Extended Mind: AI as a Cognitive Exoskeleton

To understand why this "struggle" is often unnecessary, we can look to philosophers Andy Clark and David Chalmers and their theory of "The Extended Mind"[cite: 17, 38, 150]. They argue that our minds do not stop at our skulls; rather, we regularly delegate cognitive tasks to our environment[cite: 18, 151]. When we use a notebook to remember a date or a calculator to solve a problem, those tools become an active part of our thinking process[cite: 19, 152]. By this logic, AI isn't "replacing" our thoughts—it is extending them[cite: 20, 153].

Using AI to handle the "noise" of a task allows us to create a more powerful cognitive loop where the human provides the intent and the machine provides the processing power[cite: 21, 154]. Banning these tools doesn't protect the mind; it simply forces it to work within a smaller, less capable boundary[cite: 22, 155].

The Theory of Cognitive Load

This extension of the mind is supported by John Sweller’s "Cognitive Load Theory"[cite: 24, 39, 157]. Sweller’s research suggests that the human brain has a limited "working memory"[cite: 25, 158]. If we overload that memory with "extraneous" tasks (like the mechanical struggle of writing or summarizing), we actually inhibit "germane" learning—the actual understanding of the core concept[cite: 26, 159]. AI removes this extraneous load, allowing the student to focus on the big-picture mental models that lead to true expertise[cite: 27, 160]. When the mechanical noise is reduced, the brain can finally dedicate its full capacity to high-level thinking[cite: 28, 161].

Conclusion: Choosing Our Battles

Gabriel is right that we shouldn't have a "blind worship of bits"[cite: 30, 163]. But we also shouldn't have a blind worship of "atoms" just because they are heavy[cite: 31, 164]. The goal of the modern student shouldn't be to avoid AI to keep their mind "pure"; it should be to use AI to reach high-level executive function[cite: 32, 33, 165]. We should have the courage to say, "I am using this tool for this task so that I can save my struggle for something that actually matters"[cite: 34, 167]. True relevancy isn't found in how hard we work, but in the quality of what we produce and the depth of the problems we choose to solve[cite: 35, 168].


Sources