Dear Reader, please, after you finish reading this post, do read its sibling post AI and Cognitive-Developmental Decay Progress. Or, better yet, read them with both posts open side-by-side, one paragraph at a time.
I just finished fixing my toilet. I used AI to help me. In retrospect, I think I could have done it without AI. The fix was obvious, and I would have noticed it without assistance if I had paid 25% more attention.
While it's amazing that 4o could talk me through this minor plumbing problem with an understanding of what I was showing it via live video and respond to me like a daemon plumber, I can't help but feel a little silly about this interaction.
And lately I have noticed more moments like this coming up. Moments where the assistance I receive from AI feels either like overkill or like it might be limiting my growth.
Will I ever learn to read faster if I can lock in my attention using TTS via Speechify?
Will I get better at parsing the 30% of important-to-me information in a journal article within a minute if I too often start with an AI-generated summary?1
Will I notice the subtle visual cues that hide the answer to a problem in my built environment, if the AI can do it faster than I can? Or if I know I don't have to *look* that hard, because my pretty-good-but-sometimes-hallucinating generated guardian angel can do it for me?
Still, I am very bullish on AI. Not anti-AI by any means. I mean, I was using AI to fix my toilet. I do think AI can contribute to a person’s cognitive-developmental progress.
But increasingly often, when I hand a task over to my AI, I find myself looking longingly at it getting to do the work.
To avoid leaving this on the kind of ambiguous film ending that my mother finds so annoying, I will note that I think the solution to this is honing the skill of knowing when to use the AI and when to intentionally do without.
Maybe I'm spoiling myself by reading the abstract too?
With AI and other changes to technology, I wonder which skills we shouldn't bother learning, which skills are still worth learning, and which skills are not worth learning on their own but are worth it when accounting for how much they help us generalize to other capabilities.
Some people use cursive writing as an example of a skill that fewer people have nowadays but that we shouldn't particularly regret losing. That falls in the first bucket. The ability to write strong essays, even if AIs become faster and better at it, would at a minimum land in that third bucket because of how much it helps us learn critical thinking. Better AIs give us each the opportunity to spend less time on the skills that aren't worth learning and more time on the ones that are, with the challenge of figuring out which skills are which.