In the grand scheme of things, I’m probably not the best “AI user.” I use Gemini to critique my thinking, reword things, or generate new ideas. I use Copilot to speed up coding, but I’m not having it write libraries overnight for me. I’ve stuck AI directly in the “tool” category, useful for specific tasks, the way a screwdriver is useful for specific jobs.
In the last couple of weeks, I’ve seen some videos and articles scattered about the internet, basically saying that AI is taking over learning. Or, at least, that AI is making YouTube tutorial content and other course-style content irrelevant. Maybe so, but I’m not sure if that’s good or bad, and that’s what I want to flesh out here.
An early study
A study from MIT suggests that using LLMs could harm learning. The researchers conducted a longitudinal study with 54 participants over four months to compare the neural and behavioral effects of different writing methods. Participants were split into three groups:
Brain-only: No external digital tools.
Search Engine: Using traditional search engines for research with no AI.
LLM (ChatGPT): Using an AI assistant to help write the essay.
The team used EEG (electroencephalography) to monitor brain activity in real-time and applied NLP (Natural Language Processing) to analyze the resulting essays. In the final phase, they “swapped“ conditions—making LLM users write “brain-only” and vice versa—to see how previous reliance on AI affected their unassisted performance.
So, what did they find?
When the LLM group had to switch to “brain-only” writing, they struggled with engagement and couldn’t perform at the level of those who had practiced “brain-only” writing.
The EEG data indicated LLM users exhibited the weakest neural connectivity and the lowest cognitive load.
The LLM users reported the lowest sense of ownership over their work.
The LLM users had the worst recall of arguments and facts from their own essays. I.e., because they didn’t struggle to find the information, their brains didn’t encode it into long-term memory.
The LLM group “produced statistically homogeneous essays within each topic, showing significantly less deviation compared to the other groups,” (133) meaning that there was little uniqueness in their writing.
We could argue about the generalizability of the study’s findings, given that the participants were drawn from a WEIRD (Western, Educated, Industrialized, Rich, and Democratic) population in Boston. We could also argue about some of the other methods used in the study and the ability to generalize the findings to the rest of the population. Even still, the findings are really interesting!
As a result of the findings, the researchers “support an educational model that delays AI integration until learners have engaged in sufficient self-driven cognitive effort” (141).
I suspect that we will see more academic studies on the relationship between AI and learning in the coming years, but if we are to take the findings at face value, we at least need to be careful about substituting AI for critical thinking.
My own experience
In the last week, I felt like learning a new programming language: Dart. Dart, in combination with Flutter, allows you to build elegant cross-platform mobile applications, something that Python, my love language, has yet to venture into. (Yes, I know there’s Kivy, but honestly, it’s only ok.)
To start learning Dart, I followed the website documentation and the Flutter documentation, and pieced together a simple mobile application. It was pretty fun!
I found that because I didn’t know all the syntax yet, I kept getting stuck and going back to Google or Copilot for guidance, which felt like I was wasting time, but I was still learning.
The day after I got a very basic mobile app running, I thought to myself, “I wonder if I could use an LLM to learn Dart.” I opened up Gemini, and as if it was reading my thoughts, I noticed this button:

“Help me learn?” I thought, “Yes, that’s why I’ve come here!” As it turns out, Gemini has a guided learning feature that knocked my socks off. My prompt: “Help me learn Dart by comparing it to Python.” Nothing more, nothing less.
What followed was a true back-and-forth about all of the similarities and differences between Python and Dart—from data types to variables to functions to classes and everything in between. What was awesome about using this learning feature is that Gemini wasn’t simply spitting information at me; it was actually guiding me and then checking my comprehension.
After it explained function basics in Dart, it asked me this:

I answered, and it gave me feedback immediately about the syntax error I made. Wow. What a freaking time we live in.
This guided session really helped me connect the dots that reading the documentation alone didn’t, and that would typically have taken hours of coding to connect otherwise. The next time you’re trying to learn something new, I recommend you give this a shot.
The intersection of AI and learning
Will AI help us learn, or shut off our brains? I imagine the answer is, “It depends.” On what? On the way we use the tool. If we outsource all our thinking to AI, my hunch is that the MIT study's findings are pretty accurate. If, instead, we use AI as a collaborative partner, we can enable deeper and faster learning.
Does AI make courses or YouTube tutorials irrelevant? I don’t think so, even if some creators out there seem to. Perhaps the basic “to-do” app tutorials go away, but expertise is still needed to guide folks coming into tech, and I think there’s still a place for that content online.
Either way, I’m excited to see where all of this takes us. Happy coding! 😁
📝 In Case You Missed It…
I added a section under faith for prayers. Here’s one I love from St. Thomas Aquinas: Grant me grace.
I also added a new section for software architecture patterns under engineering. The first (short) one I’ve added there is for the model-view-controller software architecture pattern.
📚 What I’m Consuming
How to beat decision fatigue (article) punched me in the face. I now know I need to be careful when I see the warning signs of decision fatigue. 😬
20 habits of high-performing leadership teams (article) is a quick list of good things to remember. Maybe easier to spot are the signs that the team might not be well-functioning.
14 more lessons from 14 years at Google (article) has some great takeaways. My favorites are 8 and 10.
Peace be with you,
Jacob
psst… hey, could you forward this to someone you think would find it valuable? I’d greatly appreciate it!
