Why Overusing AI Like ChatGPT Can Harm Your Learning—and How to Use It Wisely
Introduction
If you have relied on ChatGPT or similar large‑language models (LLMs) for most of your studying over the past year, you might actually be losing cognitive strength. A recent MIT paper titled Your Brain on ChatGPT provides concrete evidence that heavy AI use can diminish brain activity, memory retention, and critical‑thinking skills.
The MIT Study
- Three groups: (1) LLM‑only, (2) Search‑engine‑only (no AI), (3) Brain‑only (no external help).
- Method: Participants wrote essays while their brain activity was recorded with EEG.
- Findings:
- LLM group showed significantly lower electrical activity, weaker connectivity, and reduced engagement.
- Their essays were more generic and their memory recall was poorer.
- Even after stopping AI use, the LLM group’s brain metrics did not rebound to the levels of the other groups, indicating a lingering negative effect.
How AI Undermines Deep Learning
- Learning requires active information processing – the mental effort of organizing, comparing, and integrating new data into existing schemas.
- Traditional sources (books, lectures, Google searches) force this processing because the information is not pre‑packaged for the brain.
- LLMs let you skip the processing step: you ask a question, the model returns a ready‑made summary, and you feel you have “learned” without the mental work.
The Illusion of Learning
- Understanding a passage does not guarantee retention or the ability to apply the knowledge.
- When you repeatedly offload the processing to AI, you create a false sense of mastery.
- Over time, your brain stops developing the habit of deep processing, making future topics feel perpetually overwhelming.
Risks of Hallucinations
- LLMs generate text based on statistical likelihood, not on a grounded truth.
- Without prior expertise, you cannot spot factual errors, so you may internalize misinformation.
- Even sophisticated models struggle with nuanced, context‑specific reasoning.
Why Expertise Still Matters
- Experts can craft precise prompts, evaluate AI output, and spot hallucinations.
- AI actually highlights gaps in expertise: generic answers are easy to obtain, but nuanced, high‑quality responses require domain knowledge.
- A recent anecdote: a data‑scientist spent hours prompting ChatGPT for a dashboard solution yet could not articulate a coherent strategy without a human discussion.
Practical Strategies to Use AI Effectively
- Treat AI as an assistant, not a brain replacement.
- Use AI for low‑effort tasks – quick overviews, resource gathering, or checking alternative perspectives.
- Always perform the deep‑processing step yourself:
- Summarize the AI’s output in your own words.
- Identify gaps, contradictions, and missing angles.
- Connect the information to a larger framework you already know.
- Progressive deepening:
- Start with a broad AI‑generated summary.
- Challenge it with targeted questions.
- Move to primary sources (journal articles, textbooks) for detailed study.
- Build a habit of cognitive effort – recognize the feeling of “overwhelm” as a signal that genuine learning is about to begin.
Building Cognitive Resilience
- Practice cognitive offloading awareness: notice when you are about to let the AI do the thinking for you.
- Schedule dedicated “thinking blocks” where you process information without AI assistance.
- Use spaced repetition and active recall to cement knowledge after the AI‑assisted research phase.
Final Thoughts
AI can accelerate information gathering, but it cannot replace the mental work that creates lasting expertise. By consciously pairing AI assistance with rigorous personal processing, you protect your brain’s plasticity, maintain critical‑thinking abilities, and stay competitive in a future where baseline AI literacy will be universal.
Using AI responsibly means letting it handle the tedious parts while you retain the hard, effortful processing that builds true understanding and expertise.
Frequently Asked Questions
Who is Justin Sung on YouTube?
Justin Sung is a YouTube channel that publishes videos on a range of topics. Browse more summaries from this channel below.
Does this page include the full transcript of the video?
Yes, the full transcript for this video is available on this page. Click 'Show transcript' in the sidebar to read it.
How AI Undermines Deep Learning
1. **Learning requires active information processing** – the mental effort of organizing, comparing, and integrating new data into existing schemas. 2. **Traditional sources (books, lectures, Google searches) force this processing** because the information is not pre‑packaged for the brain. 3. **LLMs let you skip the processing step**: you ask a question, the model returns a ready‑made summary, and you feel you have “learned” without the mental work.
Why Expertise Still Matters
- Experts can craft precise prompts, evaluate AI output, and spot hallucinations. - AI actually **highlights** gaps in expertise: generic answers are easy to obtain, but nuanced, high‑quality responses require domain knowledge. - A recent anecdote: a data‑scientist spent hours prompting ChatGPT for a dashboard solution yet could not articulate a coherent strategy without a human discussion.
Helpful resources related to this video
If you want to practice or explore the concepts discussed in the video, these commonly used tools may help.
Links may be affiliate links. We only include resources that are genuinely relevant to the topic.