Riding Before Walking: What AI Is Teaching Us About Learning
ChatGPT is already changing education. The real question isn't whether AI will help students learn faster—it's whether we'll use it to build deeper thinkers and more independent learners, or just quicker ones.

A reflection on speed, struggle, and the surprising wisdom of moving slowly
I'm always learning something new. Recently, I've been diving deep into emotional intelligence and cognitive science—specifically how they impact education and learning experiences. It's fascinating territory, but also overwhelming. The research spans decades, crosses multiple disciplines, and includes hundreds of papers I'd never have time to read thoroughly.
So I've been feeding dense academic papers into Gemini—on working memory, emotional regulation, and cognitive load theory. In minutes, I get clean summaries highlighting key findings, methodologies, and practical implications. What might have taken me weeks to process now happens in hours.
And here's the thing—it doesn't feel hollow at all. It feels like having a research partner who can help me synthesize information faster so I can focus on the bigger questions: How do these findings connect? What patterns emerge across studies? How can I apply this to improve my teaching?
But then I read a Time article about ChatGPT in schools, and I started wondering: Is my experience fundamentally different from what students are doing? The piece interviewed a high school student who described using AI as "an extra teacher"—helping her decode tough readings, work through math problems, and polish her assignments. Reading her words, I felt that familiar educator's dilemma: excitement mixed with unease.
And yet, something about this speed unsettles me. I worry that we're moving so fast with AI in education, we're bypassing the most important part of learning: learning how to learn.
In other words, we might be teaching students to ride before they've learned how to walk.
When Fast Feels Like Progress
Let me be clear—I'm not anti-AI. I've watched my own students use ChatGPT to break through walls that used to stop them cold. A confusing prompt that once would have left them staring at a blank screen? Now they ask the AI to rephrase it, offer examples, explain it differently. The feedback loop speeds up dramatically, and they stay in motion instead of getting stuck.
That momentum is powerful. I've seen students who struggled with confidence suddenly willing to tackle challenging assignments because they know they have support. It's like giving them a continuous push on the bike—the momentum is exhilarating, but are they learning to pedal on their own? The AI becomes a patient tutor, available 24/7, never frustrated by the same question asked five different ways.
In many ways, this is what I've always wanted education to be: accessible, flexible, empowering. So why do I feel conflicted?
The Danger of Skipping the Struggle
The problem isn't that AI helps students learn faster. The problem is that it might be helping them avoid the productive friction where understanding is forged—and struggle, it turns out, is where the real growth happens.
Think about learning to ride a bike. You don't just hop on and start pedaling. First, you develop balance by walking, running, maybe using a balance bike. You build the foundational skills that make riding possible.
I watched this happen with my oldest daughter. She spent months on a balance bike—no pedals, just pushing herself along with her feet, learning to steer, learning to trust her body's sense of equilibrium. When we finally brought out a real bicycle, I braced myself for the usual routine: running alongside her, holding the seat, the inevitable falls and tears.
But she just... rode. No drama, no wobbling, no need for me to chase after her with my hand on the bike seat. The balance bike had given her everything she needed. She'd built the foundational skill that made pedaling almost automatic.
And that's where the metaphor became real for me.
That's what I worry we're doing with AI in education. Students get clean, polished explanations and move on. But in fast-forwarding past the cognitive wrestling match—the confusion, the false starts, the gradual emergence of understanding—they also bypass the chance to develop their own reasoning, their own voice.
The Time article touched on this concern: students can easily become reliant on AI-generated answers without building the foundational skills they need to grow. While ChatGPT helps them finish assignments, it might be keeping them from truly learning the content.
The Lost Art of Productive Confusion
I remember staring at sketches for a community center, utterly stuck. My designs were all lifeless boxes; my references on sustainable architecture felt disconnected from what I was trying to achieve. The frustration was a physical weight—I'd pace around my studio, questioning whether I was even cut out for this work.
But after three days of wrestling with the problem, something shifted. I was staring out the window, watching how a tree's branches distributed weight, when suddenly a completely new structural approach clicked into place. That connection was mine. It wasn't handed to me in a formatted response—it was earned by working through the fog of confusion.
In that extended period of uncertainty, I developed what I now recognize as intellectual resilience—the ability to sit with not knowing and keep working through it. Would I have developed that capacity if I'd had immediate access to a clear, well-structured solution? I doubt it.
This isn't nostalgia for the "good old days" of education. It's recognition that some aspects of learning can't be optimized or hacked. Sometimes, thinking slow is the point.
The New Role of Educators
So what do we do? Reject AI out of fear? Embrace it uncritically? Neither feels right.
Instead, I think we need to become coaches for a new kind of literacy—not just how to use AI, but how to use it wisely. This means teaching students when to engage with AI and when to step back and wrestle with problems themselves.
Here's what I'm experimenting with in my own teaching:
Before AI: Have students spend time with the problem first. Let them experience their initial confusion, form hypotheses, try approaches that might not work.
With AI: Use it as a thinking partner, not an answer machine. Ask it to help brainstorm approaches, provide analogies, or explain concepts from different angles.
After AI: Most importantly, have students reflect on what they learned and how their thinking changed. What did they understand before the AI interaction? What did they understand after? What questions do they still have?
The High-Powered Calculator Analogy
I've started thinking of AI like a high-powered calculator. Calculators didn't destroy mathematics education—they freed us to focus on higher-order thinking instead of computational drudgery. But we still teach students the "walking" of basic arithmetic before we give them the "bicycle" of advanced computational tools, because understanding foundational concepts is essential for mathematical reasoning.
Similarly, AI shouldn't replace the foundational skills of thinking, questioning, and reasoning. It should enhance them once they're in place.
A student who uses ChatGPT to check their logic after working through a problem? That's powerful. A student who uses it to skip the thinking entirely? That's concerning.
Building Real Understanding in an AI World
This brings us to what students truly need in an age of artificial intelligence. The Time article's student said something that stuck with me: she described ChatGPT as helping her "understand" difficult material. But I wonder if what she's really getting is the illusion of understanding—the feeling that comes from reading a clear explanation, which is different from the deep comprehension that comes from working through confusion yourself.
Real understanding is messy. It's recursive. It involves dead ends and breakthroughs, confusion and clarity, failure and success. It can't be delivered in a clean, formatted response.
What students need isn't just access to answers—they need the skills to generate questions, to sit with uncertainty, to build their own mental models of how the world works. They need to develop what I call "intellectual agency"—the confidence that they can figure things out for themselves.
Of course, I should acknowledge that this "productive struggle" isn't universally accessible. For students with learning disabilities, language barriers, or debilitating anxiety, the cognitive wrestling match can become an insurmountable wall. For them, AI might be the essential tool that finally allows access to learning. But even in these crucial cases, the goal must be to use AI as a bridge to build foundational skills, not as a permanent bypass around them.
Learning to Walk with AI
Here's the paradox: the students who will ultimately be most effective with AI are those who first learned to think without it. They'll know what questions to ask, how to evaluate responses, when to dig deeper, and when to trust their own judgment. They'll use AI as a collaborator, not a crutch.
This doesn't mean we should ban AI from classrooms or pretend it doesn't exist. It means we need to be more intentional about how we integrate it into learning. We need to preserve space for struggle, for confusion, for the slow development of understanding—while also teaching students to harness AI's power responsibly.
ChatGPT might help us move faster—but the best learning still takes time. Sometimes, learning to walk—really walk—is more important than racing ahead.
AI is already changing education. The real question is: will we use it to build deeper thinkers and more independent learners—or just quicker ones?
I know which future I'm walking toward.
Guiding Principles for the Path Ahead
For Educators:
- Use AI as a thinking partner, not an answer machine
- Preserve space for productive confusion and struggle
- Teach students when to engage AI and when to step back
For Students:
- Build foundational thinking skills before leaning on AI
- Use AI to check your logic, not replace your reasoning
- Remember: understanding feels different from reading explanations
Choosing the Path Forward: AI is a powerful tool, but learning to think independently remains irreplaceable. The goal isn't to reject AI—it's to use it wisely while preserving the cognitive growth that comes from wrestling with difficult problems yourself.