Photo by Compagnons / Unsplash

How Do We Educate the Native Generation of the AI Era?

Change Apr 19, 2026
Day 1143 in Japan

Lately, I've been feeling a growing sense of anxiety.

Not the "will AI replace my kids" kind of anxiety, but something more fundamental: I don't know what to teach them.

I have two kids, 10 and 6 years old. Following the traditional path, I should make sure they study hard, learn math, practice English, develop talents. But every time I think about this, I stop and ask myself: which of these things will still matter in 15 years?

What troubles me more is that my own understanding of AI is fragmented. I use ChatGPT for research, Gemini for e-commerce, and Agents to manage my workflows. But if you ask me "what foundational concepts should we have in the AI era," I can't articulate it clearly. I don't even know how much of what I'm doing now can be passed on to my kids, and how much is just transitional skills for my generation.

So I did something:

I had my Hermes Agent - Luffy spin up a sub-agent dedicated to collecting and organizing materials on AI education.

That's when I discovered that AI Literacy has become an increasingly important concept in recent years as AI tools have proliferated.

A few days ago, I listened to an episode of The Ezra Klein Show featuring Rebecca Winthrop, director of the Center for Universal Education at the Brookings Institution. Many of her insights were eye-opening and made me rethink this whole question.

Adults Are Not Native to the AI Era. Our Kids Are.

This is something Rebecca said in the podcast. I sat there stunned for a while after hearing it.

My generation grew up in a world without AI. We learned to look up words in dictionaries, memorize vocabulary, solve math problems, write essays. These skills had clear value in our development: they helped us pass exams, get into good schools, land good jobs.

I went through DOS, learned Pinyin and Wubi input methods, entered the Win98 era, then got Nokia and Blackberry phones. I didn't get my first smartphone—a Google Nexus S—until graduate school, along with my first iPad. From dial-up internet all the way to 5G. You could say I'm a native of the internet era, the mobile internet era. But I was already 37 when I first encountered ChatGPT.

Our kids are different.

To them, AI is like air, water, electricity, the internet—just another piece of basic infrastructure. They don't need to "learn to use AI" any more than we need to "learn to use air." AI will naturally integrate into their lives, learning, and work.

The question is: how do we educate them to face this thing?

How much of our past experience is still useful? How much are we still figuring out ourselves?

This is almost completely uncharted territory.

AI Is Fundamentally Shaking the Purpose of Education

Rebecca mentioned a statistic in the podcast that stuck with me:

In 1976, 40% of American high school students read 6 or more books outside of class per year, while 11% read none.

Today, those numbers have completely flipped: 40% read none.

Ivy League professors report that students can no longer complete what used to be standard reading assignments.

And AI arrives precisely at this moment. It can read any book instantly, write any essay, solve any math problem.

So the question becomes:

If AI can do these things, why should we still make kids learn them?

The traditional purpose of education, frankly, was "get a good job." It's instrumental education: you learn these things to be competitive in the future labor market.

But that "deal" is collapsing.

The economy used to require humans to work like machines. Now machines can work like humans. So what's the purpose of education?

Rebecca's answer:

Cultivating the ability to navigate uncertainty.

Not the stock of knowledge, but:

  • Motivation and engagement (the most important meta-skill)
  • Judgment to distinguish truth from falsehood
  • Creative problem-solving ability
  • The ability to live with others, know yourself, and adapt flexibly

AI makes this urgent.

If you don't know what the future will reward, the only thing you can do is cultivate your child's ability to find direction in any environment.

The Dilemma I Faced When Trying to Introduce My Kids to AI

Back to my own dilemma.

Recently, I wanted to introduce my two kids to AI. I looked for textbooks, picture books, courses on the market, and found almost nothing ready to use.

It's not that there's nothing at all, but the content is either too technical (teaching kids to code or train models), too shallow (just telling kids "AI is powerful"), purely commercial courses (selling anxiety, selling classes), or only available in English.

That's not what I want.

What I want is to help kids understand: what AI is, what it can do, what it can't do, and most importantly—how we should coexist with it.

But I don't know the answer myself.

My AI concepts and skills are scattered. I know how to write prompts, how to build Agents, how to use AI to improve work efficiency. But these are all adult use cases.

Kids need something different.

What they need might not be "how to use AI," but "how to maintain their own agency in a world where AI is everywhere."

Passenger Mode: AI's Most Dangerous Combination Point

Rebecca mentioned a concept in the podcast called Passenger Mode.

This is one of four modes of student engagement. Students in this mode might have good grades but are extremely bored. They do the minimum, complete assignments quickly, but have already "checked out of learning."

Why does this happen? Two reasons:

  • Too easy: they know all the answers, spending 45 minutes reviewing content they've already mastered
  • Too hard: missing foundational skills, feeling like they don't belong

AI has a fatal attraction for these students.

Rebecca gave two real examples:

The first student divided their assignment into three parts, generated them with three different AI models, merged them, then ran them through three plagiarism detectors. Perfect pass.

The second student used ChatGPT to generate the assignment, then used an "AI humanizer tool" to add spelling errors to make it look human-written.

These students got the grades but lost the meta-skills: how to read a book, how to write an essay, how to do difficult things.

That's the truly scary part.

It's not that AI will replace them, but that they chose to bypass these abilities precisely when they most needed to develop them.

Finding the Spark Matters More Than Optimizing Curriculum

There was one insight from the podcast that really struck me.

Rebecca said that when students find something that activates them, it creates a spillover effect.

She gave several examples:

Kia, a student who had been in Passenger Mode, designed an escape room about the Lincoln and Kennedy assassinations. The project met both history and science standards. After completing it, she re-engaged with all her courses.

Ezra Klein himself discovered political blogging in his freshman year and suddenly could do all the things he didn't want to do before.

Samir, interested in local politics, joined the school board while still in high school.

Matteo found his direction through robotics.

What these cases have in common: intrinsic drive → more engagement → more enjoyment → virtuous cycle → learning to "bend the curriculum to fit your interests."

This reminds me of myself.

My interest in AI didn't start from learning AI theory, but from "I want to use AI to solve my actual problems." I wanted to automate my workflows, have AI help me with research, build a digital employee that could work 24/7.

These specific, motivated needs drove me to learn, try, make mistakes, and optimize.

Kids are the same.

Rather than making them "learn AI," help them find "what interesting things can be done with AI."

The Promise and Pitfalls of Personalized Learning

AI optimists have an attractive argument: AI can provide completely personalized education.

Every child can have their own AI tutor, learning at their own pace according to their learning style (visual, auditory, quiz-based, poetry-based...). This is an unprecedented quantum leap in education in human history.

Sounds great.

But Rebecca asked a question: what does "better" mean?

What teachers do goes beyond knowledge transmission:

  1. Skill development and knowledge transfer (the part AI can help with)
  2. Relational learning: humans evolved to learn in relationships
  3. Non-cognitive skills: regulating emotions in groups, understanding different perspectives, the ability to ask for help

She described a dystopian scenario: 25 kids, each staring at a screen for 8 hours, each with their own AI tutor.

That's not the future we want.

A better model:

  • 2-3 hours: adaptive learning software (math, science, reading, social studies)
  • Remaining time: project-based learning, sports, community exploration
  • Teacher role transformation: from impossible omniscient being → manager/editor/supervisor

This reminds me of my own work style.

I now manage a "carbon-based + silicon-based hybrid team": I have three AI employees (Luffy, Zorro, Nami) and human colleagues.

AI handles data analysis, image and video generation, weekly reports, information scraping, knowledge base management.

Humans handle strategic decisions, creative work, relationship maintenance, and physical labor that AI can't yet replace.

Education might need this hybrid model too.

Not "AI replaces teachers," not "don't use AI at all," but "AI and humans each do what they're good at."

The Lessons of the "Screen Disaster" Cannot Be Repeated with AI

Ezra Klein said something heavy in the podcast:

"We just went through a catastrophic experiment with screens and children, and we're only now starting to realize it was a bad idea."

Schools are banning phones. The laptop and iPad craze is fading.

If given the choice between a screen-free school and a school with the latest AI technology, he would choose screen-free without hesitation.

Why?

Because we don't understand AI. The research isn't good enough yet. And humans are embodied. Children's brains are wiring themselves at the neurobiological level: how to focus, how to try, how to connect ideas, how to relate to people.

If at this stage we let them enter a "frictionless world"—where all difficulties are dissolved by AI, all problems have instant answers—what happens?

For adults (who already have decades of brain development), this is fine.

For children, it could be catastrophic.

Rebecca said we can't take a "wait and see" attitude anymore.

I deeply relate to this.

Our generation already paid tuition on screens. Even now, I still haven't given my older daughter her own phone, precisely because I've seen social media's impact on adolescent mental health, short videos' erosion of attention, algorithmic recommendations' shaping of cognition.

We can't let our kids pay tuition on AI again.

Three Principles for Using AI in Education

Rebecca proposed three principles:

1. Don't FOMO (Fear of Missing Out)

Unless there's a real problem to solve, don't use AI.

She gave an absurd example: an app that lets you ask your child "how do you feel today" through a phone screen at dinner.

Her reaction: "Are we crazy? Do we need AI to talk to our kids?"

This example is extreme, but it illustrates something: technology itself creates demand.

Beware of "using AI for the sake of using AI."

2. AI Designed for Children Must Be a Benefit Corporation

Right now, major AI labs are competing to get students to sign up:

  • ChatGPT: 2 months free Plus
  • XAI: 2 months free Super Grok
  • Google: 1 year free + 2TB storage

The problem is these products aren't designed for children and learning. They're commercial products designed for adults. These companies are legally required to maximize profit, not maximize social welfare.

3. Give It to Adults First

Give AI to teachers, let them explore how it helps their work.

Give it to innovative principals to rethink the structure of the teaching experience.

AI can optimize many non-teaching tasks: bus schedules, calendars, cafeterias, assessment inputs.

Let adults hold the steering wheel first, then consider letting kids engage.

Age Is the Key Variable

There was one detail in the podcast that really touched me.

Ezra asked Rebecca: for his 3 and 6-year-old kids, what's your advice?

Rebecca's answer: 100% support for Waldorf schools (the kind with only wooden toys).

Why?

Because research shows: more screen time equals less language acquisition. Babies learn language from human contact; the same sentences on a screen don't work.

Neurobiology won't change in 5 years.

But for teenagers, her advice is completely different: they need AI literacy education.

Understanding the business model ("big companies are trying to addict me, I provide attention for free, they make money from it"). Research shows when you tell teenagers this truth, they get angry, and then their behavior changes.

They can start playing with AI, using AI, but it must be AI designed for children.

This made me rethink my educational strategy for my two kids.

For my 10-year-old daughter, I might let her start engaging with AI under my supervision. But I'll start talking with her about "what is AI," "what can AI do," "what can't AI do."

For my 6-year-old son, the only thing I need to do now is protect his language acquisition window, protect his attention development, protect his embodied cognition.

Age really is the key variable.

What Should We Actually Teach Our Kids?

Back to the original question: what should we teach our kids?

At the end of the podcast, Rebecca said something I think is the core of the whole conversation:

"In the AI world, what matters most is being as 'human' as possible."

What does being human mean?

  • Reading great books
  • Developing deep focus
  • Long-form reading, long-form writing
  • Doing difficult things

Her 16-year-old son resisted getting a phone for a long time (because mom nagged about addiction and opportunity costs). When he finally got one, he struggled: "Mom, this is really hard."

The phone eroded his ability to do homework.

The only thing he doesn't get distracted from is playing piano. Because he loves it.

Deep focus doesn't come from willpower, but from love.

So rather than anxiously asking "what AI skills should I teach my kids," help them:

  1. Find what they love (the spark)
  2. Develop deep focus (through what they love)
  3. Learn to do difficult things (rather than bypass difficulty)
  4. Understand what AI is and how to coexist with it (AI literacy)

Will these things still be useful in 15 years?

I don't know.

But what I do know is that if they have these abilities, no matter what the future becomes, they'll be able to find their own direction.

A Father's Uncertainty

Writing this, I suddenly realize this article doesn't give a clear answer.

I still don't know how to specifically educate my two kids.

I still don't know what the world will look like in 15 years.

I still don't know how much of what I'm doing now is right and how much is wrong.

But this might just be the norm for parents in the AI era: groping through uncertainty, moving forward in anxiety, learning from mistakes.

We are not natives of the AI era. Our kids are.

What we can do is not give them a definite map, but accompany them in learning to navigate a world without maps.

This might be the true meaning of education in the AI era.

Tags

QiDi

Trusting the journey. From Beijing to Japan, I’ve traded one chapter for another to build a new life here. This is where I document my story of starting over. | 一切都是最好的安排。 从北漂到日漂,开启一段新的人生,讲述自己的故事。