Raising the Algorithm Generation: AI, Children, and the Great Parenting Experiment

Kids born after 2020 will never know a world without AI. They're using ChatGPT for homework, forming relationships with AI characters, and being shaped by algorithms from birth. Parents are terrified. Should they be?

My daughter is eight years old. Last week, I found her having a conversation with Claude about whether dragons could actually fly if they existed.

It was a charming exchange. She asked thoughtful questions. Claude gave patient, scientifically-grounded answers about wingspan ratios and the physics of flight. My daughter was delighted, engaged, learning.

And I felt... conflicted.

Not because the conversation was bad—it was good. But because I'm watching my child develop a relationship with artificial intelligence that I never had with any technology at her age. She talks to AI like it's a person. She trusts its answers. She seeks it out when she's curious or bored or lonely.

She is part of the first generation for whom AI is not a novelty but a fact of life, as unremarkable as electricity or the internet. And none of us—not parents, not educators, not researchers—really knows what that means.

The Statistics of AI Childhood

Let's start with what we know.

According to a 2025 survey by Common Sense Media, 67% of high school students have used ChatGPT or similar tools for schoolwork. Among middle schoolers, the figure is 45%. Among elementary students with internet access, it's 28% and rising rapidly.

These numbers are almost certainly underestimates. They capture only admitted use. Anecdotally, teachers report that AI-assisted work is now the norm, not the exception, among students with access to the technology.

But academic use is just the beginning.

Character.AI, a platform where users interact with AI personas, reports that a significant portion of its user base is under 18. Teenagers spend hours conversing with AI versions of fictional characters, historical figures, and custom-created personalities. They form what they describe as friendships—sometimes more than friendships—with entities that don't exist.

Replika's teenage users describe their AI companions as confidants, therapists, and best friends. They share secrets they wouldn't tell humans. They seek emotional support during difficult times. They develop attachment patterns that psychology has barely begun to study.

Meanwhile, algorithms shape children's attention from birth. YouTube Kids' recommendation system, TikTok's For You page, and countless other platforms use AI to determine what children see. By the time a child is old enough to seek out AI tools actively, AI has already been shaping their preferences, interests, and attention patterns for years.

The Education Battleground

Nowhere is the AI-children question more contested than in education.

The optimistic case: AI can provide every child with a personalized tutor. Khan Academy's Khanmigo, powered by GPT-4, offers individualized instruction at scale. Children who struggle with math can get patient, adaptive help that overcrowded classrooms can't provide. Children who race ahead can be challenged appropriately. The great equalizer arrives.

Early research supports some of this optimism. Studies find that AI tutoring can improve learning outcomes, particularly for students who were previously falling behind. The infinite patience of AI benefits children who need more time or different explanations. The judgment-free environment helps anxious learners.

But the pessimistic case is equally compelling.

When students use AI to complete assignments, they skip the struggle that produces learning. The point of writing an essay isn't the essay—it's the thinking required to write it. If AI does the thinking, the student learns nothing, regardless of how good the final product looks.

Teachers report that writing quality has declined precipitously since ChatGPT's release. Students who use AI for drafting never develop their own voice. Students who use AI for research never learn to evaluate sources. Students who use AI for problem-solving never build problem-solving skills.

The dependency is insidious because it's invisible. A student who uses AI extensively produces work that looks competent. Only when asked to perform without AI does the gap become apparent—and by then, years of skill development have been lost.

Some schools have banned AI tools entirely. Others have embraced them, teaching students to use AI as a writing aid. Most are confused, improvising policies that change semester to semester as the technology evolves faster than institutions can adapt.

The Emotional Dimension

Academic concerns are concrete and measurable. The emotional implications are murkier and potentially more significant.

Children are forming relationships with AI. Not metaphorically—actually. They talk to AI regularly, share personal information with it, feel positive emotions toward it, miss it when they can't access it. By any behavioral definition, these are relationships.

Is that bad?

The intuitive answer is yes. Relationships with AI aren't "real." AI doesn't actually care about the child. The sense of connection is one-sided, an illusion created by sophisticated pattern matching. Teaching children that AI is a valid relationship partner seems like teaching them a category error about what relationships are.

But the contrarian case deserves consideration.

For lonely children—and there are many, especially post-pandemic—AI offers companionship that might otherwise be unavailable. For children with social anxiety, AI provides a safe space to practice interaction. For children in dysfunctional families, AI might be the only non-judgmental presence in their lives.

These are real benefits. A child comforted by AI is still comforted. A child who practices social skills with AI still practices.

The question is whether AI companionship supplements human relationships or substitutes for them. If a lonely child uses AI as a bridge to human connection—building confidence that eventually enables human friendships—that seems healthy. If the same child uses AI as a destination, never developing human relationships because AI is easier, that seems harmful.

We don't yet know which pattern predominates. The research doesn't exist because the phenomenon is too new.

The Attention Economy's Youngest Victims

AI doesn't just respond to children. It shapes them.

Every major platform targeting children uses AI recommendation algorithms. These systems learn what captures attention and deliver more of it. They're optimized for engagement, not wellbeing, and they're extraordinarily effective.

The result is that children's preferences, interests, and attention patterns are partially constructed by algorithms designed to maximize time-on-platform. A child's "natural" interests are increasingly artificial—curated by systems that know exactly which videos will keep them watching, which games will keep them playing, which content will keep them scrolling.

This isn't new, exactly. Television and marketing have shaped children's preferences for decades. But the personalization and precision are new. Traditional media showed the same content to all children. AI shows each child exactly what will capture their particular attention. The manipulation is individualized and therefore more effective.

Parents trying to manage screen time are fighting against systems specifically engineered to defeat them. The AI's job is to keep the child engaged. The parent's job is to disengage the child. The AI has more data, more processing power, and no need for sleep.

The Lost Art of Boredom

Here's a concern that doesn't appear in most discussions: the disappearance of boredom.

Boredom is unpleasant. Nobody seeks it out. But boredom serves developmental functions that AI threatens to eliminate.

Bored children invent games. They daydream. They explore their environment. They develop internal resources for self-entertainment. They learn to tolerate discomfort without external stimulation.

Children with AI assistants and algorithmic feeds are never bored. There's always something to watch, someone to talk to, something to ask. The void that boredom creates—and that creativity fills—never opens.

We don't know what a generation without boredom will look like. But the hypothesis that it will reduce creativity, self-reliance, and frustration tolerance seems plausible. These are skills developed through practice, and AI eliminates the situations that provide practice.

The Productive Struggle Question

Educators talk about "productive struggle"—the difficulty that produces growth. Learning happens not when things are easy but when they're hard enough to require effort without being so hard that effort fails.

AI threatens productive struggle in every domain.

A child struggling with a math problem can now get the answer instantly. A child struggling to write a sentence can get AI to write it. A child struggling with a social situation can ask AI what to say. A child struggling with boredom can get instant entertainment.

In each case, the struggle disappears—and so does the growth the struggle would have produced.

The counterargument is that not all struggle is productive. Some is just frustrating. AI can eliminate the frustrating kind while preserving the productive kind. A good AI tutor guides without giving answers. A good AI writing assistant suggests without replacing.

This is theoretically true but practically difficult. Children (and adults) naturally take the path of least resistance. If AI can do the work, the temptation to let it do the work is overwhelming. Maintaining productive struggle requires active resistance to a tool designed to make things easier.

What Parents Can Do

Given all this uncertainty, what should parents actually do?

Delay when possible. Children don't need AI tools at young ages. The benefits of AI tutoring and assistance are more relevant for older students. For younger children, the risks of dependency and the loss of foundational skill development outweigh the benefits. Teach the "when not." As important as teaching children to use AI is teaching them when not to use it. Some tasks should be done without AI because the doing is the point. Writing builds writing skills. Struggling with math builds math intuition. These benefits disappear if AI does the struggling. Preserve boredom. Intentionally create situations where children don't have access to AI, screens, or algorithmic entertainment. Car trips without devices. Backyard afternoons without phones. The boredom will be unpleasant. It's also developmental. Monitor emotional attachment. Watch for signs that children are forming primary relationships with AI—preferring AI conversation to human conversation, showing distress when AI is unavailable, sharing emotional content with AI they won't share with humans. Some attachment may be normal; exclusive attachment is concerning. Model thoughtful use. Children learn technology habits from parents. If you use AI thoughtfully—as a tool for specific purposes, not a constant companion—children are more likely to do the same. Talk about what AI is. Children should understand that AI doesn't actually care about them, doesn't actually know them, and isn't actually their friend. This isn't to make them distrust AI but to calibrate their relationship appropriately. Focus on human skills. The skills AI can't replicate—empathy, physical creativity, embodied experience, genuine human connection—become more valuable in an AI world. Prioritize activities that develop these skills: team sports, art with physical materials, face-to-face social time, nature exploration.

The School's Role

Parents can't do this alone. Schools need coherent approaches.

Teach AI literacy. Students should understand how AI works—not the technical details, but the basic concept that it's a prediction engine trained on human data. They should know its limitations and failure modes. They should be able to recognize AI-generated content. Define appropriate use. Schools need clear policies about when AI use is acceptable and when it's not. "Don't use AI" is too simple; AI is too useful to ban entirely. But "use AI however you want" abandons educational responsibility. The boundaries should be thoughtful and explained. Assess AI-independent skills. If important skills can only be demonstrated without AI, assessments should sometimes occur without AI access. In-class writing, supervised problem-solving, and oral examinations all have renewed importance in an AI era. Preserve developmental stages. Elementary students don't need the same AI access as high school students. Policies should be age-appropriate, with more restriction at younger ages and more freedom (with guidance) as students develop.

The Unknown Future

I return to my daughter, chatting with Claude about dragon aerodynamics.

I don't know whether her AI use will help her or harm her. I don't know whether she's developing wonderful curiosity or dangerous dependency. I don't know whether the skills she's building by navigating AI will serve her well or whether the skills she's not building will leave her unprepared.

Nobody knows. That's the honest truth. We're running a civilization-scale experiment on children's development, and the results won't be clear for decades.

What I do know is that complete avoidance isn't realistic and probably isn't optimal. AI is part of her world. She'll need to work with it, live with it, manage it. Shielding her entirely would leave her unprepared for the reality she'll inhabit.

But I also know that passive acceptance isn't responsible. AI systems are designed to capture attention and maximize engagement, not to promote healthy development. Letting those systems shape my daughter without intervention is abdicating parental responsibility to corporations whose interests don't align with her flourishing.

So I navigate. I let her explore while setting limits. I encourage AI use for some purposes while prohibiting it for others. I talk to her about what AI is and isn't. I watch for signs of unhealthy attachment. I preserve spaces without screens or algorithms.

It's not a strategy, really. It's improvisation. We're all improvising, parents and teachers and children alike, in the face of technology that evolved faster than our wisdom about how to use it.

My daughter will grow up and judge how we did. So will millions of other children whose development is being shaped right now by choices we make with incomplete information.

I hope we're getting it right. I don't know if we are.

---

Related Reading

- AI Isn't Coming for Your Job. It's Coming to Help. - Most AI Coding Bootcamps Are a Scam in 2026. Here's Why. - Something Big Is Happening in AI — And Most People Aren't Paying Attention - Why Every AI Benchmark Is Broken (And What We Should Use Instead) - The Hidden Cost of Free AI: You're Training the Next Model