Everyone agrees kids need durable skills. Critical thinking. Collaboration. Empathy. Adaptability. The ability to navigate a world shaped by AI without losing what makes them distinctly human.
The sessions said it. The research says it. The panels at ASU+GSV said it — again and again, across four days in San Diego.
What nobody said: what about the adults raising those kids?
I attended ASU+GSV this year wearing three hats: an EdTech practitioner with nearly two decades in K–12 and higher ed, a founder building AI education for parents, and a mom of two school-aged girls.
AI was everywhere, as expected. What surprised me was the second most dominant theme: durable skills.
Session after session, the consensus was clear. In an age of AI, the skills that make us distinctly human are the ones that matter most. Across every panel I attended, four categories kept emerging:
How we think: critical thinking, problem-solving, discernment, creativity, sustained attention, curiosity, sensemaking
How we work: communication, collaboration, adaptability, leadership, accountability, project management
How we show up: growth mindset, confidence, resilience, integrity, professionalism, agency
What makes us irreplaceably human: empathy and compassion, relational intelligence, human connection, emotional agility, courage, flexibility
Speakers from the Stanford Accelerator for Learning, Lab4U, NAF, The 74 Million, LearnerStudio, and aiEDU all touched on variations of this. The language shifted. The through line didn’t: these are the skills schools must prioritize.
I agree with every word of it.
But here’s what I kept noticing: every single conversation about building these skills was aimed at schools, teachers, and employers. The question was always some version of: how do we cultivate these skills in students?
Nobody asked the other question: how are we helping the people raising those students do the same at home?
This isn’t just a conference observation. Three major research institutions publishing work in 2025 and 2026 are all pointing in the same direction, and the cumulative weight of what they found matters.
The Rithm Project surveyed 2,383 young people ages 13–24. One of their key findings: 61% of young people say parents and caregivers never or rarely talk to them about AI. Half report that their parents know little to nothing about their AI use. When conversations do happen, they’re almost entirely about academics and cheating.
Their conclusion is striking: what actually protects young people from high-risk AI use is not a technical filter or a school policy. It’s authentic, safe human connection. They call non-judgmental intergenerational conversation “the most important intervention available.”
That’s a parent intervention.
The Center for University Education at Brookings published a year-long global study covering consultations with over 500 students, teachers, parents, and education leaders across 50 countries. The report names families explicitly as a critical stakeholder, recommending that education systems work with families to extend AI literacy from school into home environments, and dedicating a full section to supporting parents in managing children’s AI use at home. It acknowledges directly that parents already carry a double burden, while technology companies continue to outsource responsibility for children’s safety to busy families.
The EDSAFE AI Alliance White Paper documented what’s already happening in what they call the “shadow learning environment”, where students are using AI companions outside of school, often without any adult knowing. The numbers are significant: 72% of teens have already interacted with AI companions. 1 in 3 students say conversations with AI are as satisfying as real-life friendships. 24% confide in AI rather than humans during times of distress.
Among EDSAFE’s practical recommendations: don’t just monitor kids. Partner with families.
None of these risks live primarily at school. They live in the hours between dismissal and bedtime. On the couch at 11pm. In the message that goes to a chatbot instead of a parent.
AI is not a school issue that occasionally comes home. It is a home issue that also shows up at school.
It’s at the dinner table, in the study session, in the group chat. It’s the entity a teenager confides in when they don’t want to burden a friend. It’s the voice that calls their mediocre essay brilliant. It’s the companion that is always available, always agreeable, and never tired.
And the job description for parenting has not kept up.
We talk about screen time limits. We talk about keeping devices out of bedrooms. We talk about monitoring apps. Those things matter. They are not sufficient.
The new job description includes something harder: helping our children develop the skills that AI cannot replicate.ot just by talking about them, but by modeling them, practicing them, and reinforcing them in daily life.
Critical thinking isn’t developed through a worksheet. It’s developed through real conversations with real people who ask real questions and don’t immediately hand over the answer.
Collaboration, communication, accountability: every one of these is a relational skill. They require other humans. They require friction. They require being part of something that doesn’t just agree with you.
Empathy, resilience, integrity, agency: these are character capacities. They are modeled before they are taught. Children learn them by watching the adults around them navigate hard things with honesty.
None of this is possible without parents who are informed, equipped, and in the loop.
The research has identified the gap. The Rithm Project points to parents as one of the most important potential interventions. Brookings recommends centering families as active partners. The EDSAFE AI Alliance calls on education leaders to partner with families.
But here’s what’s missing: the infrastructure to actually prepare parents.
Recommending that schools “include families in AI literacy” still positions parents as the receiving end of a school-initiated strategy, not as people worth designing for directly. Partnering with families requires families who are ready to partner. And most aren’t. Not because they don’t care, but because they were never given a framework.
Parents are the primary educators of our children’s lives. We’re the most proximate trusted adults. We’re present during the hours no school policy reaches. We’re the people who can have the curious, non-judgmental conversations the research says matter most.
If the field is serious about durable skills, about human connection, about accountability being distributed across an entire community, then the adults doing that work at home have to be part of the design. Not in the last slide. Not in a once-a-semester workshop. At the table.
ASU+GSV reminded me of how much smart, dedicated people are working on the right problems. I came home genuinely hopeful.
And I came home with one persistent question
When do we invest the same energy in empowering the adults who are responsible for modeling durable skills at home?
Those adults are parents and caregivers. And most of them are doing it without a framework, without a community, and without anyone in the field building for them specifically.
That’s the gap. That’s the work.
And if any of this resonates with you, whether you’re a parent, a researcher, a policy leader, or an educator who’s been thinking about this too, I’d love to hear from you in the comments.
What are you seeing in the research about parents and AI that you think deserves more attention? Drop it below.
About the Author Julie Kelleher is the founder of LIKEAMOTHER.AI™ and the creator of Parent-in-the-Loop™, upskilling and reskilling for parents navigating AI with their kids. She is a former middle school teacher, Peace Corps Volunteer, and EdTech strategist with nearly two decades of experience. She is also a co-founder of Raising AI and an EDSAFE AI Alliance Catalyst Fellow.
References
April 27, 2026
Be the first to comment