Welcome to School’s Out Field Notes
I’m Jason Padgett, wearer of many hats and power user of multimodal AI tools.
By day, I serve as Director of Partnerships and Growth for CitizenAI, a vertical AI company building chat-based support tools for social workers, case managers, recovery coaches, and the people they serve.
Nights and weekends, I host the AGI Podcast Network, including The Weekly Blitz, The Vibe, and our flagship education show, School’s Out Saturdays. That is where we talk with students, teachers, administrators, workforce leaders, and edtech builders about how to navigate the age of intelligence in a way that prepares students for their future, not our past.
Professionally, that’s the short version.
Personally, I’m a husband, father, grandfather, volunteer educator, and pseudo nerd.
I have never been formally employed as a teacher, but education has been part of my life for a long time. Around 2010, I taught recovery and wellness once a week at Hope Academy, a charter high school in Indianapolis for students working through drug and alcohol issues. More recently, I’ve taught basic AI literacy through the Wabash Area Lifetime Learning Association, and I’ve been a guest speaker at Purdue University, Ivy Tech Lafayette, and through the Skyepack Career+Pathways Program.
My wife was an English teacher at Hope Academy when we met. Her mother was a teacher. Her aunt was a teacher. Many of my dear friends, like Ronda Swartz, work in education.
So yes, education is near and dear to my heart.
My older children graduated from Purdue University, one with a PharmD and one with a bachelor’s in biochemistry. My youngest is a freshman at Carmel High School.
That means this conversation is not abstract for me. It is professional, personal, and sitting at my kitchen table.
I believe we are standing at the edge of a major change in how we work, live, learn, create, and engage with technology and one another. Our education systems have to navigate that change in a way that prepares students for their future, not our past.
That is why Field Notes exists.
A weekly note about AI, education, literacy, and practical use, written as a conversation starter and a compass for the people doing the work in the field.
Teachers. School leaders. Parents. Workforce partners. Higher ed faculty. Builders.
The people close enough to students to know what is actually happening.
And with that, let’s start with a bridge.
Access Isn’t Readiness
Access is the easy part.
A student can open an AI tool in ten seconds. That does not mean they know how to question the answer, explain the process, check the source, spot the shortcut, or decide when the tool is doing too much.
I call that the readiness gap.
And I think that gap is going to matter more than whether a school blocks one tool, approves another tool, or spends the next semester rewriting an acceptable use policy that will be outdated by the time everyone has finished arguing over the commas.
Students are already using AI.
Some are using it well. Some are using it poorly. Some are using it like a tutor. Some are using it like a vending machine. Prompt in, finished answer out.
Those are not the same behavior.
That is where school still matters. Maybe more than ever.
Before the question becomes, “Did students use AI?” I think the better question is:
Pearson and AWS make a similar point in their AI readiness report. Their focus is higher education and work, but the warning applies earlier than college. Access to AI tools does not equal applied readiness. Readiness means students can use the tool, apply judgment, communicate clearly, think ethically, and work well with intelligent systems.
That does not start magically at age eighteen.
It starts with practice.
Source: Pearson and AWS, AI Readiness: Building the Bridge from Higher Education to Work
The Bridge Crew
Most teachers did not sign up to become AI readiness officers.
Yet here they are, holding the clipboard, checking the ropes, watching the bridge move while students are already trying to cross it.
That is a lot to ask of people who were already dealing with attendance issues, learning gaps, behavior challenges, grading piles, parent emails, district requirements, and the quiet exhaustion that comes from caring about kids inside a system that often runs on fumes.
So let’s be real.
Teachers should not have to build this bridge alone.
This is where I think the conversation needs to mature a bit. The question cannot stay stuck at “Should AI be allowed in school?” That question had its moment, but the genie is out of the bottle and teenagers do not typically wait for committee approval before trying new technology.
The better question is:
Lord Jim Knight’s Cambridge piece, Rethinking education in the age of AI, gets closer to the conversation I think we need. It points toward personalization, project-based learning, oral explanation, digital portfolios, and assessment that asks students to defend their thinking.
That sounds like the right direction to me.
Not because AI replaces teachers.
Lord help us if that is the takeaway.
AI might help with planning, feedback, examples, translation, tutoring, and seeing patterns across student work. But teachers still set the direction. Students still need to do the thinking. Families, employers, higher ed, and communities still have to help define what readiness actually means.
That is the bridge crew.
AI can be part of the crew.
It should not be the captain.
Source: Lord Jim Knight, Rethinking education in the age of AI
Projects
Here is the plank I keep coming back to:
Not giant, grant-funded, semester-long productions with matching T-shirts and a drone shot for the district website.
I’m talking about one assignment, already on the calendar, turned slightly toward the real world.
A worksheet answer becomes a guide for a younger student.
A history response becomes a short briefing for a mayor.
A math problem becomes a comparison sheet for a family choosing between two options.
A science explanation becomes a one-page field guide.
A career class research assignment becomes a first-week survival guide for someone starting that job.
That shift matters because AI changes the value of the final answer. If the final answer is all we care about, students will be tempted to outsource the thinking. Frankly, adults will too. We should not pretend this is only a student problem.
But when students have to make something for a real audience, explain their choices, and show what changed along the way, the assignment starts to reveal more than the final product.
In the first Field Notes issue, published through my No-Code CAIO newsletter on LinkedIn, I wrote about the difference between support and substitution.
Support helps the student think.
Substitution replaces the thinking.
The tricky part is that the final product can look the same.
A polished essay does not prove the student wrestled with the idea.
A clean slide deck does not prove the student understands the topic.
A correct answer does not prove the student can explain the process.
That is why projects matter.
They give students a reason to care, and they give teachers more places to look for evidence of learning.
A Small Experiment
Try this tomorrow.
Pick one assignment you already planned to give.
Then ask one question:
That’s it.
Do not redesign the whole unit. Do not build a six-week project. Do not create a rubric that needs its own legal counsel.
Just give the assignment a real audience.
Then, if students use AI, ask for a short process note with the final work.
- What did you ask AI to help with?
- What did it give you that was useful?
- What did you change, reject, or question?
- What part of the thinking is still yours?
- Who is this work for?
That is not a full AI policy.
It is not a magic fix.
It is one plank.
But one plank is enough to start building.
One Question For You
Where could one assignment in your classroom, school, program, or organization become a small project with a real audience?
That is the question I’d love to hear educators answer.
Not in theory.
This week.