The Learning Network
Annual insights generated from sideby session transcripts and Small Wins · Academic Year 2025–26
Bar lengths and percentages reflect evidence observed across sessions — not evaluative judgments about educator capability. Absence of evidence reflects what we haven't yet seen, not what isn't there.
Learning focal points
What are educators focused on and grappling with? What are they trying to solve?
Top 3 themes in learning goals
Educators consistently set goals oriented toward using AI to better reach their students — not to adopt technology for its own sake. The entry point is almost always a specific student or group: multilingual learners, students with IEPs, students who need more challenge. This learner-centered motivation is the strongest driver of engagement across the corpus.
Educators name differentiation as both their highest-priority instructional goal and their most time-intensive challenge. AI is emerging as a practical response — generating leveled texts, scaffolded prompts, and alternative explanations. The goal is less about innovation and more about feasibility: doing the differentiation that educators already believe in, at a pace that doesn't exhaust them.
A growing number of educators are shifting from managing AI use to actively teaching it. Goals in this category focus on equipping students to evaluate accuracy, detect bias, and understand what AI can and cannot do. This is strongest among secondary educators and those with more AI experience — but it is present across grade levels.
Top 3 tensions, challenges, and questions educators are grappling with
The most frequently named tension, appearing in 67% of sessions. Educators want students to genuinely think, and aren't yet sure how to design tasks that require authentic reasoning when AI can produce plausible-looking responses. This isn't framed as a fear of cheating — it's framed as a design challenge they don't yet have language or tools for.
Unclear or absent district and school-level AI policies were named in 51% of sessions. Educators describe a landscape where one teacher bans AI use and the next actively incorporates it — with students experiencing this inconsistency directly. Several educators described waiting for guidance before acting. Others described acting without guidance and feeling unsupported.
Educators describe wanting to learn alongside colleagues but lacking structures to do so. Professional development time is oriented toward compliance or content delivery rather than collaborative inquiry. Educators who are making progress describe finding informal peer partners independently — not through school structures.
Areas educators most want to explore next
Patterns of practice
What are the observable patterns in educator behaviors?
Top 3 practices educators report trying
The most widely reported practice: using AI to produce leveled versions of texts, tiered task prompts, and alternative explanations for concepts. Reported most frequently by elementary and special education teachers. This is the practice with the most consistent follow-through between sessions.
Educators describe using AI in planning time to generate ideas, prototype unit structures, surface possible misconceptions, and pressure-test learning sequences. This is more of a private practice — it happens before instruction, so students don't see it. Several educators describe this as the practice that converted them from skeptics to regular users.
A smaller but growing group of educators has moved beyond using AI themselves and has begun designing explicit student experiences with AI tools. Most described doing this once or twice, with significant reflection about what they'd change next time. This practice is most developed among educators in grades 5–12.
Top 3 identified next implementation steps
Cited by 38 educators as their intended next step. Most describe wanting students to have co-ownership of the guidelines — not just rules handed down by the teacher. Several are planning to use sideby sessions to prototype these agreements before introducing them to students.
Educators are naming a desire to learn with colleagues — sharing what works, what flopped, and what questions they're carrying. This is not yet institutionalized; it's happening informally when educators find their own partners. There is appetite for structures that make this easier.
Educators describe wanting to redesign assessments so AI can't shortcut them — not by banning AI, but by requiring students to make their reasoning visible at multiple points. This is especially strong among educators who described feeling caught flat-footed by AI use in their current assignments.
Top 3 shifts in practice — what changed over the year
Deeper Learning focal points
What are the more specific connections to AI for Deeper Learning across the crew?
Top 3 Deeper Learning outcomes most present in conversations, goals, and Small Wins
The most consistently present Deeper Learning thread across the entire corpus. Educators are grappling with how to design experiences where students must do the analytical work — not outsource it. The most developed educators in this area are beginning to build tasks that require visible thinking at multiple stages, not just a final product.
Granular focus areas within this outcome:
- Verifying AI-generated facts and detecting AI-generated misinformation
- Identifying AI bias — strongest in ELA, social studies, and history classes
- Designing tasks where students must justify reasoning, not just produce answers
"A student came to me and said — 'I think the AI got this wrong.' That was the moment I realized something was actually shifting."
Grades 6–8 educator · Spring 2025 session
Strong presence in goal-setting and Small Wins. Educators report using AI most concretely to create multiple versions of materials for diverse learners. Success signs include multilingual learners accessing grade-level content, students with IEPs having scaffolded entry points, and advanced learners receiving extension tasks — all without the educator having to manually produce each version.
Granular focus areas within this outcome:
- Leveled text generation for multilingual learners and emergent readers
- Scaffolded task prompts that create pathways for students who are frequently struggling
- Less evidence yet of AI-supported data analysis informing instructional decisions
"For the first time, my ELL students could access the same content as everyone else. I used AI to create four different versions of the reading — it took me twelve minutes."
Elementary educator · Winter 2025 session
This outcome is present but early-stage across the crew. Educators express high belief in the importance of student metacognition about AI — knowing when to use it, when not to, and what it reveals about one's own thinking — but structured routines for teaching this are not yet widely established. The belief is ahead of the practice.
Granular focus areas:
- Students reflecting on how AI shaped their thinking process
- Students setting personal guidelines for their own AI use
- Educators modeling their own AI metacognition for students — early signals, not yet systematic
Mindset & beliefs
Where are educators in their relationship to AI for Deeper Learning? What momentum and hesitation are visible?
Sentiment toward AI — distribution across the crew
Sentiment toward Deeper Learning: broadly positive (82%), with consistent tension between belief in DL outcomes and pressure to cover content and raise test scores.
Top 3 themes in stated beliefs, assumptions, and values
Equity-as-motivation is the strongest entry point for AI adoption across this crew. Educators who had not engaged deeply with AI tools became curious when the frame shifted from technology adoption to access and equity. This belief is present across experience levels and school types.
A growing reframe visible across approximately 44% of sessions by spring 2025, compared to earlier in the year when most AI conversations were framed around risk and policy. This shift represents a meaningful movement in professional identity — from gatekeeper to educator of AI use.
A strong and recurring belief, particularly among educators in their first year of AI engagement. These educators describe wanting time and permission to experiment, make mistakes, and reflect — without the pressure of being evaluated on outcomes. The sideby session structure surfaces directly in some transcripts as a container that enables this kind of learning.
Top 3 shifts in thinking
Sample ideas & Small Wins
Small Wins curated from the crew's submissions that illuminate these insights most clearly.
"I asked AI to help me rewrite our current science unit text at three different reading levels. Within twenty minutes I had versions for my below-grade readers and my students with IEPs that still held the same key concepts. My students who normally need the most support were participating in the full class discussion for the first time in months."
"We used AI to generate a 'first draft' argument on a contested historical question, then students had to identify what was missing, what was unfair, and what they would argue differently. Three students who rarely participate in discussion were the most vocal critics of the AI's version. One said: 'It sounds confident but it doesn't actually know what it's talking about.'"
"I redesigned a lab report to include a mandatory 'thinking trace' where students had to document their reasoning process at three points — before AI use, during, and after. Students who used AI extensively actually produced better reflections because they had to articulate what changed and why. The AI use became visible data, not hidden."
"I used my sideby sessions this year to experiment with AI the same way I wanted teachers to. I made mistakes in front of my coaching clients and talked through them out loud. Three teachers told me it was the first time they felt like they had permission to not know what they were doing. That shifted something in the room."
"I introduced a simple routine: before we use any tool — AI or otherwise — we ask 'what are we trying to figure out, and does this tool help us figure it out ourselves or figure it out for us?' My second graders now ask this question without me prompting it. It's become part of how we talk about thinking."
Recommended leadership actions
Where does this data indicate opportunities for those leading and resourcing this work?
Educators consistently describe wanting to learn alongside colleagues but lacking structures to do so. The most growth we observed happened in relationships, not solo exploration. Consider designating regular "AI pedagogy inquiry" time — even 40 minutes monthly — with optional partnerships across grade levels or departments. sideby can scaffold these conversations.
Educators experiencing the most progress this year are those who have a clear framework for thinking about AI in relation to student learning — not just rules about what's allowed. Introducing the AI for Deeper Learning framework as a shared professional vocabulary, before it becomes a compliance tool, is likely to accelerate thoughtful adoption.
Policy ambiguity is the single largest structural barrier named across the corpus. The most effective path forward is not top-down policy but a co-created framework that gives educators and students shared language and shared agency. Teams that built classroom-level agreements reported significantly more confidence and less anxiety.
Educators who adopted AI practices most readily did so because they connected it to equity goals they already held. These educators are potential peer leaders for colleagues who are resistant or uncertain. Making their rationale and practice visible — through storytelling, Small Wins features, or crew showcases — is more likely to shift reluctant colleagues than technical training.
This is the most consistently named unmet learning need. Educators need concrete frameworks for redesigning tasks so that AI use produces evidence of thinking rather than obscuring it. Workshop or inquiry cycle design anchored in the assessment redesign work educators are already doing would have immediate traction.
Using AI to produce leveled texts and scaffolded task versions is the most consistently practiced and most positively reported action across the crew. It is well-aligned to existing instructional priorities, requires minimal technical fluency, and has produced direct success signs for students who are frequently excluded from full participation. This is the practice most ready to spread intentionally across schools and grade levels.