The Learning Network — sideby Insights Report 2025–26

The Learning Network

Annual insights generated from sideby session transcripts and Small Wins  ·  Academic Year 2025–26

100 educators 8 schools 384 sessions analyzed 247 Small Wins submitted Sept 2025 – May 2026
94
educators with at least one session
3.8
avg sessions per educator
247
Small Wins submitted
72%
of educators showed practice shifts

Bar lengths and percentages reflect evidence observed across sessions — not evaluative judgments about educator capability. Absence of evidence reflects what we haven't yet seen, not what isn't there.


Section 1

Learning focal points

What are educators focused on and grappling with? What are they trying to solve?

Top 3 themes in learning goals

Theme 1  ·  Most frequently stated
Building personal AI fluency in service of students

Educators consistently set goals oriented toward using AI to better reach their students — not to adopt technology for its own sake. The entry point is almost always a specific student or group: multilingual learners, students with IEPs, students who need more challenge. This learner-centered motivation is the strongest driver of engagement across the corpus.

Evidence across sessions
78%
Theme 2
Using AI to make differentiated instruction more sustainable

Educators name differentiation as both their highest-priority instructional goal and their most time-intensive challenge. AI is emerging as a practical response — generating leveled texts, scaffolded prompts, and alternative explanations. The goal is less about innovation and more about feasibility: doing the differentiation that educators already believe in, at a pace that doesn't exhaust them.

Evidence across sessions
65%
Theme 3
Helping students think critically about AI-generated content

A growing number of educators are shifting from managing AI use to actively teaching it. Goals in this category focus on equipping students to evaluate accuracy, detect bias, and understand what AI can and cannot do. This is strongest among secondary educators and those with more AI experience — but it is present across grade levels.

Evidence across sessions
41%

Top 3 tensions, challenges, and questions educators are grappling with

Student thinking vs. AI thinking — where's the line?

The most frequently named tension, appearing in 67% of sessions. Educators want students to genuinely think, and aren't yet sure how to design tasks that require authentic reasoning when AI can produce plausible-looking responses. This isn't framed as a fear of cheating — it's framed as a design challenge they don't yet have language or tools for.

Policy ambiguity is creating paralysis and inconsistency

Unclear or absent district and school-level AI policies were named in 51% of sessions. Educators describe a landscape where one teacher bans AI use and the next actively incorporates it — with students experiencing this inconsistency directly. Several educators described waiting for guidance before acting. Others described acting without guidance and feeling unsupported.

No protected time to learn and experiment together

Educators describe wanting to learn alongside colleagues but lacking structures to do so. Professional development time is oriented toward compliance or content delivery rather than collaborative inquiry. Educators who are making progress describe finding informal peer partners independently — not through school structures.

Areas educators most want to explore next

AI-supported formative assessment Teaching students to critique AI outputs Scaffolding for multilingual learners w/ AI Classroom-level AI use agreements AI and project-based learning design Using AI for student feedback loops

Section 2

Patterns of practice

What are the observable patterns in educator behaviors?

Top 3 practices educators report trying

Using AI to generate differentiated learning materials — 52 educators

The most widely reported practice: using AI to produce leveled versions of texts, tiered task prompts, and alternative explanations for concepts. Reported most frequently by elementary and special education teachers. This is the practice with the most consistent follow-through between sessions.

Reported practice
52%
Using AI as a lesson-planning and unit-design partner — 44 educators

Educators describe using AI in planning time to generate ideas, prototype unit structures, surface possible misconceptions, and pressure-test learning sequences. This is more of a private practice — it happens before instruction, so students don't see it. Several educators describe this as the practice that converted them from skeptics to regular users.

Reported practice
44%
Introducing AI tools directly to students with structured guidance — 31 educators

A smaller but growing group of educators has moved beyond using AI themselves and has begun designing explicit student experiences with AI tools. Most described doing this once or twice, with significant reflection about what they'd change next time. This practice is most developed among educators in grades 5–12.

Reported practice
31%

Top 3 identified next implementation steps

Developing classroom-level AI use agreements with students

Cited by 38 educators as their intended next step. Most describe wanting students to have co-ownership of the guidelines — not just rules handed down by the teacher. Several are planning to use sideby sessions to prototype these agreements before introducing them to students.

38 educators committed to this step
Building peer-learning routines around AI discovery

Educators are naming a desire to learn with colleagues — sharing what works, what flopped, and what questions they're carrying. This is not yet institutionalized; it's happening informally when educators find their own partners. There is appetite for structures that make this easier.

29 educators named this intention
Designing formative checkpoints that require students to show their own thinking

Educators describe wanting to redesign assessments so AI can't shortcut them — not by banning AI, but by requiring students to make their reasoning visible at multiple points. This is especially strong among educators who described feeling caught flat-footed by AI use in their current assignments.

24 educators named this intention

Top 3 shifts in practice — what changed over the year

Using AI to save time for myself
Using AI to serve more students more specifically
Experimenting alone
Beginning to share AI discoveries and failures with colleagues
Treating AI use as invisible or implicit in student work
Beginning to teach AI use explicitly as part of the learning design

Section 3

Deeper Learning focal points

What are the more specific connections to AI for Deeper Learning across the crew?

Top 3 Deeper Learning outcomes most present in conversations, goals, and Small Wins

Fostering Critical Thinking & Problem Solving w/ AI
Distinguishing student reasoning from AI reasoning

The most consistently present Deeper Learning thread across the entire corpus. Educators are grappling with how to design experiences where students must do the analytical work — not outsource it. The most developed educators in this area are beginning to build tasks that require visible thinking at multiple stages, not just a final product.

Granular focus areas within this outcome:

  • Verifying AI-generated facts and detecting AI-generated misinformation
  • Identifying AI bias — strongest in ELA, social studies, and history classes
  • Designing tasks where students must justify reasoning, not just produce answers

"A student came to me and said — 'I think the AI got this wrong.' That was the moment I realized something was actually shifting."

Grades 6–8 educator  ·  Spring 2025 session

Supporting All Students to Meet Academic Standards w/ AI
AI as a differentiation and access tool — sustainably

Strong presence in goal-setting and Small Wins. Educators report using AI most concretely to create multiple versions of materials for diverse learners. Success signs include multilingual learners accessing grade-level content, students with IEPs having scaffolded entry points, and advanced learners receiving extension tasks — all without the educator having to manually produce each version.

Granular focus areas within this outcome:

  • Leveled text generation for multilingual learners and emergent readers
  • Scaffolded task prompts that create pathways for students who are frequently struggling
  • Less evidence yet of AI-supported data analysis informing instructional decisions

"For the first time, my ELL students could access the same content as everyone else. I used AI to create four different versions of the reading — it took me twelve minutes."

Elementary educator  ·  Winter 2025 session

Fostering Students' Learning Skills & Mindsets w/ AI
Student metacognition about AI use — early, promising, unstructured

This outcome is present but early-stage across the crew. Educators express high belief in the importance of student metacognition about AI — knowing when to use it, when not to, and what it reveals about one's own thinking — but structured routines for teaching this are not yet widely established. The belief is ahead of the practice.

Granular focus areas:

  • Students reflecting on how AI shaped their thinking process
  • Students setting personal guidelines for their own AI use
  • Educators modeling their own AI metacognition for students — early signals, not yet systematic

Section 4

Mindset & beliefs

Where are educators in their relationship to AI for Deeper Learning? What momentum and hesitation are visible?

Sentiment toward AI — distribution across the crew

Cautiously optimistic
62%
Anxious or resistant
24%
Enthusiastically engaged
14%

Sentiment toward Deeper Learning: broadly positive (82%), with consistent tension between belief in DL outcomes and pressure to cover content and raise test scores.

Top 3 themes in stated beliefs, assumptions, and values

"AI is most valuable when it helps me reach the students I was already struggling to reach"

Equity-as-motivation is the strongest entry point for AI adoption across this crew. Educators who had not engaged deeply with AI tools became curious when the frame shifted from technology adoption to access and equity. This belief is present across experience levels and school types.

"The risk isn't that students use AI — it's that we don't teach them how to use it well"

A growing reframe visible across approximately 44% of sessions by spring 2025, compared to earlier in the year when most AI conversations were framed around risk and policy. This shift represents a meaningful movement in professional identity — from gatekeeper to educator of AI use.

"I need to experience this as a learner myself before I can facilitate it with students"

A strong and recurring belief, particularly among educators in their first year of AI engagement. These educators describe wanting time and permission to experiment, make mistakes, and reflect — without the pressure of being evaluated on outcomes. The sideby session structure surfaces directly in some transcripts as a container that enables this kind of learning.

Top 3 shifts in thinking

AI is a threat to academic integrity → ban it or strictly control it
AI requires a redesign of assessment, not just a prohibition
I'm not a tech person — this isn't for me
This is actually a pedagogy question, and pedagogy is my expertise
I'll wait until there's more guidance from district or admin
I'll start experimenting and bring my questions back to my team

Section 5

Sample ideas & Small Wins

Small Wins curated from the crew's submissions that illuminate these insights most clearly.

Elementary educator  ·  Supporting All Students  ·  Feb 2025

"I asked AI to help me rewrite our current science unit text at three different reading levels. Within twenty minutes I had versions for my below-grade readers and my students with IEPs that still held the same key concepts. My students who normally need the most support were participating in the full class discussion for the first time in months."

AI for differentiation Multilingual learners All students accessing content
Middle school humanities educator  ·  Critical Thinking  ·  March 2025

"We used AI to generate a 'first draft' argument on a contested historical question, then students had to identify what was missing, what was unfair, and what they would argue differently. Three students who rarely participate in discussion were the most vocal critics of the AI's version. One said: 'It sounds confident but it doesn't actually know what it's talking about.'"

Critical AI evaluation Student voice and agency Equity — student engagement
High school science educator  ·  Assessment design  ·  Jan 2025

"I redesigned a lab report to include a mandatory 'thinking trace' where students had to document their reasoning process at three points — before AI use, during, and after. Students who used AI extensively actually produced better reflections because they had to articulate what changed and why. The AI use became visible data, not hidden."

Assessment redesign Student metacognition Visible thinking protocols
Instructional coach  ·  Professional learning  ·  April 2025

"I used my sideby sessions this year to experiment with AI the same way I wanted teachers to. I made mistakes in front of my coaching clients and talked through them out loud. Three teachers told me it was the first time they felt like they had permission to not know what they were doing. That shifted something in the room."

Educator as learner Modeling vulnerability Coaching culture
K–2 educator  ·  Learning skills & mindsets  ·  Feb 2025

"I introduced a simple routine: before we use any tool — AI or otherwise — we ask 'what are we trying to figure out, and does this tool help us figure it out ourselves or figure it out for us?' My second graders now ask this question without me prompting it. It's become part of how we talk about thinking."

Metacognition routines Student agency Early childhood AI literacy

Section 6

Recommended leadership actions

Where does this data indicate opportunities for those leading and resourcing this work?

Recommended supports for educators
Create protected inquiry time with structured peer partnerships

Educators consistently describe wanting to learn alongside colleagues but lacking structures to do so. The most growth we observed happened in relationships, not solo exploration. Consider designating regular "AI pedagogy inquiry" time — even 40 minutes monthly — with optional partnerships across grade levels or departments. sideby can scaffold these conversations.

Recommended supports for educators
Offer a shared language and framework — not a policy mandate

Educators experiencing the most progress this year are those who have a clear framework for thinking about AI in relation to student learning — not just rules about what's allowed. Introducing the AI for Deeper Learning framework as a shared professional vocabulary, before it becomes a compliance tool, is likely to accelerate thoughtful adoption.

Organizational structures & conditions
Develop a school-level AI use framework co-created with teachers and students

Policy ambiguity is the single largest structural barrier named across the corpus. The most effective path forward is not top-down policy but a co-created framework that gives educators and students shared language and shared agency. Teams that built classroom-level agreements reported significantly more confidence and less anxiety.

Organizational structures & conditions
Surface and celebrate the equity-motivated educators — they are your pathfinders

Educators who adopted AI practices most readily did so because they connected it to equity goals they already held. These educators are potential peer leaders for colleagues who are resistant or uncertain. Making their rationale and practice visible — through storytelling, Small Wins features, or crew showcases — is more likely to shift reluctant colleagues than technical training.

Priority areas for further professional learning
Assessment redesign for an AI-present classroom

This is the most consistently named unmet learning need. Educators need concrete frameworks for redesigning tasks so that AI use produces evidence of thinking rather than obscuring it. Workshop or inquiry cycle design anchored in the assessment redesign work educators are already doing would have immediate traction.

Practices ready for broader adoption
AI-generated differentiated materials — the highest-confidence, highest-impact practice

Using AI to produce leveled texts and scaffolded task versions is the most consistently practiced and most positively reported action across the crew. It is well-aligned to existing instructional priorities, requires minimal technical fluency, and has produced direct success signs for students who are frequently excluded from full participation. This is the practice most ready to spread intentionally across schools and grade levels.