Offline-First Classrooms with a Local LLM
Third period, a small rural middle school. The district's AI policy says student data may not leave the state. The teacher, a computer science teacher who just discovered block-based coding with their students, wants an AI assistant that can help a twelve-year-old debug a Scratch project. The IT coordinator has read every privacy policy from every AI company the vendor pitches have promoted. None of them give a comfortable answer. The teacher's workaround so far has been to do the debugging herself, one student at a time.
The policy gap
Educational data protection is a patchwork: FERPA federally, COPPA for the youngest students, a half-dozen state-level statutes, plus district-level AI policies. Most of them say the same thing in different words. Student data cannot leave the school's control. Commercial AI providers, almost without exception, process data in their own environments with their own privacy terms. The gap is real. Teachers need tools. Districts need policy compliance. Vendors sell something between the two and paper over the difference.
Remote and rural schools also face the other problem: the connection. A full day of student AI usage assumes bandwidth that many schools do not have. When the internet drops, cloud-hosted AI drops with it.
What Crow does
Crow installs on a small box in the server closet or a cart that rolls between rooms. A single mini PC with a recent GPU (or even a recent CPU with a small model) runs an entire classroom. Four bundles cover the common use cases:
- Kolibri brings an offline-first library of educational content, tuned to the state's standards. Videos, practice problems, reading passages, all on the local disk.
- Maker Lab adds a scaffolded AI tutor designed for kids. Age-banded personas, a hint ladder that teaches problem-solving instead of giving the answer, classroom mode for the teacher to monitor.
- Scratch Offline runs the Scratch coding environment on the local server, so students can save and share projects inside the school LAN.
- vLLM or Ollama hosts a local language model. The model lives on the classroom server. Every inference stays on the LAN.
The classroom server announces itself on the local network. Student Chromebooks find it automatically via LAN discovery. No public URL, no outbound call to an AI provider, no data leaving the building.
A Thursday afternoon
A student, stuck on the last step of their Scratch project:
Student: my cat sprite keeps moving off the screen and i can't
click it
Maker Lab: Sounds like the sprite is running out of the stage
bounds. Before I show you the answer, can you find the
"if on edge, bounce" block? It is in the Motion
category.
Student: found it
Maker Lab: Drag it into your forever loop. Run the project. Does
the cat stay on the stage now?
A teacher, from the staff laptop, checking the class status:
Teacher: status of period 3
Maker Lab: 18 students active in the last 20 minutes. 14 are
making progress on their Scratch assignments; 3 are
stuck on the same sprite-bounds issue and have asked
for help; 1 has not started. Privacy settings per
district policy.
No student data leaves the box. The teacher sees what she needs. The district compliance officer has an answer when asked.
Tradeoffs, honestly
Local models are slower than frontier cloud models. A kid asking a clever question gets a good answer, and gets it in five seconds instead of two. For the classroom use case, the latency is a fine trade for the data-residency win.
The server needs a person. Someone in the district has to be responsible for it: updates, backups, occasional troubleshooting. An $8,000 mini PC plus an afternoon a month of IT time is the real price tag; it replaces a few hundred dollars of cloud-AI subscription plus ongoing policy risk.
Finally, the model is only as good as the curriculum. Kolibri plus Maker Lab is a strong starting kit; the teacher is still the teacher. The AI is a scaffold for the lesson, and the lesson is hers.
Start here
Talk to your IT coordinator before installing. The hardware is modest; the policy conversation is the real work. Once you have clearance, the install takes under an hour: getting started with Crow.
Next post in this series: MCP, the protocol argument.