Professional Services on a Server You Can Point To
You are a therapist. Your practice grew, you hired two associates, and the state licensing board just updated its language around electronic records. Your client intake form has a question about where notes are stored. Last year you answered "Notion." This year you need a better answer. Your existing tools are convenient; your existing tools are also owned by companies whose SLAs are written for engineers rather than for licensees of a regulated profession.
The data-residency question
Every regulated profession faces a variant of the same question: where, specifically, does client data live? The cloud-first generation of tools had one answer: "many places." For a while, that was fine. Now the question has teeth again. HIPAA covered entities owe a documented answer. Law firms owe one to their ethics regulators. Accounting firms owe one to the PCAOB. Educational services owe one to FERPA. Each regime has slightly different scope; all of them want you to be able to point at an actual machine.
"Many places" becomes a liability the first time a regulator, an auditor, or a plaintiff's counsel asks you to demonstrate control.
What Crow does
Crow runs on a server you can point at. Literally. The machine can be in your office closet, in a rented colocation rack, or on a single-tenant cloud instance in the jurisdiction your regulator cares about. You own the disk. You hold the encryption keys. You control who reaches the server.
Three layers matter for compliance work:
- Network boundary. Tailscale makes the instance tailnet-only by default. Clients never touch the server directly; you reach it from your devices and nowhere else. The public blog is the single optional exception, path-scoped so only
/blogis exposed. - Encrypted-at-rest storage. SQLite data files live on disk; full-disk encryption on your server is your baseline. MinIO object storage binds to the tailnet IP, so files never traverse the public internet.
- Auditable moderation actions. Destructive actions queue in a log for 72 hours before executing. Every one is attributed, timestamped, and reversible within the window. The log doubles as an audit trail if a regulator ever asks "who deleted what, when?"
A scenario
A client requests their file under a privacy statute. Your workflow:
You: pull everything related to client 7441 for an access request
Crow: Found 24 research notes, 3 source documents, 11 blog drafts
(status: private), and 47 memory entries. Bundled as
client-7441-access-20260420.zip. Storage: local, 4.2MB.
Attribution and timestamps included per audit schema.
One request, one export, one documented chain of custody. Your AI did the retrieval. You did the review. The regulator sees a clean answer to a clean question.
Tradeoffs, honestly
Compliance is a discipline. A platform cannot give it to you; it can only make the discipline easier to practice. Running your own server reduces one risk and shifts others onto you. You are now responsible for patching, for backups, for the incident-response plan if the box is stolen. Pair the primary with a second instance in a different physical location for disaster recovery. Test the restore quarterly.
Your AI provider is a separate surface. If you pipe client data to a commercial LLM, review the provider's data-processing addendum. For the highest-sensitivity work, run a local model (Ollama or vLLM) and keep every token inside your network. The tradeoff is quality: a local model is slower and often less capable than a frontier one. Many practitioners run a local model for intake and a cloud model for non-client research.
Finally, document your controls. An auditor who sees "runs on Tailscale with MinIO encrypted at rest, backed up nightly to a Hetzner box in the same jurisdiction" will accept that answer. One who sees "I use Crow" will ask for more. Your Nest settings panel can export a control inventory for exactly this purpose.
Start here
Read the Tailscale setup guide first. It governs the network boundary and is the most important single step: getting started with Crow.
Next post in this series: offline-first classrooms with a local LLM.