Cloud Deploy
Archived — Legacy Deployment Path
This page documents the Render + Turso deployment method, which is no longer supported. Turso cloud database support has been removed from Crow. Multi-device sync is now handled by Hypercore P2P replication with local SQLite.
For new deployments, use Oracle Cloud Free Tier, a Home Server, or Managed Hosting.
Recommended: Oracle Cloud Free Tier
For new cloud deployments, we recommend Oracle Cloud Free Tier — it never sleeps, uses local SQLite (no external database), and is genuinely permanent. The guide below is for Render, which is still supported but has significant limitations.
Don't want to manage infrastructure?
Try managed hosting — $15/mo, no setup required.
Legacy: Render + Turso Deployment
Render Free Tier Limitations
- Sleeps after 15 minutes of inactivity — first request after sleep takes ~30 seconds
- Ephemeral disk — local files are lost on redeploy, requiring Turso as an external database
- Two dependencies — you need both a Render account and a Turso account
- Cold starts — every time the service wakes from sleep, all MCP connections must reconnect
For a free server that avoids all of these issues, use Oracle Cloud instead.
Step 1: Create a Turso Database
- Sign up at turso.tech (free tier works)
- Create a database named
crow - Copy your Database URL (starts with
libsql://) - Create an auth token and copy it
Security note: Your Turso credentials (database URL and auth token) grant full access to your Crow database. Treat them like passwords — never share them publicly or commit them to code. See the Security Guide for more details.
Step 2: Deploy to Render
- Fork the Crow repository on GitHub
- Go to Render Dashboard → New → Blueprint
- Connect your forked repo — Render will detect the
render.yaml - Set the required environment variables:
TURSO_DATABASE_URL— your Turso database URLTURSO_AUTH_TOKEN— your Turso auth token
- Click Apply — Render will deploy automatically
Step 3: Initialize the Database
After deployment, open the Render shell for your service and run:
npm run init-dbOr trigger it via the health endpoint — the database tables are created automatically on first request.
Step 4: Connect Your AI Platform
Once deployed, visit https://your-service.onrender.com/setup to see:
- Which integrations are connected
- Your MCP endpoint URLs for each platform
- Instructions for adding API keys
Then follow the platform-specific guide:
Step 5: Add Integrations (Optional)
Security note: API keys are like passwords — each one grants access to a service on your behalf. Only add keys for services you actually need, and never share them. If a key is ever exposed, revoke it immediately at the service's website and create a new one. See the Security Guide for step-by-step instructions.
Add API keys for external services in your Render dashboard under Environment:
| Integration | Environment Variable | Get Key |
|---|---|---|
| GitHub | GITHUB_PERSONAL_ACCESS_TOKEN | GitHub Settings |
| Brave Search | BRAVE_API_KEY | Brave API |
| Slack | SLACK_BOT_TOKEN | Slack Apps |
| Notion | NOTION_TOKEN | Notion Integrations |
| Trello | TRELLO_API_KEY + TRELLO_TOKEN | Trello Power-Ups |
| Google Workspace | GOOGLE_CLIENT_ID + GOOGLE_CLIENT_SECRET | Google Cloud Console |
See the full list on the Integrations page.
After adding keys, Render restarts automatically. Refresh your /setup page to confirm they're connected.
What's publicly accessible after deployment?
When deployed to Render, your instance is on the public internet. Here's what that means:
- Blog (
/blog) — Public, but only posts you explicitly publish withpublicvisibility appear - Crow's Nest (
/dashboard) — Blocked from public IPs (returns 403). Only accessible from your Tailscale network or localhost - MCP endpoints — Protected by OAuth 2.1. Only authorized AI clients can access your tools
- Setup page (
/setup) — Shows connection status but never exposes API keys
Nothing personal is visible unless you publish it. See the Security Guide for details.
Verify
curl https://your-service.onrender.com/healthVisit /setup on your deployed URL to see integration status and endpoint URLs.
Try it out — after connecting your AI platform, say:
"Remember that today is my first day using Crow" "What do you remember?"
Many integrations?
If you have several integrations enabled, use the /router/mcp endpoint instead of connecting each server individually. It consolidates all tools into 7 category tools, reducing context window usage by ~75%. See the Context & Performance guide.
Now connect your AI: Claude · ChatGPT · All platforms