Skip to content

Cloud Deploy

Archived — Legacy Deployment Path

This page documents the Render + Turso deployment method, which is no longer supported. Turso cloud database support has been removed from Crow. Multi-device sync is now handled by Hypercore P2P replication with local SQLite.

For new deployments, use Oracle Cloud Free Tier, a Home Server, or Managed Hosting.

Recommended: Oracle Cloud Free Tier

For new cloud deployments, we recommend Oracle Cloud Free Tier — it never sleeps, uses local SQLite (no external database), and is genuinely permanent. The guide below is for Render, which is still supported but has significant limitations.

Don't want to manage infrastructure?

Try managed hosting — $15/mo, no setup required.


Legacy: Render + Turso Deployment

Render Free Tier Limitations

  • Sleeps after 15 minutes of inactivity — first request after sleep takes ~30 seconds
  • Ephemeral disk — local files are lost on redeploy, requiring Turso as an external database
  • Two dependencies — you need both a Render account and a Turso account
  • Cold starts — every time the service wakes from sleep, all MCP connections must reconnect

For a free server that avoids all of these issues, use Oracle Cloud instead.

Step 1: Create a Turso Database

  1. Sign up at turso.tech (free tier works)
  2. Create a database named crow
  3. Copy your Database URL (starts with libsql://)
  4. Create an auth token and copy it

Security note: Your Turso credentials (database URL and auth token) grant full access to your Crow database. Treat them like passwords — never share them publicly or commit them to code. See the Security Guide for more details.

Step 2: Deploy to Render

  1. Fork the Crow repository on GitHub
  2. Go to Render DashboardNewBlueprint
  3. Connect your forked repo — Render will detect the render.yaml
  4. Set the required environment variables:
    • TURSO_DATABASE_URL — your Turso database URL
    • TURSO_AUTH_TOKEN — your Turso auth token
  5. Click Apply — Render will deploy automatically

Step 3: Initialize the Database

After deployment, open the Render shell for your service and run:

bash
npm run init-db

Or trigger it via the health endpoint — the database tables are created automatically on first request.

Step 4: Connect Your AI Platform

Once deployed, visit https://your-service.onrender.com/setup to see:

  • Which integrations are connected
  • Your MCP endpoint URLs for each platform
  • Instructions for adding API keys

Then follow the platform-specific guide:

Step 5: Add Integrations (Optional)

Security note: API keys are like passwords — each one grants access to a service on your behalf. Only add keys for services you actually need, and never share them. If a key is ever exposed, revoke it immediately at the service's website and create a new one. See the Security Guide for step-by-step instructions.

Add API keys for external services in your Render dashboard under Environment:

IntegrationEnvironment VariableGet Key
GitHubGITHUB_PERSONAL_ACCESS_TOKENGitHub Settings
Brave SearchBRAVE_API_KEYBrave API
SlackSLACK_BOT_TOKENSlack Apps
NotionNOTION_TOKENNotion Integrations
TrelloTRELLO_API_KEY + TRELLO_TOKENTrello Power-Ups
Google WorkspaceGOOGLE_CLIENT_ID + GOOGLE_CLIENT_SECRETGoogle Cloud Console

See the full list on the Integrations page.

After adding keys, Render restarts automatically. Refresh your /setup page to confirm they're connected.

What's publicly accessible after deployment?

When deployed to Render, your instance is on the public internet. Here's what that means:

  • Blog (/blog) — Public, but only posts you explicitly publish with public visibility appear
  • Crow's Nest (/dashboard) — Blocked from public IPs (returns 403). Only accessible from your Tailscale network or localhost
  • MCP endpoints — Protected by OAuth 2.1. Only authorized AI clients can access your tools
  • Setup page (/setup) — Shows connection status but never exposes API keys

Nothing personal is visible unless you publish it. See the Security Guide for details.

Verify

bash
curl https://your-service.onrender.com/health

Visit /setup on your deployed URL to see integration status and endpoint URLs.

Try it out — after connecting your AI platform, say:

"Remember that today is my first day using Crow" "What do you remember?"

Many integrations?

If you have several integrations enabled, use the /router/mcp endpoint instead of connecting each server individually. It consolidates all tools into 7 category tools, reducing context window usage by ~75%. See the Context & Performance guide.

Now connect your AI: Claude · ChatGPT · All platforms

Released under the MIT License.