# Context Hub > A personal Model Context Protocol (MCP) memory layer that Claude, ChatGPT, Cursor, Perplexity, and Claude Code all read from and write to. One npm command stands up a Cloudflare Workers + D1 backed shared context server so your AI clients stop forgetting who you are. Context Hub is open source (MIT) and runs on Cloudflare's free tier. It solves the "I have to re-explain myself to every AI tool" problem by giving every MCP-compatible client access to the same memory store, decisions log, and project rules. The CLI is `create-context-hub` on npm. ## Install & Quick start - [Install command](https://contexthub.tryrehearsal.ai/): `npx create-context-hub` — about 4 minutes on a fresh machine; provisions Cloudflare D1, deploys the MCP server, prints connector instructions for every supported client - [npm package](https://www.npmjs.com/package/create-context-hub): One-command CLI source on the npm registry - [GitHub repository](https://github.com/JaipuriaAI/context-hub): Full source, MIT licensed, contributions welcome via PRs and issues ## Use case deep-dives - [Own your AI memory before a vendor wipes it for you (Markdown)](https://contexthub.tryrehearsal.ai/use-cases/own-your-ai-memory/llms.md): ChatGPT lost users' saved memories twice in 2025. Context Hub puts the rows in your own Cloudflare account so a vendor backend bug never costs you months of context again - [Sync Claude.ai memories into Claude Code (Markdown)](https://contexthub.tryrehearsal.ai/use-cases/sync-claude-to-claude-code/llms.md): Save a decision in the browser, ship it from the terminal — both clients read the same MCP memory store - [Build a personal MCP memory server for $0/month (Markdown)](https://contexthub.tryrehearsal.ai/use-cases/personal-mcp-memory-server/llms.md): Deploy a durable AI context layer on Cloudflare Workers + D1 without paying for hosted memory products like Mem0 or Letta - [Share context between ChatGPT, Perplexity, and Cursor (Markdown)](https://contexthub.tryrehearsal.ai/use-cases/share-context-multi-client/llms.md): Use Context Hub as the bridge when research, planning, and implementation happen in different AI tools ## What Context Hub does (HTML pages) - [Homepage](https://contexthub.tryrehearsal.ai/): Product overview, install command, supported clients, feature list - [Use cases index](https://contexthub.tryrehearsal.ai/use-cases): All three workflow deep-dives in one place - [About the author](https://contexthub.tryrehearsal.ai/about/mayank-bohra): Author bio, credentials, verified social profiles (E-E-A-T author entity) ## Concepts referenced - Model Context Protocol (MCP): Anthropic's open protocol for connecting AI clients to external context sources. Context Hub is an MCP server. - Cloudflare Workers + D1: Serverless runtime + SQLite-on-the-edge that hosts the Context Hub backend. Free tier covers personal use comfortably. - Cross-client memory: The product's core promise — the same row store readable from Claude.ai, Claude Code, ChatGPT, Cursor, Perplexity, and any custom MCP client. ## Optional - [Author website](https://mayankbohra.com): Mayank Bohra's portfolio + writing - [Rehearsal AI](https://tryrehearsal.ai): The AI interview-practice product the same author ships - [Highlyt](https://highlyt.app): Author's other in-progress project