Connect a Telegram group
Generate an invite link in the dashboard, drop the bot into a group, and Feedbot starts listening. Replies in chat are tracked back to the original feedback.
Feedbot listens to your Telegram groups, classifies every message with an LLM, and serves the structured backlog to Claude Code, Cursor, and Windsurf over MCP. Self-host in 30 seconds.
curl -fsSL https://get.feedbot.dev | sh Feedbot lives between your community and your tooling. It picks the signal out of the noise so your engineers can focus on shipping.
Generate an invite link in the dashboard, drop the bot into a group, and Feedbot starts listening. Replies in chat are tracked back to the original feedback.
Bring your own OpenAI or Anthropic key. Each message becomes a typed feedback row (bug · feature · question) with severity, tags, and a one-line summary. Monthly budgets cap surprises.
Every Feedbot install ships an MCP server at /mcp/.
Drop the URL into Claude Code and ask "what bugs are open in
project X?" — the model reads, filters, and triages from your IDE.
A single curl … | sh bootstraps a full Docker stack —
API, dashboard, bot, Postgres, and TLS termination. No YAML, no
.env editing, no Kubernetes.
SMTP, Telegram, domain, HTTPS — all in the dashboard. Save once, the orchestrator restarts only what changed.
Replies in Telegram are tracked back to the original issue. The bot acks reactions and posts updates when the team marks an item done.
Streamable-HTTP MCP server at /mcp/. Works with
Claude Code, Claude Desktop, Cursor, Windsurf — anywhere MCP
speaks.
LLM keys stay yours, Fernet-encrypted at rest. Per-project budget caps prevent runaway bills.
Every feature ships with a public roadmap, ADRs, and a changelog. MIT licensed forever. No "open core" pivot, no feature gating between free and paid — the cloud is ops-as-a-service.
Same product, two delivery modes. Migrate either direction with a Postgres dump.
MIT licensed. Bring your own Docker host.
Managed by us. EU data residency. Pricing TBD on GA.
Yes — forever, MIT. We don't believe in feature-stripping the OSS to push people to a paid tier. The cloud is ops-as-a-service, not a different product.
Anything that runs Docker. We test on Coolify, Dokploy, plain Docker behind nginx-proxy-manager, and Hetzner / DigitalOcean VMs. See the self-host guide.
Yes. Postgres dump → restore. Same schema both sides. Detailed migration guide ships with cloud GA.
You bring your own OpenAI or Anthropic key. Costs are tracked
per-project in llm_calls; monthly budget caps stop
calls when you hit your limit.
Streamable-HTTP MCP at /mcp/. Drop the URL into
Claude Code, Cursor, or Windsurf and the model can list, read,
filter, and update feedback rows on your behalf — same auth as
the API.
Not at launch. We focus on shipping the product. If you need a signed DPA, EU residency, or a custom SLA, email us at [email protected].
30 seconds to spin up. No card required.