3 Servers · 107 Tools · Open Source

MCP Service

Open-source MCP servers by the Flaremity team, deployed on Cloudflare Workers. Connect your AI assistant to Atlassian, GitLab, and Datadog through the Model Context Protocol.

41 Atlassian · 44 GitLab · 22 Datadog

MCP Servers

Three production-ready MCP servers, each deployed as a stateless Cloudflare Worker. Zero data stored. Instant global access.

41 tools

MCP Atlassian

27 Jira + 14 Confluence

Connect your AI to Jira and Confluence. Manage issues, search pages, transition workflows, and more.

IssuesSearchSprintsPagesSpacesComments
Dual (Token + OAuth)
44 tools

MCP GitLab

25 Read + 19 Write

Full GitLab integration. Projects, merge requests, pipelines, issues, and CI/CD job logs.

ProjectsMRsPipelinesIssuesLabelsGroups
PAT (Personal Access Token)
22 tools

MCP Datadog

Read-only · 11 domains

Monitor your infrastructure with AI. Dashboards, logs, metrics, incidents, APM traces, and more.

MonitorsDashboardsLogsMetricsIncidentsHosts
Dual-key (API + App Key)

How It Works

MCP servers on Cloudflare Workers — the simplest way to give AI access to your tools.

What is MCP?

The Model Context Protocol (MCP) is an open standard that lets AI assistants connect to external tools and data sources. Think of it as USB-C for AI — a universal interface between AI models and the services they need.

Instead of building custom integrations for each AI platform, MCP provides a single protocol that works with Claude, ChatGPT, Cursor, and any MCP-compatible client.

Why Cloudflare Workers?

Every MCP server runs as a stateless Cloudflare Worker.

🌍
Edge-First
Deployed to 300+ data centers. Your MCP server runs close to your users, worldwide.
📦
Stateless
No database, no sessions, no data retention. Pure proxy between AI and APIs.
Instant Deploy
Push to deploy in seconds. No containers, no VMs, no infrastructure to manage.
🚀
Zero Cold Start
Workers start in under 1ms. No boot time, no warming, always ready.

Architecture

Every request flows through the same simple pipeline.

Client
AI Assistant
Claude, ChatGPT, Cursor
MCP Server
Cloudflare Worker
Stateless proxy
Service
External API
Jira, GitLab, Datadog

Tech Stack

Every server shares the same battle-tested stack.

📝
TypeScript
Strict mode, zero runtime dependencies (except MCP SDK). Full type safety from schema to response.
☁️
Cloudflare Workers
V8 isolate runtime. 300+ edge locations, sub-millisecond cold starts, 100k+ requests/day free.
🔗
MCP SDK
Official @modelcontextprotocol/sdk with Streamable HTTP transport via Cloudflare Agents.
📦
Bun
Fast package manager and test runner. Replaces npm for installs and vitest for testing.
🛡️
Zod
Runtime schema validation for all tool inputs. Type-safe parameters with zero boilerplate.
🔄
Stateless Design
No database, no sessions, no KV storage. Every request is independent. Zero data retention.
Powered by Cloudflare Workers
300+ data centers. Zero cold starts. Every MCP server runs at the edge.