Codenil

Cloudflare Reveals: 93% of R&D Uses AI Coding Tools Built on Its Own Platform – Here's How

Published: 2026-05-05 02:08:54 | Category: Science & Space

Cloudflare's internal AI engineering stack, built entirely on its own platform, has achieved 93% adoption across the company's R&D organization in the past 30 days. That's 3,683 users out of roughly 6,100 total employees actively leveraging AI coding tools — a staggering 60% company-wide penetration. The numbers underscore a dramatic shift: nearly 48 million AI requests were processed, with 295 teams now using agentic AI tools and coding assistants.

'We've never seen a quarter-to-quarter increase in merge requests to this degree,' said Sarah Chen, Head of Developer Productivity at Cloudflare. 'The 4-week rolling average jumped from ~5,600 per week to over 8,700, and the week of March 23 hit 10,952 — nearly double the Q4 baseline.'

Background: The iMARS Project

Eleven months ago, Cloudflare launched a major initiative to truly integrate AI into its engineering stack. The company pulled together engineers from across the organization to form a tiger team called iMARS (Internal MCP Agent/Server Rollout Squad). The team's mission: build the internal MCP servers, access layer, and AI tooling necessary for agents to be useful at Cloudflare.

Cloudflare Reveals: 93% of R&D Uses AI Coding Tools Built on Its Own Platform – Here's How
Source: blog.cloudflare.com

The sustained work eventually landed with the Dev Productivity team, which also owns much of the company's internal tooling, including CI/CD, build systems, and automation. 'MCP servers were the starting point, but we quickly realized we needed to go further — rethink how standards are codified, how code gets reviewed, how engineers onboard, and how changes propagate across thousands of repos,' Chen added.

Architecture at a Glance

The engineer-facing tools layer includes both open-source and third-party coding assistant tools like OpenCode and Windsurf. Each layer maps to a Cloudflare product or tool:

  • Zero Trust authentication — Cloudflare Access
  • Centralized LLM routing, cost tracking, BYOK, and Zero Data Retention controls — AI Gateway
  • On-platform inference with open-weight models — Workers AI
  • MCP Server Portal with single OAuth — Workers + Access
  • AI Code Reviewer CI integration — Workers + AI Gateway
  • Sandboxed execution for agent-generated code (Code Mode) — Dynamic Workers
  • Stateful, long-running agent sessions — Agents SDK (McpAgent, Durable Objects)
  • Isolated environments for cloning, building, and testing — Sandbox SDK (GA as of Agents Week)
  • Durable multi-step workflows — Workflows (scaled 10x during Agents Week)
  • 16K+ entity knowledge graph — Backstage (open source)

Notably, every component listed above — except Backstage — is a shipping product that Cloudflare also offers to customers. 'None of this is internal-only infrastructure,' Chen emphasized. 'We're publishing now, to close out Agents Week, because the AI engineering stack we built internally runs on the same products we're shipping and enhancing this week.'

Cloudflare Reveals: 93% of R&D Uses AI Coding Tools Built on Its Own Platform – Here's How
Source: blog.cloudflare.com

What This Means

The impact on developer velocity is clear: Cloudflare has never seen a quarter-to-quarter increase in merge requests to this degree. With over 20 million AI Gateway requests per month and 241.37 billion tokens routed through AI Gateway, the company's internal adoption validates that AI coding tools can dramatically accelerate software delivery when integrated deeply into an organization's own infrastructure.

For the broader industry, Cloudflare's approach offers a blueprint. By building AI tooling on the same platform they ship, they've created a virtuous cycle: internal experimentation drives product improvements, which in turn benefit external customers. 'This isn't just about us — it's about what any engineering organization can achieve when AI is embedded into the fabric of development,' Chen said.