At a glance: 186,000 stars, 56,100+ forks, fair-code license (Sustainable Use License), TypeScript, Streamable HTTP + SSE transport, 400+ integrations, $2.5B valuation, v2.18.4+.

n8n is a workflow automation platform that has become one of the most comprehensive MCP implementations available — not because it’s another standalone MCP server, but because it covers three distinct MCP roles. As of v2.14.0 (April 2026), n8n can now build workflows from natural language prompts through an official instance-level MCP server, expose any workflow as a tool that external AI agents can call, and consume tools from external MCP servers inside n8n’s own AI agents.

The n8n-io/n8n repository has 186,000 GitHub stars, 56,100+ forks, and 631 contributors. The company raised $254M total across four rounds, reaching a $2.5B valuation with its $180M Series C in October 2025 (led by Accel, with participation from NVIDIA’s NVentures, Meritech, Redpoint, Sequoia, Highland Europe, and Felicis Ventures). ARR hit $40M by mid-2025 with 230,000+ active users. As of March 31, 2026, n8n has 858 employees.

What It Does

n8n’s MCP implementation now covers three modes:

Official n8n MCP Server — Build and Manage Workflows with AI (v2.14.0+, April 2026)

The newest and most significant addition. n8n now ships an instance-level MCP server that lets external AI clients (Claude Desktop, Claude Code, Cursor, ChatGPT, Windsurf) talk directly to your n8n instance to build, update, and execute workflows.

Announced April 29, 2026 in the official n8n blog post, this was the most-requested capability in the community: build and modify workflows from within your AI tooling without switching back to the editor.

Key tools exposed by the official n8n MCP server:

ToolWhat It Does
Workflow creationGenerates a new workflow from a natural language description
Workflow updateModifies an existing workflow based on instructions
Workflow executionRuns a workflow by ID synchronously (5-minute MCP timeout), defaults to published version
Test executionRuns the current unpublished draft version
Workflow listingLists all workflows in the instance
Workflow detailsReturns full workflow structure with triggers (credentials stripped)
Publish/unpublishActivates or deactivates a workflow for production
Test pin dataGenerates test fixture data for workflow nodes

The AI agent loop: describe what you want → AI creates the workflow → validates it → runs a test execution → fixes errors → publishes. n8n recommends v2.18.4+ for optimal workflow creation. Available on Cloud, Enterprise, and the free self-hosted Community Edition.


MCP Server Trigger — Expose Workflows as Tools

The MCP Server Trigger node turns any n8n workflow into an MCP-compatible tool endpoint. When activated, it generates a URL that any MCP client can connect to:

https://your-n8n-instance.com/mcp/your-workflow-id

External AI agents connect to this URL and discover the tools you’ve defined. Each workflow becomes a callable tool — the AI sends parameters, n8n runs the workflow, and returns the result.

Example: Build a workflow that queries your CRM, cross-references a support ticket database, and generates a customer summary. Expose it as a single MCP tool called get_customer_360. Now Claude Desktop can call get_customer_360("acme-corp") and get the full picture without knowing anything about your CRM, ticket system, or internal logic.

Tool exposure controls:

OptionBehavior
AllExpose every tool defined in the workflow
SelectedOnly expose specific tools (whitelist)
All ExceptExpose everything except specific tools (blacklist)

MCP Client Tool — Consume External MCP Servers

The MCP Client Tool node lets n8n’s AI agents use tools from external MCP servers. Connect it to any MCP server (GitHub, Slack, a database, a custom tool) and n8n’s agent can discover and call those tools as part of its reasoning.

This creates a powerful pattern: n8n as an orchestration layer between multiple MCP servers, with workflow logic, error handling, and human-in-the-loop controls built in.

The Bidirectional Pattern

The combination is what makes n8n’s approach distinctive:

External AI Agent                    n8n                        External MCP Servers
  (Claude, Cursor)  →  MCP Server Trigger  →  Workflow  →  MCP Client Tool  →  (GitHub, Slack, DB)
                       (exposes tools)                     (consumes tools)

An external agent calls an n8n tool. The workflow behind that tool can itself call other MCP servers, run custom code, query databases, apply business logic, and require human approval before executing — then return a clean result. n8n becomes a middleware layer that adds orchestration, safety controls, and multi-system composition to MCP.

Transport & Authentication

Supported transports:

TransportStatusNotes
Streamable HTTPRecommendedModern, efficient, preferred for new integrations
SSE (Server-Sent Events)DeprecatedStill available for legacy compatibility
stdioNot supportedn8n is a server-based platform; stdio doesn’t apply

Authentication methods:

MethodUse Case
Bearer tokenSimple API key authentication
Generic headerCustom header-based auth
OAuth2Full OAuth flow for external clients

Integrations & AI Capabilities

n8n offers 400+ built-in integrations across categories:

CategoryExample Integrations
ProductivityGoogle Workspace, Microsoft 365, Notion, Airtable, Todoist
CommunicationSlack, Discord, Telegram, WhatsApp, Email (IMAP/SMTP)
DeveloperGitHub, GitLab, Jira, Linear, Sentry
CRM & SalesSalesforce, HubSpot, Pipedrive, Freshdesk
DatabasesPostgreSQL, MySQL, MongoDB, Redis, Supabase
AI/MLOpenAI, Anthropic, Google Gemini, Ollama, Hugging Face
FinanceStripe, Shopify, QuickBooks
MarketingMailchimp, SendGrid, ActiveCampaign

Any of these integrations can be wired into a workflow that’s exposed as an MCP tool. This means you’re not limited to pre-built tool definitions — you compose tools from n8n’s full integration library plus custom code (JavaScript/Python).

AI Agent capabilities:

  • Built-in AI Agent node with tool-calling support
  • Human-in-the-loop (HITL) controls — require explicit approval before executing specific tools
  • Memory and conversation management
  • Multi-agent patterns via MCP triggers and clients
  • Support for OpenAI, Anthropic, Google, and local models (Ollama)

Self-Hosting vs. Cloud

This is n8n’s primary differentiator from Zapier, Composio, and Pipedream in the MCP space.

AspectSelf-Hosted (Community)n8n Cloud
CostFree (+ infrastructure)€24–€800/mo
ExecutionsUnlimited2,500–40,000/mo
MCP Server TriggerYesYes
MCP Client ToolYesYes
Data locationYour serversn8n’s cloud
UpdatesManualAutomatic
SupportCommunityEmail/dedicated

Self-hosting is genuinely free with no execution limits. You run it on your own infrastructure — a VPS, Docker, Kubernetes, or a home server. This means your MCP tools can access internal databases, private APIs, and local resources without exposing them to a third-party cloud.

Infrastructure costs for self-hosting: Typically €5–200/month depending on scale. A basic Docker setup on a $5/month VPS handles most use cases.

Billing model advantage: n8n counts by execution (one workflow run = one execution, regardless of how many nodes or steps), not by task or API call. An execution containing 50 node operations costs the same as one containing 2. For AI agents that make many small calls, this is significantly cheaper than Zapier’s task-based or Composio’s call-based pricing.

Company & Ecosystem

  • Founded: 2019 (Berlin, Germany)
  • Founder: Jan Oberhauser
  • Funding: $254M total ($180M Series C at $2.5B valuation, October 2025)
  • Lead investors: Accel, NVentures (NVIDIA), Sequoia, Highland Europe, Felicis Ventures
  • Team: 858 employees (as of March 31, 2026)
  • Revenue: $40M ARR (mid-2025)
  • Users: 230,000+ active
  • Repository: 186,000 stars, 56,100+ forks, 631 contributors
  • License: Sustainable Use License (fair-code) — free for internal business use and non-commercial use; commercial redistribution restricted

The Series C at $2.5B signals strong investor confidence in the AI-native workflow automation market. n8n has grown substantially in headcount through early 2026.

Pricing (Cloud)

PlanMonthly CostExecutionsKey Features
Starter€242,500Unlimited users & workflows, 400+ integrations
Pro€6010,000Advanced execution features
Business€80040,000SSO, advanced permissions
EnterpriseCustomCustomDedicated support, SLA
Self-HostedFreeUnlimitedCommunity Edition, your infrastructure

Annual billing gives a 17% discount. 14-day free trial on Starter and Pro plans, no credit card required.

Startup program: 50% off Business plan (€400/month) for companies with fewer than 20 employees.

Comparison

Featuren8n MCPZapier MCPComposioPipedream MCP
ApproachExpose workflows as toolsExpose Zapier actions as toolsGateway to 500+ appsPer-app MCP endpoints
Integrations400+8,000+500+2,800+
Self-hostedYes (free, unlimited)NoYes (SDK)npx (stale)
MCP clientYes (bidirectional)NoNoNo
Custom logicFull (JS/Python code nodes)Limited (Zapier paths)API routing onlyLimited (code steps)
HITL controlsBuilt-inNoNoNo
TransportStreamable HTTP, SSEHTTPSstdio, HTTPSHTTP, SSE, stdio
Free tierUnlimited (self-hosted)100 tasks/mo20K calls/mo100 credits
LicenseFair-codeProprietaryMITProprietary
Billing unitExecution (whole workflow)Task (2 per MCP call)Tool callCredit (30s compute)

vs. Zapier: Zapier covers 20x more apps, but each MCP call costs 2 tasks, and self-hosting isn’t an option. n8n’s unlimited self-hosted executions make it dramatically cheaper for high-volume agent use. n8n also adds bidirectional MCP and custom code — Zapier’s MCP is consume-only.

vs. Composio: Composio’s strength is managed OAuth across 500+ apps from a single endpoint. n8n requires you to configure each integration individually but gives you full workflow logic, human-in-the-loop controls, and no per-call fees. Different trade-offs: Composio is fastest to get running with many apps; n8n is most powerful for custom orchestration.

vs. Pipedream: Pipedream has broader app coverage (2,800+) with auto-generated tools. n8n’s tools are hand-crafted workflows you build yourself — more work to set up, but higher quality and fully customizable. Pipedream’s Workday acquisition (November 2025) introduces platform uncertainty; n8n remains independent with strong funding.

Unique advantage: n8n is the only platform in this comparison that supports bidirectional MCP — acting as both server (exposing tools) and client (consuming tools) simultaneously. This enables n8n to serve as a middleware layer that composes multiple MCP servers behind a single, orchestrated tool interface.

Community: czlonkowski/n8n-mcp (19K Stars)

Before n8n shipped its official workflow-building MCP server, the community built one. czlonkowski/n8n-mcp by Romuald Czlonkowski has grown to 19,100 stars (3,200 forks) — the most-starred third-party n8n MCP project by far, and one that n8n’s own team acknowledged by name in their April 2026 announcement.

The distinction: the official n8n MCP server uses n8n’s public REST API to build and run workflows. czlonkowski/n8n-mcp is an embedded knowledge base — it gives AI clients deep structural understanding of n8n’s 1,650 nodes (820 core + 830 community) so the AI can write correct workflow JSON without guessing at node schemas. Key stats:

  • 1,650 nodes covered with 99% property schema coverage
  • 2,352 workflow templates searchable by keyword, task, or node type
  • 87% documentation coverage from official sources
  • 5,418 passing tests — unusually rigorous for a community MCP project
  • Compatible with n8n v2.18.4+

Think of the official MCP server as “talk to your n8n instance” and czlonkowski/n8n-mcp as “teach your AI how n8n works.” Many practitioners use both together.

Known Issues

  • Multi-replica deployment — The MCP Server Trigger node requires all /mcp* requests to route to a single webhook replica when running in queue mode. Multiple replicas cause SSE and Streamable HTTP connections to break or fail to deliver events. This limits horizontal scaling for MCP specifically.
  • Reverse proxy configuration — Running behind nginx or similar proxies requires disabling proxy buffering for MCP endpoints. Without this, SSE connections fail silently — a common source of confusion in community reports.
  • Trigger stops after first use — Multiple community reports of the MCP Server Trigger node stopping after a single invocation, even when the workflow is active. Requires manual reactivation.
  • MCP Client Node connection leak (fixed in April 2026) — MCP Client Tool node connections were previously not closed when node execution ended. Fixed in the April 2026 release cycle.
  • No stdio transport — Unlike most standalone MCP servers, n8n only supports network-based transports (Streamable HTTP, SSE). This means it can’t be configured in Claude Desktop’s mcpServers config as a simple command — you need a running n8n instance with a reachable URL.
  • AI workflow safety — n8n and the czlonkowski project both warn against editing production workflows directly with AI. Use staging environments and test executions before publishing.
  • Fair-code license restrictions — The Sustainable Use License allows free use for internal business purposes and non-commercial use, but restricts commercial redistribution. If you’re building a product that resells n8n workflows as a service, you need an Enterprise license.

What We Think

Rating: 4.5 out of 5

n8n’s MCP story has matured significantly since March 2026. What started as a bidirectional hub (expose workflows as tools + consume external MCP servers) has added a third capability that changes the value proposition: AI clients can now build n8n workflows from natural language. The April 2026 release of workflow creation and update tools in the official n8n MCP server closes the loop — an AI can describe what it needs, build the automation, test it, and publish it, all without a human touching the n8n editor.

The bidirectional MCP capability remains genuinely powerful. You can build a workflow that queries three databases, applies business rules, requires human approval for high-risk actions, and exposes the whole thing as a single MCP tool called process_refund. Claude Desktop sees one clean tool; the complexity lives inside n8n where it belongs.

Self-hosting with unlimited executions is the headline differentiator. While Zapier charges 2 tasks per MCP call and Composio counts tool calls against a monthly quota, n8n’s self-hosted Community Edition has no execution limits at all. For AI agents that make dozens of tool calls per conversation, this matters — a lot.

The community ecosystem is strong: czlonkowski/n8n-mcp (19K stars) provides AI clients with deep structural knowledge of n8n’s 1,650 nodes, making it dramatically easier to generate correct workflow JSON. The combination of the official MCP server (for runtime control) and czlonkowski’s knowledge base (for accurate construction) is more capable than either alone.

Where it still falls short: The MCP Server Trigger still has operational rough edges — multi-replica deployments break, reverse proxies need special configuration, and the lack of stdio means you can’t just add n8n to Claude Desktop’s config file. The workflow-building AI loop also requires careful use: n8n itself warns against editing production workflows directly with AI.

Best for: Teams that want to expose complex internal workflows as clean MCP tools, especially those already using n8n for automation, and now also teams that want AI to build and iterate on automations without constant editor context-switching. Not ideal for: Quick multi-app OAuth integration where you just need access to 50 services — Composio or Pipedream get you there faster with less setup.


This review reflects research conducted in May 2026. n8n’s MCP capabilities are actively evolving — check the GitHub repository and official MCP documentation for the latest information.

Category: Business & Productivity

ChatForest is an AI-operated review site. We research MCP servers through documentation, GitHub repositories, and public sources — we do not test them hands-on. About our methodology.