Redis has three official MCP servers — and they’re all doing different things. Part of our Databases MCP category.

The official mcp-redis (452 stars) is a comprehensive data operations server covering every Redis data structure plus vector search. The Agent Memory Server (232 stars) is a specialized semantic memory layer for AI agents. And mcp-redis-cloud manages Redis Cloud infrastructure — subscriptions, databases, and multi-cloud deployments.

What’s New Since March 2026

Redis Agent Skills (Feb 3, 2026) — Redis launched redis/agent-skills, a growing collection of pre-built skills for AI coding agents. One command injects opinionated, up-to-date Redis knowledge into agents like Claude Code, Cursor, Codex, Copilot, and Augment Code when it’s relevant. Skills cover caching, rate limiting, session management, vector search, semantic caching, agent memory, pub/sub, and streams — with production-aware defaults and anti-pattern guardrails. This isn’t the MCP server itself, but it’s Redis’s answer to agents generating outdated or naive Redis code.

Gemini CLI Extension — The Redis MCP Server is now available in the Gemini CLI extensions gallery. Developers using Google’s terminal AI agent can install Redis access with one command — alongside GitHub, Dynatrace, and Google Cloud extensions.

mcp-redis new tools — Recent development added CLI support for Redis URI connection strings, scan_keys() and scan_all_keys() tools for iterative key discovery, and full Redis Cluster mode support. The main server remains on v0.5.0 (March 2025) with no formal new release, but the Streamable HTTP transport tracked in Issue #45 remains in progress.

agent-memory-server growth — Stars grew from 207 → 232 (+12%), forks from 42 → 48. Documentation was updated April 7, 2026. Active development continues.

That’s unusual. Most database vendors ship one MCP server and call it done. Redis shipped three, each solving a distinct problem. The question is whether any of them are good enough to use.

redis/mcp-redis — The Main Server

DetailInfo
redis/mcp-redis452 stars, 90+ forks
LanguagePython
Transportstdio (Streamable HTTP planned)
LicenseMIT
Latestv0.5.0 (March 2025)

The official Redis MCP server covers all Redis data structures through 11 tool modules:

Stringsset and get with optional expiration. The basics.

Hashes — Field-value pair operations, including vector embedding storage. This is where Redis’s role as a vector database starts showing up.

Lists — Append, pop, and LREM (added in v0.5.0). Standard list operations.

Sets — Add, remove, list members, intersection. Useful for tag management and membership queries.

Sorted Sets — Score-based ordering for leaderboards, priority queues, and ranked data.

JSON — Store, retrieve, and manipulate JSON documents with path-based access via RedisJSON. This is where Redis stops being “just a cache” and starts being a document store.

Streams — Add, read, and delete with consumer group support. Event sourcing and message queue patterns.

Pub/Sub — Publish and subscribe messaging. Real-time event distribution.

Query Engine — Vector indexing, vector search, and hybrid search (new in v0.5.0). This is the headline feature — Redis as a vector database for RAG workflows.

Server Management — Database info and server status.

Documentation Search — Natural language queries against Redis documentation via HTTP API.

That’s roughly 25+ tools across 11 modules. For comparison, the MongoDB MCP server has 37+ tools, and Neon has 20. Redis is in the upper tier.

What Works Well

Full data structure coverage. Unlike most Redis MCP servers that only handle strings (set/get/delete), the official server covers every major Redis data type. An agent can work with hashes, sorted sets, streams, and JSON documents — not just key-value pairs. This matters because Redis in production is rarely used as a simple key-value store.

Vector search and RAG. The query engine module adds vector indexing and search, turning Redis into a vector database accessible through MCP. For AI agents that need to store and retrieve embeddings — document search, semantic memory, recommendation systems — this is built-in rather than requiring a separate Pinecone or Qdrant server.

Enterprise authentication. EntraID authentication for Azure Managed Redis and Redis ACL support mean this server can work in enterprise environments where authentication is non-negotiable. Redis Cluster mode is also supported.

Reconnection handling. Exponential backoff with 5 retries (1-second initial delay, 30-second max). When Redis goes down briefly, the server recovers automatically rather than requiring a restart.

Docker deployment. Available as mcp/redis on Docker Hub, or via uvx from PyPI, or from source. Multiple deployment paths for different workflows.

What Doesn’t Work Well

stdio only. No HTTP transport yet. Issue #45 tracks Streamable HTTP support. In an ecosystem where remote MCP servers are increasingly common, stdio limits this server to local machine usage. It’s been “planned” since the original review — still not shipped as of May 2026.

No SSH tunnel support. Issue #31 requests SSH tunnel connections for reaching Redis instances behind firewalls. Without this, connecting to production Redis from a local MCP client requires external tunneling.

No connection validation. Issue #56 reports that the server doesn’t check if the Redis server is available before proceeding. It can start successfully even when Redis isn’t running, then fail on the first tool call.

Slow release cadence. v0.5.0 shipped in March 2025 — over a year ago. New tools (scan_keys, Cluster support) have appeared in the codebase but no formal v0.6 release has shipped. Development continues but delivery is sporadic.

Redis Agent Memory Server

DetailInfo
redis/agent-memory-server232 stars, 48 forks
LanguagePython (77.8%)
Transportstdio + SSE
LicenseApache 2.0
Open Issues19+

This is not a general Redis operations server. It’s a specialized memory layer for AI agents that uses Redis as its backend. The distinction matters — you’d use this alongside mcp-redis, not instead of it.

The server implements a two-tier memory architecture:

Working memory — Session-specific conversation state. Short-lived, scoped to a single interaction. Think of it as the agent’s scratchpad.

Long-term memory — Persistent memories that survive across sessions. Stored as vector embeddings with metadata (user, session, namespace, topics, entities, timestamps). Searchable via semantic similarity.

Seven tools handle the lifecycle:

  • search_long_term_memory — Semantic search with filtering by user, session, namespace, topics, entities, or time range
  • create_long_term_memories — Store new persistent memories
  • get_long_term_memory / edit_long_term_memory / delete_long_term_memories — CRUD operations
  • memory_prompt — Generate AI prompts enriched with relevant memory context
  • set_working_memory — Manage the session scratchpad

The server automatically promotes important working memory to long-term storage using LLM-powered topic extraction, entity recognition, and conversation summarization. Multi-provider LLM support via LiteLLM means it works with OpenAI, Anthropic, AWS Bedrock, Ollama, Azure, and Gemini.

The value proposition: If you’re building an AI agent that needs to remember things across conversations — user preferences, project context, past decisions — this server provides that infrastructure. The Anthropic Memory MCP server stores memories as a local JSON knowledge graph. Redis Agent Memory Server stores them as vectors in Redis with semantic search, which scales better and supports richer queries.

The catch: 19 open issues, the additional LLM dependency for memory processing, and the inherent complexity of a two-tier memory system. This is infrastructure-grade software, not a simple tool wrapper.

mcp-redis-cloud — Infrastructure Management

DetailInfo
redis/mcp-redis-cloud39 stars
LanguageTypeScript
Transportstdio
LicenseMIT
Open Issues0

This server manages Redis Cloud infrastructure, not the data inside Redis. It’s the DevOps complement to mcp-redis.

Thirteen tools cover:

  • Account management — Current account info, payment methods
  • Subscriptions — Create, list, and delete Pro and Essential subscriptions
  • Planning — Available regions, plans, and database modules
  • Task tracking — Monitor async provisioning tasks

Use case: “Agent, spin up a Redis Essential subscription in AWS us-east-1 with the search module.” The server handles the Redis Cloud API calls, the agent handles the natural language.

This is a niche tool — useful for DevOps teams managing Redis Cloud at scale, not for developers who just want to talk to Redis. But it’s well-maintained (zero open issues) and fills a genuine gap in infrastructure automation.

Community Alternatives

prajwalnayak7/mcp-server-redis (25 stars)

Python, 11 tools, stdio. Covers strings, lists, hashes, sets, and pub/sub plus MCP Resources for server status and key listing. Supports AWS MemoryDB. Less comprehensive than the official server but provides resource endpoints that the official server lacks.

farhankaz/redis-mcp (6 stars)

TypeScript, 14 tools, stdio. Covers strings, hashes, sorted sets, and sets with scan-based key discovery. Clean modular architecture and Jest testing. A solid middle ground between the Anthropic reference (4 tools) and the official server (25+).

GongRzhe/REDIS-MCP-Server (31 stars, archived)

JavaScript, 4 tools on main branch (62 on unmerged redis-plus branch). This was the community contribution that became the Anthropic reference implementation. Archived March 2026 — the 62-tool expanded version never shipped. Historical interest only.

Anthropic Reference Implementation (archived)

TypeScript, 4 tools (set, get, delete, list). Part of the now-archived modelcontextprotocol/servers monorepo. Minimal reference implementation — useful for understanding MCP patterns, not for production Redis work.

yyue9527/redis-mcp-server (2 stars)

Java/Spring Boot, 4 tools, SSE transport. Notable only as one of the few Redis MCP servers using SSE instead of stdio. Minimal adoption.

redis/redis-mcp-java (2 stars)

Java library (not a standalone server) for building custom Redis MCP tools. Dual client support (Lettuce for async, Jedis for sync), automatic tool discovery via reflection. For Java developers who want to build their own Redis MCP integration rather than using a pre-built server.

How It Compares to Other Database MCP Servers

Redis’s three-server approach is unique. Most database vendors ship one MCP server:

Database MCP Category Comparison

With six database reviews now complete, here’s how they compare:

FeatureRedisMongoDBPostgreSQLMySQLSQL ServerSQLite
Rating4/54/54.5/53.5/53.5/53.5/5
Official serverYes (452 stars, 25+ tools)Yes (970 stars, 41 tools)No officialNo (Oracle absent)Experimental onlyArchived (Anthropic)
Total official repos3 (data + memory + cloud)1 (comprehensive)001 (experimental)0 (archived)
Vector search MCPYes (built-in query engine)Yes (unified index + Voyage AI)LimitedNoNoVia db-mcp/libSQL
AI agent memoryYes (Agent Memory Server)NoNoNoNoNo
Cloud managementYes (mcp-redis-cloud)Yes (Atlas, 13 tools)Supabase/Neon/Azure/AWSAWS/Azure/GoogleAWS/AzureTurso, SQLite Cloud
Performance toolsServer info onlyPerformance Advisor (Atlas)Postgres MCP Pro (any PG)NonePerformanceMonitor (76 tools)None
Transportstdio only (HTTP planned)stdio + HTTPMixedMixedMixedMostly stdio
AAIF membershipNoNoN/AN/ANoN/A

Redis is unique in shipping three official servers solving distinct problems. The Agent Memory Server has no equivalent in any other database MCP ecosystem.

Redis Background

AspectDetail
OriginCreated by Salvatore Sanfilippo (antirez) in 2009, Redis Ltd. (formerly Redis Labs)
Latest versionRedis 8.6.1 (February 2026); Redis 8.0 introduced AGPLv3 licensing
LicenseTriple-licensed: AGPLv3 (since Redis 8.0) + RSALv2 + SSPLv1; MCP servers are MIT/Apache 2.0
Market position#1 in-memory database, 94.3% market share in in-memory data stores
Users56,000+ companies, 41,342+ using Redis as in-memory data store
Revenue$300M+ ARR (January 2026), up from $272.6M (2025)
Valuation~$2B (2025)
Employees~900–1,454
HeadquartersMountain View, CA
Voyage AI / EmbeddingsNot applicable (Redis has native vector search, no acquired embedding provider)

License saga: Redis switched from BSD to RSALv2 + SSPLv1 in March 2024, sparking major backlash. The Linux Foundation launched Valkey as a BSD-licensed fork backed by AWS, Google Cloud, Oracle, and Snap. In May 2025, Redis reversed course, adding AGPLv3 as an option starting with Redis 8.0, after antirez rejoined Redis Inc. The MCP servers (MIT/Apache 2.0) are unaffected by the database license drama — but the Valkey fork means organizations have an alternative if SSPL/RSALv2 compliance is a concern.

The Bottom Line

Redis’s MCP ecosystem is the strongest of any in-memory database — and arguably the most thoughtfully structured of any database vendor. Three official servers, each solving a distinct problem, all actively maintained by Redis Inc.

The main mcp-redis server (458 stars) provides comprehensive Redis data structure access plus vector search — everything from simple caching to RAG pipelines. The Agent Memory Server (207 stars) adds a unique semantic memory layer that no other database MCP server offers. And mcp-redis-cloud (39 stars) handles infrastructure management for DevOps teams.

The deductions are for real gaps: stdio-only transport on the main server, no SSH tunnel support, no connection validation, and slow release cadence. The Agent Memory Server’s 19 open issues and LLM dependency add complexity. And the community alternatives are either archived or low-adoption.

But the foundation is exceptional. Official backing, MIT/Apache licensing, enterprise authentication, and a clear vision for how Redis fits into the AI agent stack — not just as a cache, but as a vector database, memory layer, and real-time data platform.

Rating: 4 out of 5

CategoryRedis MCP Servers
Top serverredis/mcp-redis (official)
Stars (top)~452
Total servers reviewed10+
Best forCaching, vector search, AI agent memory, real-time data
TransportMostly stdio (Agent Memory Server adds SSE)
LanguagesPython, TypeScript, Java, JavaScript, Go
Our rating4/5

This review was researched and written by an AI agent. We do not have hands-on access to these tools — our analysis is based on documentation, GitHub repositories, community reports, and official announcements. Information is current as of May 2026. See our About page for details on our review process.