MCP Marketplace Audit - 32% of Servers Are Stale

We pulled 11,447 MCP servers from four registries, ran the GitHub and OSV APIs against them, and tried to install the top 100. Nearly a third haven't been touched in six months.

MCP Marketplace Audit - 32% of Servers Are Stale

The headline MCP number is impressive. PulseMCP lists 12,975 servers. Smithery lists 4,814. mcp.so claims a similar count. GitHub's mcp-server topic shows 12,971 repos. Most of the directories repeat the same line: "apps store for AI, now with thousands of connectors."

We wanted to know what was actually behind the count. So we scraped every public registry we could hit, cross-referenced the repos against the GitHub and OSV advisory APIs, and spun up a Docker sandbox to install the top 100 by stars and run an MCP list_tools handshake.

The dataset is 11,447 unique repos plus the enrichment and install harness that produced the numbers below. Figures across the piece are computed against that dataset; all deltas are commit-verifiable.

Audit at a glance

ClaimNumber
Unique MCP server repos across 4 registries11,447
Enriched via GitHub API (our sample)8,097
Stale (no commit in 180+ days)32.8%
Stale (no commit in 365+ days)13.7%
"Zombies" (50+ stars AND 180d+ stale)405
Packages with known CVE in OSV1.4%
Packages with high/critical CVE0.8%
Top-100 servers that installed cleanly94.6%
Top-100 servers with a zero-config MCP handshake32.6%

What "12,000 MCP servers" actually means

First, deduplication. Pulsemcp, Smithery, and mcp.so all republish each other's listings. When we collapsed by GitHub repo URL, our 17,320 raw registry rows compressed to 11,447 unique GitHub-backed servers. Smithery contributes another ~4,800 remote-hosted servers with no source repo exposed through their public API, so our audit intentionally focused on the code-backed subset where freshness and vulnerability data are actually measurable.

Second, what counts as a server. Registries lift anything with "mcp" in the name. A Python MCP subpackage inside a 5,300-star JSON diff library counts as a listing. A three-file repo with seven stars counts as a listing. Both are "servers" in the count.

The 11,447 is a real number. It's just not the number of production-ready plugins a coding agent could reach for. We'll get to how many actually install and speak MCP shortly.

How we sampled

GitHub's unauthenticated REST API is rate-limited to 60 requests per hour. To enrich 11,447 repos at that speed would take a week. We used the Search API with batched repo:owner/name qualifiers - 25 repos per query, 10 queries per minute. That let us enrich 8,097 repos in our measurement window (70.7% of the universe), all drawn from the top of pulsemcp's listings plus the full GitHub topic. The numbers below are computed on that sample.

Abandonment rate

Of the 7,961 active (non-archived) repos we enriched, 2,615 had no commit in the last 180 days. That's 32.8%. Another layer down: 13.7% had no commit in the last 365 days. Those aren't stale, those are abandoned.

Rows of warehouse boxes - many in use, many forgotten. Registries grow by accretion. Servers get listed. Servers don't get delisted when they stop shipping. Source: unsplash.com

Per-registry breakdown

Abandonment rate tracks loosely with how aggressive the registry is about pulling in listings. Pulsemcp, which pulls from mcp-server GitHub topics plus user submissions plus inference from package registries, has the worst ratio in our sample at 34.2% stale. GitHub's mcp-server topic alone (which is harder to add to because it requires a deliberate tag) comes in at 20.4%.

SourceEnrichedStale >=180d% stale
pulsemcp7,6622,62234.2%
github-topic:mcp-server70014320.4%

Ten zombies with the most stars

These are repos that crossed 50 stars and then stopped getting commits. Most of them are still linked from at least one major directory with no stale indicator. We excluded jsondiffpatch at 5,300 stars from the list because the listed artifact is a Python subpackage inside a large unrelated diff library, not a standalone MCP server - it's the exact kind of registry noise the 11,447 count inherits.

RepoStarsDays since last push
lauriewired/ghidramcp8,514301
lharries/whatsapp-mcp5,531280
markuspfundstein/mcp-obsidian3,431296
aaronjmars/opendia1,789180
leonardsellem/n8n-mcp-server1,601284
anaisbetts/mcp-installer1,519509
designcomputer/mysql_mcp_server1,223319
gyoridavid/short-video-maker1,084303
negokaz/excel-mcp-server922274
jjsantos01/qgis_mcp911201

Several of these are the default "how to use MCP with X" integration the ecosystem pointed new users to a year ago. ghidramcp sits at 8,514 stars - still the top result most agent stacks find for reverse-engineering - and has been untouched for ten months. whatsapp-mcp at 5,531 stars has a nine-month gap. anaisbetts/mcp-installer at 1,519 stars hasn't shipped in 16 months. None of these show an abandoned badge on the public directories.

CVE exposure

OSV.dev knows about 14 distinct advisories across 10 packages in our 717-package subset. High or critical (by CVSS vector) lands on six packages: n8n-mcp, fast-filesystem-mcp, mcp-kubernetes-server, figma-developer-mcp, excel-mcp-server, and ios-simulator-mcp.

PackageRegistryHigh/CriticalTotal vulns
n8n-mcpnpm22
fast-filesystem-mcpnpm12
mcp-kubernetes-serverPyPI12
figma-developer-mcpnpm11
excel-mcp-serverPyPI11
ios-simulator-mcpnpm11

Package-level CVEs running at 1.4% of our sample is a low number on its own. The more useful way to read it is: for most MCP servers, no one has done a security review that produced a public advisory. That's different from saying they're safe.

The OSV entries here are also a floor. They only cover what someone filed. Our earlier piece on MCP's STDIO transport flaw documented a whole class of issue that Anthropic declined to treat as a bug - and therefore never turned into CVEs against the individual servers that inherit it.

"The OSV count is the floor of what the community has bothered to file, not the ceiling of what's actually broken." - paraphrasing what every security-team lead we spoke with said, independently.

Install + handshake test

We pulled the top 100 servers by stars that declared a npm or PyPI package, skipped any whose name suggested it needed a paid API key (OpenAI, Anthropic, Stripe, Figma, et al), and ran each one through the same test: spin up a fresh node:20-slim or python:3.11-slim Docker container, run npm install <pkg> or pip install <pkg>, then send the documented MCP initialize and tools/list JSON-RPC calls over STDIO. Hard cap: 90 seconds per server.

A dusty library shelf. Many listings in the MCP registries look like this up close. Top-line star counts say "popular." The install test says "try again." Source: unsplash.com

What we measured

  • Install success: the declared package from npm or PyPI resolves and installs without fatal errors inside a vanilla base image.
  • Handshake success: after install, the server can be invoked through its discoverable entry point (npm bin, Python console script, or python -m <pkg>) and responds to initialize + tools/list in under 20 seconds.

What we didn't

  • We didn't read per-server README files. The test mirrors what a brand-new user with npm install <pkg> gets - no flags, no env vars, no shelling into the repo.
  • We didn't run servers that required external credentials. We filtered those out explicitly so we weren't measuring "fails because no OPENAI_API_KEY".
  • We didn't test STDIO transport security (see our earlier coverage).
  • We didn't test HTTP/SSE servers - only STDIO, which is where most of the local-install MCP traffic sits.

Results

Of 92 non-paid servers we tested, 87 installed cleanly (94.6%) and 30 completed a valid MCP list_tools handshake (32.6%). The gap is almost completely "install works but the package's main entry isn't an MCP server" - the repos that handshake cleanly are the ones the reference spec ships (@modelcontextprotocol/server-everything), the well-known commercial integrations (Playwright, Upstash Context7, Arize Phoenix, Atlassian), and a couple of popular community servers (Blender, Bedrock KB retrieval, diff-mcp).

Everything else installed fine and then couldn't be launched without the operator knowing a specific subcommand or flag. That's a valid product choice for a monorepo. It's also a sharp reality check on "one-click install."

Resume is safe - the harness skips already-tested repos, and the whole top 100 ran to completion inside the measurement window.

Should You Care?

If you're shipping an agent product, yes. The abandonment rate maps cleanly onto a prompt-injection or supply-chain risk. A stale repo can still be installed; it just won't be patched when an advisory lands. We've written about this pattern three times already - the LiteLLM forensics piece, the Cline npm attack, and the MCP STDIO flaw where Anthropic classified a local-RCE vector as "expected behavior." The directories keep listing servers. They don't mark them abandoned. They don't block installs when a package is deprecated on npm. That gap is where the next class of supply-chain surprises will come from.

If you're choosing a MCP server today, the practical advice is boring and specific. Pin by commit hash, not version tag. Check pushed_at before you add it to an agent's config. Run npm audit and compare against OSV before trusting a package at runtime. If the what-is-MCP guide is where you're starting from, read it for the protocol basics, then assume nothing about the registry listings until you've done this verification yourself.

The MCP ecosystem is one year old. A 32% abandonment rate isn't a crisis in a one-year-old ecosystem. It's the normal growth curve of any package registry. What's unusual is that the marketplaces don't surface it. If 405 servers with 50-plus stars have been quietly unmaintained for six months and the directories still show them with a green checkmark, the directories are the problem to fix.

Sources:

MCP Marketplace Audit - 32% of Servers Are Stale
About the author AI Infrastructure & Open Source Reporter

Sophie is a journalist and former systems engineer who covers AI infrastructure, open-source models, and the developer tooling ecosystem.