A practical comparison of every Google Search Console MCP servers - roughly 30 open-source options plus a handful of hosted SaaS endpoints, with no official Google-built server in sight as of April 2026. The serious contenders boil down to seven: AminForou as the safe default, Suganthan for opinionated SEO analysis, ahonn for high-volume row pulls, ncosentino for zero supply-chain surface, houtini for local SQLite work, saurabhsharma2u for cross-platform GSC+Bing+GA4, and AppsYogi for keychain-based token storage. The security section matters more than most people think: CVE-2025-6514 in mcp-remote, indirect prompt injection via your own GSC data, and an npm name collision that silently installs the wrong server are all real gotchas. The post ends with concrete recommendations by scenario — solo SEO, agency, enterprise, big-data analysis, and demos — plus a clearly-labelled ‘avoid’ list covering dormant repos, prompt-routing-through-Gemini designs, and one server that authenticates by reading your Chrome cookies off disk.

Every Google Search Console MCP Server, Reviewed (April 2026)

Model Context Protocol (MCP) servers turn Google Search Console from a place you click through into a place an AI assistant can actually query, slice, and reason about. Instead of exporting CSVs every time you want to look at content decay or quick-win opportunities, you ask your assistant — and the server pulls the live API data, runs the analysis, and reports back.

The catch: there’s no official Google-built GSC MCP server yet. There’s an open issue on the google/mcp repo asking for one, but as of late April 2026 the entire space is community-built. That’s roughly 30+ open-source servers and a handful of hosted SaaS endpoints, all of varying quality, maintenance, and security posture.

I went deep into every one I could find. This post is the synthesis: who’s worth using, who’s outdated, who’s actively risky, and how to pick based on what you’re actually trying to do.

TL;DR — pick by use case

SituationRecommended server
Best general-purpose pick for a working SEOAminForou/mcp-gsc
Best opinionated SEO analyst (quick wins, decay, cannibalisation)Suganthan-Mohanadasan/Suganthans-GSC-MCP
Maximum row extraction (25k+ per call)ahonn/mcp-server-gsc
Zero supply-chain surface (single Go binary)ncosentino/google-search-console-mcp
Local SQLite cache + arbitrary SQL, no API row capshoutini-ai/better-search-console
Cross-platform (GSC + Bing + GA4) under one roofsaurabhsharma2u/search-console-mcp
Agency multi-account workflowsMattiooFR/mcp-gsc-multi-account
Multi-tenant remote deployment patternsmw355/google-search-console-mcp
Domain-wide delegation for Workspace orgslocomotive-agency/google-search-console-mcp-python
Zero local install, OAuth-and-go demoEkamoira hosted
Already on Apify or Coupler.io infratheir respective hosted MCPs

If you only read one section, skip to “Security: what to actually worry about” before you point any of these at a real GSC property.


The serious contenders

These are the seven servers I’d actually run on real data. They’re actively maintained, reasonably well-engineered, and each occupies a different niche.

1. AminForou/mcp-gsc — the safe default

Repo: https://github.com/AminForou/mcp-gsc Stack: Python 3.11+, FastMCP, runs via uvx or local venv Auth: OAuth or Service Account (toggle with GSC_SKIP_OAUTH) License: MIT Status: ~700+ stars, active commits in April 2026

This is the most-starred GSC MCP server, and for good reason. It does roughly 19 tools well: list_properties, get_search_analytics, get_performance_overview, inspect_url_enhanced, batch_url_inspection, check_indexing_issues, sitemap CRUD, compare_search_periods, get_search_by_page_query, and a handful of advanced analytics helpers. Documentation is the most beginner-friendly of the lot. The uvx mcp-search-console install path means you don’t even need to think about Python venvs anymore.

Built-in _meta provenance tags on every response prove what was sent to Google, which materially reduces the risk of the LLM inventing favourable numbers. There’s also a Dockerfile if you want to run it in SSE mode on port 3001 and treat it as a proper service.

Pick this if: You’re an SEO who wants the broadest pure-GSC feature set with the lowest setup friction and the strongest community signal.

2. Suganthan-Mohanadasan/Suganthans-GSC-MCP — the SEO analyst

Repo: https://github.com/Suganthan-Mohanadasan/Suganthans-GSC-MCP npm: suganthan-gsc-mcp (note: not gsc-mcp-server — see security section) Stack: TypeScript, Node ≥20 Auth: OAuth or Service Account (GSC_AUTH_MODE=oauth / GSC_KEY_FILE) License: Apache-2.0 (README still says MIT in places — small inconsistency)

Where AminForou’s server is “GSC API as MCP,” Suganthan’s is “senior SEO analyst as MCP.” The 20 tools are question-shaped rather than endpoint-shaped: quick_wins, cannibalization_check, content_decay, traffic_drops, ctr_vs_benchmark, ctr_opportunities, topic_cluster_performance, multi_site_dashboard, generate_report, check_alerts. The kind of analyses you’d otherwise build yourself in a Looker Studio dashboard, but driven by natural language.

Two things make it stand out:

Hallucination guardrails. The verify_claim tool forces the model to self-check numerical claims against live API data before answering. If the model is about to say “organic traffic is up 50% week-over-week,” the server makes it verify. Combined with _meta provenance on every response, it’s the most defensively-built server for client reporting work.

Indexing API integration. It includes submit_url and submit_batch (up to 200 URLs) against Google’s Indexing API. Worth knowing that Google’s docs say the Indexing API is officially scoped to JobPosting and BroadcastEvent content — so unless that’s your use case, treat these tools as opt-in only and consider gating them behind explicit confirmation.

It also defaults to dataState: 'all' so the numbers match the GSC dashboard rather than the 2–3 day delayed default. And it renders rich visual artefacts (cards, charts, tabs) inside Claude Desktop, which is genuinely useful when you’re presenting findings.

Pick this if: You do real SEO analysis and want the assistant to actually surface insights rather than just hand you rows. Especially good for agencies producing client reports.

3. ahonn/mcp-server-gsc — the data firehose

Repo: https://github.com/ahonn/mcp-server-gsc npm: mcp-server-gsc Stack: TypeScript, Node ≥18 Auth: Service Account only License: MIT Status: ~211 stars, recent Node 25 compatibility work

A single tool — search_analytics — but a very capable one. Up to 25,000 rows per request (most servers cap at 1,000), full regex filters on queries and pages, dimension stacking, and built-in Quick Wins detection (detectQuickWins: true).

No URL inspection. No sitemap management. That’s deliberate. If you need to pull a quarter’s worth of long-tail query data and grep through it for patterns, this is the right tool. For everything else, pair it with another server.

One small thing for enterprise reviewers: the package.json includes a local patches/ override for buffer-equal-constant-time. Almost certainly benign — it’s the kind of dependency patch you do to fix a transitive issue — but worth eyeballing before you ship it inside a regulated environment.

Pick this if: You manage large sites, need deep query analysis, and accept “one really good tool” over a dozen mediocre ones.

4. ncosentino/google-search-console-mcp — the zero-dependency binary

Repo: https://github.com/ncosentino/google-search-console-mcp Stack: Go (single native binary per platform) Auth: Service Account (file path, env var, or inline JSON) License: MIT

Download a binary, point it at a service account key, edit your Claude config, done. No Node, no Python, no .NET, no uvx, no dependency tree to audit. The tool surface is core-focused — list_sites, list/submit sitemaps, query_search_analytics with dimensions — and it’ll pull up to 50,000 rows per query (which is technically the highest in the field, though for most analyses you don’t actually want that much).

For locked-down corporate machines, air-gapped audits, or anyone who has lived through one too many npx -y supply-chain incidents, this is the easiest server to vet and ship through an internal artefact registry.

Pick this if: You want minimum supply chain, easy distribution, and you don’t need URL inspection or fancy analytics.

5. houtini-ai/better-search-console — the SQLite approach

Repo: https://github.com/houtini-ai/better-search-console npm: @houtini/better-search-console Stack: TypeScript + embedded better-sqlite3 Auth: Service Account License: Apache-2.0

This one’s genuinely different in approach. Instead of querying the GSC API on every prompt, it syncs your full GSC dataset (up to 16 months) into a local SQLite database per property. Then it exposes 16 pre-built SQL insight queries — content decay, cannibalisation, CTR issues, growth opportunities — plus a custom_sql escape hatch.

The advantages compound:

  • No 1,000/25,000 row API caps. You’re querying SQLite, not Google.
  • The LLM only sees query results, not raw row dumps, which keeps token costs sane on big sites.
  • Reproducible. Run the same query an hour later and get the same answer (until next sync).
  • Faster follow-up questions, because you’re not re-hitting the API.

The trade-off is the initial sync — for a million-row property this is 5–10 minutes — and the data is only as fresh as your last sync.

Pick this if: You do repeat analytical work on the same site and want to avoid re-pulling the same data over and over. Also genuinely the best option for keeping raw GSC rows out of LLM context — your analyst stays in SQL, the model only sees aggregates.

6. saurabhsharma2u/search-console-mcp — the cross-platform option

Repo: https://github.com/saurabhsharma2u/search-console-mcp Stack: TypeScript, Node ≥18 Auth: OAuth or Service Account, multi-account License: MIT Status: Active since Feb 2026, ~109 stars at time of writing

The unique pitch: GSC + Bing Webmaster Tools + GA4 under one MCP. If your workflow involves cross-engine reporting or you need to correlate GSC clicks with GA4 sessions and conversions in the same conversation, this saves you running three separate servers.

The SEO-specific bits worth calling out:

  • A built-in algorithm-update database (27 entries from 2022 to 2026) so the model can correlate traffic changes against named Google updates.
  • Week-over-week health checks and anomaly detection.
  • A friendly npx search-console-mcp setup wizard.
  • Hardware-bound AES-256-GCM encryption for tokens via OS keychain integration (macOS Keychain, Windows Credential Manager, Linux Secret Service).

That last point is genuinely nice for enterprise. Most servers leave service-account JSON sitting in plain text on disk; this one pushes secrets into the OS-managed credential store.

Pick this if: You need GSC + Bing + GA4 together, run multiple client accounts, or care about token storage hygiene more than the average dev.

The “since Feb 2026” caveat: the code looks fine on inspection, but it doesn’t yet have years of production wear. I’d run it, but I wouldn’t make it the only thing standing between an attacker and your indexing API.

7. AppsYogi-com/gsc-mcp-server — the polished local tool

Repo: https://github.com/AppsYogi-com/gsc-mcp-server npm: @appsyogi/gsc-mcp-server Stack: TypeScript Auth: OAuth (with OS keychain) or Service Account

A smaller project (handful of stars at last check) but well-engineered: OS keychain for OAuth token storage, a gsc-mcp doctor command to verify your setup, an HTTP debug mode, SQLite query caching, and a sensible set of tools — analytics, opportunities/cannibalisation, weekly summaries, sitemap CRUD, URL inspection.

It’s an early but promising option. If keychain-based token storage matters to you and you don’t want the heavier feature set of Suganthan or Aminforou, this is worth a look.


Specialty options worth knowing

These don’t displace the top seven, but each solves a specific problem.

MattiooFR/mcp-gsc-multi-account — agency workflows

Repo: https://github.com/MattiooFR/mcp-gsc-multi-account Stack: TypeScript

The first GSC MCP I’ve seen with explicit, first-class multi-account design. Switch between client accounts without re-auth. Optional Supabase integration if you want persistent storage. Early-stage (low stars, no formal releases yet), but if you’re an agency juggling 20 clients across 100 properties, the architecture decision alone is worth the audit.

smw355/google-search-console-mcp — the multi-tenant remote pattern

Repo: https://github.com/smw355/google-search-console-mcp Stack: Python FastMCP + httpx, designed for Google Cloud Run Auth: OAuth bearer token passthrough (no server-side credential storage)

Architecturally the most interesting open-source GSC MCP. Fully stateless HTTP MCP, every user supplies their own Google bearer token, no shared service account. If you want to host an MCP for a team rather than each developer running one locally, this is the reference implementation to study. Note that the tool surface includes site add/delete and sitemap management, so you’ll want approval gates before pointing real users at it.

locomotive-agency/google-search-console-mcp-python — Workspace delegation

PyPI: google-search-console-mcp-python Stack: Python 3.12+, FastMCP, uv/pip Auth: Service Account, optional domain-wide delegation

The interesting bit here is Workspace domain-wide delegation support. If your org runs Google Workspace and you want a single service account to impersonate users for GSC access (with proper IAM controls), this is one of the few servers that natively supports the pattern. Also includes add/delete site operations, so handle accordingly.

GiorgiKemo/mcp-seo-audit — the technical SEO lab

Repo: https://github.com/GiorgiKemo/mcp-seo-audit Stack: Python (fork of AminForou)

A 30-tool fork that bundles GSC with Indexing API, CrUX (Chrome User Experience Report), PageSpeed Insights, local Lighthouse runs, robots.txt parsing, sitemap analysis, on-page SEO checks, and crawl audits. It’s a Swiss army knife for technical SEO work.

Worth noting because of the non-GSC tools that are actually useful: CrUX gives you real-user Core Web Vitals from Chrome telemetry, PageSpeed Insights is the canonical way to surface lab + field metrics to an LLM, and a local Lighthouse runner means the model can audit a page without you opening DevTools. If you do technical SEO audits, that’s a meaningful capability bundle even before you add GSC data.

That said: broader surface = broader risk. I’d run this in a sandboxed container with restricted network egress and disable any write tools you don’t need.

sarahpark/google-search-console-mcp — the read-only minimalist

Repo: https://github.com/sarahpark/google-search-console-mcp npm: @sarahpark/google-search-console-mcp

Four read-only tools: list_sites, search_analytics, inspect_url, list_sitemaps. That’s it. For a low-risk PoC where you want to demonstrate “AI can see GSC” without granting any write capabilities, this is exactly right.

noviq-ai/google-searchconsole-mcp — small Apache-licensed Python

Repo: https://github.com/noviq-ai/google-searchconsole-mcp

Four tools, Apache-2.0, runs via uvx from GitHub. Similar role to sarahpark’s — minimal surface for low-risk experiments — but in Python.

mikusnuz/gsc-mcp — full API coverage including Indexing

Repo: https://github.com/mikusnuz/gsc-mcp

13 tools, covers the full GSC + Indexing API surface including site add/delete and batch URL submission. Higher blast radius than most, low public review signal. Useful if you specifically need the Indexing API write capabilities and don’t want Suganthan’s analysis layer.

acamolese/google-search-console-mcp — generates HTML audit reports

Repo: https://github.com/acamolese/google-search-console-mcp

Niche but interesting: queries GSC and generates brandable HTML SEO audit reports as output. Useful if your deliverable is a client-facing document rather than an in-chat answer.

Suganthan-Mohanadasan/Suganthans-BigQuery-MCP-Server — the BigQuery companion

Repo: https://github.com/Suganthan-Mohanadasan/Suganthans-BigQuery-MCP-Server

Not a GSC API server, but worth a mention because it pairs naturally with Suganthan’s GSC MCP. If you’ve enabled the GSC bulk export to BigQuery, this server lets you query the export via MCP. The big win is that the BigQuery export includes the ~46% of clicks that GSC anonymises in its UI/API at the query level. Pair “API for fresh data, BQ for deep history” and you’ve got both halves of the picture. Includes ARIMA_PLUS forecasting tools and GA4+GSC revenue attribution.


Hosted / SaaS options

If you don’t want to run anything locally, these connect to GSC on their infrastructure and expose an MCP endpoint. The trade-off is the obvious one: your data and OAuth tokens live with the vendor, not you.

Ekamoira hosted GSC MCP

Endpoint: https://app.ekamoira.com/gsc/mcp Auth: OAuth 2.1 with Dynamic Client Registration

Add the URL to Claude.ai or ChatGPT, OAuth, you’re done in about 60 seconds. 13 tools, including mobile-friendly test, comprehensive sitemap management, and aggregationType: byProperty to combine www/non-www. Defaults to dataState: 'all' for fresh data. 30-day trial, then bundled with paid Ekamoira plans.

The vendor lock-in is real — your team’s OAuth grants live with Ekamoira — but for non-technical stakeholders or quick demos this is the fastest path.

Coupler.io MCP

Page: https://www.coupler.io/mcp/google-search-console Auth: Personal access tokens

Tools cover SERP performance, performance by appearance type (video, image, AMP), Discover, and Google News performance. Coupler.io says the AI receives processed summaries rather than raw query data, and that data isn’t used for model training. That’s the most enterprise-friendly stance among hosted options — you’re trading some flexibility for a real privacy posture.

Apify smacient/gsc-mcp-worker

Page: https://apify.com/smacient/gsc-mcp-worker/api/mcp

Sits inside Apify’s broader scraper ecosystem. The interesting feature isn’t really the GSC tools themselves — it’s that you can chain GSC data with Apify’s crawl actors in the same workflow. Crawl a site, get its GSC data, correlate them. If you’re already on Apify it’s a natural extension.

Composio Google Search Console

Page: https://composio.dev/toolkits/google_search_console

Tool-router model. The agent picks GSC tools dynamically alongside everything else (Slack, Asana, Gmail, etc.) via a single MCP endpoint. Composio handles encryption at rest/in transit, token refresh, RBAC, and audit logs. Strong enterprise story on paper, but you’re betting on Composio’s platform rather than running a focused tool.

Adzviser, StackOne, SellOnLLM Analytics

These are all hosted/managed integrations that include GSC. Adzviser is no-code-marketer-flavoured. StackOne lists GSC as one of many enterprise integrations. SellOnLLM bundles GA4 + GSC with snapshot analyses. Each is reasonable in its own niche, but none of them are open-source GSC MCPs in the strict sense — you’re picking a platform, not a tool.


Use with caution: dormant, low-signal, or actively risky

A handful of servers either don’t have recent maintenance, have low public review, or have specific security gotchas you should know about before installing.

Dormant but probably still working

Shin-sibainu/google-search-console-mcp-server — Last commit October 2025. Notable for being one of the older servers with first-class Indexing API URL submission, but no recent dependency updates. Verdict: OK for personal use; not enterprise-ready.

surendranb/google-search-console-mcp — Last commit June 2025. Pip-installable Python with 7 opinionated SEO “intel” tools. Verdict: the AminForou server has eaten its lunch.

metehan777/google-search-console-mcp — Last updated March 2025. Functional but superseded.

alfie-max/mcp-google-search-console — Small, infrequent updates, plain-text .env Service Account storage. Functional, but in a field this active there’s no reason to pick it over AminForou or Suganthan.

Auto-generated / not SEO-curated

ag2-mcp-servers/google-search-console-api — Auto-generated from the GSC OpenAPI spec. Mechanical 1:1 API coverage, often poor tool descriptions, and the LLM has to figure out which of dozens of generic tools to use. Verdict: useful for experiments behind an explicit allowlist; not what you want for everyday SEO work.

Specific security gotchas

garethcull/search-console-mcp routes your natural-language prompts through Google Gemini to translate them into GSC API payloads before querying GSC. Functionally fine, but every prompt transits a third-party LLM. For client work or anything with data residency requirements, that’s a non-starter without disclosure. Verdict: interesting Cloud Run deployment pattern; don’t run on data you can’t legally send to Gemini.

gsc-bing-mcp authenticates to GSC by reading your Chrome browser cookies directly from disk via rookiepy, then calling internal GSC endpoints as if it were your browser session. That’s clever, and it does sidestep the entire OAuth setup, but it’s exactly the kind of credential surface I’d never approve in a professional environment. Your Chrome cookies are also the keys to your Gmail, your Google Ads, and your entire Workspace account. Verdict: avoid for anything other than personal experiments on a throwaway profile.

Donmandela/gsc-mcp — README tells users to download from “releases,” but the GitHub repo has no actual releases published. Unclear implementation, no normal MCP config visible, no real tooling story. Verdict: treat as suspicious until proven otherwise. I would not install this.

Low-signal but probably fine on inspection

fandungptit/mcp-server-gsc is a small fork of ahonn’s server with Docker support added. chrishart0/searchconsole-mcp is a tiny single-developer Python project. ticosan/webvip-gsc is a Spanish-language GSC MCP for Claude Code. guchey/mcp-server-google-search-console is a lightweight Python wrapper. arturseo-geo/mcp-gsc-advanced advertises cannibalisation detection and crawl-gap analysis but has near-zero stars. lionkiii/google-searchconsole-mcp advertises “no Google Cloud setup required” — convenient, but make sure you understand whose OAuth client you’ll be authenticating to. seotesting-com/gsc-mcp-server is SEOTesting’s vendor-flavoured Python MCP. soumyadeep-ux/gsc-mcp-server is the open-source counterpart to Ekamoira’s hosted offering. None of these are bad, they just don’t have anything most people would pick over the seven main servers.

The npm name collision worth knowing about

sofianbettayeb/gsc-mcp-server owns the npm package name gsc-mcp-server. That matters because Suganthan’s setup guide previously instructed people to run npx -y gsc-mcp-server, which silently installed sofianbettayeb’s older 7-tool server instead of Suganthan’s 20-tool one. Now corrected — Suganthan’s package is suganthan-gsc-mcp — but the lesson generalises: always read the README of the actual npm package you’re installing, not just the blog post that links to it.


Security: what to actually worry about

GSC data alone is rarely catastrophic if it leaks. The real risk surface is the credential bundle these servers sit on top of: a Google service account or OAuth refresh token with webmasters scope, sometimes also with Indexing API write access. Both are attractive targets, and several real-world MCP-ecosystem CVEs from late 2025 / early 2026 are directly relevant.

Documented CVEs that touch the GSC MCP stack

CVE-2025-6514 (mcp-remote, CVSS 9.6) — JFrog disclosed a command-injection flaw in the mcp-remote proxy used by Claude Desktop and similar clients to bridge stdio → remote MCP. A malicious MCP server could supply a crafted authorization endpoint and execute arbitrary OS commands on the client. Affects mcp-remote 0.0.5 through 0.1.15; fixed in 0.1.16. If you use any hosted GSC MCP via mcp-remote, make sure your local proxy is patched.

MCP-client RCE / LFE class (Obsidian Security, late 2025) — Multiple MCP clients (Gemini-CLI, MCP Inspector, Cherry Studio, VS Code, Windsurf, Smithery, Lutra, Glue) had OAuth-flow URL-handling bugs leading to RCE, local file execution, or account takeover. Four CVEs assigned. Update your client.

Anthropic’s SQLite MCP server SQL-injection — broader-ecosystem context; an unpatched fork is still common in dependency trees.

Generic MCP risk classes that apply to every GSC server

Tool-description injection / tool poisoning. A malicious server can ship tool descriptions crafted to manipulate the model. Lower risk for read-only GSC servers, much higher risk for any server that can submit URLs or sitemaps.

Confused deputy / token passthrough. MCP servers must not accept tokens not issued for them. Check OAuth 2.1 / RFC 9728 compliance in any hosted server.

Session hijack via prompt injection. Multi-tenant hosted MCP servers must use unguessable session IDs bound to the user.

Indirect prompt injection via your own GSC data. This is the underrated one. Search queries, page titles, and even sitemap entries are user-controllable inputs. A malicious referrer or query showing up in your GSC data could try to inject instructions when the LLM reads it. Suganthan’s verify_claim and _meta provenance tags are designed against exactly this — and it’s a real reason to prefer servers that ship those defences over servers that just dump rows into context.

Excessive scopes. https://www.googleapis.com/auth/webmasters (read-write) is broader than most workflows need. Prefer webmasters.readonly if the server supports it.

Hard rules I’d apply to any GSC MCP

  1. Use a dedicated service account with webmasters.readonly unless you genuinely need write.
  2. Add the SA to the minimum set of GSC properties needed.
  3. Keep the JSON key out of any repo. Use OS keychain (AppsYogi or saurabhsharma2u do this natively) or a secrets manager.
  4. Pin npm/PyPI versions; don’t use npx -y against a moving latest tag in production.
  5. If you’re using a remote/hosted MCP, run it through mcp-remote ≥ 0.1.16 or a vendor-native client.
  6. Read the source. Each of these servers is small (under 5,000 LOC, often well under). It’s an afternoon’s work to skim the ones you’d actually run. Run gitleaks or trufflehog over the repo first to catch any committed service-account JSON.
  7. Gate any write tool — submit_url, submit_sitemap, add_site, delete_site — behind explicit human confirmation. Better yet: split the deployment so your read-only server has no credentials capable of writing.

Setup difficulty, ranked

From “no thinking required” to “expect a config rabbit hole”:

  1. Ekamoira hosted — Add URL to Claude.ai, OAuth, done. ~60 seconds.
  2. Coupler.io / Apify / Composio — Vendor account + token + add MCP URL. ~5 minutes.
  3. ncosentino (Go binary) — Download binary, point at SA JSON. No runtime install. ~5–10 minutes.
  4. AppsYogi gsc-mcp doctor — Interactive OAuth, guided.
  5. ahonn/mcp-server-gscnpx -y mcp-server-gsc, plug in SA JSON. ~10–15 minutes if you already have a Google Cloud project.
  6. Suganthan / saurabhsharma2u / sofianbettayeb — npx-based, OAuth or SA. ~15 minutes.
  7. AminForouuvx mcp-search-console makes this near-trivial now; the older venv route is more involved.
  8. houtini/better-search-console — Install plus initial sync (30 seconds to several minutes depending on data volume).
  9. garethcull (Cloud Run) — Cloud Run project + Gemini API key + bearer token + base64-encoded SA. ~30–45 minutes.
  10. MattiooFR multi-account — Same as TypeScript options, multiplied per account.
  11. smw355 (Cloud Run multi-tenant) — Architectural setup; budget at least an afternoon.

The single most common failure across all of these on macOS is spawn npx ENOENT, because Claude Desktop and Cursor don’t inherit your shell PATH. Either symlink npx into /usr/local/bin or use absolute paths in the MCP config. Same applies to uvx on Linux/macOS — the AminForou README handles this explicitly.


Final recommendations by scenario

If you’re a working SEO on a single site

Start with AminForou/mcp-gsc. Install via uvx, OAuth, you’re done. If you find yourself wanting analyst-style queries (“which queries are being cannibalised by which pages”) rather than raw row dumps, add Suganthan-Mohanadasan/Suganthans-GSC-MCP alongside it.

If you run an SEO agency

Suganthan for client-facing analysis and reporting (the verify_claim tool is a real differentiator when you’re putting numbers in front of paying clients), and MattiooFR/mcp-gsc-multi-account for multi-property switching. If you’re already exporting to BigQuery, the Suganthan BigQuery companion recovers the anonymised long-tail queries that GSC hides.

If you do enterprise / regulated work

Self-host ncosentino’s Go binary (single artefact, easy to vet, easy to ship through internal registries) or self-host AminForou in a network-restricted container. Pair with houtini/better-search-console so the LLM only sees query results, not raw rows. Use a dedicated service account with webmasters.readonly. Wrap everything in audit logging — none of these servers do this natively, so you’ll log at the proxy or container level.

If “self-host anything” is off the table for policy reasons, Coupler.io has the most defensible privacy posture among hosted options (summaries-only, no model-training).

If you need to crunch huge amounts of GSC data

ahonn/mcp-server-gsc for one-off 25k-row pulls, houtini-ai/better-search-console for repeat work where the cost of re-pulling is annoying. The combination — ahonn for fresh “what happened today” and houtini for “give me 16 months of trend SQL” — covers nearly every analytical scenario without leaving you waiting for the API.

If you want a no-brainer demo for non-technical stakeholders

Ekamoira hosted. Be honest with them that the data touches Ekamoira’s infrastructure, get explicit consent, and use it for the demo. Then evaluate self-hosting once they’re sold.

What to avoid

  • Anything dormant since mid-2025 (Shin-sibainu, surendranb).
  • garethcull unless you’re explicitly OK piping prompts through Gemini.
  • gsc-bing-mcp unless you’re on a throwaway Chrome profile.
  • Donmandela/gsc-mcp until someone can actually explain what it is.
  • Auto-generated OpenAPI MCPs (ag2-mcp-servers) without an explicit tool allowlist.
  • Any server with a shared OAuth client when you’re past the prototyping stage — switch to your own Cloud Console OAuth client so you control the consent UX and quotas.
  • npx -y <package> against a moving latest tag in any production setup.

What I’d actually run

For my own non-sensitive SEO work today, I’m running AminForou as the daily driver, Suganthan for opinionated analysis when the question is “what should I do” rather than “what does the data say,” and houtini for anything that needs more than a thousand rows.

For client work I’d add ncosentino’s Go binary as a sandboxed read-only fallback (the smaller the supply chain, the smaller the surface for an attacker to ride into a client’s GSC), and a webmasters.readonly service account scoped only to the properties under review.

For everything else, I’d wait six months. The space is still settling. Half the servers in this post didn’t exist twelve months ago, and a quarter of them probably won’t be maintained twelve months from now. The shortlist above is the part of the field that’s earned its place.


Useful further reading