Google Search Console (MCP)
Query and act on Google Search Console data for any of your sites directly from the BlackOps MCP server. Indexing diagnostics, search performance, URL inspection, period comparison, sitemap status — plus sitemap submission and indexing requests. Per-site, encrypted at rest, never returned to the browser.
What it does
The MCP server exposes seven GSC tools — five read, two write — for any AI assistant connected to BlackOps (Claude, ChatGPT, etc). Each tool takes a domain argument matching a BlackOps site you own — the underlying GSC site URL is resolved server-side from your saved settings.
Read
gsc_index_status— bulk URL inspection grouped into buckets: indexed, crawled-not-indexed, discovered-not-indexed, not-found, excludedgsc_search_analytics— top queries, pages, countries, devices with clicks, impressions, CTR, average positiongsc_url_inspection— per-URL deep dive: verdict, coverage state, last crawl, page-fetch state, robots, canonical, mobile, structured datagsc_compare_periods— two date ranges side-by-side with delta math, winners and losersgsc_sitemaps— submitted sitemaps with status, last-read date, errors, warnings, submitted vs indexed counts. PassincludeUrls: trueto also fetch each sitemap.xml and return the actual URL list (or child sitemaps for sitemap-index files)
Write
gsc_submit_sitemap— submit or force a refetch on a sitemap. Idempotent; resubmitting is the "refetch please" use casegsc_request_indexing— push up to 10 URLs to Google's crawl queue per call. Every URL is HEAD/GET pre-validated; invalid URLs are skipped with a reason and don't burn quota. 200/day per-site quota. Every response includesquotaRemaining,quotaTotal,quotaResetAt. PassskipValidation: trueonly for intentionally-unreachable URLs
Every response is shaped as { summary, structured, raw } — a one-paragraph opinionated headline the AI can quote verbatim, a pre-aggregated payload it can chart or filter, and the full GSC API response if it needs to dig deeper.
Setup
If you already have a service account configured for the GA4 MCP, you can reuse the same service account JSON. You only need to grant it access to your GSC property and paste the same JSON into the GSC card.
1. Service account + Search Console API
- Open the Google Cloud service accounts page and create a service account, or reuse the one you set up for GA4.
- Enable the Google Search Console API for that project (APIs & Services → Library → search "Search Console API" → Enable).
- Generate a JSON key for the service account and download it. The same JSON works for GA4 and GSC if both APIs are enabled in the project.
2. Grant the service account access to your GSC property
Search Console's "Add user" UI rejects service account email addresses with a "not a valid email" error. There are two supported workarounds:
Option A — DNS verification (recommended for Domain properties)
- Enable the Site Verification API in your Google Cloud project.
- Have the service account request a verification token via
siteVerification.webResource.getTokenwithverificationMethod: "DNS_TXT". - Add the returned
google-site-verification=…string as a TXT record at your domain root. Multiple Google verification TXT records can coexist. - Once the record propagates, call
siteVerification.webResource.insertfrom the service account — it becomes a verified DNS owner alongside any existing owners. - Then call
webmasters.sites.addwithsiteUrl: "sc-domain:yourdomain.com"to associate the property. The service account is now asiteOwneron the property.
Option B — wait for OAuth (v2)
We're shipping an OAuth-based connect flow in v2 so customers can grant access by signing in with the Google account that already owns the GSC property. If you don't want to manage DNS, this is the path to wait for.
Note your GSC site URL string — this is what GSC uses internally, and it's exactly what you'll paste into BlackOps in step 3:
- Domain property (covers all subdomains + protocols):
sc-domain:yourdomain.com - URL-prefix property: the full URL with trailing slash, e.g.
https://yourdomain.com/orhttps://www.yourdomain.com/
3. Save credentials in BlackOps
- Go to Admin → Sites and open the site you want to enable.
- Open the Analytics tab. Scroll past the GA4 card to Google Search Console (MCP).
- Paste your GSC Site URL (from step 2) and the service account JSON.
- Click Save GSC Credentials. The badge changes to Configured.
- Flip the GSC MCP tools toggle from Disabled to Enabled.
The JSON is encrypted with AES-256-GCM and is never sent back to the browser. To rotate, paste new JSON and save. To remove, click Clear. To pause access without deleting credentials, flip the toggle off — the tools will return a structured "not enabled" response instead of throwing.
Example chats
Ask in natural language. The AI picks the right tool and arguments for you.
Indexing audit
"Pull the sitemap for blackopscenter.com, then run an indexing audit on every URL in it."
→ gsc_sitemaps lists submitted URLs, then gsc_index_status buckets each one. The summary calls out the indexed ratio and flags it if it's under 50%.
Top performing queries
"What are the top 25 queries driving impressions to benenewton.com over the last 90 days, and what's the average position?"
→ gsc_search_analytics with dimensions: ["query"]. Use ["query", "page"] if you want to see which page each query lands on.
Per-URL diagnostic
"Inspect https://blackopscenter.com/blog/some-post in Search Console. Why isn't it indexed yet?"
→ gsc_url_inspection returns the full indexing verdict, last crawl date, page fetch state, robots state, Google's chosen canonical, mobile usability, and structured data verdict — enough for the AI to tell you exactly what's blocking.
This month vs last month
"Compare query performance for blackopscenter.com between last month and the month before. Show me winners and losers."
→ gsc_compare_periods with two date ranges. Returns sorted winners (positive click delta), losers, and net change. Position deltas are sign-flipped so positive = ranking improvement.
Resubmit a stale sitemap
"Resubmit https://www.blackopscenter.com/sitemap.xml. It hasn't been fetched in months."
→ gsc_submit_sitemap looks up the previous fetch, submits, and reports how stale it was. Stale (>30 days) resubmissions get an "expect re-indexing activity" nudge. Idempotent — resubmitting an already-submitted sitemap is the canonical "please refetch" request to Google.
Force-index high-value URLs
"Find the eight highest-impression pages on benenewton.com that aren't indexed yet, then request indexing on them."
→ gsc_search_analytics + gsc_index_status identifies the candidates, then gsc_request_indexing pushes them. Capped at 10 URLs per call, 200/day per-site quota. The first call for a given site includes a one-time disclosure paragraph about Google's Indexing API TOS scope (see "About the Indexing API" below). Subsequent calls are operational only and surface remaining quota.
Multi-step workflows
Like GA4, GSC tools chain with the rest of the BlackOps MCP — turn ranking signals into editorial decisions, not dashboard reads.
"Find queries on benenewton.com where my position is between 5 and 15. Pick three with the most impressions, then draft tweets that link to those posts."
"List every URL on blackopscenter.com that's 'Crawled — not indexed', then save them as a Brain note tagged 'indexing-debt' for me to triage."
"Compare the last 28 days of GSC data with the same period a year ago. If clicks are down more than 20%, draft a journal entry exploring why."
Calling tools directly
If you're scripting against the MCP yourself, the JSON-RPC arguments look like this:
{
"name": "gsc_index_status",
"arguments": {
"domain": "blackopscenter.com",
"urls": [
"https://blackopscenter.com/blog/post-a",
"https://blackopscenter.com/blog/post-b"
]
}
}
{
"name": "gsc_search_analytics",
"arguments": {
"domain": "benenewton.com",
"startDate": "2026-04-01",
"endDate": "2026-04-28",
"dimensions": ["query", "page"],
"rowLimit": 100
}
}
{
"name": "gsc_compare_periods",
"arguments": {
"domain": "blackopscenter.com",
"period1Start": "2026-03-01",
"period1End": "2026-03-31",
"period2Start": "2026-04-01",
"period2End": "2026-04-30",
"dimension": "query",
"rowLimit": 50
}
}
{
"name": "gsc_submit_sitemap",
"arguments": {
"domain": "blackopscenter.com",
"sitemapUrl": "https://www.blackopscenter.com/sitemap.xml"
}
}
{
"name": "gsc_request_indexing",
"arguments": {
"domain": "blackopscenter.com",
"urls": [
"https://www.blackopscenter.com/blog/post-a",
"https://www.blackopscenter.com/blog/post-b"
]
}
}
{
"name": "gsc_sitemaps",
"arguments": {
"domain": "blackopscenter.com",
"includeUrls": true
}
}About the Indexing API
Per Google's documentation the Indexing API is officially scoped to JobPosting and BroadcastEvent content types. In practice Google processes requests for general content without enforcement, and BlackOps treats this as supported use. If enforcement changes, the tool will be updated. The gsc_request_indexing tool surfaces this disclosure in its summary on the very first call for each site, so operators can make an informed choice. Subsequent calls are operational only.
If the policy ambiguity is a deal-breaker, the alternative is the "Request Indexing" button in the Google Search Console web UI, which uses Google's internal API and is sanctioned for general content. There's currently no official public API equivalent.
How it works under the hood
When the MCP receives a gsc_* call, it:
- Calls
GET /api/v2/search-console/gsc-credentials?domain=…with the user's bearer token. - The endpoint validates the token, confirms the user owns the site, checks the
gsc_mcp_enabledflag (returning 403 if off), decrypts the stored ciphertext, and returns the GSC site URL + service account JSON. - The MCP constructs a Google Auth client with read-only Webmasters scope and runs the requested operation against the Search Console API.
- The Google client is cached per-domain for the MCP process lifetime. To pick up rotated credentials, restart the MCP server or clear and re-save credentials.
Troubleshooting
"GSC MCP tools are not enabled for this site"
The per-site feature flag is off. Open Admin → Sites → Analytics tab → flip the GSC MCP tools toggle to Enabled.
"GSC not configured for this site"
Either the GSC site URL or the service account JSON is missing in site settings. Both are required.
"User does not have sufficient permission for site 'sc-domain:…'"
The service account isn't a user on the GSC property yet. Run the DNS verification + sites.add flow from step 2 above.
"not a valid email" in Search Console's Add user dialog
Expected — Search Console's UI rejects service-account emails. Use the DNS verification path described in step 2 instead.
"Server cannot decrypt credentials"
The EXT_TOKEN_ENCRYPTION_KEY environment variable is not set on the BlackOps app. Contact the platform admin.
404 from Google for the site URL
The GSC Site URL string doesn't match a property the service account has access to. Confirm Domain (sc-domain:…) vs URL-prefix (full URL with trailing slash) and that ownership is verified.
Limitations
gsc_index_statusis bounded by Google's URL Inspection quota. Up to 50 URLs per call; large audits should batch.gsc_search_analyticsrows are capped at 1000 per call by GSC.gsc_request_indexingis capped at 10 URLs per call and 200/day per-site. Every URL is HEAD/GET pre-validated (5s timeout, parallel) before submission — invalid URLs are returned asskipped_invalidwith a reason and do not count against quota. Every response surfacesquotaRemaining,quotaTotal,quotaResetAt(next midnight Pacific), andskippedInvalidCount. At ≤20 remaining the summary nudges "pace yourself." At 0 it returns "Quota exhausted" — never throws or retries. UseskipValidation: truefor intentionally-unreachable URLs (auth, maintenance pages).gsc_request_indexingshows a one-time TOS disclosure on the first call for each site (see "About the Indexing API"). Tracked viasites.gsc_indexing_disclosure_shown_at. Reset that column to null to surface it again.- Sitemap submission is idempotent. Submitting an already-submitted sitemap is the canonical "refetch please" request and is not an error.
- GSC data is typically 2–3 days behind real-time. Today's data won't be there until tomorrow at the earliest.
- Cross-site comparison parallel to
ga4_site_comparisonis not in v1 — callgsc_search_analyticsper site instead.