The "Discourage search engines from indexing this site" checkbox in Settings > Reading is the single most common reason a brand new or freshly migrated WordPress site silently disappears from Google. One checkbox erases months of work.
Why this matters
When the box is ticked, WordPress does two things in tandem. It injects <meta name="robots" content="noindex, nofollow"> into the head of every page, telling Googlebot "do not include in results, do not follow links". And it serves a virtual robots.txt that returns Disallow: / for all user agents, blocking crawl entirely. Together those signals create a hard wall: even with strong backlinks, your URLs will not surface in search.
The business impact is severe. A new launch waiting to rank for branded queries stays invisible for weeks until someone notices. A site that just migrated from staging to production loses its historical rankings within 2-4 weeks because Google drops noindex pages from the index on the next crawl cycle. In the worst cases, agencies are paid for SEO retainers while the box is silently checked - and nobody verifies.
How to detect
Three quick checks. First, open Settings > Reading and look at the checkbox itself. Second, view source on the home page (Ctrl+U) and search for noindex - if it appears inside a meta robots tag, the flag is set. Third, visit /robots.txt directly - Disallow: / with no other rules means the entire site is blocked.
A confirming signal in Google Search Console: under Pages, a large bucket of URLs labeled "Excluded by 'noindex' tag" or "Blocked by robots.txt" is the smoking gun. The URL Inspection tool will explicitly say "Indexing allowed? No: 'noindex' detected in 'robots' meta tag".
How to fix
Navigate to Settings > Reading in the WordPress admin. Scroll to the "Search Engine Visibility" section near the bottom and untick the box labeled "Discourage search engines from indexing this site". Click Save Changes.
WordPress immediately stops emitting the noindex tag and reverts the virtual robots.txt to its open state. If you run a page cache plugin (WP Rocket, W3 Total Cache, LiteSpeed Cache, Cloudflare APO), purge the cache so visitors and bots receive freshly generated HTML rather than the cached noindex version.
Common mistakes
First mistake: assuming unticking is enough. Google will not realize the site is open for crawling immediately. The next Googlebot pass might be days or weeks away depending on crawl history. Speed it up via Search Console > URL Inspection > Request Indexing on key pages, especially the homepage and primary category pages.
Second mistake: forgetting the staging environment. If you maintain a staging.example.com subdomain, that environment should keep the box ticked - otherwise Google may index staging URLs and show them to users instead of production. When promoting staging to production, untick on production and confirm staging is still hidden.
Third mistake: layered noindex from SEO plugins. Yoast SEO, Rank Math, SEOPress and All in One SEO each add their own noindex controls. If you cleared the core checkbox but pages still emit noindex, open the SEO plugin settings - look in "Search Appearance" (Yoast) or "Titles & Meta" (Rank Math) for post types or taxonomies set to noindex. Both layers must be open.
Verifying the fix
Verify in three places. First, view source on the homepage and confirm there is no noindex in the robots meta tag. Second, hit /robots.txt and confirm there is no blanket Disallow: /. Third, run the homepage URL through Search Console URL Inspection - the expected status is "URL is on Google" or, for fresh sites, "URL is not on Google" but with "Crawl allowed? Yes" and "Indexing allowed? Yes".