Webflow sync, pageviews & more.
NEW

Why is my Webflow site blocked for bots and why is the sitemap verification in Google Search Console not working?

TL;DR
  • Check and update robots.txt in Webflow SEO settings to allow crawling.
  • Disable "Disable indexing" and remove password protection if enabled.
  • Ensure site is published to a custom domain and sitemap URL is accessible at yourdomain.com/sitemap.xml.
  • Fix any 403 or 404 errors by verifying sitemap URL and site accessibility.
  • Use Google Search Console's "Inspect URL" tool to request re-indexing after changes.

Your Webflow site may be blocked for bots if certain settings or files prevent search engine access, leading to Google Search Console sitemap verification issues.

1. Check Robots.txt Settings

  • In Webflow, go to Project Settings > SEO.
  • Look for the robots.txt input box.
  • If you see lines like User-agent: * Disallow: /, this blocks all bots from crawling your site.
  • Solution: Remove or modify these rules to allow crawling. A basic crawl-friendly version is:
  • User-agent: *
  • Disallow:
  • Save the changes and re-publish the site.

2. Ensure Site Is Not Password-Protected or Disabled for Indexing

  • In Project Settings > General, check the “Disable indexing” option. If this is enabled, Webflow adds a "noindex" meta tag and blocks bots.
  • Also make sure the site is not password protected, which would prevent bots from accessing any content.
  • Solution: Uncheck both settings if you want your content to appear in search engines.

3. Confirm Sitemap URL and Hosting Status

  • Webflow automatically generates a sitemap at yourdomain.com/sitemap.xml.
  • Make sure you've published your site to a custom domain and not just the Webflow staging domain (e.g., yoursite.webflow.io).
  • In Google Search Console, submit the sitemap as sitemap.xml under your verified custom domain.
  • Visit https://yourdomain.com/sitemap.xml in your browser to ensure the sitemap loads.

4. Check for 403 or 404 Errors on Sitemap Submission

  • If Google Search Console shows a 403 Forbidden or 404 Not Found error:
  • Double-check the sitemap URL.
  • Make sure the site is published and accessible.
  • If it's a 403, it could be bots blocked by either robots.txt or server settings.

5. Wait for Google to Reprocess Changes

  • After updating robots.txt, sitemap, or indexing settings, Google may take some time to reprocess.
  • You can use the “Inspect URL” tool in Google Search Console to manually request indexing.

Summary

Your Webflow site may be blocking bots due to a restrictive robots.txt, “Disable indexing” setting, or an incorrectly submitted sitemap. Fix these by allowing crawlers in SEO settings, ensuring the site is live and not hidden from indexing, and verifying the correct sitemap URL submission.

Rate this answer

Other Webflow Questions