Webflow sync, pageviews & more.
NEW

Is anyone experiencing issues with their Robots.txt file in Webflow? I'm having trouble uploading the sitemap to Google and testing the site for rich results on Google due to the site being blocked by the robots.txt file. I've tried editing the robots.txt file and deleting text from the SEO tab, but neither solution has worked. Even manually indexing each URL on Google's sitemap app doesn't resolve the issue. Any suggestions?

TL;DR
  • Disable “Prevent search engines from indexing” in Webflow General Settings and ensure robots.txt doesn’t contain Disallow: / in the SEO tab.
  • Republish your site to all domains, verify the live robots.txt file is correct, then resubmit your sitemap and pages in Google Search Console.

You're experiencing issues with robots.txt in Webflow preventing Google from crawling your site, which in turn affects sitemap submission and rich result testing.

Here’s how to resolve robots.txt and indexing issues in Webflow:

1. Check Robots.txt in Webflow Settings

  • Go to Project Settings > SEO tab in your Webflow dashboard.
  • Scroll to the "Robots.txt" section and check for any directives like Disallow: / or other disallow rules that block crawlers.
  • If it's not empty, delete or modify any blocking rules (unless intentional).
  • If it’s empty and still serving Disallow: /, continue to the next steps.

2. Confirm the Site Is Published

  • Make sure you've published your site to your custom domain (not just the Webflow subdomain).
  • Webflow only updates robots.txt and sitemap.xml on the published version, so changes won’t reflect until re-published.
  • Use the “Publish to selected domains” button and ensure both the Webflow subdomain and custom domain are selected.

3. Test the Live robots.txt File

  • Visit your live site’s robots.txt file by going to https://yourdomain.com/robots.txt.
  • Confirm whether it still includes Disallow: / or other unexpected rules.
  • If it still shows incorrect content, wait a few minutes (Webflow servers can take time to propagate changes), then hard refresh or check in incognito mode.

4. Ensure “Disable Indexing” Is Off

  • In Project Settings > General, scroll down to the “Advanced Publishing Options”.
  • Uncheck the box labeled “Prevent search engines from indexing this site” if it’s enabled.
  • If this box is checked, Webflow adds Disallow: / automatically in robots.txt.

5. Resubmit in Google Search Console

  • Go to Google Search Console, select your property, and use “URL inspection” to re-test any pages showing as blocked.
  • Also resubmit the sitemap at: Sitemaps > Add a new sitemap.
  • Google may take time to re-crawl your site, so give it a few hours to reflect changes.

Summary

The likely cause is the “Prevent search engines from indexing” option or cached Disallow: / in robots.txt. Disable that setting, empty or correct the robots.txt field in SEO settings, publish your site, and re-test in Google Search Console after confirming the live robots.txt is clean.

Rate this answer

Other Webflow Questions