Webflow sync, pageviews & more.
NEW

How can I correctly set up the robots.txt file in Webflow to ensure certain pages are not ignored by search engines?

TL;DR
  • Edit your Webflow robots.txt via Project Settings > SEO to remove any unnecessary Disallow directives for pages you want indexed.
  • Ensure "Disable SEO indexing" is off for key pages, then publish and test using Google's Robots Testing Tool.

To ensure certain pages are not ignored by search engines in Webflow, you need to properly configure your robots.txt file so it doesn’t block those pages. Here's how to do that:

1. Understand What robots.txt Controls

  • The robots.txt file tells search engine crawlers which parts of your site to ignore or crawl.
  • If a page is listed under a Disallow: directive, it won’t be crawled.
  • If not listed, it’s typically eligible for crawling—unless blocked by other means (like meta tags or password protection).

2. Access Your Webflow robots.txt Settings

  • Go to Project Settings.
  • Open the SEO tab.
  • Scroll to the robots.txt section where you can edit the file directly.

3. Ensure You’re Not Blocking Important Pages

  • Remove any Disallow: lines that list pages you want indexed.
  • Example: If you previously had Disallow: /blog, and you want your blog posts indexed, you need to delete that line.

4. Use Correct Syntax for Exclusions Only

To protect certain pages but allow others:

  • Only use Disallow: for pages or folders you don’t want indexed, like login or admin pages.

  • Example of a safe configuration:

    ```
    User-agent: *
    Disallow: /admin
    Disallow: /signup
    ```

    This tells search engines to skip /admin and /signup, but everything else is allowed.

5. Check Custom Page Settings

  • Go to the Page Settings for any key page.
  • Ensure “Disable SEO indexing” is turned off if you want it indexed.
  • Even if it’s allowed in robots.txt, Webflow adds a noindex meta tag if this setting is enabled, which tells search engines to ignore the page.

6. Test After Publishing

  • After publishing your site, check the live robots.txt file: yourdomain.com/robots.txt.
  • Use Google’s Robots Testing Tool (in Google Search Console) to test if a page is blocked.

Summary

To ensure important pages aren't ignored, update your robots.txt in Webflow by removing unnecessary Disallow: entries and avoid disabling indexing in individual page settings. Publish and verify the live file, then test with Google tools for peace of mind.

Rate this answer

Other Webflow Questions