To ensure certain pages are not ignored by search engines in Webflow, you need to properly configure your robots.txt file so it doesn’t block those pages. Here's how to do that:
Disallow:
directive, it won’t be crawled.Disallow:
lines that list pages you want indexed.Disallow: /blog
, and you want your blog posts indexed, you need to delete that line.To protect certain pages but allow others:
Only use Disallow:
for pages or folders you don’t want indexed, like login or admin pages.
Example of a safe configuration:
```
User-agent: *
Disallow: /admin
Disallow: /signup
```
This tells search engines to skip /admin
and /signup
, but everything else is allowed.
noindex
meta tag if this setting is enabled, which tells search engines to ignore the page.To ensure important pages aren't ignored, update your robots.txt in Webflow by removing unnecessary Disallow:
entries and avoid disabling indexing in individual page settings. Publish and verify the live file, then test with Google tools for peace of mind.