If your Webflow site isn't being indexed by Google, even after following SEO best practices, there could be hidden issues blocking crawling or indexing.
1. Check Site Visibility Settings in Webflow
- Go to Project Settings > SEO tab.
- Ensure “Disable Webflow subdomain indexing” is unchecked if you’re still using the Webflow.io domain.
- Ensure “Disable Website Indexing” is unchecked. If enabled, this adds a
noindex
meta tag to all site pages.
2. Inspect Published Robots.txt
- Navigate to Project Settings > SEO tab and scroll to the robots.txt field.
- Check for disallow rules like
Disallow: /
or disallowing key folders (e.g., /
or /blog
). - If using custom robots.txt, make sure it's not blocking important areas.
3. Check Individual Page Settings
- In the Webflow Designer, go to each page’s Page Settings (gear icon).
- Ensure “Hide this page from search engines” is not enabled—this adds a page-level
noindex
tag.
4. Use Google Search Console
- Add and verify your site in Google Search Console.
- Use the URL Inspection Tool on any affected pages:
- If it says “Excluded by ‘noindex’ tag” or similar, that confirms an issue.
- You can also Request Indexing to re-submit after making changes.
- Check Coverage and Sitemaps reports for indexing errors.
5. Confirm Sitemap Configuration
- Webflow auto-generates a sitemap (
yourdomain.com/sitemap.xml
) if enabled. - Go to SEO settings in Webflow and confirm “Auto-generate sitemap” is turned on.
- In Search Console, ensure you’ve submitted the sitemap URL.
6. Check for Canonical Tag Conflicts
- Webflow sets canonical tags automatically.
- In Page Settings, avoid custom canonical URLs pointing to non-indexed or irrelevant domains unless intentional.
7. Make Sure Your Site Is Live & Accessible
- Ensure you're publishing to your custom domain and it loads without issues.
- SSL must be enabled in Hosting Settings—Google may deprioritize non-HTTPS sites.
8. Look For Third-Party Script Conflicts
- Rarely, third-party scripts or SEO tools can inject
noindex
meta tags. - Use Inspect Element in the browser or view the site source to search for
<meta name="robots"
and check its value.
9. Give It Time
- Indexing can take several days to weeks after a site's launch.
- High competition or low-quality signals can delay indexing despite correct setup.
Summary
Check for “noindex” tags, robots.txt blocks, individual page settings, and Search Console feedback to find the indexing issue. Use the URL Inspection Tool and verify your sitemap is submitted. Most indexing issues trace back to robots.txt misconfiguration or hidden noindex
tags.