If you're seeing the "Excluded by ‘noindex’ tag" error in Google Search Console for pages on your Blogger site, it means that the page has been explicitly marked with a noindex
directive, instructing Google not to index it. Here's how you can fix this issue:
1. Check for the Noindex Meta Tag in Blogger
The first thing to do is ensure that your Blogger page doesn't have a noindex
tag set unintentionally.
Steps to check for and remove the noindex tag:
Access the HTML of Your Post/Page:
- Go to your Blogger Dashboard.
- Select the Post or Page where the issue is occurring.
- In the post editor, click on HTML to view the HTML code of your content.
Search for the Noindex Tag:
- Look for a line of code that looks like this:
- This tag tells Google not to index the page.
- Remove this line if you want Google to index the page.
- Look for a line of code that looks like this:
Save Your Changes:
- After removing the tag, switch back to the Compose view and save the post or page.
2. Check Blogger’s Search Preferences (Meta Tags Settings)
Sometimes, the noindex
tag might be added through Blogger's settings, especially if you're having issues with pages like the homepage, tag pages, or archive pages.
Steps to Check Search Preferences:
In your Blogger Dashboard, go to Settings.
Scroll down to Search Preferences.
Check the Meta Tags section and ensure that Custom Robots Tags is set to No.
- If it’s set to Yes, remove any
noindex
tag from the custom robots section.
Example of a valid setting:
- Custom Robots Tags: No.
- If it’s set to Yes, remove any
Save any changes made in the settings.
3. Check Robots.txt File (Blogger Default)
Sometimes Blogger automatically generates a robots.txt
file that could block certain pages from being indexed.
Steps to Review and Edit Robots.txt:
Go to your Blogger Dashboard > Settings > Search Preferences.
In the Crawlers and Indexing section, click on Edit robots.txt.
Ensure the file isn't blocking important pages from being crawled by search engines. Here’s an example of a correctly configured robots.txt:
- Disallow:
/search
ensures that search result pages are not indexed, but the main content is. - Allow:
/
ensures that the main website content is indexable.
- Disallow:
Save any changes.
4. Use Google Search Console to Request Re-Crawling
After you’ve removed the noindex
tag, use Google Search Console to request Google to re-crawl and re-index your page.
Steps to Request Re-Crawling:
- Go to Google Search Console.
- In the left-hand sidebar, click on URL Inspection.
- Enter the URL of the page that was excluded.
- If the page has been updated and is now free of the
noindex
tag, click Request Indexing.- Google will queue your page for re-crawling.
5. Monitor the Page in Google Search Console
- After fixing the issue, it may take a few days for the page to be re-crawled and indexed.
- Check the Coverage report in Google Search Console to see if the page is now included in the index.
6. Additional Considerations
- Pages with Low-Quality Content: If the page has very little or duplicate content, Google may still choose not to index it.
- Ensure the page has valuable and unique content.
- Check for Other Technical Issues: If there are any other technical issues with the page, like slow load times, broken links, or server errors, it could affect indexing as well.
Next Steps
- Once the changes are made, Google should eventually remove the "Excluded by 'noindex' tag" status from the affected pages.
- If the issue persists, double-check your settings and content quality.
Let me know if you need further help with any specific step!