We all are aware that search engines crawlers index your whole website to cache your website’s pages for their index. The whole process is fully automated but might takes some time. Moreover, there are certain pages on a website which are not liked to be shown in search results like archive pages, labelled pages and so on. So, the admin of a particular website disallow these pages using robots.txt or using noindex robots header tags.
1 Steps to Prevent Search Engines from Indexing Specific Post or Page:
- First of all, you need to enable custom robots header settings on your blogger blog.
- After that, Go to Post or Page Editor from your Blogger blog’s dashboard.
- Under Post Settings, you will be shown Custom Robots Tags. Click on that.
- Untick the ‘default’ and ‘all’ tick boxes.
- Tick right the ‘noindex’ and ‘none’ tick boxes and Click Done.
- Now Publish the Post or Page as per your convenience.
- You’re all set. Now, that particular post or particular will not allow search engines spiders to crawl it.