1. Block pages and directories with the robots.txt fileThe disallow directive of the
robots.txt file is an easy way to exclude single files or whole directories from indexing. To exclude individual files, add this to your robots.txt file:
User-agent: *
Disallow: /directory/name-of-file.html
To exclude whole directories, use this:
User-agent: *
Disallow: /first-directory/
Disallow: /second-directory/
If you use this method, double check your robots.txt file to make sure that you do not exclude directories that you want to see in Google's search results.
Note that your website visitors still can see the pages that you exclude in the robots.txt file.
2. Block individual pages with the meta robots noindex tagThe meta robots noindex tag enables you to tell search engine robots that a particular page should not be indexed. To exclude a web page from the search results, add the following code in the
section of a web page:
In this case, search engines won't index the page and they also won't follow the links on the page. If you want search engines to follow the links on the page, use this tag:
The meta robots noindex tag only influences search engine robots. Regular visitors of your website still can see the pages.
3. Block pages with the correct server header statusThe server
header status code enables you to send real website visitors and search engine robots to different places on your website. A web page usually has a "200 OK" status code. For example, you can use these server status codes:
- 301 moved permanently: this request and all future requests should be sent to a new URL.
- 403 forbidden: the server refuses to respond to the request.
For search engine optimization purposes, a 301 redirect should be used if you want to make sure that visitors of old pages get redirected to the new pages on your website.
4. Password protect your web pagesIf you password protect your pages, only visitors who know the password will be able to view the content.
Search engine robots won't be able to access the pages. Password protected pages can have a negative influence on the user experience so you should thoroughly test this. Details on how to password protect a page can be found
here.
5. Use cookies and JavaScript to present your contentCookies and JavaScript can also help you to keep search engine robots away from your door. For example, you can hide content by making it only accessible to user agents that accept cookies.
You can also use very complex JavaScripts to execute your content. Most search engine robots do not execute complex JavaScript code.
Blocking Google can be helpful for some pages. In general, you want Google to index page pages. The tools in SEOprofiler help you to make sure that Google and other search engines can index your web pages correctly.
If you haven't done it yet, create your SEOprofiler account now. If you want to save time, order the full version for our special price.
Create a
free trial account now or
get the full version and save 99%.
Article by Axandra
SEO software