Using Robots.txt to Protect WordPress from Google

I’ve found several people lately talking about using robots.txt files to help their ranking in Google and get pages out of the supplemental index. Especially if you use WordPress, you can quickly start having way too many pages that just list out your content, using up link juice, while your posts themselves are sitting there without enough PR to get indexed. The Archives pages, any Tags pages, Categories pages, Feed pages… They add up quickly, all showing pretty much similar content.

Read more about this at Earners Blog, and at Shoemoney. Both of these guys got a huge increase in traffic after they instituted a simple exclusion rules. Each have a different list that worked for them, so check out each post and make sure you read the comments, that’s where the real gold is at Shoemoney‘s site.

One thought on “Using Robots.txt to Protect WordPress from Google

  1. Stuart

    I actually found a much easier way to do this Jason.

    In .htaccess you can just put:

    Options -Indexes

    That will stop anyone including the search engines from being able to browse your directory structure. So it’s also good for security :)

    Cheers!
    Stu

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>