Robots.txt

Discussion in 'SEO, Traffic and Revenue' started by Nick, Jun 4, 2009.

  1. Nick

    Nick Regular Member

    What areas of your site (if any) do you 'ask' the search engine spiders to not crawl, via a robots.txt file?
     
  2. theboss

    theboss Newcomer

    nope i want it to find all :P
     
  3. Peggy

    Peggy Regular Member

    no you don't, really. Most members don't want their profile info listed in the search engines ;)

    You want the spidies to concentrate on content, not your registration page, or usercp page, etc.

    This is what my robots.txt file looks like -

     
    2 people like this.
  4. Demo

    Demo Regular Member

    Mostly vbulletin files, which can create double content or aren't content-related like ajax.php
     
  5. Bundy

    Bundy Admin Talk Staff

    Yeah you want a lot blocked off. Not that many bot really pay attention to the text anymore :/

    You also never want to mention your admincp or modcp areas or files in your robots.txt. It's a big no no

     
  6. Michael Biddle

    Michael Biddle vBSEO Mentor

    Just a FYI, do not copy Kevin's above. It looks like he is using vBSEO without an extension (.html). If you use his, all of your content would be blocked.
     
    2 people like this.
  7. Bundy

    Bundy Admin Talk Staff


    Yes! Thank you. hahaha Every single one of your files would be blocked ;)

    I use no extensions at all. I actually use briansol's setup.

    I probably should have mentioned that.
     
  8. doodles

    doodles Adept

    Mine

     
  9. Nick

    Nick Regular Member

    We don't use extensions here (unless I forgot to change some of them...), so I'm assuming it's safe to use your robots.txt?
     
  10. Bundy

    Bundy Admin Talk Staff

    Yeah, it looks like your setup is pretty much the same as mine. Except you use 'Keywords' in your url's and I use all numbers for the most part. But that wont matter at all ;)
     
    2 people like this.
  11. Michael Biddle

    Michael Biddle vBSEO Mentor

    Looking at your URLs you can use his as well.
     
    2 people like this.

Share This Page