Only 63% of URLs indexed?

Discussion in 'SEO, Traffic and Revenue' started by Nick, Nov 8, 2009.

  1. Nick

    Nick Regular Member

    Joined:
    Jul 27, 2008
    Messages:
    7,441
    Likes Received:
    218
    In my Google Webmaster Tools console, for one site, it reads:
    which is roughly 80% (this forum has vBSEO)

    ... and for another, it is:
    which is roughly 63% (this forum does not have vBSEO).


    Why are so few of the URLs indexed?
     
  2. p4guru

    p4guru Addict

    Joined:
    Oct 22, 2009
    Messages:
    82
    Likes Received:
    7
    Looks about right i think, no idea why not more are indexed. One of my forums with vBSEO

    One reason could be duplicate meta tag descriptions on some pages.
     
  3. kev

    kev Regular Member

    Joined:
    Mar 9, 2009
    Messages:
    1,224
    Likes Received:
    61
    Site 1 vbulletin forum with VBSEO sitemap - Sitemap Index Total: 1,111,382, Indexed: 406,461

    Site 2 vbulletin forum with VBSEO sitemap - Sitemap Index Total: 15,956, Indexed: 7,950

    Site 3 vbulletin forum with VBSEO sitemap - Sitemap Index Total: 5,567, Indexed: 3,399

    Site 4 wordpress with sitemap - Sitemap Total: 869, Indexed: 658

    Site 5 wordpress with sitemap - Sitemap Total: 719, Indexed: 352

    One of the things that affects how many pages the search engines index is how many backlinks you have. The more backlinks you have, the more pages you get indexed.

    If you look at my sites list, site 1 has more backlinks then site 2. But site 2 has a higher ratio of indexed pages.
     
  4. vlauria

    vlauria Addict

    Joined:
    Nov 25, 2009
    Messages:
    51
    Likes Received:
    15
    First Name:
    Vincent
    One thought I have, is that depending on the links on the forum without vBSEO, the 5,131 urls in the sitemap could be some duplicate content by having slightly different URLs pointing to the same content, and therefore google choose one of another.

    The number of backlinks as kevin pointed out could be a factor. Also, your internal linking structure can be a factor. Google doesn't guarantee to crawl all pages, and it tends to crawl links at a 'higher level' (closer to the root) more frequently. It will also have a better chance of finding pages with more internal links. Also, the rate at which the content on a page is changed has an effect on the crawl rate, which could effect if google is coming back to check for updates, say page 2, 3, 4, etc.

    And lastly, one thought that comes to mind is server speed. We noticed that by simple code optimization, we could serve up pages faster and have google crawl more pages on our site. They were inversely proportional. If we cut in half the speed that our pages were served up, google crawled twice as many pages per day! We verified this on webmaster tools.
     

Share This Page