Sometimes the search engines are prevented from crawling your website for one reason or another. You’ll want to make sure you eliminate these.
1. Your site requires a cookie for navigation. Bots can’t carry cookies the way a regular browser user can.
2. Framed websites. Back in the day (10+ years ago) when I first started designing & programming websites, I loved using frames. But I had no idea that it would hinder my sites’ ability to be ranked in the search
3. Long, complicated URLs such as http://www.website.com/page.php?ID=HuUj=987sj=%site%
4. Login pages
5. Redirect pages (Google hates these)
6. Poor linking structure on your website. Each page on your domain should be linked to from the home page (or a sitemap) or the bots may have a difficult time crawling it (because they won’t be able to find it). Ideally,
you would like to have a very convenient linking structure, in which each page on your domain is accessible from every other page. This is why Wordpress blogs are favored by the search engines, because their layout allows for this kind of linking structure.
7. You accidentally have the “I would like to block search engines” option selected in the “Privacy” tab in your WordPress dashboard. Sometimes this option is selected automatically, so always double-check that it is
turned off. Similar options exist for other sites like Blogger.
8. You have a robots.txt file preventing the search engines from crawling certain pages. Usually this is done purposefully to prevent unwanted content from being indexed, but sometimes it might be a mistake.