Why Google is not indexing all pages of my website

Keeping all the pages of your website indexed in Google can be a challenging task and that too in this times when Google is becoming stricter towards SEO practices. Now-a-days, the biggest search engine is continuously stressing all blogs to follow its best SEO practices.

Any website majorly depends on organic traffic by Google. And any search engine can send traffic to your site, if and only if your website is crawled by it and it contains your website pages.

But if you notice that Google has de-indexed your some pages from its search results then as a webmaster, it is a matter of worry to you.

How to check if my blog is indexed by Google

In order to check if your blog and its pages have been indexed by google, follow below steps:


Go to google.com and in the search bar, type: example.com. See example below: Replace the word example with your domain name. I have used .com as my blog uses .com extension and vast numbers of sites are on this popular extension only. You need to replace it with your actual TLD.

Google no indexing website
Google no indexing website

If nothing comes up then try searching site:www.example.com

If still you cannot see any result then it means Google has not indexed your website and your website is missing from google index. Continue reading below as how to resolve google index issue.

site:domain inurl:slug

If you want to check some specific pages of your blog in google then try this method. It will point out to the specific slug or url, if that is present in the search engine’s crawling index or not.

google index slug
Google index slug

site:domain.com filetype:filetype

In order to check for specific filetypes, you can use the mentioned style. Here filetype stands for type such as html,xml, pdf etc.

google index crawl
Google index crawl


Login to Search Console and check that how many pages your sitemap has submitted for crawling. Sitemaps can also be viewed on your blog itself. You can also check this from the SEO wordpress plugin that you are using in your blog.

Index Status Report

Head over to Search Console and you can check the urls crawled by google. This information is displayed in the Index Status Report. This report proves very useful in determining the performance of your blog in organic search. It displays the information in the form of a graph.

• It tells you the total number of urls in google’s index.

• And along with it, it also tells you how many links are blocked by your robots.txt file.

• You can also see that how many pages of your blog you removed using URL removal tool.

• Use this report to determine if there is any sudden increase or decrease in the number of pages in Google’s index.

• If you see sudden spike in number of pages then you need to check if google is not crawling your pages as 2 times (can be due to www and non-www versions).

• The graph allows you to review and fix any unwanted things.

Why does my website not show up in Google Search

There could be multiple reasons that why google has not indexed your all pages or not even a single page. Below is the list with 14 reasons.

1. Your website is very new

If you have purchased your domain few hours back and have added 1-2 articles on it then chances are that google might not know about your domain. In such a case, you need to be little patient and try posting your links on facebook or twitter.

2. Blocked by robots.txt

If you have accidentally blocked certain pages of your blog to be no-indexed then they will not be present in google’s cache.

Robots.txt is a file which is specially designed for search-engines and the search engine indexes any site based on the rules defined in that file.

If you see that your robots.txt file is blocking some urls which it should not be then you should remove those rules from it.

You can check out our robots.txt file and can modify yours accordingly.

3. Your blog does not have sitemap.xml

A sitemap is must for any blog. A sitemap contains the urls of your all posts and pages that you want to be crawled and indexed. If your blog is missing the sitemap.xml file then you should build it now. For wordpress, there are plenty of plugins available such as All-In-One SEO or Yoast SEO plugin. Check here for Google’s sitemap policy.

4. Internal Duplicate content

If you have duplicate content on your website then google might de-index your site. If you are writing same exact wordings for each article and page of your site then your site itself does not keep any importance, neither in the eyes of any search engine and nor for any human user. If it is, then you should try fixing it.

Confused??? Because no website-owner will create pages of similar content then how duplicate content exists on any site?. Google treats www, non-www, and https all as different folders. If your site opens with below 4 different versions and the url in browser remains the same, then you might get penalized due to duplicate content.
• http://domain.com
• http://www.domain.com
• https://www.domain.com
• https://domain.com

The fix is that you should do permanent 301 redirect of www to non-www or from non-www to www. Also, if you have recently moved your site to secured version https then considering 301 permanent redirection from http to https.

5. External Duplicate content

Many new bloggers, in order to post appealing content on their blogs, and in order to quickly make their sites content-rich, just copy the articles from different websites on internet and paste on their blogs. This is against the seo-guidelines and if you are doing this then consider being penalized now or later.

The solution is that you should not copy the articles from anywhere. Instead, write in your own wordings. If you have already copied the articles then either delete the entire post from your blog or edit them and add fresh new content in them.

After editing your old posts, consider submitting reconsideration request.

6. Discourage Indexing Settings is On

If your blog is on wordpress then go to Settings –> Reading and see if you have checked the checkbox Discourage search engines from indexing this site.

Here, the solution is straight-forward. Just uncheck this box so that your blog gets included in google index.

Discourage Search Engines
Discourage Search Engines

7. Crawl Errors

If your blog has lots of crawl errors then those pages might get de-indexed. You can check in Google Search Console (GSC) for all the crawl errors for your website. In order to check all the crawl errors, login to GSC and go inside Coverage menu.

Crawl Errors in Google
Crawl Errors in Google

The solution is to check all the errors and fix them.

8. NoIndex, NoFollow tag

If your blog has meta tag as noindex,nofollow then no search engine can crawl and index your blog. This meta tag looks as below:

<meta name=”robots” content=”noindex, nofollow”>

This tag clearly orders the biggest giant neither to index and nor to follow that page.

The solution is to remove this meta tag. Also check your website for proper meta tags.

9. Page load time is too high

Your website has been coded poorly and it takes minutes to load and the spiders and bots crawling your website times out.

The solution is to code properly and reduce page-load times.

If you are on wordpress then consider installing any cache plugins such as WP-Total Cache.

10. Your web host is down

If your web hosting company’s servers take frequent downtimes or if they have gone down for longer duration of time then obviously, the biggest giant does not want to show The site cannot be reached status to its users.

The solution is to move your blog to a reliable web host such as A2 Hosting.

11. Your website has no value

If your website is not having any proper content and is filled with grammatical mistakes with improper English then you should be ready to face penalty from the major giant.

Consider you have recently added forums on your blog and non-English speaking users are spamming your forums with junk urls and bad language then you might get penalized soon.

The solution is to remove all such thin-quality content either by deleting those pages or updating them with proper language and grammar. In case of forums, it would be difficult to do as it will require lots of manual work. I would suggest that you manually approve registered users and remove the thin-content providing users.

12. Manual Penalty

Login to GSC and head over to Manual Actions section to analyze if someone from official search team has penalized you.

Manual action penalty Google
Manual action penalty Google

The solution is that you reconsider the suggestions illustrated in Manual Actions section and fix them. After fixing the issues, consider filing a reconsideration request and get your site crawled so that it can be added in google index.

13. Ajax and JavaScript issues

Gone are the days when people used to ignore javascript validations. Now, adhering to javascript standards and leaving javascript errors open in browser might lead to an indexing issue. If your site is not optimized for Ajax and javascript then consider optimizing it else Googlebot will not follow your site.

14. Not mobile-friendly

As the number of mobile-users is increasing day-by-day and so, google has started paying very much attention on user’s mobile experience. So, your site should be optimized for mobile devices else your site might land in trouble.

Leave a Comment

%d bloggers like this: