Re-indexing was ordered, but the robot did not find all the pages on the site

Last update: 30.08.2022

Firstly, it is worth noting that the robot only indexes the 1st, 2nd and 3rd nesting levels (NL). If your pages are further than two clicks away from the home page, they will not be indexed.

Possible reasons for NOT indexing the 2nd and the 3rd level pages are as follows.

  1. The system code is not correctly installed. Check in the Check section that the code is correctly installed on the website.

  2. BODY and/or HTML tags are opened and/or closed multiple times.

  3. Acceptable waiting time of a response from your server was exceeded. There may be a problem with your hosting.

  4. Incorrect site structure, e.g. a large number of internal links from one page. Indexer robot has a limit on the number of internal links through which it goes deep into the site. For the main page - 500, for the 2nd nesting level - 150.

  5. Pages are locked in robots.txt.

  6. Pages are locked from indexing in the meta tags.

  7. The system code is located in a non-indexable part of the document (in <noindex>, <script> blocks or comments). Make sure the pages are indexed by Google, otherwise the site will not pass moderation.

  8. The site runs on a specific CMS, so the robot can't index it correctly.


  • Try to make a sitemap and link to it from the home page.

  • Practice shows that the most common situations are #2, #3 and #6 (many webmasters accidentally forget to close the <noindex> tag somewhere.

  • Situation #7 is very rare.

  • If you have checked everything and found no error and still do not understand why the robot does not index pages on your site give an example of a page with code that the robot does not add to the system. 

Was this article helpful?