Thursday, April 4, 2013

6:15 AM

Google: Blocked URL Count Updates Slowly

Block URL in Robot file

A Google Webmaster Help thread has a webmaster who is seeking to minimize the quantity of blocked URLs being reported within Google Webmaster Tools Index Status report.

To make an extended story short , they used the robots.txt to block hundreds of thousands of pages . Then finally they just wiped out the pages and eradicated the lines in the robots.txt to block them . But Google still shows them as being blocked in the Index Status report.

Google Blocked URL Count

Google's John Mueller described that it may take quite a while for Google to re-crawl and observe the pages are no longer there.

He Said:
It's likely going to take quite some time for those URLs to either drop out of the index or be re-crawled again, so I would not expect to see that number significantly drop in the near future (and that's not a bad thing, it's just technically how it works out).
We also be familiar with the index status report is slowed down by about a week or so.