Google: Blocked URL Count Updates Slowly
To make an extended story short , they used the robots.txt to block hundreds of thousands of pages . Then finally they just wiped out the pages and eradicated the lines in the robots.txt to block them . But Google still shows them as being blocked in the Index Status report.
Google's John Mueller described that it may take quite a while for Google to re-crawl and observe the pages are no longer there.
He Said:
It's likely going to take quite some time for those URLs to either drop out of the index or be re-crawled again, so I would not expect to see that number significantly drop in the near future (and that's not a bad thing, it's just technically how it works out).We also be familiar with the index status report is slowed down by about a week or so.