If you have your website or blog registered under Google Webmasters then it always nice to see no crawl errors as shown here. ‘No errors in the last 90 days‘ is possible if you keep an eye on Google webmasters almost every day or atleast once in a week. Most of them just register with webmasters and forget about it or in the worst case people just create a website (own or a free one) and don’t register them in Google webmasters.
I have seen multiple time that even if you keep a close watch on webmasters for errors and keep your website clean by fixing all crawl errors and other issues, there are chances of the same crawl errors reappearing after sometime. When an error is listed under current status, it will appear as it is till the issue is fixed under Google webmasters. The first thing that most of us do is to try ‘Fetching the link as Google’. Unless the underlying issue is fixed, marked crawl error will reappear the next time Googlebot crawls that URL.
Also check: How to fetch your website as Google
While doing so, if there is no issue in fetching the link, then you can submit for indexing and if it fails then you might have to try this again after sometime or check for the link manually if the link exists or not. If it exists and the page loads without any error or issues, then try again or if successful then go ahead with the resubmission of link for indexing. If the Googlebot is reporting an error with the link that doesn’t exist. Say, you had a link which was indexed before by Googlebot and after indexing you have modified that link and have resubmitted the new link for indexing. The new link will be indexed, but at the same time you have an already indexed link which has to be removed from Google cache and indexing. The old URL which was indexed and is now modified has to be removed from Google asap.
Also check: How to remove URL from Search Engine Results
Failing to do so, can cause crawl errors and will be reported under Google Webmasters even long after the new link is indexed. It’s always better to remove the old link or webpage, or outdated content from your website and Google. You can even redirect to a new updated webpage if you want to.
You will have to remove or redirect all the dead webpages to avoid such crawl errors. You don’t have to worry about link containing session IDs etc. Also, as soon as you fetch as Google the errors will still be listed under Google webmasters for a day or so. So wait for atleast 12 to 24 hours for Google to update. If all are fixed then you should not have any issues or errors listed.