To fix this error, check first if the URLs you are blocking are important pages or not. If these pages are important and are accidentally blocked in your robots.txt file, simply update it and remove those URLs from the file. Make sure that those URLs are not blocked anymore using the robots.txt Tester at the old Google Search Console version . … [Read more...] about How to Fix Index Coverage Errors in Google Search Console
Reported error 0x800ccc92
I suspect the normal crawl errors work fine for most issues but the news specific issues are gone. As I covered at Search Engine Land weeks ago, those specific errors included these errors Article disproportionately short, Article fragmented, Article too long, Article too short, Date not found, Date too old, Empty article, Extraction failed, No sentences found, Off-site redirect, Page too large, Title not allowed, Title not found, Uncompression failed, and Unsupported content type. This was also removed from the Google help document. … [Read more...] about Google Search Console Crawl Errors For News Publishers Gone
On Google Search Console, some of the features in the old version are going away as soon as March. But Google brought some new features to the URL inspection tool, did some message consolidation and added the security feature to the tool. Google ran lots of tests, did some things on local and much more. … [Read more...] about February 2019 Google Webmaster Report
As the BrightEdge graph above conveys, the number of keywords ranked on page one moved from about 9K keywords to 6K keywords in one week. These rankings shifts resulted in immense losses in Google traffic. Bing wasn’t impacted. … [Read more...] about A Super Fresh Google Index? Server Errors & Rankings Impacts
Ability to download all crawl error sources. Previously, you could download a CSV file that listed URLs that returned an error along with the pages that linked to those URLs. You could then sort that CSV by linking source to find broken links within your site and had an easy list of sites to contact to fix links to important pages of your site. Now, the only way to access this information is to click on an individual URL to view its details, then click the Linked From tab. There seems to be no way to download this data, even at the individual URL level. (Update 3/17/12: This detail is still available from the API-based crawl errors feed.) 100K URLs of each type. Previously, you could download up to 100,000 URLs with each type of error. Now, both the display and download are limited to 1,000. Google says “less is more” and “there was no realistic way to view all 100,000 errors—no way to sort, search, or mark your progress.” Google is wrong. There were … [Read more...] about Google Webmaster Tools Revamps Crawl Errors, But Is It For The Better?