Ability to download all crawl error sources. Previously, you could download a CSV file that listed URLs that returned an error along with the pages that linked to those URLs. You could then sort that CSV by linking source to find broken links within your site and had an easy list of sites to contact to fix links to important pages of your site. Now, the only way to access this information is to click on an individual URL to view its details, then click the Linked From tab. There seems to be no way to download this data, even at the individual URL level. (Update 3/17/12: This detail is still available from the API-based crawl errors feed.) 100K URLs of each type. Previously, you could download up to 100,000 URLs with each type of error. Now, both the display and download are limited to 1,000. Google says “less is more” and “there was no realistic way to view all 100,000 errors—no way to sort, search, or mark your progress.” Google is wrong. There were … [Read more...] about Google Webmaster Tools Revamps Crawl Errors, But Is It For The Better?
404 error 200 status code
Server response codes may appear to be errors at first glance; they’re most obvious when what the user wants to happen, doesn’t. On deeper inspection, these informational codes exist for every properly functioning online interaction. Server response codes, also called status codes, are feedback that your website is built correctly and web server functioning as intended. … [Read more...] about Beginner’s guide to server response codes
Someone at Microsoft has a violent allergy to standards of any kind. The best example ever is the way .net-based websites handle broken links. Instead of delivering a nice, normal ‘404’ response code — that would tell a browser or search bot that the link’s busted — .net returns a ‘200’ or ‘302’ code, depending on just how deranged the developer was at the time. … [Read more...] about 5 Whopping Lies That Keep SEO At Status Quo
To work around this ASP.NET deficiency, use ISAPI rewrite to create rewrite rules for the deleted pages to direct traffic to a custom 404 page called 404.aspx. This won’t work if somebody types in a bad URL, but as a practical matter we’re most interested in trapping errors caused by inbound links to deleted pages. By using wildcards, it’s easy enough to rewrite all your deleted URLs to the custom 404 page. When trapping errors, we silently rewrite the URL to avoid returning a redirect. Here’s sample rewrite code for the httpd.ini configuration file: … [Read more...] about URL Rewriting & Custom Error Pages In ASP.NET 2.0