In the battle between search engines and some mainstream news publishers, ACAP has been lurking for several years. ACAP — the Automated Content Access Protocol — has constantly been positioned by some news executives as a cornerstone to reestablishing the control they feel has been lost over their content. However, the reality is that publishers have more control even without ACAP than is commonly believed by some. In addition, ACAP currently provides no “DRM” or licensing mechanisms over news content. But the system does offer some ideas well worth considering. Below, a look at how it measures up against the current systems for controlling search engines. ACAP started development in 2006 and formally launched a year later with version 1.0 (see ACAP Launches, Robots.txt 2.0 For Blocking Search Engines?). This year, in October, ACAP 1.1 was released and has been installed by over 1,250 publishers worldwide, says the organization, which is backed by the European … [Read more...] about ACAP Versus Robots.txt For Controlling Search Engines
Permissive robots txt
After a year of discussions, ACAP — Automated Content Access Protocol — was released today as a sort ofrobots.txt 2.0 system for telling search engines what they can or can’t includein their listings. However, none of the major search engines support ACAP, andits future remains firmly one of "watch and see." Below, more about the how andwhy of ACAP. Let’s start with some history. ACAPgot going in September 2006, backed by major European newspaper andpublishing groups that in particular felt Google was using content withoutproper permissions and wanting a more flexible means to provide this thanallowed by the long-standing robots.txt and meta robots standards. These two standards are found at the robotstxt.org, and ACAP has been referring to them often at "RobotsExclusion Protocol" or REP, though within the SEO world, they’re generally knownby their actual names. Robots.txt was born in 1994 as a way to block content on a server-wide basis;meta robots … [Read more...] about ACAP Launches, Robots.txt 2.0 For Blocking Search Engines?
The Search Console (or Google Webmaster Tools as it used to be known) is a completely free and indispensably useful service offered by Google to all webmasters. Although you certainly don’t have to be signed up to Search Console in order to be crawled and indexed by Google, it can definitely help with optimising your site and its content for search. Search Console is where you can monitor your site’s performance, identify issues, submit content for crawling, remove content you don’t want indexed, view the search queries that brought visitors to your site, monitor backlinks… there’s lots of good stuff here. Perhaps most importantly though, Search Console is where Google will communicate with you should anything go wrong (crawling errors, manual penalties, increase in 404 pages, malware detected, etc.) If you don’t have a Search Console account, then you should get one now. You may find that you won’t actually need some of the other fancier, … [Read more...] about Google Search Console: a complete overview
Please note: there is now a more up-to-date version of this guide – Google Search Console: a Complete Overview – published in May 2016. Google Webmaster Tools (GWT) is the primary mechanism for Google to communicate with webmasters. Google Webmaster Tools helps you to identify issues with your site and can even let you know if it has been infected with malware (not something you ever want to see, but if you haven’t spotted it yourself, or had one of your users tweet at you to let you know, it’s invaluable). And the best part? It’s absolutely free. If you don’t have a GWT account, then you need to go get one now. This guide to Google Webmaster Tools will walk you through the various features of this tool, and give you insight into what actionable data can be found within. (For more in-depth help, go to Google’s Webmaster Help.) Verification Before you can access any data on your site, you have to prove that you’re an authorized … [Read more...] about Google Webmaster Tools: An Overview
How good is the quality of your website, really? Here is a detailed list of questions every website owner should ask themselves. If the answer to every question below is yes, give yourself a big ole pat on the back. Most websites have flaws for some different reasons, mostly related to the limitation of resources. This 50 question long questionnaire might trigger something here and there for some of you or may bring some forgotten item from the long to-do list back into your mind. Is content structurally separate from navigational elements? Is the website optimized for mobile? How compliant is the website with W3C coding standards? Valid HTML/CSS? Are ‘alt’ tags in place on all significant images? Are text-based alternatives in place to convey essential information if this is featured within images or multimedia files? Navigation Are links labeled with anchor text that provides a clear indication of where they lead without over using exact match anchor text? Depth … [Read more...] about 50 Questions to Evaluate the Quality of Your Website