Knowing how to submit websites and individual pages to search engines is an essential skill for SEO professionals and webmasters alike.Whether you’re building a new website or simply adding new content, knowing the ins and outs of indexation is key.What You’ll Need Before SubmittingFirst, you’ll need access to edit your website.Some people may refute this and claim backend web access is not necessary to submit a website to search engines. Well, they’re right.However, there are some cases where you’ll need access to a website’s backend.Situations Where You’ll Need Backend AccessThe website doesn’t have a sitemap.The website doesn’t have a robot.txt file.The website doesn’t have Tag Manager or a way to verify Google Search Console/Bing Webmaster Tools access.If your client or IT team doesn’t allow you to have access to their backend, or your CMS has certain limitations, see if you’re able to obtain FTP access. This … [Read more...] about How to Submit Websites & Pages to Search Engines: A Simple Guide
Google's JohnMu warned one webmaster over the New Years break in a Google Webmaster Help thread to never ever list URLs with session IDs in the Sitemap XML file.John said:If you are not submitting clean URLs in your Sitemap file, you'd be better off not using a Sitemap file. With session-IDs in there, it'll cause more problems (with us crawling and indexing those URLs) than if you just let us crawl your website normally (especially if you really have a clean URL structure). So my advice would be to either delete the Sitemap file, or make sure that the submitted URLs are really exactly the same, clean ones that we find while crawling.To most of us, this is obvious. But sometimes the obvious needs to be said.Sending Google duplicate URLs for the same landing page is asking for trouble. Why hand Google duplicate content on a silver plater? That is what you are doing when you are listing these URLs in a Sitemap file. If you have duplicate content on your site, and you don't block it … [Read more...] about Never List URLs With Session IDs In A Google Sitemap
Back in November 2009, Google News announced they were “in the midst of an exciting transition period” that included a change to the News Sitemap Protocol. News publishers have through April 2010 to modify their News Sitemap to accommodate the new format. What’s so exciting and transitional? I asked Google, thinking that they were changing the protocol to prepare for some exciting new things in Google News. I was a bit disappointed in the answer, then, when they told me the exciting transition was simply the change to the protocol itself. The changes do make things a bit easier for News publishers though in a couple of ways: You can now reference your News Sitemap in your robots.txt file or ping Google with its location, rather than submitting via Google Webmaster Tools (I would still recommend submitting via Webmaster Tools the first time for the benefit of the parsing error information) You can now combine articles of multiple types into one News Sitemap. … [Read more...] about The Latest On Google News Sitemaps
Since Google launched XML Sitemaps back in 2005, they’ve added specialized formats to enable site owners to submit content other than web pages. Until now, site owners have had to create separate Sitemaps for each content type: That’s now changed. You can now create a single XML Sitemap that contains any combination of these content types. The Google Webmaster Central blog post doesn’t mention News Sitemaps, so presumably news content can’t be mixed with the other types. Great news for site owners? Possibly. It may be easier to create and maintain a single Sitemap in some cases, but the lowest overhead way to create and maintain Sitemaps generally is via a script that creates and updates the file automatically. And it might be easier to keep track of things separately. Indexing metrics Certainly, from a metrics perspective, it may make sense to keep content types separated. When you submit an XML Sitemap to Google Webmaster Tools, you can see a report of the … [Read more...] about Is the Sitemaps Alliance Over?
This new method for submitting URLs to Google is limited, so you should use it when it’s important that certain pages be crawled right away. Although Google doesn’t guarantee that they’ll index every page that they crawl, this new feature does seem to at least escalate that evaluation process. To better understand how this feature works, let’s take a look at how Google crawls the web and the various ways URLs are fed into Google’s crawling and indexing system. How Google Crawls & Indexes the Web First, it’s important to know a bit about Google’s crawling and indexing pipeline. Google learns about URLs through all of the ways described below and then adds those URLs to its crawl scheduling system. It dedups the list and then rearranges the list of URLs in priority order and crawls in that order. The priority is based on all kinds of factors, including the overall value of the page, based in part on PageRank, as well as how often the content … [Read more...] about Fetch, Googlebot! Google’s New Way To Submit URLs & Updated Pages