Referring domains 25
Organic traffic 8314
Data from Content Explorer tool.
But here’s the thing: it takes time for search engines to discover (and index) new websites.
This means that if you search for your two-day-old website in Google, chances are you’ll be disappointed. In fact, 9 times out of 10, you’ll find nothing.
However, even a few weeks after going live, some people still won’t see their website in Google/Bing/Yahoo (which account for 98% of all US searches).
If you’re reading this article (which you clearly are), you’re probably one of these people. Now, you want to know the fastest way to submit your website to the search engines.
If so, you’ve come to the right place; here’s what we’ll be discussing in this article:
- How to submit your website to Google, Bing, and Yahoo (and why you don’t always have to);
- How to make sure it’s actually indexed (and what to do if it isn’t);
- Why submitting your website to Google ≠ rankings (and how to fix this);
- Why you still need to submit your website to search engines.
Let’s start with how to submit your website.
How to submit your website to Google, Bing, and Yahoo
It’s very easy to submit websites to Google/Bing/Yahoo.
Just watch our quick video tutorial or follow the written steps below.
All you have to do is enter a URL and click submit.
Yahoo is powered by Bing’s index, so there’s no need to submit your website(s) to Yahoo; Bing submission/indexation will automatically result in submission to Yahoo. This means that unless you’re targeting China, in which case Baidu will be a priority (76%+ market share), Google and Bing are your only priorities. We won’t be focussing on Baidu in this article but here’s a good guide to Baidu submission.
However, this only allows you to submit one URL at a time, which can be quite a hassle if you want to submit every page on a large website.
We, therefore, recommend submitting via Google/Bing Webmaster Tools.
How to Submit Your Website Via Google/Bing Webmaster Tools (Recommended)
Google/Bing Webmaster Tools allows you to submit multiple pages at once by submitting your sitemap(s). Sitemaps also have the added benefit of supplying a few other useful pieces of information about your web pages to the search engines, including:
- Change Frequency — gives search engines an indication as to how frequently a page/posts content is likely to change;
- Last Modified Date/Time — tells search engines when the page/post was last modified;
- [crawl] Priority — tells search engines how important you deem a particular page/post to be in comparison to other pages on the same domain.
An example of a sitemap from websitesetup.org.
I won’t go into the intricacies of creating a sitemap in this post. However, if you’re using WordPress (or another CMS), there are many plugins/tools for generating sitemaps automatically. We recommend Yoast SEO. For static websites, this sitemap generator works well. If you prefer a more manual/customized approach, follow this guide by Screaming Frog
Let’s start with Google.
Once verified, click on the property name and go to Crawl > Sitemaps (on the side menu), then click the Add/test sitemap button and enter your sitemap URL (hint: this is usually something similar to www.yourdomain.com/sitemap.xml)
Some plugins will automatically create multiple sitemaps for the various sections of your website (e.g. post_sitemap.xml and page_sitemap.xml). This isn’t a problem; just add them all within Search Console.
That’s it — your website is now submitted.
It’s a similar process for Bing.
Sign up for Bing Webmaster Tools, click “add your site”, then fill in the form. This will ask for your URL (i.e. domain), sitemap URL, and a few other pieces of information.
Click “submit” and you’re done.
Remember, submission to Google/Bing doesn’t guarantee that your website will be indexed. It’s, therefore, super-important to verify this.
How to check if your website is indexed (and what to do if it isn’t)
Using a site: search in Google/Bing is the quickest way to check for indexation.
Here’s the syntax: site:http://www.youdomain.com/page-to-check/
Using the “site:” operator to check whether or not my list of link building strategies has been indexed in Google. Learn more about Google’s advanced search operators here.
If you want to check for indexed pages across a whole site, just use the site: operator with your domain (not an individual page).
No results? Your website/web page probably hasn’t been indexed.
Don’t worry if this remains the case for the first few days after submitting your website/web page. If it still hasn’t been indexed after a week-or-two, there may be an issue.
If you submitted via Google/Bing Webmaster Tools, you can also see a rough indication as to how many pages on your website are now indexed within Search Console.
Screenshot taken from Google Search Console. This report can be found under Crawl > Sitemaps.
If you notice the discrepancy between the number of web pages submitted and indexed (or if your website/webpage isn’t showing up using a site: search), try checking the affected web pages for these common errors:
- The web page has a noindex tag — this tells Google, and other search engines, not to index a page. You can check for the presence of a noindex tag on a web page, which you can then remove (if required), by searching the HTML for either: <meta name=“googlebot” content=“noindex”>, or <meta name=“robots” content=“noindex”>, x-robots-tag: noindex;
- Indexing blocked by robots.txt file — every website has a robots.txt; this gives robots/crawlers/spiders a bunch of rules to follow, such as where they can/can’t crawl, and the pages they can/can’t index. You can check whether a URL is blocked by robots.txt with Google’s Robots Testing Tool. Simply enter a URL and it’ll return allowed or blocked.
- Indexing blocked by .htaccess file — .htaccess is a configuration file for websites running on Apache-based web servers (which is roughly 50% of websites). If your .htaccess file contains the following line of code: Header set X-Robots-Tag “noindex, nofollow”, it should be removed in order to allow Googlebot to index your website.
If the checks above don’t uncover any issues, it may be the case that the page isn’t indexed because it doesn’t provide enough value. It may also point to a bigger technical problem, in which case, it may be worth employing the skills of a technical SEO expert.
But still, even if you don’t encounter any indexation problems, you’ll find that your website won’t be visible on the front page for the search queries you’re targeting.
For example, if you had a page about YouTube SEO, it’s extremely unlikely that you’ll magically rank #1 in Google for the term “YouTube SEO” just because you’ve submitted it to the search engines.
Indexation may be quick and easy, but ranking takes time and effort.
Why submitting your website to Google ≠ ranking in Google (and how to fix it)
Most Google searches return thousands, if not millions, of results.
25M+ results for the term “youtube SEO” in Google.
This number also correlates (roughly) to the number of indexed pages that are related to your search.
But most people never click beyond the first page of results (which only shows 10 web pages), meaning there will be little/no traffic for those ranking in positions 11+. Therefore, simply being indexed won’t necessarily bring organic traffic.
If you want organic traffic, you need to rank in the top 10 (or ideally, the top 3).
So, assuming your page is objectively relevant to the search query you want to rank for (and at least slightly better than the current top 10 ranking pages), how do you rank in the Top10?
In short, you need links.
Google counts links from other websites as votes. So, when they’re presented with two (or more) pages on the same topic, the page with the most links (i.e. votes) will usually rank higher than the page(s) with fewer links.
This means that in general, the more links you have, the higher you’ll rank.
You can explore the SERPs for your target keywords, and see how many links the top 10 ranking pages currently have, using Ahrefs Keywords Explorer.
Here’s how to do it:
- Go to Keywords Explorer and enter the keyword(s) you want to rank for;
- Scroll down to the SERP overview to see the top 10 ranking pages.
Screenshot taken from Ahrefs’ Keywords Explorer. It shows the top 10 ranking pages for the keyphrase “YouTube SEO”.
In this example, if you wanted to rank #1, you would need 882+ backlinks — this is because the web page currently ranking #1 already has this number of backlinks.
It’s perfectly possible for one website to outrank another with fewer backlinks, assuming those backlinks are of extremely high quality and are highly relevant to the subject nature of the page itself. Always aim for quality over quantity when building links.
However, links aren’t the only factor of importance; you also need to make sure your on-page SEO is up to scratch and that your keyword targeting is on-point (there’s no point pursuing keywords if they’re too competitive).
Here are a few articles to help you with these steps:
- The Noob Friendly Guide To Link Building;
- Deconstructing Linkbait: How to Create Content That Attracts Backlinks;
- Broken Link Building: How to Build Quality Backlinks by Fixing the Web
- On Page SEO: A (2M Keyword) Data Driven Analysis;
- On-Page SEO in 2016: The 8 Principles for Success (Whiteboard Friday);
- On-Page Ranking Factors
- How To Do Keyword Research in 2017;
- How To Gauge Keyword Difficulty And Find The Easiest Keywords To Rank For
- 4 Ways to Find Untapped Keyword Ideas With Great Traffic Potential
- Long Tail Keywords: how to get TONS of traffic from ‘unpopular’ search queries
But, given the fact that submitting a website to Google is unlikely to result in overnight rankings or traffic (thanks to the need for links and many other factors), why should you even bother submitting to search engines?
Why it’s still good practice to submit your website to search engines (even though it isn’t always necessary)
Google wasn’t built to rely on manual submissions (in fact, many people don’t even realize manual submission is a thing).
That’s why they rely heavily on a number of other data sources to discover new websites/web pages (and keep their index fresh), such as their own crawling operations.
Basically, crawling is when Google looks for new links being added to websites then follows them to see what they lead to. If they lead to something useful, they’ll add those pages to their index (hint: this is another reason why links are so important!)
For example, when someone tweets a link to a web page, Google may see that link and decide to add that web page to their index.
Matt Cutts explains more about crawling and how it works in this video.
It’s also theorized that Google uses many other data sources, such as Chrome browser usage statistics, Google Analytics data, domain registration data, and more.
With all this data, search engines are pretty good at discovering new websites on their own.
However, there are still many reasons why you should directly submit your website to search engines, such as:
- It’s better to be safe than sorry — Google will probably be able to find your website/web page(s) without the need for manual submission, but is “probably” good enough? My thinking is that it’s better to be safe than sorry, especially when submitting your website only takes a minute-or-two;
- Google/Bing can’t figure out everything via crawling — If you submit your website via Google/Bing Webmaster Tools, you’ll have the opportunity to supply Google/Bing with a few useful pieces of information about your website. This is information they cannot possibly obtain via crawling alone;
- It helps you to improve your website — Google/Bing will offer some insight into how they view your website via Webmaster Tools. There are also various tools for testing your web pages and they’ll even alert you if potential problems/errors occur on your website.
In fact, even if your website is already indexed, there are still times it may make sense to submit/resubmit manually including when:
- You change/update a page and want search engines to notice — Search engines can’t possibly re-crawl the entire web in a day, so chances are they won’t usually re-crawl your updated page right away. You can use the Fetch as Google tool within Search Console to ask Google to prioritize the re-crawl;
- You fix errors on your website — Sometimes Google will alert you of crawling errors or 404’s within Search Console. It makes sense to instruct Google to recrawl these URLs after you fix these errors.
It’s highly likely that Google/Bing will discover and index your website, regardless of whether or not you choose to submit. However, we still recommend submitting manually via Google/Bing Webmaster Tools as it’s simply not worth leaving it up to chance.
It’s also important to remember that indexation is only part of the battle; even if your website is indexed, you probably won’t rank unless you have high-quality, topically relevant page. Also, you need links.
Links are one of Google’s top 3 ranking factors. If you have links from powerful, relevant websites, Google will not only be sure to know your web page exists but also, be much more likely to rank said page.
- Supreme Court says ‘a legal solution has to be arrived at’ as Government expresses helplessness over search engine results
- Easy, Essential Search Engine Optimization (SEO) for Lawyers and Law firms
- Search Engines Have The Expertise To Stop Ads For Sex Selection Tests, Sanjay Parikh Tells Supreme Court
- A Law Firm Search Engine Optimization (SEO) Guide to Understanding Your Website’s Rankings
- What is Search Engine Optimization Manipulation?
- What's working for lawyers seeking better search engine results
- SC Directs Search Engines To ‘Autoblock’ Search On Pre-Natal Sex Determination [Read Order]
- Search Engine Optimization (SEO): The Connection Between Being Found Online and Being Worth Finding
- Having Multiple Law Firm Websites Is a Dead Search Engine Optimization (SEO) Strategy
- Law Firm Search Engine Optimization (SEO): Five Common Mistakes to Avoid