The biggest breakdown in website architecture is a navigation structure that prevents visitors from easily finding the information they need and blocks search engines from indexing your content. It doesn’t matter how perfectly your site is optimized if your site navigation fails to get searchers and search engines to your content. Ensuring your internal navigation structure and links are set up ensures proper search engine spidering and helps visitors find information they need quickly.
Most people think of their website’s navigation is little more than the main navigation bar that displays at the top or right side of each web page. That’s a significant part of it, but in truth, there is a lot to know about how every link on your site should work in order to maintain a structurally sound, search engine friendly, and user-optimized website. But the main navigation is a good place to start…
Build an Efficient Navigational Structure
When it comes to navigation, what is “efficient” for one site may not be efficient for another. Each site is unique and will have unique navigational characteristics. Here are some points to consider:
- Should your navigation be on the top, the side, or a combination of both?
- Should you include only main categories or sub-categories as well?
- Should you use drop down or fly out menus?
- Do you have enough room to fit your key pages in the navigation?
- If not, what should be moved “off” the main navigation?
- What other information needs to be presented in the navigational area?
These are just a few key questions that will need to be answered before determining the best navigational set up for your site.
However you choose to layout your site navigation, there are some “essential” navigational items to consider:
You should always include your logo with a home link at the top of every webpage. This helps identify your site and provides continuity from page to page.
Besides your logo image home link, you need a “home” link in the navigation bar. Not everyone knows logos link back to the home page, so including this obvious link ensures anyone can find their way there.
About Us Link
Don’t make it too difficult for people to learn more about you. Including a link to the About Us page in the main navigation helps build trust and credibility.
Provide a clear and obvious link to your Contact Us page so visitors don’t have to dig for it. Hidden contact information is a common frustration for visitors. While it might cut down on the “unnecessary” calls, it also cuts down on sales from visitors who won’t do business with someone they can’t easily communicate with.
Many businesses don’t post their phone number, instead they redirect visitors to web forms. This can reduce the need for manpower, but customers need to feel secure that if they have a problem, a real person can be reached. Forms and email are too impersonal for some people’s liking.
Giving visitors a way to search your site via in-site search allows them to bypass the “search by navigation” option and get directly to the information they are looking for. Every click eliminated increases your conversion rate exponentially.
For e-commerce sites, having shopping cart/checkout/basket links helps keep visitors engaged in the shopping process, making it easier to convert them from browsers to customers.
Navigation Design Matters
The links on your navigation bar are only one aspect of building an effective navigational structure. How those elements all function together makes a big difference as well. Recently there has been a move by web designers to try to cram more and more links into their main navigation options, which reduces the number of clicks it takes to get to content. Reducing clicks is almost always a good thing, but sometimes the needs of the visitor can get in the way of the overall site performance.
There are two types of navigational menus that I’m not a very big fan of, typically because when incorrectly implemented, it can cause problems for both searchers and search engines.
A sitemap menu is essentially a top-tier navigation menu that has way too many links (i.e. a link to almost every page of the website.) This is usually done via drop down or fly out menus. The problem with sitemap menus is that by linking to every page from every page there is virtually no page hierarchy. Sub-categories are determined to be on the same level as their parent category. Visually we see this, but the search engines don’t. This can pose problems when you want to use that hierarchy for good site architecture.
The problem with flyout menus is they are often difficult to use. The most common frustration is with disappearing flyouts. The visitor places their mouse over a category, then moves to the right where the sub-categories appeared. As they do, their mouse goes too high or too low the flyout closes. So frustrating! In the image example above, Home Depot fixed this usability issue by creating a delay for the flyout menu closure.
Minimize Footer Navigation Links
In the early days of web design, footer links were used to duplicate the main navigation in case the visitor had images turned off. Today, very few navigation menus use images so there is no need for this duplication. Unfortunately for many site developers, the practice still holds.
Recently the footer has been used as an alternative to having a sitemap navigation in the main menu. Instead, it’s just created in the footer navigation menu. Unfortunately, the problems are the same, regardless of where you do it.
The Home Depot doesn’t have EVERY link in their footer, but it sure is a big list that could easily be truncated.
The best use for your footer navigation is to have a few key pages and links without interfering with the visitor’s primary shopping experience. It’s really a good place for “housekeeping links”. But again, no need to display them all, just get visitors to the main pages.
Institute Site-Wide Breadcrumbs
Breadcrumbs are a great navigational tool that helps both visitors and search engines easily navigate your site.
First, it provides visitors with visual indicators as to where they are on your site how to conveniently move back without returning to the main navigation. When you use appropriate keywords in your breadcrumbs, you also give visitors an idea of what a page is about. This allows you to use headings for more than just category titles, since the breadcrumb covers that for you. This gives you the freedom to make headings more compelling.
Using those keyword indicators in your breadcrumbs also helps your SEO. These breadcrumbs are keyword-rich on-page links to the page on your site that best represents those keywords. Outside of your main navigation, breadcrumbs give you some of your best internal linking opportunities.
In addition, breadcrumbs help build stronger search hierarchy of your category, sub-category, and product pages. This internal cross-linking helps the search engines understand the relationships between these pages to determine the importance, level, and value of each page.
Use Keyword-Rich Link Text
The keyword-in-link strategy needs to be used beyond your navigation and breadcrumb structure. It must also be used throughout your site whenever you link to other content or pages.
The image below shows an example of how many people link to other websites. The link text reads “click here” or “read more”. Or in this case, both! The problem with this type of linking is the words in the link don’t provide any indication as to what information the visitor will find. They have to read the content around the link to know for sure.
However, if you look at this next image, you’ll see that the links provide both the search engines and shoppers with a clear indication as to what information they’ll find on that link. This extra indication provides search relevance to Google’s algorithms and an additional tag of relevance.
Click here to learn more about preparing personal tax returns.
You can simply say:
Learn more about preparing personal tax returns.
Many people prefer to have a “click here” call to action. That leaves you with two alternatives:
Click here to learn more about preparing personal tax returns.
Click here to learn more about preparing personal tax returns.
Neither of these are great options, though I would opt for the second over the first. Using two links for the same page does not provide keyword value, as search engines often assume the first link is the best text to use and ignore the rest. The second option is still quite cumbersome, but workable.
As a general rule, if your content mentions something where more detailed information is found on another page, then you want to link to that information using keywords. With that said, don’t go overboard and flood your text in a sea of links. Keep a decent balance between content and links pointing to other pages.
Link to Related Topics or Products
One of the best opportunities for keywords links is to link to related topics or products. When linking to similar products, you’re doing more than adding link equity to your site. You’re giving visitors more of what they want.
In the example below, the website assumed that my interest in Battlestar Galactica might mean I’m interested in Firefly and Stargate TV shows. Guess what? They are right! I enjoy both of those shows, and if I didn’t already have Firefly on Blu-Ray, I might have bought it at the same time.
Each of these blog posts might be of interest to the reader and, if the blog post titles are keyword optimized, more keyword weight is passed on for the algorithms to factor in.
What Do You Do With Hundreds of Products?
E-commerce sites have an especially difficult challenge of managing their internal link structure while giving options for filtering products without creating duplicate content and looping pages. I’m not sure I’ll be able to produce the “perfect” solution, but I can provide some tips to help you develop your internal links and eliminate potential problems.
Filters are a great way to drill down to find the product that best meets your need. The problem with filters, from a search engine perspective, is they can create an endless loop of options. If you prevent these filters from getting spidered by the search engines then you potentially lose good landing pages and are feeding the search engines category pages with a long list of product links. Somewhere there has to be a balance between duplicate filter pages and fewer products per page.
The place I would start is figuring out which “filters” would be worthy of having their own optimized page. For example, filtering by brand would give you a great page to optimize for brand keywords. However filtering by color probably doesn’t warrant its own optimizable landing page, since few people search by color. They find the item they want then look for their favorite color. (Though this might depend on your industry. For some, color might actually make a good landing page.)
Once you have an idea of which filters need their own landing pages, build those filters to generate unique URLs with unique content. The rest of the filters can simply be excluded from the search engine indexes, either by using on-the-fly filtering that doesn’t take the visitor from the URL or takes them to URLs that the search engines are prohibited from indexing. (I like the former option better.)
Even still, some of your filter landing pages might have too many products. If this is the case, look at sub-level filter landing pages. Otherwise, I suggest creating a page that displays all products, rather than forcing the search engines to spider page multiple pages.
Don’t Link to Shopping Cart Pages
However you implement, the end result needs to be unspiderable links. But don’t stop there; do the same for links to write reviews, printer friendly pages, add to wishlist, add comments, and other links that typically create pages that the search engines don’t need get into. While you can simply keep the search engines from indexing those pages, the best solution is to ensure they can’t read the links to begin with.
Internal search is a navigational issue but can also lead to some linking problems. Let’s start with the navigation side of things.
To start, every site with more than 10 pages should use an internal search. Your main navigation should do a great job of getting people to the information they want with as few clicks as possible. However sometimes visitors just want to search in order to get what they want more quickly. A good internal search will do that, but only if it works!
An internal search should get visitors to their destination 90% of the time. If a visitor searches your site and doesn’t find what they are looking for, the will leave, assuming you don’t have the product. Just like that, you lost a sale. No search at all is better than a search that doesn’t deliver visitors to what they want 90% of the time.
The linking problems produced from internal searches occur when sites allow the search results pages of their site to be indexed by the search engines. Search engines have been known to “search” sites for random words and characters. This produces a potentially unlimited number of URLs for the search engine to index, most of which won’t provide any value to the site and simply slow down or diminish the valuing process of the rest of the site.
Avoid Blocking Content Behind Forms
Some forms are used to help visitors find content faster, however the use of such forms can often be an inhibitor, keeping search engines away from content that you want in the index so it can rank and drive traffic. Anytime a visitor has to put in a password, or make any kind of form selection, that content will not help your SEO efforts. If it is your intent to keep that information hidden, then you’re fine. However, if you want to use that content to drive traffic, then you should rethink the way people access it.
XML sitemaps are a good way to get your content spidered by the search engines, but don’t use it to supplement an HTML sitemap for your visitors. A good sitemap can be a great way to help visitors quickly navigate to any page on your site in just a few clicks. It bypasses the main navigation and instead lets visitors see the entire site from a birds-eye view.
It also provides a way to ensure the search engines don’t have to spider every page to get to pages several clicks away from the home page. It can help get these pages indexed and weighted sooner, and keep them fresher in the index.
What’s the difference between an absolute and relative link? An absolute link hard codes the entire URL into the HTML:
A relative link allows you to put in only what’s needed in the code for the browser to know how to get to the destination based on where the visitor currently is:
There less code needed for a relative link and many HTML editors use relative links rather than absolute links. Web developers benefit by using relative links when building websites on a testing URL. When the site rolls out they don’t have to worry about changing the links because the relative links use the site structure, rather than the domain name, to work. When the developers change the test domain to the main domain, all is well.
However relative links leave room for problems and, in some cases, outright disaster. Without going into all the potential problems relative links can cause, I will say that absolute links are pretty absolute and leave very little room for error when the link is properly formed. The downside is when transferring a site from a test domain to the real domain, you must ensure those links are changed accordingly.
Using Search Engine Friendly Links
How your links are coded can make the difference between getting the page spidered by the search engines and preventing the search engines from finding the page.
Good, search engine friendly, HTML links typically look like this:
If this is what your links look like then you know your links are built right. But for a variety of reasons, some links are coded differently. It’s these links that can be unspiderable to the search engines. Here are a couple examples:
The first example above is completely useless to the search engine. The page this links to cannot be found, and unless there are other search engine friendly links pointing to the page somewhere out there in the world, that page will forever remain outside the search engine index.
Blocking Pages From Being Indexed
As I mentioned above, there are times when you don’t want pages in the search engine index. There are a few options for blocking those pages. Each has pros and cons as well as merit in certain circumstances rather than others.
The robots.txt file is your “master control” for telling the search engines what pages to or not to index. When search engines first visit your site they are supposed to first download this file to get their instructions on what to do.
As you can see, it’s a rather simple file, but it’s one you don’t want to get wrong. The main key here is to ensure you’re only disallowing pages or folders that that you don’t want in the index and not inadvertently disallowing something that you do want. There are plenty of tutorials written to help you craft the robots.txt file specific for your needs, including allowing or disallowing certain search engines. Google Webmaster Tools also has robots.txt file helps that you can use.
However, I cannot stress enough the importance of getting this page right. One wrong move can wipe out all your search engine rankings almost instantly.
The nofollow attribute was originally designed to append to blog comment links to tell the engines that you don’t trust those links and therefore they should not consider them a vote of confidence from you. Providing a vote of confidence to a known spam site can hurt your link profile. However, it quickly evolved into an advertisement attribute. Since Google doesn’t want people paying for links to get link credit they require that all advertising links use the nofollow attribute. Failure to do so could hurt your back link profile and ability to rank in the search engines.
In simple terms, the nofollow attribute tells the search engines not to follow that link:
In reality, the search engines can and will follow these links, however whatever positive or negative value that link would otherwise give you is eliminated.
Robots Meta Tag
Another option for keeping pages from being indexed is to use the robots meta tag on the page itself:
Once the search engine spiders the page, the “noindex” portion of the tag tells them not to put it in their index. If it’s already in, it will promptly be removed. If it’s not, it won’t be added.
Using this tag has several downsides. Namely, the search engines have to spider the page before they can receive the instructions not to index it. If you have a lot of these pages, then the search engine will use up resources spidering pages not to index them rather than to use their resources on pages you want in the index. Plus, you are potentially using valuable link equity by linking to these pages that won’t do you any good.
With that said, sometimes you need pages to be found because the links in them cannot be found easily on any other page. A good example of this would be a tag page in your blog. The tag pages aren’t good landing pages by most counts so you don’t want those in the search index. But you do want the search engine to follow those links to the blog post. Using the “noindex,follow” attribute tells the engines not to index but to go ahead and follow the links.
If you don’t want the pages indexed or links to be followed then go with “noindex,nofollow.” On the other hand, if you want the page indexed you can use the “index” command. However, since search engines will index by default, this adds nothing as the search engines don’t really do as commanded.
Keep Fixing Broken Links
Broken links happen for a variety of reasons. Sometimes you’ve removed pages or you’ve linked to a page on another site that has been removed. For this reason it’s a good practice to regularly check for and fix any broken links you find. This can become a chore for blog sites as blogs often do a lot of external linking and there is a huge amount of turnover on the web. But once you keep up, a monthly check or so should do the trick to help you keep broken links under control.
301 Redirect Moved or Removed Pages
Even if you remove or change the link to any pages that no longer exist on your site, its good practice to 301 redirect those links to the current relevant content. Without the 301 redirects you lose all the link equity built into those old pages. Even if just one site links to it, you should implement the redirect, unless you can request that the link on the external site be changed.
With 301 redirects in place, you not only maintain your link equity for each page removed, you also are able to keep visitors on your site that may visit those pages through their browser bookmarks. You never know how many people have bookmarked that content. Without the redirect in place there is a good bet the visitor will move on to another site. That could have resulted in a new link or even a sale that you won’t be able to capture without the redirect in place.
Who said internal linking and navigational structure was easy? There are certainly a lot of nuances to maintaining them! Without any one way to do it right, there’s a lot of ways of doing it wrong that can be detrimental to your online success.
I should also note that there are other issues regarding linking and navigation that were covered in The Complete Guide to Mastering Duplicate Content Issues. But for the sake of brevity (too late) and avoiding duplication (pun intended), I did not include that information in this post. Follow the link above to read all about those!
- The Complete Guide to St. Patrick's Day in Dublin
- Mardi Gras 2019: A Complete Guide to the St. Louis Fun
- Tampa Mayor’s Race 2019: A Complete Guide
- The complete guide to Gasparilla Arts Month in Tampa
- All the tearooms, all the times: A complete guide to Charleston's spring lunch tradition
- A complete guide to Charleston's spring tea room tradition
- St. Patrick's Day in St. Louis, 2019: A Complete Guide to the Fun
- Record Store Day in St. Louis 2019: A Complete Guide
- 'Roblox' Pizza Party Event Guide: How to Get Boombox Backpack, Pinata Hat and More
- Masters Results, Leaderboard, Tee times, TV Schedule: What Time Does Tiger Woods Tee off Sunday at Augusta?