Keyword research is a make-or-break stage in any SEO campaign. If you do it wrong, all your efforts (and money) are in vain. This is why you need a reliable keyword research tool. For many years THE tool for keyword volume research was the Free Keyword Tool from Google and it was all we needed but unfortunately, this great tool is history now.

The good news, though is that fortunately there are alternatives for keyword volume research. For instance, the new tool from Google with similar functionality ‐ Google Keyword Planner does many of the things the old tool did but unfortunately it is more suitable for PPC keyword research than for organic SEO research. There are also other tools, such as Bing Keyword Tool, or WordTracker Keyword tool, or the numerous paid tools and each of them has its advantages as disadvantages, as we'll see next.

Google Keyword Planner – Still the Best Keyword Volume Tool

The new Google Keyword Planner offers much of the functionality of the old Free Keyword Tool from Google and is a much better option then most of the other tools, including the paid ones. Here are some of the advantages and disadvantages of Google Keyword Planner:

Advantages of Google Keyword Planner

The advantages of Google Keyword Planner can be summarized as follows:

  • Data comes directly from Google itself. Obviously,data is more reliable when it comes from the source than from a third party.

  • Great possibilities for local keywords. If you are doing local SEO, then you will certainly appreciate the enhanced possibilities for local search ‐ you can target your search to your small town audience.

  • Suggested keywords are grouped. This isn't a new feature but it's convenient when you can see groups of keywords instead of single keywords only.

These aren't the only advantages of Google Keyword Planner but even they are strong enough to make you use it.

Disadvantages of Google Keyword Planner

The disadvantages of Google Keyword Planner are not that numerous and here are some of them:

  • You need an AdWords account. Since the tool is not available separately, if you want to use it, you need to create an AdWords account.

  • No Broad or Phrase search. The new tool lacks a key feature ‐ the ability to search for broad and phrase match. Now your search is limited to Exact matches only.

  • No device targeting. Now you can't search for volumes on mobiles and desktops separately ‐ all this is mixed into one.

Nevertheless, despite its disadvantages, Google Keyword Planner is still a better tool than its competitors.

Bing Keyword Tool – the Second Best Alternative

If you are not pleased with Google Keyword Planner, or simply want an alternative, you might want to check Bing Keyword Tool.

Advantages of Bing Keyword Tool

The advantages of Bing Keyword Tool are these:

  • The best tool if you optimize for Bing. If you have decided to concentrate your efforts on Bing, obviously this is the tool to use.

  • Data range search. Unlike Google Keyword Planner, here you can search for volumes during a particular data range. This is very useful, if you are interested in seasonal volumes, or holiday volumes for example.

  • You can see search trends as well. Unlike Google,where you need to go to Google Trends, if you need data over time, with Bing Keyword Tool this data is visible right away.

Disadvantages of Bing Keyword Tool

The disadvantages of Bing Keyword Tool are these:

  • The data is not quite applicable for Google searches.The first and major disadvantage is that the data you get is not always useful when you optimize for Google. Of course, you could use it to spot trends but very often you need more precise data in order to do well on Google.

  • Requires to signup for Bing Webmaster Tool. Similarly to Google Keyword Planner, this isn't a publicly accessible tool and you need an account. The problem is that you might have privacy concerns about what happens with the data for your searches, otherwise the tool is free.

  • No good local data. If you are fine with local level up to country, then this won't be a problem but if you need more detailed local data, such as on a state level, not to mention on a city and town level, you are out of luck.

WordTracker Keyword Tool

WordTracker was one of the first keyword tools and even though today its popularity (and usefulness) is quite low, it's still an alternative to consider.

Advantages of WordTracker Keyword Tool

The main advantages of WordTracker Keyword Tool are:

  • It gives an idea how competitive a keyword is. In other words, you get not only the volume of searches but also the volume of competitors. This is good to know because if a keyword is way too competitive, it might be better to skip it.

  • Multiple search filter. You can narrow down your search to exact matches only, or include related terms, or search for keywords in any order.

  • Paid searchers get data from SEMRush tool as well. If you subscribe for the tool, you will get data from SEMRush search tool as well.

Disadvantages of WordTracker Keyword Tool

The disadvantages of WordTracker Keyword Tool are these:

  • The tool is not free. While it does offer a free version with somehow limited functionality, if you want to use the tool to the fullest, you need to pay.

  • The tool requires registration. This isn't much of an issue but still it is a disadvantage.

  • Reliability of the data. Since the data doesn't come from Google or other major search engine, its reliability is not high. Still, you can use it for direction but you'd better double check with other tools, if you plan to build your whole campaign on it.

These three keyword volume research tools are the best. In addition to them, there are many other paid ones and some free but their reliability is seriously under question, so basically they aren't much of an alternative. Nevertheless, even these three tools are more than nothing, so use them when you have to research for keyword volumes.

Any update to the algorithms of search engines, for instance the famous Panda update, is a major source of stress for any Web master because often these updates are like earthquakes ‐ they shake your rankings upside down and all your efforts to pump up your site's rankings go down the drain. While these changes are inevitable and there is nothing you can do to prevent them, there are some measures you can take in order to soften the blow. Basically, it's believed that if you follow white hat best practices, you are safe from updates but in reality this isn't so. Here are some suggestions (in addition to the general white hat best practices) how to make your site less vulnerable to search engine algorithm changes.

However, before we continue with the steps themselves, let's clarify that not all traffic fluctuations are the result of a search engine algorithms update. Very often fluctuations are normal ‐ for instance, seasonal fluctuations are not caused by an algorithm change but rather by the time of the year factor. Such changes are harder to control because similarly to search engines algorithms they are also outside your reach and this is why we are not discussing these here.

1. Stay Focused on Your Target Keywords

When you want to increase the traffic from search engines, your first idea might be that if you manage to rank for more keywords, you will get more traffic. Could be (especially if the keywords are not closely related) but basically this isn't necessarily so.

You might think that when you expand into new keywords, this will increase traffic but actually it's just the opposite ‐ these new keywords will dilute the relevancy of your present keywords and as a result you might lose some of the traffic you already have. So, before you target some new keywords, always be ready to back off, if results turn out to be worse than expected.

2. Optimize for Long Tail Keywords

Long tail keywords are frequently neglected because they don't bring as much traffic as their more lucrative counterparts. However, long tail keywords are more proof against changes in algorithms. The traffic of long tail keywords tends to fluctuate less because there is less competition. If you have been skipping long tail keywords up to now, start optimizing for them asap.

3. Optimize for Less Competitive Keywords

Less competitive keywords might be not long tail but they also tend to suffer less from search engine updates. The explanation is easy ‐ for example, if there are 10 sites that compete for a keyword, even if Google updates their algorithm and shuffles results, the worst that can happen to you is to rank 10th, which is much better than to rank in the second hundred, as is quite possible with keywords that have hundreds of sites competing for them.

4. Post New Content Regularly

All equal, huge intervals between new posts can literally bury your site even for keywords you always used to rank well for and even if there are no search engine algorithm changes. While you can't beat these changes solely by posting new content all the time simply because it takes time for new posts to rank well, new content is fresh blood and it does bring traffic. As we've mentioned multiple times, it's better to post one new content piece a week or even a month than to post no content for months and then pour 20 new content pieces at once.

5. Get Backlinks from Superb Sites Only

The days when any backlink was good are history. Now links from bad or simply irrelevant sites can hurt you badly. This is why, you need to get backlinks from high ranking, relevant sites only. In addition to the juice they pass, these sites tend not to lose their own rankings that frequently and as a result your own rankings will fluctuate less. Also, once an A List site puts a link, they will hardly remove it just to protect their own rankings, as some of the other sites will do.

In order to avoid links from bad sites, you can use various tools to discover who is linking to you. When you encounter a link you don't want, contact the web master of the site and ask him or her to have it removed.

6. Use PPC

If you haven't figured this out so far, let's tell it directly - free search engine traffic is great but it's unreliable. No matter how great your site is and how masterfully you are trying to protect yourself against search engine algorithm changes, you are never immune. If you want constant traffic to your site, start using PPC. We've put a detailed tutorial on Google AdWords. If you are not familiar with PPC at all, you can start with this tutorial.

7. Promote on Social Media

It's the same here ‐ don't put all your eggs in one basket, i.e. don't get all your traffic from search engines. When you start diversifying your traffic sources, the first option is PPC (which costs money but generally it brings targeted traffic) and the second option is social media. Sites, such as Facebook, Twitter, Tumblr, etc. are free to use and they can bring you lots of traffic, if your posts become popular. However, usually the traffic from social sites isn't targeted and conversions might be low but still these sites are a viable alternative.

While you sometimes can benefit from search engine algorithm changes because these changes bury your competitors and take you up in search results without any effort on your part, this is pure luck and you can't rely on it. If you don't want to experience the negative effects of algorithm changes to the max, you need to be proactive and take the steps listed in this article. Unfortunately, even if you do, you still can't say you are fully covered but at least you will have the consolation that you did what could be done.

The URLs and the navigation of a site do matter for SEO but the elements of site structure that affect SEO don't end here. For example, take sub domains and sub directories.These are another example of a site's structural elements that do influence how Google ranks results. While the difference between when a page is in a sub directory (also called a sub folder) vs when it's on a sub domain might not be always huge, there are cases when it really matters if you use sub domains or sub directories.
The Difference Between a Sub Domain and a Sub Directory
Before we discuss the advantages and disadvantages of sub domains and sub directories for SEO, let's clarify the difference between them.
Basically, when you use sub domains, your URLs will look like this:
Notice the 'subdomain' part before your main domain. This part tells Google and your visitors that the contents in the sub domain is separate from the contents of the rest of your site. A typical use for sub domains is for a business site where the blog is separate from the rest and it resides on a subdomain of its own ‐ i.e. Sites with forums (and other content, for instance articles) also frequently separate the forum part on a sub domain, such as
Unlike sub domains that precede the domain name, sub directories follow it. Like this:
The difference is not purely in the syntax. It goes beyond that and it's related to the type of content you have. There isn't a rule when to use sub domains and when to use sub directories but basically if the content is a good candidate for a separate site, then you go with sub domains. If the content isn't that much different from the main content, you go with sub directories. For instance, if you have a blog about web design, you can have separate sub directories for tutorials, free stuff, your artwork, etc., but put the shop where you sell templates and designs on a separate domain.
SEO Benefits of Sub Domains
It's hard to say if sub domains are better for SEO or not because it depends on many other factors. However, one of the cases when you would benefit from a separate domain is when you have multiple pages for a single keyword.
Google usually limits the number of search results per domain,unless they are very relevant but this doesn't apply to sub domains.In other words, if you had a domain with sub directories and you have 10 relevant results, most likely Google will show only 2 or 3 of them, while if you had 5 sub domains, chances are 2 or 3 results per sub domain (or 10-15 altogether) will be shown. However, don't take this for granted and don't rush to sub domaining just to trick Google to show more results from your site ‐ this might not work and you will have wasted your time and effort.
Another advantage of sub domains for SEO is that you can put your keywords as sub domain names. This is especially good, if your main domain name lacks them ‐ i.e. if your main domain is but you have lots of stuff about addiction, it makes sense to create a separate sub domain with your keyword in it. Of course, you can do this with sub directories as well but somehow a separate sub domain with your keywords has more weight.
SEO Drawbacks of Sub Domains
The advantages of sub domains for SEO are tangible, however they also have drawbacks. For instance, sub domains are harder to set and manage.
Another disadvantage, and it's a major one, is that sub domains don't always inherit metrics from the main domain (i.e. if your main domain is PR5, your sub domains could be PR0 because for Google both are not closely related). In many cases, this alone is enough to make you give up the idea of using sub domains at all ‐ since the sub domains are not inheriting metrics, this means you practically have to optimize them from scratch.
SEO Benefits of Sub Folders
If you expect a longish list of SEO benefits of sub folders, there is no such list because basically sub folders have only two advantages.
First, similarly to sub domains, with sub folders, you can have the keyword in your URL. Second, sub folders inherit the metrics of your root domain, which means that if your site is doing well on a whole, any content in a sub folder automatically benefits from this, while with sub domains you might have to start your SEO efforts from the very beginning.
SEO Drawbacks of Sub Folders
The main SEO disadvantage of sub folders is that they could limit your exposure, if search results are already saturated with pages from your site. As I already mentioned, Google generally limits the number of search results per domain to 2 or 3, so if you have more pages that are relevant, they might not show in the main search results.
When to Use a Sub Domain and When to Use a Sub Directory?
Use sub domains for larger topics &dasdash; i.e. if you have a health site, it makes sense to create separate domains for each major group of diseases (i.e. cardiovascular, respiratory, etc.) and then create a sub folder for the separate diseases in this group (i.e. heart attack, flu, etc.)
Also, as already mentioned, if your site has a forum and or/a a blog in addition to your corporate pages, you should use sub domains for the forum and the blog ‐ i.e. and

Of course, these are just general recommendations and you don't have to follow them blindly. As usual, the ultimate test is to try and see what works for you and what doesn't.

If you don't sell Android apps at Google Play Store, you might not have heard about App Store Optimization (ASO) at all. However, if this marketplace is of interest to you, then you really can't go without knowledge about the main factors that influence your popularity there. In a sense, ASO knowledge is, even more, vital than general SEO knowledge, because if you manage to get good rankings in Google Play Store, this could bring more revenue with much less investment than if you manage to get good rankings in a search engine, as Google itself.

What's Is ASO and Why It Matters

As already deciphered, ASO is an abbreviation for App Store Optimization. As the name itself implies, ASO involves steps to make your store better, more popular, get better rankings in the marketplace, and presumably make more money. You will see ASO is very similar to search engine optimization, so if you are familiar with SEO, you are not starting from scratch, but still ASO does have its specifics.

With 80 per cent of high-quality organic downloads (whatever this means) happening as a result of a search, you simply can't afford to leave your store unoptimized. In other words, while direct ads might work with apps, the lion's share of downloads comes from search.

Major ASO Ranking Factors

Maybe you've guessed by now that similarly to SEO, the exact ranking ASO algorithm isn't publicly known and there is a lot of speculation about what matters for good ASO rankings. However, there are some factors that are more or less universally agreed upon as factors with huge or, at least, some importance. These factors are discussed next.

1. Keywords

It's hardly a surprise keyword are a key factor for good rankings at Google Play Store. For truth's sake, it needs to be mentioned there are ASO experts who dispute the huge role of keywords and place more weight on other factors, such as reviews and downloads. However, it's hard to believe that properly placed keywords don't matter, so even if you are not absolutely convinced about the ranking role of keywords, don't totally neglect them.

As with SEO, keywords matter especially in the title, the headings, and the description. Even if it were true keywords aren't that important for ASO, they are important for your human visitors ‐ when you see a title with a keyword you are looking for, you are more likely to open this link, then a link with no keywords in it. For even better results, use variations of your keyword(s).

2. App Reviews and Ratings

All equal, apps with more and favorable reviews and better rankings rank higher for the same keyword. The reason why reviews and ratings have such a weight is simple ‐ they show how your users view your app. Therefore, it makes sense to try to get better reviews and ratings. However, don't go the slippery road of buying favorable ratings and reviews because if caught, this can get you into real trouble.

3. Encourage App Downloads and Installs

Similarly to apps reviews and ratings, app downloads and installs, as well as frequent app use, are other major ranking factors. This is for a reason ‐ all these measure user involvement and satisfaction.The rule here is simple - the more the better.

In order to ensure more app downloads and installs, you need to actively promote your app off the store. Active promotion on social networks, such as G+, Facebook, Twitter, etc., as well as on forums for mobile apps, can drive lots of traffic to your app and affect the a number of downloads and installs in a favorable way.

4. Go Local

If you are targeting a particular country as a market, it's a must you offer a localized version of your app. For starters, translate your app in the languages of the target countries. This might look like lots of work, but unless you target a very small country with a really limited user base, the payoff could be good.

As a rule, most apps have an English version and very few have non-English ones. As a result, typically the competition for non-English apps are much lower and this gives you better chances to get noticed, especially on a saturated market.

5. Quality Screenshots

Quality screenshots might not be directly related to better rankings, but once your app is discovered, you don't want to chase users away with poorly-looking screenshots. Because of this, take the time and effort to make crisp screenshots. When users can clearly see the screens of your app and these screens contain the functionality they need, they are more likely to download and install the app. As already explained, more downloads and installs lead to better rankings.

6. Get Backlinks to Your App

As with conventional search, getting backlinks to your app page helps a lot. Of course, you need quality links from reputable sources, not spam from link farms. It's better to get one single link from an A-list site where your app is featured, then dozens of backlinks from Z-list sites.

If your app is good, chances are it will be included in articles all over the Net and you will be getting backlinks naturally. However, if this doesn't happen, you might have to build backlinks manually one by one.

ASO isn't hard to do but it can help you a lot. Most likely you are already doing most of the steps, but if you have neglected them by now, you might want to change this and pay more attention to the ASO ranking factors. After all, even if this doesn't make you a star on the Google Play Store, it doesn't take much time and effort and it most certainly won't do you any harm.

Every website owner wants to get as many high rankings on Google as possible. You can lead Google and other search engine bots to the most important pages of your websites. Make sure that irrelevant pages of your site do not clutter the search results and that your important pages get the attention they deserve.

Googlebot and your website

Why should you guide search engine robots?

Important and valuable pages will be crawled every time Google visits your website if you guide the bot. Other pages such as the 'terms and conditions' page on your website are not important for search engine rankings.
Search engine robots can ignore these pages that do not need to be ranked. Indexing time that is spent on these pages should be spent on the most important pages of your site.
There are several different ways to guide Googlebot to the right pages:

1. Use a good robots.txt file and the the meta noindex tag

Robots.txt is the name of a text file that is available at It tells search engines which parts of your website should be accessible.
You should set disallow rules for all file types, folders and pages that you do not want to be indexed.   Search engine robots check the robots.txt file before indexing the pages on a website.
A good robots.txt file makes sure that unimportant pages aren't indexed. Examples can be found here.
The meta noindex tag is another way to exclude individual pages from the index. In general, you do not need this tag if you have a good robots.txt file.

2. Make sure that all of your links work

Search engine bots follow the links they find on your web pages. Broken links are a sign of low quality sites. You don't want Googlebot to waste time on crawling links to non-existing pages on your site. You can check the links on your site with the website audit tool in SEOprofiler.

3. Improve your website structure

A good website structure with clearly categorized page content increases the likelihood that Googlebot and other search engine bots will find the important pages of your site.
The most important pages of your site should be available with a few mouse clicks. The website audit tool in SEOprofiler shows you how many clicks are needed to reach the pages on your site.

If you do the things above, Google will index the right pages of your site. The number of indexed pages is not as important as the quality of the indexed pages. If the right web pages can be found for the right keywords, you will get many more sales.

Google Webmaster Tools (GWT) is one of the most powerful resources in an online marketer’s toolbox -- especially when it comes to SEO. Most of us know how to get around in GWT, and are familiar with some of its basic functions. But there’s so much you can do in GWT that will enhance your site’s ranking in the search results pages.

Whether you’ re a beginner in GWT or an experienced pro, I think you’ re going to discover a few things in this article that will increase your knowledge of Webmaster Tools.

1. View author stats

Author stats is a not-so-obvious feature of Google Webmaster Tools. The main reason for this is because it’s still in beta. You can access it through the “Labs” section of Google Webmaster Tools through your personal login.

There’s another reason why it’s often overlooked. It’s not a website feature, as much as it is a feature to track your personal stats and visibility as a content producer.

If you have a Google+ account, and are a verified author on any websites, here’s how to view your author stats.

  • Log in to the Gmail account that is connected to your Google+ account.

  • Go to Google Webmaster Tools.

  • Click on Labs

  • Click on “Author Stats”

In “Author Stats,” you can view the number of all the articles you’ve written, impressions, click-through rates, and average position on Google.

This data is crucial if you’re interested in improving as a recognized author, and enhancing your content personal brand and content marketing efforts.

8 Advanced Webmaster Tools You Should Be Using for Better SEO

2. Change crawl rate

Google lets you set the rate at which you want them to crawl your site. You can only change this feature in Google Webmaster Tools. In most cases, you want your site to be crawled as often as possible. If, however, Google’s frequent crawl rate seems to be slowing down your site, here’s how to change it.

Click on the gear icon in the upper right corner of GWT.

Click on “Site Settings.”

8 Advanced Webmaster Tools You Should Be Using for Better SEO

Click on “Limit Google’s maximum crawl rate.”

Set the crawl rate to “Low” or “High” depending on your needs.

8 Advanced Webmaster Tools You Should Be Using for Better SEO

Click “Save.”

3. Create structured data highlighter

Sites with structured data (e.g., schema markup) rank four positions higher in search results. If you’re not using structured data on your website, you need to start. According to Searchmetrics, only 0.3 percent of websites are using schema, but a whopping 36% of Google’s search results include snippets derived from schema markup.

8 Advanced Webmaster Tools You Should Be Using for Better SEO

There is a huge opportunity to gain rank and improve your listings in search results pages by using schema.

The structured data highlighter in Google Webmaster Tools makes it easy for you to do so. I’ve written a tutorial on schema markup, that will help you to get started.

Here’s how to access the structured data highlighter in Google Webmaster Tools.

Click on Search Appearance.

Click on Data Highlighter.

Click “Start.”

Prepare to spend a few minutes acquainting yourself with the tool. It’s not difficult to get started, but it might take a few minutes to completely go through the process of adding schema to any of your pages.

4. Identify HTML improvements

Google tells you exactly what SEO features you should focus on as you optimize your site for search. It’s called “HTML Improvements.”

Go to “Search Appearance.”

Click on “HTML Improvements.”

8 Advanced Webmaster Tools You Should Be Using for Better SEO

Here are the issues that Google focuses on:

  • Meta descriptions - Search-optimized descriptions are probably not a direct factor in search rankings. This is probably why the tool explains that “Addressing the following may help your site's user experience.” Meta descriptions are, however, important for users and click-through rate, which have a subsequent impact on ranking. Descriptions should be present on each page, and written in such a way that they are neither too long nor too short.

  • Missing title tags - Most SEOs agree that the title tag is the single most important feature of a site’s technical SEO. This is probably why the GWT HTML Improvement tool focuses on every aspect of a title tag.

  • Duplicate title tags - Lots of SEOs are concerned about duplicate content. It’s a concern that’s probably well-warranted. The fact is, duplicate content is going to happen. It has to happen, especially if a site needs to include, for example, the same legal disclaimer on multiple pages, or other standardized verbiage. Though no one knows quite how, duplicate content might negatively impact search rankings. You can identify any occurrence of duplicate title tags directly in GWT.

  • Long title tags - Title tags that are too long will be truncated or replaced by Google.

  • Short title tags - Title tags that are too short are not taking advantage of full SEO potential, and are probably not helpful to users, either.

  • Non-informative title tags - Google is all about providing information. If, in their view, your title tags are not providing sufficient information, they will let you know.

  • Non-indexable content. - If your site has non-indexable content such as rich media files, Google will alert you to this.

To examine any of these issues in detail, you can click the hyperlinked topic and identify where exactly on your site the problem is located.

5. Change sitelink ordering

Sitelinks are the additional entries that Google lists underneath your main site in the SERPS. They appear when a user performs a directional or branded search. For example, here are the site links for

8 Advanced Webmaster Tools You Should Be Using for Better SEO

You don’t get to control whether or not Google shows site links for your site. It’s an algorithmic-generated feature. You can, however, ensure that you have a clear site structure and a Sitemap, which will probably generate site links.

Sometimes, however, Google’s algorithm doesn’t quite get the sitelinks right. That’s where the “Sitelinks” feature of GWT comes in.

Click on Search Appearance.

Click on “Sitelinks.”

Your only option is to “demote” a certain sitelink. If, for example, one of your sitelinks is appearing, but you don’t want it to, you can ask Google to keep it from appearing as a sitelink.

6. Identify internally linked pages

Internal linking is a very important part of a site’s SEO. I’ve explained before how exactly you should conduct an internal linking strategy.

The “Internal Links” feature in GWT helps you see which of your internal pages are most frequently linked within your site. The greater their internal linked integration, the stronger these pages are.

If you don’t see your most important content pages on the first page of the “Internal Links,” you should probably address this by adding additional internal links.

8 Advanced Webmaster Tools You Should Be Using for Better SEO

Click on “Search Traffic.”

Click “Internal Links.”

7. Indexation status

I strongly encourage every SEO and webmaster to keep a close eye on their indexation status. Over time, the number of indexed pages can change. These changes may signal an algorithmic shift, which could affect rankings. At other times, indexation drops may signal a negative SEO on your site.

As a general rule, your number of indexed pages should rise in correlation to the consistency of your content-marketing output. As long as you’re-creating great content, your indexed pages should be rising.

Check “Index Status” to make sure.

Go to “Google Index.”

Click on “Index Status.”

8. Fetch as Google

If you want to analyze a page’s performance in the search engine results pages, the “Fetch as Google” tool is a great resource.

Click on “Crawl.”

Click “Fetch as Google.”

Type in the URL of the page you want to analyze.

Click “Fetch and Render.”

8 Advanced Webmaster Tools You Should Be Using for Better SEO

Using Fetch as Google allows to see how -- or if -- a page might appear. You’ll be able to identify the HTTP server response, the time of the crawl request, the HTML code, the visible indexable content, and a screenshot of the page as seen by Google. If the crawler was unable to access any of the content, you’ll see this list of inaccessible resources as well.

If you’ve configured robots.txt to block certain elements, this will be listed as the reason for no access. Other times, however, there may be other legitimate crawl problems that can surface during this examination.

8 Advanced Webmaster Tools You Should Be Using for Better SEO


I use Google Webmaster Tools every day, both from my own site and others’ sites. The list of eight resources above is only a sampling of the many features available in GWT.

The better you become at using these tools, the better you will become at having a site that is perfectly optimized in Google.

The better your site structure, the better your chance of higher ranking in the search engines. Every website has some “structure.” It might be a rigorous and streamlined structure, or it may be a disorganized jumble of pages. If you are intentional and careful with your site structure, you will create a site that achieves search excellence.

In this article, I share some of the best advice on creating a powerful site structure. The tips below will help you create a site that appeals to users, gets crawled and indexed by spiders, and delivers the best SERP listings and rankings possible.

Why Structure Matters

As I’ve worked with hundreds of clients over the years, I’ve been surprised at how often site structure is overlooked. On the one hand, it’s one of the most crucial aspects of a site’s SEO performance, but on the other hand, few webmasters and owners understand what it means to have a site structure that enhances SEO.

I’m going to share a few of the reasons why site structure is so crucial, and then get into the how-to of developing your own SEO-friendly site structure.

A good site structure means a great user experience.

When you take away the colors, the fonts, the kerning, the graphics, the images, and the white space, good site design is really about a great structure.

The human mind craves cognitive equilibrium -- being able to put pieces logically together, finding things where they’re expected, and locating what they are seeking. Thus, a strong and logical site structure is cognitively satisfying to users.

As you know, the more appealing your site to users, the more appealing it is to search engines, too. Google’s algorithm uses information from searchers to rank your site. If your site has poor CTRs and low dwell time, it will not perform well in the SERPs. By contrast, when a user finds a site that they like -- i.e. a site with great structure -- they don’t bounce and they stay longer. An accurate site structure can reduce bounce rate and improve dwell time, both of which will lead to improved rankings.

A good site structure provides your site with site links.

Sitelinks are a listing format in the SERPs that show your site’s main page along with several internal links indented below. You’ve seen them before.

How to Create a Site Structure That Will Enhance SEO

Sitelinks are a huge SEO advantage. They increase the navigability of your site, point users to the most relevant information, increase your brand’s reputation, improve user trust, help you dominate SERPs, increase clickthrough rate, and shorten the conversion funnel. Basically, sitelinks are awesome.

But how do you get sitelinks? You don’t simply go to Google Webmaster Tools and fill in a few fields on a form. You can’t issue a sitelink request. Instead, Google’s algorithm automatically awards websites with sitelinks. And they do so based on great site structure.

If you have a poor site structure, it’s very likely that your site will never receive site links. The absence of sitelinks could be costing your site more targeted traffic, higher CTR, and increased conversions.

A good structure means better crawling.

Web crawlers like Googlebot crawl a website’s structure. Their goal is to index the content in order to return it in search results. The better your site structure, the easier the crawlers can access and index the content.

Crawler’s don’t automatically discover everything on your website. Google even admits, “[there are] pages on your site we might,” or “URLs that may not be discoverable by Google’s normal crawling process.” (That’s one of the reasons why Sitemaps are necessary.) However, crawlers will have a far easier time accessing, crawling, indexing, and returning the pages of a site with strong structure.

A good site structure is at the very core of good SEO -- optimizing for the crawlers.

To sum up, your site’s organization paves the way for SEO success. In fact, it could be argued, that without a good site structure, you will never have SEO success. Strong site structure gives your site an unbreakable SEO foundation that will provide you with vast amounts of organic search.

Six Steps to Creating Site Structure

Now, I’ll tell you how to create this kind of site structure.

1. Plan out a hierarchy before you develop your website.

If you’re starting a website from scratch, you’re in a great position to plan out site structure for the best SEO possible. Even before you start creating pages in a CMS, plan out your structure. You can do it on a whiteboard, a spreadsheet program (Excel, Google Drive Spreadsheets), most word processors, or something like Visio or OmniGraffle.

A “hierarchy” is nothing more than a way to organize your information -- something that is simple and makes sense. Your hierarchy will also become your navigation and your URL structure, so everything important begins here.

Generally, a site hierarchy looks like this:

How to Create a Site Structure That Will Enhance SEO

There are a few features of hierarchy that you should keep in mind.

  • Make your hierarchy logical. Don’t overthink or overcomplicate this process. You want simplicity, both for your own sake and for the ease of crawlers and users. Each main category should be unique and distinct. Each subcategory should somehow relate to the main category under which it is located.
  • Keep the number of main categories between two and seven. Unless you’re, you don’t want to have too many main categories. There should be only a few main things. If you have more than seven, you may want to rethink the organization, and pare it down a bit.
  • Try to balance the amount of subcategories within each category. Basically, try to keep it approximately even. If one main category has fourteen subcategories, while another main category has only three subcategories, this could become a little unbalanced.

A site hierarchy is the beginning point for a great site structure.

2. Create a URL structure that follows your navigation hierarchy.

The second main element in developing strong site structure is your URL structure. If you’ve logically thought through your hierarchy, this shouldn’t be too difficult. Your URL structure follows your hierarchy.

So, let’s say your hierarchy looks like this:

How to Create a Site Structure That Will Enhance SEO

The URL structure for the Chinatown location would look like this:

Your URL structure will be organized according to your site hierarchy. This means, obviously, that your URLs will have real words (not symbols) and appropriate keyword coverage.

3. Create your site navigation in HTML or CSS.

When you create your navigation, keep the coding simple. HTML and CSS are your safest approach. Coding in JavaScript, Flash, and Ajax will limit the crawler’s ability to cover your site’s well-thought out navigation and hierarchy.

4. Use a shallow depth navigation structure.

Your navigation structure will obviously follow your site hierarchy. Make sure that pages, especially important ones, aren’t buried too deep within the site. Shallow sites work better, both from a usability and crawler perspective, as noted in this Search Engine Journal article:

A shallow website (that is, one that requires three or fewer clicks to reach every page) is far more preferable than a deep website (which requires lengthy strings of clicks to see every page on your site).

5. Create a header that lists your main navigation pages.

Your top header should list out your main pages. That’s it. My website, uses a very simple top navigational header with three subcategories. This accomplishes everything I need.

How to Create a Site Structure That Will Enhance SEO

Adding any other menu elements apart from your main categories can become distracting and unnecessary. If you’ve designed a parallax site, be sure to provide a persistent header menu that displays through each scrolling phase.

While dropdown menus using CSS effects or disappearing menus may provide a unique or intriguing user experience, they do not enhance SEO. I advise against them. I also advise against using an image-based navigational structure. Text links with appropriate anchors provide the strongest form of SEO.

If you have a footer with menu links, be sure to duplicate the main links of your top navigational menu in your footer navigation menu. Changing the order of links or adding additional category listing will complicate the user experience.

6. Develop a comprehensive internal linking structure.

Internal linking puts meat on the bones of a logical site hierarchy. Moz’s article on internal links lists three reasons why they are important:

  • They allow users to navigate a website.
  • They help establish information hierarchy for the given website.
  • They help spread link juice (ranking power) around websites.

Each of these is directly tied to creating a tight-knit and well-integrated site structure.

There’s no need to get complicated with internal linking. The basic idea is that every page on your website should have some link to and some link from another page on the website. Your navigation should accomplish internal linking to the main categories and subcategory pages, but you should also make sure that leaf-level pages have internal linking as well.

Internal linking tells the search engines what pages are important, and how to get there. The more internal linking you have across all pages, the better.


Site structure is a product of careful thinking, intentional design, and accurate organization. The best time to develop a strong site stricture is before you create your site. However, if you’re redesigning your site, you can rework the design and reorganize some navigational elements to improve structural SEO.

There are a lot of things to keep in mind when optimizing your site for search engines. Site structure is one of the most important, but one of the most-overlooked optimization methods. If you have a great site structure, then great SEO will follow.

What advice do you have for improving a site’s structure?