About frans

Website:
frans has written 4625 articles so far, you can find them below.

The Balanced Digital Scorecard: A Simpler Way to Evaluate Prospects

Posted by EmilySmith

[Estimated read time: 10 minutes]

As anyone who’s contributed to business development at an agency knows, it can be challenging to establish exactly what a given prospect needs. What projects, services, or campaigns would actually move the needle for this organization? While some clients come to an agency with specific requests, others are looking for guidance — help establishing where to focus resources. This can be especially difficult, as answering these questions often requires large amounts of information to be analyzed in a small period of time.

To address the challenge of evaluating prospective clients and prioritizing proposed work, we’ve developed the Balanced Digital Scorecard framework. This post is the first in a two-part series. Today, we’ll look at:

  • Why we developed this framework,
  • Where the concept came from, and
  • Specific areas to review when evaluating prospects

Part two will cover how to use the inputs from the evaluation process to prioritize proposed work — stay tuned!

Evaluating potential clients

Working with new clients, establishing what strategies will be most impactful to their goals… this is what makes working at an agency awesome. But it can also be some of the most challenging work. Contributing to business development and pitching prospects tends to amplify this with time constraints and limited access to internal data. While some clients have a clear idea of the work they want help with, this doesn’t always equal the most impactful work from a consultant’s standpoint. Balancing these needs and wants takes experience and skill, but can be made easier with the right framework.

The use of a framework in this setting helps narrow down the questions you need to answer and the areas to investigate. This is crucial to working smarter, not harder — words which we at Distilled take very seriously. Often when putting together proposals and pitches, consultants must quickly establish the past and present status of a site from many different perspectives.

  • What type of business is this and what are their overall goals?
  • What purpose does the site serve and how does it align with these goals?
  • What campaigns have they run and were they successful?
  • What does the internal team look like and how efficiently can they get things done?
  • What is the experience of the user when they arrive on the site?

The list goes on and on, often becoming a vast amount of information that, if not digested and organized, can make putting the right pitch together burdensome.

To help our consultants understand both what questions to ask and how they fit together, we’ve adapted the Balanced Scorecard framework to meet our needs. But before I talk more about our version, I want to briefly touch on the original framework to make sure we’re all on the same page.

airplane-quote-kaplan-norton.png

The Balanced Scorecard

For anyone not familiar with this concept, the Balanced Scorecard was created by Robert Kaplan and David Norton in 1992. First published in the Harvard Business Review, Kaplan and Norton set out to create a management system, as opposed to a measurement system (which was more common at that time).

Kaplan and Norton argued that “the traditional financial performance measures worked well for the industrial era, but they are out of step with the skills and competencies companies are trying to master today.” They felt the information age would require a different approach, one that guided and evaluated the journey companies undertook. This would allow them to better create “future value through investment in customers, suppliers, employees, processes, technology, and innovation.”

The concept suggests that businesses be viewed through four distinct perspectives:

  • Innovation and learning – Can we continue to improve and create value?
  • Internal business – What must we excel at?
  • Customer – How do customers see us?
  • Financial – How do we look to shareholders?

Narrowing the focus to these four perspectives reduces information overload. “Companies rarely suffer from having too few measures,” wrote Kaplan and Norton. “More commonly, they keep adding new measures whenever an employee or a consultant makes a worthwhile suggestion.” By limiting the perspectives and associated measurements, management is forced to focus on only the most critical areas of the business.

This image below shows the relations of each perspective:

balanced scorecard graphic .gif

And now, with it filled out as an example:

92105_B.gif

As you can see, this gives the company clear goals and corresponding measurements.

Kaplan and Norton found that companies solely driven by financial goals and departments were unable to implement the scorecard, because it required all teams and departments to work toward central visions — which often weren’t financial goals.

“The balanced scorecard, on the other hand, is well suited to the kind of organization many companies are trying to become… put[ting] strategy and vision, not control, at the center,” wrote Kaplan and Norton. This would inevitably bring teams together, helping management understand the connectivity within the organization. Ultimately, they felt that “this understanding can help managers transcend traditional notions about functional barriers and ultimately lead to improved decision-making and problem-solving.”

At this point, you’re probably wondering why this framework matters to a digital marketing consultant. While it’s more directly suited for evaluating companies from the inside, so much of this approach is really about breaking down the evaluation process into meaningful metrics with forward-looking goals. And this happens to be very similar to evaluating prospects.

Our digital version

As I mentioned before, evaluating prospective clients can be a very challenging task. It’s crucial to limit the areas of investigation during this process to avoid getting lost in the weeds, instead focusing only on the most critical data points.

Since our framework is built for evaluating clients in the digital world, we have appropriately named it the Balanced Digital Scorecard. Our scorecard also has main perspectives through which to view the client:

  1. Platform – Does their platform support publishing, discovery, and discoverability from a technical standpoint?
  2. Content – Are they publishing content which combines appropriate blends of effective, informative, entertaining, and compelling?
  3. Audience – Are they building visibility through owned, earned, and paid media?
  4. Conversions – Do they have a deep understanding of the needs of the market, and are they creating assets, resources, and journeys that drive profitable customer action?
  5. Measurement – Are they measuring all relevant aspects of their approach and their prospects’ activities to enable testing, improvement, and appropriate investment?

These perspectives make up the five areas of analysis to work through when evaluating most prospective clients.

1. Platform

Most consultants or SEO experts have a good understanding of the technical elements to review in a standard site audit. A great list of these can be found on our Technical Audit Checklist, created by my fellow Distiller, Ben Estes. The goal of reviewing these factors is of course to “ensure site implementation won’t hurt rankings” says Ben. While you should definitely evaluate these elements (at a high level), there is more to look into when using this framework.

Evaluating a prospect’s platform does include standard technical SEO factors but also more internal questions, like:

  • How effective and/or differentiated is their CMS?
  • How easy is it for them to publish content?
  • How differentiated are their template levels?
  • What elements are under the control of each team?

Additionally, you should look into areas like social sharing, overall mobile-friendliness, and site speed.

If you’re thinking this seems like quite the undertaking because technical audits take time and some prospects won’t be open with platform constraints, you’re right (to an extent). Take a high-level approach and look for massive weaknesses instead of every single limitation. This will give you enough information to understand where to prioritize this perspective in the pitch.

2. Content

Similar to the technical section, evaluating content looks similar to a lightweight version of a full content audit. What content do they have, which pieces are awesome and what is missing? Also look to competitors to understand who is creating content in the space and what level the bar is set at.

Beyond looking at these elements through a search lens, aim to understand what content is being shared and why. Is this taking place largely on social channels, or are publications picking these pieces up? Evaluating content on multiple levels helps to understand what they’ve created in the past and their audience’s response to it.

3. Audience

Looking into a prospect’s audience can be challenging depending on how much access they grant you during the pitch process. If you’re able to get access to analytics this task is much easier but without it, there are many tools you can leverage to get some of the same insights.

In this section, you’re looking at the traffic the site is receiving and from where. Are they building visibility through owned, earned, and paid media outlets? How effective are those efforts? Look at metrics like Search Visibility from SearchMetrics, social reach, and email stats.

A large amount of this research will depend on what information is available or accessible to you. As with previous perspectives, you’re just aiming to spot large weaknesses.

4. Conversion

Increased conversions are often a main goal stated by prospects, but without transparency from them, this can be very difficult to evaluate during a pitch. This means that often you’re left to speculate or use basic approaches. How difficult or simple is it to buy something, contact them, or complete a conversion in general? Are there good calls to action to micro-conversions such as joining an email list? How much different is the mobile experience of this process?

Look at the path to these conversions. Was there a clear funnel and did it make sense from a user’s perspective? Understanding the journey a user takes (which you can generally experience first-hand) can tell you a lot about expected conversion metrics.

Lastly, many companies’ financials are available to the public and offer a general idea of how the company is doing. If you can establish how much of their business takes place online, you can start to speculate about the success of their web presence.

5. Measurement

Evaluating a prospect’s measurement capabilities is (not surprisingly) vastly more accurate with analytics access. If you’re granted access, evaluate each platform not just for validity but also accessibility. Are there useful dashboards, management data, or other data sources that teams can use to monitor and make decisions?

Without access, you’re left to simply check and see the presence of analytics and if there is a data layer. While this doesn’t tell you much, you can often deduce from conversations how much data is a part of the internal team’s thought process. If people are monitoring, engaging, and interested in analytics data, changes and prioritization might be an easier undertaking.

what-you-measure-quote.png

Final thoughts

Working with prospective clients is something all agency consultants will have to do at some point in their career. This process is incredibly interesting — it challenges you to leverage a variety of skills and a range of knowledge to evaluate new clients and industries. It’s also a daunting task. Often your position outside the organization or unfamiliarity with a given industry can make it difficult to know where to start.

Frameworks like the original Balanced Scorecard created by Kaplan and Norton were designed to help a business evaluate itself from a more modern and holistic perspective. This approach turns the focus to future goals and action, not just evaluation of the past.

This notion is crucial at an agency needing to establish the best path forward for prospective clients. We developed our own framework, the Balanced Digital Scorecard, to help our consultants do just that. By limiting the questions you’re looking to answer, you can work smarter and focus your attention on five perspectives to evaluate a given client. Once you’ve reviewed these, you’re able to identify which ones are lagging behind and prioritize proposed work accordingly.

Next time, we’ll cover the second part: how to use the Balanced Digital Scorecard to prioritize your work.

If you use a framework to evaluate prospects or have thoughts on the Balanced Digital Scorecard, I’d love to hear from you. I welcome any feedback and/or questions!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

10 Illustrations of How Fresh Content May Influence Google Rankings (Updated)

Posted by Cyrus-Shepard

[Estimated read time: 11 minutes]

How fresh is this article?

Through patent filings over the years, Google has explored many ways that it might use “freshness” as a ranking signal. Back in 2011, we published a popular Moz Blog post about these “Freshness Factors” for SEO. Following our own advice, this is a brand new update of that article.

In 2003, Google engineers filed a patent named Information retrieval based on historical data that shook the SEO world. The patent not only offered insight into the mind of Google engineers at the time, but also seemingly provided a roadmap for Google’s algorithm for years to come.

In his series on the “10 most important search patents of all time,” Bill Slawski’s excellent writeup shows how this patent spawned an entire family of Google child patents–the latest from October 2011.

This post doesn’t attempt to describe all the ways that Google may determine freshness to rank web pages, but instead focuses on areas we may most likely influence through SEO.

Giant, great big caveat: Keep in mind that while multiple Google patent filings describe these techniques — often in great detail — we have no guarantee how Google uses them in its algorithm. While we can’t be 100% certain, evidence suggests that they use at least some, and possibly many, of these techniques to rank search results.

For another take on these factors, I highly recommend reading Justin Briggs’ excellent article Methods for Evaluating Freshness.

When “Queries Deserve Freshness”

Former Google Fellow Amit Singhal once explained how “Different searches have different freshness needs.”

The implication is that Google measures all of your documents for freshness, then scores each page according to the type of search query.

Singhal describes the types of keyword searches most likely to require fresh content:

  • Recent events or hot topics: “occupy oakland protest” “nba lockout”
  • Regularly recurring events: “NFL scores” “dancing with the stars” “exxon earnings”
  • Frequent updates: “best slr cameras” “subaru impreza reviews”

Google may determine exactly which queries require fresh content by monitoring the web and their own huge warehouse of data, including:

  1. Search volume: Are queries for a particular term spiking (i.e. “Earthquake Los Angeles”)?
  2. News and blog coverage: If a number of news organizations start writing about the same subject, it’s likely a hot topic.
  3. Social media: A spike in mentions of a particular topic may indicate the topic is “trending.”

While some queries need fresh content, other search queries may be better served by older content.

Fresh is often better, but not always. (More on this later.)

Below are ten ways Google may determine the freshness of your content. Images courtesy of my favorite graphic designer, Dawn Shepard.

1. Freshness by inception date

Initially, a web page can be given a “freshness” score based on its inception date, which decays over time. This freshness score may boost a piece of content for certain search queries, but degrades as the content becomes older.

The inception date is often when Google first becomes aware of the document, such as when Googlebot first indexes a document or discovers a link to it.

“For some queries, older documents may be more favorable than newer ones. As a result, it may be beneficial to adjust the score of a document based on the difference (in age) from the average age of the result set.”
– All captions from US Patent Document Scoring Based on Document Content Update

2. Amount of change influences freshness: How Much

The age of a webpage or domain isn’t the only freshness factor. Search engines can score regularly updated content for freshness differently from content that doesn’t change. In this case, the amount of change on your webpage plays a role.

For example, changing a single sentence won’t have as big of a freshness impact as a large change to the main body text.

“Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time.”

In fact, Google may choose to ignore small changes completely. That’s one reason why when I update a link on a page, I typically also update the text surrounding it. This way, Google may be less likely to ignore the change. Consider the following:

“In order to not update every link’s freshness from a minor edit of a tiny unrelated part of a document, each updated document may be tested for significant changes (e.g., changes to a large portion of the document or changes to many different portions of the document) and a link’s freshness may be updated (or not updated) accordingly.”

3. Changes to core content matter more: How important

Changes made in “important” areas of a document will signal freshness differently than changes made in less important content.

Less important content includes:

  • JavaScript
  • Comments
  • Advertisements
  • Navigation
  • Boilerplate material
  • Date/time tags

Conversely, “important” content often means the main body text.

So simply changing out the links in your sidebar, or updating your footer copy, likely won’t be considered as a signal of freshness.

“…content deemed to be unimportant if updated/changed, such as Javascript, comments, advertisements, navigational elements, boilerplate material, or date/time tags, may be given relatively little weight or even ignored altogether when determining UA.”

This brings up the issue of timestamps on a page. Some webmasters like to update timestamps regularly — sometimes in an attempt to fake freshness — but there exists conflicting evidence on how well this works. Suffice to say, the freshness signals are likely much stronger when you keep the actual page content itself fresh and updated.

4. The rate of document change: How often

Content that changes more often is scored differently than content that only changes every few years.

For example, consider the homepage of the New York Times, which updates every day and has a high degree of change.

“For example, a document whose content is edited often may be scored differently than a document whose content remains static over time. Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time.”

Google may treat links from these pages differently as well (more on this below.) For example, a fresh “link of the day” from the Yahoo homepage may be assigned less significance than a link that remains more permanently.

5. New page creation

Instead of revising individual pages, fresh websites often add completely new pages over time. (This is the case with most blogs.) Websites that add new pages at a higher rate may earn a higher freshness score than sites that add content less frequently.

“UA may also be determined as a function of one or more factors, such as the number of ‘new’ or unique pages associated with a document over a period of time. Another factor might include the ratio of the number of new or unique pages associated with a document over a period of time versus the total number of pages associated with that document.”

Some webmasters advocate adding 20–30% new pages to your site every year. Personally, I don’t believe this is necessary as long as you send other freshness signals, including keeping your content up-to-date and regularly earning new links.

6. Rate of new link growth signals freshness

Not all freshness signals are restricted to the page itself. Many external signals can also indicate freshness as well, oftentimes with powerful results.

If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you’re about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)

“…a downward trend in the number or rate of new links (e.g., based on a comparison of the number or rate of new links in a recent time period versus an older time period) over time could signal to search engine 125 that a document is stale, in which case search engine 125 may decrease the document’s score.”

Be warned: an unusual increase in linking activity can also indicate spam or manipulative link building techniques. Search engines are likely to devalue such behavior. Natural link growth over time is usually the best bet.

7. Links from fresh sites pass fresh value

Links from sites that have a high freshness score themselves can raise the freshness score of the sites they link to.

For example, if you obtain a link off an old, static site that hasn’t been updated in years, this may not pass the same level of freshness value as a link from a fresh page, i.e. the homepage of Wired. Justin Briggs coined this FreshRank.

“Document S may be considered fresh if n% of the links to S are fresh or if the documents containing forward links to S are considered fresh.”

8. Traffic and engagement metrics may signal freshness

When Google presents a list of search results to users, the results the users choose and how much time they spend on each one can be used as an indicator of freshness and relevance.

For example, if users consistently click a search result further down the list, and they spend much more time engaged with that page than the other results, this may mean the result is more fresh and relevant.

“If a document is returned for a certain query and over time, or within a given time window, users spend either more or less time on average on the document given the same or similar query, then this may be used as an indication that the document is fresh or stale, respectively.”

You might interpret this to mean that click-through rate is a ranking factor, but that’s not necessarily the case. A more nuanced interpretation might say that the increased clicks tell Google there is a hot interest in the topic, and this page — and others like it — happen to match user intent.

For a more detailed explanation of this CTR phenomenon, I highly recommend reading Eric Enge’s excellent article about CTR as a ranking factor.

9. Changes in anchor text may devalue links

If the subject of a web page changes dramatically over time, it makes sense that any new anchor text pointing to the page will change as well.

For example, if you buy a domain about racing cars, then change the format to content about baking, over time your new incoming anchor text will shift from cars to cookies.

In this instance, Google might determine that your site has changed so much that the old anchor text is now stale (the opposite of fresh) and devalue those older links entirely.

“The date of appearance/change of the document pointed to by the link may be a good indicator of the freshness of the anchor text based on the theory that good anchor text may go unchanged when a document gets updated if it is still relevant and good.”

The lesson here is that if you update a page, don’t deviate too much from the original context or you may risk losing equity from your pre-existing links.

10. Older is often better

Google understands the newest result isn’t always the best. Consider a search query for “Magna Carta.” An older, authoritative result may be best here.

In this case, having a well-aged document may actually help you.

Google’s patent suggests they determine the freshness requirement for a query based on the average age of documents returned for the query.

“For some queries, documents with content that has not recently changed may be more favorable than documents with content that has recently changed. As a result, it may be beneficial to adjust the score of a document based on the difference from the average date-of-change of the result set.”

A good way to determine this is to simply Google your search term, and gauge the average inception age of the pages returned in the results. If they all appear more than a few years old, a brand-new fresh page may have a hard time competing.

Freshness best practices

The goal here shouldn’t be to update your site simply for the sake of updating it and hoping for better ranking. If this is your practice, you’ll likely be frustrated with a lack of results.

Instead, your goal should be to update your site in a timely manner that benefits users, with an aim of increasing clicks, user engagement, and fresh links. These are the clearest signals you can pass to Google to show that your site is fresh and deserving of high rankings.

Aside from updating older content, other best practices include:

  1. Create new content regularly.
  2. When updating, focus on core content, and not unimportant boilerplate material.
  3. Keep in mind that small changes may be ignored. If you’re going to update a link, you may consider updating all the text around the link.
  4. Steady link growth is almost always better than spiky, inconsistent link growth.
  5. All other things being equal, links from fresher pages likely pass more value than links from stale pages.
  6. Engagement metrics are your friend. Work to increase clicks and user satisfaction.
  7. If you change the topic of a page too much, older links to the page may lose value.

Updating older content works amazingly well when you also earn fresh links to the content. A perfect example of this is when Geoff Kenyon updated his Technical Site Audit Checklist post on Moz. You can see the before and after results below:

Be fresh.

Be relevant.

Most important, be useful.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Predicting Intent: What Unnatural Outbound Link Penalties Could Mean for the Future of SEO

Posted by Angular

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of Moz, Inc.

[Estimated read time: 8 minutes]

As SEOs, we often find ourselves facing new changes implemented by search engines that impact how our clients’ websites perform in the SERPs. With each change, it’s important that we look beyond its immediate impact and think about its future implications so that we can try to answer this question: “If I were Google, why would I do that?”

Recently, Google implemented a series of manual penalties that affected sites deemed to have unnatural outbound links. Webmasters of affected sites received messages like this in Google Search Console:

Google Outbound Links Penalty

Webmasters were notified in an email that Google had detected a pattern of “unnatural artificial, deceptive, or manipulative outbound links.” The manual action itself described the link as being either “unnatural or irrelevant.”

The responses from webmasters varied in their usual extreme fashion, with recommendations ranging from “do nothing” to “nofollow every outbound link on your site.”

Google’s John Mueller posted in product forums that you don’t need to nofollow every link on your site, but you should focus on nofollowing links that point to a product, sales, or social media page as the result of an exchange.

Now, on to the fun part of being an SEO: looking at a problem and trying to reverse-engineer Google’s intentions to decipher the implications this could have on our industry, clients, and strategy.

The intent of this post is not to decry those opinions that this was specifically focused on bloggers who placed dofollow links on product/business reviews, but to present a few ideas to incite discussion as to the potential big-picture strategy that could be at play here.

A few concepts that influenced my thought process are as follows:

  • Penguin has repeatedly missed its “launch date,” which indicates that Google engineers don’t feel it’s accurate enough to release into the wild.
Penguin Not Ready
  • The growth of negative SEO makes it even more difficult for Google to identify/penalize sites for tactics that are not implemented on their own websites.
  • Penguin temporarily impacted link-building markets in a way Google would want. The decline reached its plateau in July 2015, as shown in this graph from Google Trends:
    Trend of Link Building

If I were Google, I would expect webmasters impacted by the unnatural outbound links penalty to respond in one of these ways:

  1. Do nothing. The penalty is specifically stated to “discount the trust in links on your site.” As a webmaster, do you really care if Google trusts the outbound links on your site or not? What about if the penalty does not impact your traffic, rankings, visibility, etc.? What incentive do you have to take any action? Even if you sell links, if the information is not publicly displayed, this does nothing to harm your link-selling business.
  2. Innocent site cleanup effort. A legitimate site that has not exchanged goods for links (or wants to pretend they haven’t) would simply go through their site and remove any links that they feel may have triggered the issue and then maybe file a bland reconsideration request stating as much.
  3. Guilty site cleanup effort. A site that has participated in link schemes would know exactly which links are the offenders and remove them. Now, depending on the business owner, some might then file a reconsideration request saying, “I’m sorry, so-and-so paid me to do it, and I’ll never do it again.” Others may simply state, “Yes, we have identified the problem and corrected it.”

In scenario No. 1, Google wins because this helps further the fear, uncertainty, and doubt (FUD) campaigns around link development. It is suddenly impossible to know if a site’s outbound links have value because they may possibly have a penalty preventing them from passing value. So link building not only continues to carry the risk of creating a penalty on your site, but it suddenly becomes more obvious that you could exchange goods/money/services for a link that has no value despite its MozRank or any other external “ranking” metric.

In scenarios No. 2 and No. 3, Google wins because they can monitor the links that have been nofollowed/removed and add potential link scheme violators to training data.

In scenario No. 3, they may be able to procure evidence of sites participating in link schemes through admissions by webmasters who sold the links.

If I were Google, I would really love to have a control group of known sites participating in link schemes to further develop my machine-learned algorithm for detecting link profile manipulation. I would take the unnatural outbound link data from scenario No. 3 above and run those sites as a data set against Penguin to attempt 100% confidence, knowing that all those sites definitely participated in link schemes. Then I would tweak Penguin with this training dataset and issue manual actions against the linked sites.

This wouldn’t be the first time SEOs have predicted a Google subtext of leveraging webmasters and their data to help them further develop their algorithms for link penalties. In 2012, the SEO industry was skeptical regarding the use of the disavow tool and whether or not Google was crowdsourcing webmasters for their spam team.

martinibuster

“Clearly there are link schemes that cannot be caught through the standard algorithm. That’s one of the reasons why there are manual actions. It’s within the realm of possibilities that disavow data can be used to confirm how well they’re catching spam, as well as identifying spam they couldn’t catch automatically. For example, when web publishers disavow sites that were not caught by the algorithm, this can suggest a new area for quality control to look into.” — Roger Montti, Martinibuster.com


What objectives could the unnatural outbound links penalties accomplish?

  1. Legit webmasters could become more afraid to sell/place links because they get “penalized.”
  2. Spammy webmasters could continue selling links from their penalized sites, which would add to the confusion and devaluation of link markets.
  3. Webmasters could become afraid to buy/exchange links because they could get scammed by penalized sites and be more likely to be outed by the legitimate sites.
  4. The Penguin algorithm could have increased confidence scoring and become ready for real-time implementation.
Russ Jones

“There was a time when Google would devalue the PR of a site that was caught selling links. With that signal gone, and Google going after outbound links, it is now more difficult than ever to know whether a link acquired is really of value.” -— Russ Jones, Principal Search Scientist at MOZ


Again, if I were Google, the next generation of Penguin would likely heavily weight irrelevantly placed links, and not just commercial keyword-specific anchor text. Testing this first on the sites I think are guilty of providing the links and simply devaluing those links seems much smarter. Of course, at this point, there is no specific evidence to indicate Google’s intention behind the unnatural outbound links penalties were intended as a final testing phase for Penguin and to further devalue the manipulated link market. But if I were Google, that’s exactly what I would be doing.

Tripp Hamilton


“Gone are the days of easily repeatable link building strategies. Acquiring links shouldn’t be easy, and Penguin will continue to change the search marketing landscape whether we like it or not. I, for one, welcome our artificially intelligent overlords. Future iterations of the Penguin algorithm will further solidify the “difficulty level” of link acquisition, making spam less popular and forcing businesses toward legitimate marketing strategies.” — Tripp Hamilton, Product Manager at Removeem.com

Google’s webmaster guidelines show link schemes are interpreted by intent. I wonder what happens if I start nofollowing links from my site for the intent of devaluing a site’s rankings? The intent is manipulation. Am I at risk of being considered a participant in link schemes? If I do link building as part of an SEO campaign, am I inherently conducting a link scheme?

Google Webmaster Guidelines for Link Scheme

So, since I’m an SEO, not Google, I have to ask myself and my colleagues, “What does this do to change or reinforce my SEO efforts?” I immediately think back to a Whiteboard Friday from a few years ago that discusses the Rules of Link Building.

Cyrus Shepard

“At its best, good link building is indistinguishable from good marketing.” — Cyrus Shepard, former Content Astronaut at Moz



When asked what type of impact SEOs should expect from this, Garret French from Citation Labs shared:

Garret French


“Clearly this new effort by Google will start to dry up the dofollow sponsored post, sponsored review marketplace. Watch for prices to drop over the next few months and then go back and test reviews with nofollowed links to see which ones actually drive converting traffic! If you can’t stomach paying for nofollowed links then it’s time to get creative and return to old-fashioned, story-driven blog PR. It doesn’t scale well, but it works well for natural links.”

In conclusion, as SEOs, we are responsible for predicting the future of our industry. We do not simply act in the present. Google does not wish for its results to be gamed and have departments full of data scientists dedicated to building algorithms to identify and devalue manipulative practices. If you are incapable of legitimately building links, then you must mimic legitimate links in all aspects (or consider a new career).

Takeaways

Most importantly, any links that we try to build must provide value. If a URL links to a landing page that is not contextually relevant to its source page, then this irrelevant link is likely to be flagged and devalued. Remember, Google can do topical analysis, too.

In link cleanup mode or Penguin recovery, we’ve typically approached unnatural links as being obvious when they have a commercial keyword (e.g. “insurance quotes”) because links more naturally occur with the URL, brand, or navigational labels as anchor text. It would also be safe to assume that natural links tend to occur in content about the destination the link offers and that link relevance should be considered.

Finally, we should continue to identify and present clients with methods for naturally building authority by providing value in what they offer and working to build real relationships and brand advocates.

What are your thoughts? Do you agree? Disagree?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Long Tail SEO: When & How to Target Low-Volume Keywords – Whiteboard Friday

Posted by randfish

The long tail of search can be a mysterious place to explore, often lacking the volume data that we usually rely on to guide us. But the keyword phrases you can uncover there are worth their weight in gold, often driving highly valuable traffic to your site. In this edition of Whiteboard Friday, Rand delves into core strategies you can use to make long tail keywords work in your favor, from niche-specific SEO to a bigger content strategy that catches many long tail searches in its net.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about long tail SEO.

Now, for those of you who might not be familiar, there’s basically a demand curve in the search engine world. Lots and lots of searchers are searching for very popular keywords in the NBA world like “NBA finals.” Then we have a smaller number of folks who are searching for “basketball hoops,” but it’s still pretty substantial, right? Probably hundreds to thousands per month. Then maybe there are only a few dozen searches a month for something like “Miami Heat box ticket prices.”

Then we get into the very long tail, where there are one, two, maybe three searches a month, or maybe not even. Maybe it’s only a few searches per year for something like “retro Super Sonics customizable jersey Seattle.”

Now, this is pretty tough to do keyword research anywhere in this long tail region. The long tail region is almost a mystery to us because the search engines themselves don’t get enough volume to where they’d show it in a tool like AdWords or in Bing’s research. Even Search Suggest or related searches will often not surface these kinds of terms and phrases. They just don’t get enough volume. But for many businesses, and yours may be one of them, these keywords are actually quite valuable.

2 ways to think about long tail keyword targeting

#1: I think that there’s this small set of hyper-targeted, specific keyword terms and phrases that are very high value to my business. I know they’re not searched for very much, maybe only a couple of times a month, maybe not even that. But when they are, if I can drive the search traffic to my website, it’s hugely valuable to me, and therefore it’s worth pursuing a handful of these. A handful could be half a dozen, or it could be in the small hundreds that you decide these terms are worth going after even though they have a very small number of keyword searches. Remember, if we were to build 50 landing pages targeting terms that only get one or two searches a month, we still might get a hundred or a couple hundred searches every year coming to our site that are super valuable to the business. So these terms in general, when we’re doing this hyper-specific, they need to be…

  • Conversion-likely, meaning that we know we’re going to convert those searchers into buyers if we can get them or searchers into whatever we need them to do.
  • They should be very low competition, because not a lot of people know about these keywords. There’s not a bunch of sites targeting them already. There are no keyword research tools out there that are showing this data.
  • It should be a relatively small number of terms that we’re targeting. Like I said, maybe a few dozen, maybe a couple hundred, generally not more than that.
  • We’re going to try and build specifically optimized pages to turn those searchers into customers or to serve them in whatever way we need.

#2: The second way is to have a large-scale sort of blast approach, where we’re less targeted with our content, but we’re covering a very wide range of keyword targets. This is what a lot of user-generated content sites, large blogs, and large content sites are doing with their work. Maybe they’re doing some specific keyword targeting, but they’re also kind of trying to reach this broad group of long tail keywords that might be in their niche. It tends to be the case that there’s…

  • A ton of content being produced.
  • It’s less conversion-focused in general, because we don’t know the intent of all these searchers, particularly on the long tail terms.
  • We are going to be targeting a large number of terms here.
  • There are no specific keyword targets available. So, in general, we’re focused more on the content itself and less on the specificity of that keyword targeting.

Niche + specific long tail SEO

Now, let’s start with the niche and specific. The way I’m going to think about this is I might want to build these pages — my retro Super Sonics jerseys that are customizable — with my:

  • Standard on-page SEO best practices.
  • I’m going to do my smart internal linking.
  • I really don’t need very many external links. One or two will probably do it. In fact, a lot of times, when it comes to long tail, you can rank with no external links at all, internal links only.
  • Quality content investment is still essential. I need to make sure that this page gets indexed by Google, and it has to do a great job of converting visitors. So it’s got to serve the searcher intent. It can’t look like automated content, it can’t look low quality, and it certainly can’t dissuade visitors from coming, because then I’ve wasted all the investment that I’ve made getting that searcher to my page. Especially since there are so few of them, I better make sure this page does a great job.

A) PPC is a great way to go. You can do a broad-term PPC buy in AdWords or in Bing, and then discover these hyper-specific opportunities. So if I’m buying keywords like “customizable jerseys,” I might see that, sure, most of them are for teams and sports that I’ve heard of, but there might be some that come to me that are very, very long tail. This is actually a reason why you might want to do those broad PPC buys for discovery purposes, even if the ROI isn’t paying off inside your AdWords campaign. You look and you go, “Hey, it doesn’t pay to do this broad buy, but every week we’re discovering new keywords for our long tail targeting that does make it worthwhile.” That can be something to pay attention to.

B) You can use some keyword research tools, just not AdWords itself, because AdWords bias is to show you more commercial terms, and it biases to show you terms and phrases that do actually have search volume. What you want to do is actually find keyword research tools that can show you keywords with zero searches, no search volume at all. So you could use something like Moz’s Keyword Explorer. You could use KeywordTool.io. You could use Übersuggest. You could use some of the keyword research tools from the other providers out there, like a Searchmetrics or what have you. But all of these kinds of terms, what you want to find are those 0–10 searches keywords, because those are going to be the ones that have very, very little volume but potentially are super high-value for your specific website or business.

C) Be aware that the keyword difficulty scores may not actually be that useful in these cases. Keyword difficulty scores — this is true for Moz’s keyword difficulty score and for all the other tools that do keyword difficulty — what they tend to do is they look at a search result and then they say, “How many links or how high is the domain authority and page authority or all the link metrics that point to these 10 pages?” The problem is in a set where there are very few people doing very specific keyword targeting, you could have powerful pages that are not actually optimized at all for these keywords that aren’t really relevant, and therefore it might be much easier than it looks like from a keyword difficulty score to rank for those pages. So my advice is to look at the keyword targeting to spot that opportunity. If you see that none of the 10 pages actually includes all the keywords, or only one of them seems to actually serve the searcher intent for these long tail keywords, you’ve probably found yourself a great long tail SEO opportunity.

Large-scale, untargeted long tail SEO

This is very, very different in approach. It’s going to be for a different kind of website, different application. We are not targeting specific terms and phrases that we’ve identified. We’re instead saying, “You know what? We want to have a big content strategy to own all types of long tail searches in a particular niche.” That could be educational content. It could be discussion content. It could be product content, where you’re supporting user-generated content, those kinds of things.

  • I want a bias to the uniqueness of the content itself and real searcher value, which means I do need content that is useful to searchers, useful to real people. It can’t be completely auto-generated.
  • I’m worrying less about the particular keyword targeting. I know that I don’t know which terms and phrases I’m going to be going after. So instead, I’m biasing to other things, like usefulness, amount of uniqueness of content, the quality of it, the value that it provides, the engagement metrics that I can look at in my analytics, all that kind of stuff.
  • You want to be careful here. Anytime you’re doing broad-scale content creation or enabling content creation on a platform, you’ve got to keep low-value, low-unique content pages out of Google’s index. That could be done two ways. One, you limit the system to only allow in certain amounts of content before a page can even be published. Or you look at the quantity of content that’s being created or the engagement metrics from your analytics, and you essentially block — via robots.txt or via meta robots tag — any of the pages that look like they’re low-value, low-unique content.

A) This approach requires a lot of scalability, and so you need something like a:

  • Discussion forum
  • Q&A-style content
  • User-posted product or service or business listings. Think something like an Etsy or a GitHub or a Moz Q&A, discussion forums like Reddit. These all support user-generated content.
  • You can also go with non-UGC if it’s editorially created. Something like a frequently updated blog or news content, particularly if you have enough of a staff that can create that content on a regular basis so that you’re pumping out good stuff on a regular basis, that can also work. It’s generally not as scalable, but you have to worry less about the uniqueness of quality content.

B) You don’t want to fully automate this system. The worst thing you can possibly do is to take a site that has been doing well, pump out hundreds, thousands, tens of thousands of pages, throw them up on the site, they’re low-quality content, low uniqueness of content, and Google can hit you with something like the Panda penalty, which has happened to a lot of sites that we’ve seen over the years. They continue to iterate and refine that, so be very cautious. You need some human curation in order to make sure the uniqueness of content and value remain above the level you need.

C) If you’re going to be doing this large-scale content creation, I highly advise you to make the content management system or the UGC submission system work in your favor. Make it do some of that hard SEO legwork for you, things like…

  • Nudging users to give more descriptive, more useful content when they’re creating it for you.
  • Require some minimum level of content in order to even be able to post it.
  • Use spam software to be able to catch and evaluate stuff before it goes into your system. If it has lots of links, if it contains poison keywords, spam keywords, kick it out.
  • Encourage and reward the high-quality contributions. If you see users or content that is consistently doing well through your engagement metrics, go find out who those users were, go reward them. Go promote that content. Push that to higher visibility. You want to make this a system that rewards the best stuff and keeps the bad stuff out. A great UGC content management system can do this for you if you build it right.

All right, everyone, look forward to your thoughts on long tail SEO, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.


Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Diving for Pearls: A Guide to Long Tail Keywords – Next Level

Posted by jocameron

[Estimated read time: 15 minutes]

Welcome to the fifth installment of our educational Next Level series! Last time, we led you on a journey to perform a heroic site audit. This time around we’re diving into the subject of long tail keywords, equipping you with all the tools you’ll need to uncover buried treasure.


One of the biggest obstacles to driving forward your business online is being able to rank well for keywords that people are searching for. Getting your lovely URLs to show up in those precious top positions — and gaining a good portion of the visitors behind the searches — can feel like an impossible dream.

Particularly if you’re working on a newish site on a modest budget within a competitive niche.

Well, strap yourself in, because today we’re going to live that dream. I’ll take you through the bronze, silver, and gold levels of finding, assessing, and targeting long tail keywords so you can start getting visitors to your site that are primed and ready to convert.

So what the bloomin’ heck are long tail keywords?

The ‘long tail of search’ refers to the many weird and wonderful ways the diverse people of the world search for what they’re after in any given niche.

People (yes, people! Shiny, happy, everyday, run-of-the-mill, muesli-eating, bogie-picking, credit-card-toting people!) don’t just stop at searching broad and generic ‘head’ keywords, like “web design” or “camera” or “sailor moon.”

They clarify their search with emotional triggers, technical terms they’ve learned from reading forums, and compared features and prices before mustering up the courage to commit and convert on your site.

The long tail is packed with searches like “best web designer in Nottingham” or “mirrorless camera 4k video 2016” or “sailor moon cat costume.”

This lovely chart visualizes the long tail of search by using the tried and tested “Internet loves cats + animated gifs are the coolest = SUCCESS” formula.

All along that tail are searches being constantly generated by people seeking answers from the Internet hive mind. There’s no end to what you’ll find if you have a good old rummage about, including: Questions, styles, colors, brands, concerns, peeves, desires, hopes, dreams… and everything in between.

Fresh, new, outrageous, often bizarre keywords. If you’ve done any keyword research you’ll know what I mean by bizarre. Things a person wouldn’t admit to their therapist, priest, or doctor they’ll happily pump into Google and hit search. And we’re going to go diving for pearls: keywords with searcher intent, high demand, low competition, and a spot on the SERPs just for you.

Bronze medal: Build your own keyword

It’s really easy to come up with a long tail keyword. You can use your brain, gather some thoughts, take a stab in the dark, and throw a few keyword modifiers around your head keyword.

Have you ever played with that magnetic fridge poetry game? It’s a bit like that. You can play online if (like me) you have an aversion to physical things.

I’m no poet, but I think I deserve a medal for this attempt, and now I really want some “hot seasonal berry water.”

Magnetic poetry not doing it for you? Don’t worry — that’s only the beginning.

Use your industry knowledge

Time to draw on that valuable industry knowledge you’ve been storing up, jot down some ideas, and think about intent and common misconceptions. I’m going to use the example pearls or freshwater pearls in this post as the head term because that’s something I’m interested in.

Let’s go!

How do I clean freshwater pearls

Ok, cool, adding to my list.

Search your keyword

Now you can get some more ideas by manually entering your keyword into Google and prompting it to give you popular suggestions, like I’ve done below:

Awesome, I’m adding Freshwater pearls price to my list.

Explore the language of social media

Get amongst the over-sharers and have a look at what people are chatting about on social media by searching your keyword in Twitter, Instagram, and Youtube. These are topics in your niche that people are talking about right now.

Twitter and Instagram are proving tricky to explore for my head term because it’s jam-packed with people selling pearl jewelry.

Shout out to a cheeky Moz tool, Followerwonk, for helping with this stage. I’m searching Twitter bios to find Twitter accounts with “freshwater pearls.”

Click these handy little graph icons for a more in-depth profile analysis

I can now explore what they’re tweeting, I can follow them and find out who is engaging with them, and I can find their most important tweets. Pretty groovy!

YouTube is also pulling up some interesting ideas around my keyword. This is simultaneously helping me gather keyword ideas and giving me a good sense about what content is already out there. Don’t worry, we’ll touch on content later on in this post. 🙂

I’m adding understanding types of pearls and Difference between saltwater and freshwater pearls to my list.

Ask keyword questions?

You’ll probably notice that I’ve added a question mark to a phrase that is not a question, just to mess with you all. Apologies for the confusing internal-reading-voice-upwards-inflection.

Questions are my favorite types of keywords. What!? You don’t have a fav keyword type? Well, you do now — trust me.

Answer the Public is packed with questions, and it has the added bonus of having this tooth-picking (not bogie-picking, thank goodness!) dude waiting for you to impress him.

So let’s give him something to munch on and pop freshwater pearls in there, too, then grab some questions for our growing list.

To leave no rock unturned (or no mollusk unshucked), let’s pop over to Google Search Console to find keywords that are already sending you traffic (and discover any mismatches between your content and use intent.)

Pile these into a list, like I’ve done in this spreadsheet.

Now this is starting to look interesting: we’ve got some keyword modifiers, some clear buying signals, and a better idea of what people might be looking for around “freshwater pearls.”

Should you stop there? I’m flabbergasted — how can you even suggest that?! This is only the beginning. 🙂

Silver medal: Assess demand and explore topics

So far, so rosy. But we’ve been focusing on finding keywords, picking them up, and stashing them in our collection like colored glass at the seaside.

To really dig into the endless tail of your niche, you’ll need a keyword tool like our very own Keyword Explorer (KWE for short). This is invaluable to finding topics within your niche that present a real opportunity for your site.

If you’re trying out KWE for the first time, you get 2 searches free per day without having to log in, but you get a few more with your Community account and even more with a Moz Pro subscription.

Find search volume for your head keyword

Let’s put “pearls” into KWE. Now you can see how many times it’s being searched per month in Google:

Now try “freshwater pearls.” As expected, the search volume goes down, but we’re getting more specific.

We could keep going like this, but we’re going to burn up all our free searches. Just take it as read that, as you get more specific and enter all the phrases we found earlier, the search volume will decrease even more. There may not be any data at all. That’s why you need to explore the searches around this main keyword.

Find even more long tail searches

Below the search volume, click on “Keyword Suggestions.”

Well, hi there, ever-expanding long tail! We’ve gone from a handful of keywords pulled together manually from different sources to 1,000 suggestions right there on your screen. Positioned right next to that we have search volume to give us an idea of demand.

The diversity of searches within your niche is just as important as that big number we saw at the beginning, because it shows you how much demand there is for this niche as a whole. We’re also learning more about searcher intent.

I’m scanning through those 1,000 suggestions and looking for other terms that pop up again and again. I’m also looking for signals and different ways the words are being used to pick out words to expand my list.

I like to toggle between sorting by relevancy and search volume, and then scroll through all the results to cherry-pick those that catch my eye.

Now reverse the volume filter so that it’s showing lower-volume search terms and scroll down through the end of the tail to explore the lower-volume chatter.

This is where your industry knowledge comes into play again. Bots, formulas, spreadsheets, and algorithms are all well and good, but don’t discount your own instincts and knowledge.

Use the suggestions filters to your advantage and play around with broader or more specific suggestion types. Keyword Explorer pulls together suggestions from AdWords, autosuggest, related searches, Wikipedia titles, topic modeling extractions, and SERPscape.

Looking through the suggestions, I’ve noticed that the word “cultured” has popped up a few times.

To see these all bundled together, I want to look at the grouping options in KWE. I like the high lexicon groups so I can see how much discussion is going on within my topics.

Scroll down and expand that group to get an idea of demand and assess intent.

I’m also interested in the words around “price” and “value,” so I’m doing the same and saving those to my sheet, along with the search volume. A few attempts at researching the “cleaning” of pearls wasn’t very fruitful, so I’ve adjusted my search to “clean freshwater pearls.”

Because I’m a keyword questions fanatic, I’m also going to filter by questions (the bottom option from the drop-down menu):

OK! How is our list looking? Pretty darn hot, I reckon! We’ve gathered together a list of keywords and dug into the long tail of these sub-niches, and right alongside we’ve got search volume.

You’ll notice that some of the keywords I discovered in the bronze stage don’t have any data showing up in KWE (indicated by the hyphen in the screenshot above). That’s ok — they’re still topics I can research further. This is exactly why we have assessed demand; no wild goose chase for us!

Ok, we’re drawing some conclusions, we’re building our list, and we’re making educated decisions. Congrats on your silver-level keyword wizardry! 😀

Gold medal: Find out who you’re competing with

We’re not operating in a vacuum. There’s always someone out there trying to elbow their way onto the first page. Don’t fall into the trap of thinking that just because it’s a long tail term with a nice chunk of search volume all those clicks will rain down on you. If the terms you’re looking to target already have big names headlining, this could very well alter your roadmap.

To reap the rewards of targeting the long tail, you’ll have to make sure you can outperform your competition.

Manually check the SERPs

Check out who’s showing up in the search engine results page (SERPs) by running a search for your head term. Make sure you’re signed out of Google and in an incognito tab.

We’re focusing on the organic results to find out if there are any weaker URLs you can pick off.

I’ll start with “freshwater pearls” for illustrative purposes.

Whoooaaa, this is a noisy page. I’ve had to scroll a whole 2.5cm on my magic mouse (that’s very nearly a whole inch for the imperialists among us) just to see any organic results.

Let’s install the Mozbar to discover some metrics on the fly, like domain authority and back-linking data.

Now, if seeing those big players in the SERPs doesn’t make it clear, looking at the Mozbar metrics certainly does. This is exclusive real estate. It’s dominated by retailers, although Wikipedia gets a place in the middle of the page.

Let’s get into the mind of Google for a second. It — or should I say “they” (I can’t decide if it’s more creepy for Google to be referred to as a singular or plural pronoun. Let’s go with “they”) — anyway, I digress. “They” are guessing that we’re looking to buy pearls, but they’re also offering results on what they are.

This sort of information is offered up by big retailers who have created content that targets the intention of searchers. Mikimoto drives us to their blog post all about where freshwater pearls come from.

As you get deeper into the long tail of your niche, you’ll begin to come across sites you might not be so familiar with. So go and have a peek at their content.

With a little bit of snooping you can easily find out:

  • how relevant the article is
  • if it looks appealing, up to date, and sharable
  • be really judge-y: why not?

Now let’s find some more:

  • when the article was published
  • when their site was created
  • how often their blog is updated
  • how many other sites are linking to the page with Open Site Explorer
  • how many tweets, likes, etc.

You can also pop your topic into Moz Content to see how other articles are performing in your niche. I talk about competitor analysis a bit more in my Bonnie Tyler Site Audit Manifesto, so check it out.

Document all of your findings our spreadsheet from earlier to keep track of the data. This information will now inform you of your chances of ranking for that term.

Manually checking out your competition is something that I would strongly recommend. But we don’t have all the time in the world to check each results page for each keyword we’re interested in.

Keyword Explorer leaps to our rescue again

Run your search and click on “SERP Analysis” to see what the first page looks like, along with authority metrics and social activity.


All the metrics for the organic results, like Page Authority, goes into calculating the Difficulty score above (lower is better).

And all those other factors — the ads and suggestions taking up space on the SERPs — that’s what’s used to calculate Opportunity (higher is better).

Potential is all the other metrics tallied up. You definitely want this to be higher.

So now we have 3 important numerical values we can use to to gauge our competition. We can use these values to compare keywords.

After a few searches in KWE, you’re going to start hankering for a keyword list or two. For this you’ll need a paid subscription, or a Moz Pro 30-day free trial.

It’s well worth the sign-up; not only to you get 5,000 keyword reports per month and 30 lists (on the Medium plan), but you also get to check out the super-magical-KWE-mega-list-funky-cool metric page. That’s what I call it, just rolls off the tongue, you know?

Ok, fellow list buddies, let’s go and add those terms we’re interested in to our lovely new list.

Then head up to your lists on the top right and open up the one you just created.

Now we can see the spread of demand, competition and SERP features for our whole list.

You can compare Volume, SERPS, Difficulty, Opportunity, and Potential across multiple lists, topics, and niches.

How to compare apples with apples

Comparing keywords is something we get asked about quite a bit on the Moz Help Team.

Should I target this word or that word?

For the long tail keyword, the Volume is a lot lower, Difficulty is also down, the Opportunity is a bit up, and overall the Potential is down because of the drop in search volume.

But don’t discount it! By targeting these sorts of terms, you’re focusing more on the intent of the searcher. You’re also making your content relevant for all the other neighboring search terms.

Let’s compare difference between freshwater and cultured pearls with how much are freshwater pearls worth.

Search volume is the same, but for the keyword how much are freshwater pearls worth Difficulty is up, but so is the overall Potential because the Opportunity is higher.

But just because you’re picking between two long tail keywords doesn’t mean you’ve fully understood the long tail of search.

You know all those keywords I grabbed for my list earlier in this post? Well, here they are sorted into topics.

Look at all the different ways people search for kind of the same thing. This is what drives the long tail of search — searcher diversity. If you tally all the volume up for the cultured topic, we’ve got a bigger group of keywords and overall more search volume. This is where you can use Keyword Explorer and the long tail to make informed decisions.

You’re laying out your virtual welcome mat for all the potential traffic these terms send.

Platinum level: I lied — there’s one more level!

For all you lovely overachievers out there who have reached the end of this post, I’m going to reward you with one final tip.

You’ve done all the snooping around on your competitors, so you know who you’re up against. You’ve done the research, so you know what keywords to target to begin driving intent-rich traffic.

Now you need to create strong, consistent, and outstanding content. For the best explanation on how and why you must do this, you can’t go past Rand’s 10x Whiteboard Friday.

Here’s where you really have to tip your hat to long tail keywords, because by targeting the long tail you can start to build enough authority in the industry to beat stronger competition and rank higher for more competitive keywords in your niche.

Wrapping up…

The various different keyword phrases that make up the long tail in your industry is vast, often easier to rank for, and indicates stronger intent from the searcher. By targeting them you’ll find you can start to rank for relevant phrases sooner than if you just targeted the head. And over time, if you get the right signals, you’ll be able to rank for keywords with tougher competition. Pretty sweet, huh? Give our Keyword Explorer tool a whirl and let me know how you get on 🙂


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →