About frans

Website:
frans has written 4625 articles so far, you can find them below.

Google’s War on Data and the Clickstream Revolution

Posted by rjonesx.

Existential threats to SEO

Rand called “Not Provided” the First Existential Threat to SEO in 2013. While 100% Not Provided was certainly one of the largest and most egregious data grabs by Google, it was part of a long and continued history of Google pulling data sources which benefit search engine optimizers.

A brief history

  1. Nov 2010 – Deprecate search API
  2. Oct 2011 – Google begins Not Provided
  3. Feb 2012 – Sampled data in Google Analytics
  4. Aug 2013 – Google Keyword Tool closed
  5. Sep 2013 – Not Provided ramped up
  6. Feb 2015 – Link Operator degraded
  7. Jan 2016 – Search API killed
  8. Mar 2016 – Google ends Toolbar PageRank
  9. Aug 2016 – Keyword Planner restricted to paid

I don’t intend to say that Google made any of these decisions specifically to harm SEOs, but that the decisions did harm SEO is inarguable. In our industry, like many others, data is power. Without access to SERP, keyword, and analytics data, our and our industry’s collective judgement is clouded. A recent survey of SEOs showed that data is more important to them than ever, despite these data retractions.

So how do we proceed in a world in which we need data more and more but our access is steadily restricted by the powers that be? Perhaps we have an answer — clickstream data.

What is clickstream data?

First, let’s give a quick definition of clickstream data to those who are not yet familiar. The most straightforward definition I’ve seen is:

“The process of collecting, analyzing, and reporting aggregate data about which pages users visit in what order.”
– (TechTarget: What is Clickstream Analysis)

If you’ve spent any time analyzing your funnel or looking at how users move through your site, you have utilized clickstream data in performing clickstream analysis. However, traditionally, clickstream data is restricted to sites you own. But what if we could see how users behave across the web — not just our own sites? What keywords they search, what pages they visit, and how they navigate the web? With that data, we could begin to fill in the data gaps previously lost to Google.

I think it’s worthwhile to point out the concerns presented by clickstream data. As a webmaster, you must be thoughtful about what you do with user data. You have access to the referrers which brought visitors to your site, you know what they click on, you might even have usernames, emails, and passwords. In the same manner, being vigilant about anonymizing data and excluding personally identifiable information (PII) has to be the first priority in using clickstream data. Moz and our partners remain vigilant, including our latest partner Jumpshot, whose algorithms for removing PII are industry-leading.

What can we do?

So let’s have some fun, shall we? Let’s start to talk about all the great things we can do with clickstream data. Below, I’ll outline a half dozen or so insights we’ve gleaned from clickstream data that are relevant to search marketers and Internet users in general. First, let me give credit where credit is due — the data for these insights have come from 2 excellent partners: Clickstre.am and Jumpshot.

Popping the filter bubble

It isn’t very often that the interests of search engine marketers and social scientists intersect, so this is a rare opportunity for me to blend my career with my formal education. Search engines like Google personalize results in a number of ways. We regularly see personalization of search results in the form of geolocation, previous sites visited, or even SERP features tailored to things Google knows about us as users. One question posed by social scientists is whether this personalization creates a filter bubble, where users only see information relative to their interests. Of particular concern is whether this filter bubble could influence important informational queries like those related to political candidates. Does Google show uniform results for political candidate queries, or do they show you the results you want to see based on their personalization models?

Well, with clickstream data we can answer this question quite clearly by looking at the number of unique URLs which users click on from a SERP. Personalized keywords should result in a higher number of unique URLs clicked, as users see different URLs from one another. We randomly selected 50 search-click pairs (a searched keyword and the URL the user clicked on) for the following keywords to get an idea of how personalized the SERPs were.

  1. Dropbox – 10
  2. Google – 12
  3. Donald Trump – 14
  4. Hillary Clinton – 14
  5. Facebook – 15
  6. Note 7 – 16
  7. Heart Disease – 16
  8. Banks Near Me – 107
  9. Landscaping Company – 260

As you can see, a highly personalized keyword like “banks near me” or “landscaping company” — which are dependent upon location —receive a large number of unique URLs clicked. This is to be expected and validates the model to a degree. However, candidate names like “Hillary Clinton” and “Donald Trump” are personalized no more than major brands like Dropbox, Google, or Facebook and products like the Samsung Note 7. It appears that the hypothetical filter bubble has burst — most users see the exact same results as one another.

Biased search behavior

But is that all we need to ask? Can we learn more about the political behavior of users online? It turns out we can. One of the truly interesting features of clickstream data is the ability to do “also-searched” analysis. We can look at clickstream data and determine whether or not a person or group of people are more likely to search for one phrase or another after first searching for a particular phrase. We dove into the clickstream data to see if there were any material differences between subsequent searches of individuals who looked for “donald trump” and “hillary clinton,” respectively. While the majority of the searches were quite the same, as you would expect, searching for things like “youtube” or “facebook,” there were some very interesting differences.

For example, individuals who searched for “donald trump” were 2x as likely to then go on to search for “Omar Mateen” than individuals who previously searched for “hillary clinton.” Omar Mateen was the Orlando shooter. Individuals who searched for “Hillary Clinton” were about 60% more likely to search for “Philando Castile,” the victim of a police shooting and, in particular, one of the more egregious examples. So it seems — at least from this early evidence —that people carry their biases to the search engines, rather than search engines pushing bias back upon them.

Getting a real click-through rate model

Search marketers have been looking at click-through rate (CTR) models since the beginning of our craft, trying to predict traffic and earnings under a set of assumptions that have all but disappeared since the days of 10 blue links. With the advent of SERP features like answer boxes, the knowledge graph, and Twitter feeds in the search results, it has been hard to garner exactly what level of traffic we would derive from any given position.

With clickstream data, we have a path to uncovering those mysteries. For starters, the click-through rate curve is dead. Sorry folks, but it has been for quite some time and any allegiance to it should be categorized as willful neglect.

We have to begin building somewhere, so at Moz we start with opportunity metrics (like the one introduced by Dr. Pete, which can be found in Keyword Explorer) which depreciate the potential search traffic available from a keyword based on the presence of SERP features. We can use clickstream data to learn the non-linear relationship between SERP features and CTR, which is often counter-intuitive.

Let’s take a quick quiz.

Which SERP has the highest organic click-through rate?

  • A SERP with just news
  • A SERP with just top ads
  • A SERP with sitelinks, knowledge panel, tweets, and ads at the top

Strangely enough, it’s the last that has the highest click-through rate to organic. Why? It turns out that the only queries that get that bizarre combination of SERP features are for important brands, like Louis Vuitton or BMW. Subsequently, nearly 100% of the click traffic goes to the #1 sitelink, which is the brand website.

Perhaps even more strangely, pages with top ads deliver more organic clicks than those with just news. News tends to entice users more than advertisements.

It would be nearly impossible to come to these revelations without clickstream data, but now we can use the data to find the unique relationships between SERP features and click-through rates.

In production: Better volume data

Perhaps Moz’s most well-known usage of clickstream data is our volume metric in Keyword Explorer. There has been a long history of search marketers using Google’s keyword volume as a metric to predict traffic and prioritize keywords. While (not provided) hit SEOs the hardest, it seems like the recent Google Keyword Planner ranges are taking a toll as well.

So how do we address this with clickstream data? Unfortunately, it isn’t as cut-and-dry as simply replacing Google’s data with Jumpshot or a 3rd party provider. There are several steps involved — here are just a few.

  1. Data ingestion and clean-up
  2. Bias removal
  3. Modeling against Google Volume
  4. Disambiguation corrections

I can’t stress how much attention to detail needs to go into these steps in order to make sure you’re adding value with clickstream data rather than simply muddling things further. But I can say with confidence that our complex solutions have had a profoundly positive impact on the data we provide. Let me give you some disambiguation examples that were recently uncovered by our model.

Keyword Google Value Disambiguated
cars part 135000 2900
chopsuey 74000 4400
treatment for mononucleosis 4400 720
lorton va 9900 8100
definition of customer service 2400 1300
marion county detention center 5400 4400
smoke again lyrics 1900 880
should i get a phd 480 320
oakley crosshair 2.0 1000 480
barter 6 download 4400 590
how to build a shoe rack 880 720

Look at the huge discrepancies here for the keyword “cars part.” Most people search for “car parts” or “car part,” but Google groups together the keyword “cars part,” giving it a ridiculously high search value. We were able to use clickstream data to dramatically lower that number.

The same is true for “chopsuey.” Most people search for it, correctly, as two separate words: “chop suey.”

These corrections to Google search volume data are essential to make accurate, informed decisions about what content to create and how to properly optimize it. Without clickstream data on our side, we would be grossly misled, especially in aggregate data.

How much does this actually impact Google search volume? Roughly 25% of all keywords we process from Google data are corrected by clickstream data. This means tens of millions of keywords monthly.

Moving forward

The big question for marketers is now not only how do we respond to losses in data, but how do we prepare for future losses? A quick survey of SEOs revealed some of their future concerns…

Luckily, a blended model of crawled and clickstream data allows Moz to uniquely manage these types of losses. SERP and suggest data are all available through clickstream sources, piggybacking on real results rather than performing automated ones. Link data is already available through third-party indexes like MozScape, but can be improved even further with clickstream data that reveals the true popularity of individual links. All that being said, the future looks bright for this new blended data model, and we look forward to delivering upon its promises in the months and years to come.

And finally, a question for you…

As Moz continues to improve upon Keyword Explorer, we want to make that data more easily accessible to you. We hope to soon offer you an API, which will bring this data directly to you and your apps so that you can do more research than ever before. But we need your help in tailoring this API to your needs. If you have a moment, please answer this survey so we can piece together something that provides just what you need.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

I’ve Optimized My Site, But I’m Still Not Ranking—Help! – Next Level

Posted by jocameron

Welcome to the sixth installment of our educational Next Level series! In our last episode, Jo took you on an adventure diving for treasure in the long tail of search. This time around we’re answering the call for help when you feel like you’ve done all you can, but you’re still not ranking. Read on and level up!

You’ve optimized your pages, written delightful title tags, concocted a gorgeous description to entice clicks, used your target keyword in your copy with similar words, and your content is good, like really good. As far as you’re concerned you’re doing everything you can on that page to say to Google “This is relevant content!” But, lo and behold, you’re not ranking.

Frustrating, right? Well, no more. I’m going to show you how you can discover what’s holding you back, and how to make sure your site is a lovely big target for visitors, just like this happy fellow:

You’ll learn some tricks you can do in your browser and then we’ll speed things up with some cat magic and pixie dust to sprinkle all over your site.

To start, pop open these tools in another tab so you’re ready to go:

Dreamy!

Step 1: Put in a quick call to Google

Well, you could try to call Big G (that’s what I like to call Google sometimes, just for kicks), but you may have better luck phoning yourself from 1995 with the idea for Google — then you could fix the rankings in your favor. Totally worth it.

Hello, operator?

Instead of messy and possibly future-altering time travel, you can put a call in by running a search operator like this:

site:yourfabsite.com
site:yourfabsite.com/blog
site:yourfabsite.com/blog/my-site-rocks

It’s like saying, “Hey, Big G, show me all the result you have in your index for yourfabsite.com.” This is what you don’t want to see:

If you’re seeing the above, you won’t be able to rank because your site isn’t indexed. It’s got to be indexed before it can rank, and it’s got to be crawled before it can be indexed. Trying to rank without being indexed is like applying for a job and forgetting to attach your CV.

Search Console is here to console you

In the results page above, Google is directing you straight to the Google Search Console.

Not quite as fun as Xbox or as comforting as a hug from a loved one, Google’s Search Console is still pretty sweet all the same.

Go — right now, right, right now, don’t read any more, you should have already gone — go and set up your Search Console. Once you’re all set up and your site is verified, you can go to the page that I like to think of as the Fires of Mount Doom and throw in your precious.

https://www.google.com/webmasters/tools/submit-url

Don’t worry, that analogy doesn’t hold up. It won’t destroy your site. 🙂

Head to “Google Index” and then “Index Status” to see the data similar to what we looked at above, but in graph form! Definitely handy for tracking how your pages have been indexed over time.

If your site is not being indexed, you’re going to want to take a closer look at your robots.txt file. Check your Search Console Messages to see if there’s a reason Google couldn’t index your site. If Google can’t access your robots.txt file they’ll stop indexing to avoid crawling pages listed there.

Step 2: Find out where you’re ranking

Now that you know your pages are being crawled and indexed, you want to get them to the top of the results where they gosh-darn-well should be, right?

Find your rankings with your bear hands

Yes, I DO mean bear hands. This is a manual job and your soft, tender, indoor keyboard hands just won’t do. So attach your bear hands and start digging. Search Google for your brand name, primary keywords, secondary keywords, words, and phrases you used on your page (one at a time, of course). Feel the ache in your chest as you scan the page: “Where is my jazzy title? My tantalizing description? My adorable URL?”

Turn up the volume

Not finding your site on the first page? Instead of clicking through to the many ooooos of Google, we’re going to change the settings in your browser to show 50 or 100 results per search so we can view more results with every search. I’m going to want to see A LOT more pet costume results, so I’ll click on the gear icon in Chrome and hit “Search Settings,” then toggle up the “Results per page”:

Now we’ve got a whole page of 50 or 100 results to search through. Use CMD + F for Mac (or CRTL + F for Windows) to search for your domain.

This process is great for doing a quick check to see if you’re in the top 50 or top 100. Remember that your browser can return personalized results when you’re logged into Google, so log out and enter incognito mode.

Like any good detective, make sure you record the keyword, position, and URL in a spreadsheet for Future You to discover and applaud Present-Day You on your fabulousness.

Start cooking with gas

Manual searches aren’t for everyone. I mean come on, we work in technology — we don’t want to be lugging keywords around the hot, dry Google search page, plugging them in one after another. I hear ya buddy, loud and clear. Let’s detach those bear hands, grab your list of keywords, and plug them straight into Keyword Explorer.

Check if you’re on the first page

Remember, you’ll need a Medium or higher Moz Pro subscription or a standalone Keyword Explorer subscription so you can create your keyword list.

Hit “Create or upload a new list” and choose “Enter Keywords” to pop those straight in there, bish-bash-bosh.

Open up a list you’ve created and pop in your URL to to see your rank from 1–10.

Check Rankings

Want to see if you’re in the top 50?

Heck yeah! Take that same list and paste them into a new campaign in Moz Pro.

If you already have a campaign running you can also transfer these straight over from Keyword Explorer. Just check the box next to the keywords you want to track, then choose a campaign from the drop down.

Add keywords to campaign

You know before (about 30 seconds ago), when we talked about manual searches returning personalized results? Checking rankings in Moz Pro avoids all that nonsense by anonymizing the data and, in my experience, provides the most accurate results, showing what the “most” users see. Pretty snazzy, right?

A new campaign will build in about 30 minutes, which is just enough time to catch up on “Stranger Things” and reminisce about Winona Ryder circa 1990…

On the other hand, adding to an existing campaign will be a bit longer. You’ll see data as soon as your campaign updates next. So you can binge watch the whole series, because why not, right?

…and we’re back! Check out where you’re ranking for your target keywords, which URL is ranking, and over time, whether you’ve moved up or down.

We also pull in search volume from Moz’s Keyword Explorer to give you an idea of demand. When looking at search volume, don’t forget that the higher the demand, the more competition you’ll likely face. Don’t be disheartened by ranking well for keywords with lower search volume, especially if they convert better.

Tracking your rankings is crucial to understanding why you’re not performing as well as you expected. If you’re seeing a lot of down arrows, you need to investigate who is jumping ahead of you and why.

Dig into keywords with falling rankings

Let’s find some keywords that have that sad little down arrow, meaning we’ve dropped down in rankings since our last update.

Here’s a little bundle of keywords that I can investigate. I’ll click on the keyword to open up the Analysis report and scroll down to “Your Performance.” Now we can see a historical graph of your rankings and track those other sites who want to push us to one side. And what do we have here?

They’ve gone and nipped in front of us! This will not stand! It’s likely that for some reason your competitors result has been getting stronger engagement for this keyword. More clicks and more people who do click staying on the page. So let’s find out what you can do to set things right.

Toolkit:

Keyword Explorer Lists – Check your rankings on the fly

Moz Pro – Track your rankings (and your competitors’) over time

Step 3: Make sure you and your content are best friends

There are 2 parts to this step, just like those ‘Best Friend’ heart necklaces that were so popular in the ’90s. Separately they look like BE FRIE and ST NDS, but together…. awww, the secret code is unlocked.

Get your basic on-page optimization in order. Check your content is tip-top quality

Don’t go changing (too often)

I don’t want to recommend you jumping in and making changes to content too often. Even Google needs time to register your updates. However, if your content is a bit dusty and you’re losing out to competitors, then it’s time to check that everything you think is in place is actually in place.

View your page like a bot

I like to think of this as a “bot’s-eye-view.” When a little bot comes along, it doesn’t go, “Oooh, look at that lovely header image! Oooh, I love that font, the white space is really working for me! Oh, how the Internet has changed since my days as a junior bot trawling through gifs of dancing babies!” It reads the code and moves on. We can do this too, with a little bit of knowhow.

Using Firefox or Chrome, you can right-click and view the page source.

If you’re unfamiliar with reading code, it’ll look pretty intimidating.

We’re going to use CMD + F (or CRTL + F for Windows) to hunt for the bits and pieces we’re after.

Pro tip: If you’re seeing og:title, this is a Facebook tag.

Likewise, if you’re using the meta property=”og:description,” this is also a Facebook tag. These help format posts when the URL is shared on Facebook. You’ll want to make sure you also have Title and Description tags link these:

<title>The best title for this page</title>

<meta name=”description” content=”The best description for this page” />

Basic page optimization

This is relatively straightforward because you control your pages. However, maybe for that very same reason, it’s still a bit of a stumbling block for beginners. I’ve been there. I once spent a whole morning trying to write a single title tag.

If you’re confused and locked in a mind-melt of madness because you can’t figure out if you should use the primary keyword and/or the secondary keyword in the title tag, chill your boots.

Here is a brisk and fairly brief run-through on how to get into a productive page optimization mindset.

Title tag basics

This is the bit you click on in the SERPs. Should be about 55 characters of punchy goodness that is relevant to your content. Because it’s relevant to your content, it includes the words you want to rank for and accurately describes what you’re talking about. You better believe Google is paying attention to click signals, so draw that click with your awesome headline. Think about the titles you click on when you’re searching for lovely things. Do your own searches to see what title tags are out there; it’s not like they’re hard to find, they’re literally a click away.

Description tag basics

This is the bit of text under the title tag in the SERPs. They should be about 155 characters of tender lovin’ poetry that talks to the user like they’re a real human being, because they are, and so are you (unless you’re part of the cat colony I suspect controls large portions of the web). This is not a direct ranking factor, but it can heavily influence clicks. Clicks from humans. And what do clicks do? They signal to Google that you’re hot stuff!

On-page copy

Yep, you’re going to want to pop your keywords here, too. But really, let’s not get too hung up on this. If you’re writing something super-duper about your topic, this will flow naturally. Make it as long as it needs to be to make your point. Don’t rattle off the same words over and over; use language to the best of your ability to describe your topic. Remember all those clicks you worked so hard to get with your title and description tags? Well, if they all bounce back to search, you just know Google is paying attention to this. Your content has to be worth the click.

Go and look at what type of content is already ranking. This is not an exercise in scraping content, but a way to make sure that your content isn’t just as good, but much better.

This task can be done manually for a small site or for a few pages you’ve cherry-picked, no problem.

Check your whole site regularly

Maybe you’ve been creating content like a content-creating super machine and you might have skipped a few description tags. Or maybe you copy and pasted a title tag or two. In this case, you’ll want to check that it’s all hunky-dory on a larger scale and on a regular basis.

We’re going back to our Moz Pro campaign to take the heavy lifting out of this job.

Head to the Rankings tab and hit that little “Optimize” button.

Once you hit that little button, you’ve set off a chain of events where our bot looks at the keyword you’re targeting, then has a good old dig-around on your page and gives you a score out of 100.

We’re hoping for that wheel of destiny to roll around to 100.

If we make it part-way around, it’s time to look at the suggestions to see how you can improve your on-page optimization.

Focus on top-level pages, pages that convert, and high-authority pages first.

Toolkit:

Moz Pro Page Optimization – Check that your whole site is optimized correctly

Further reading:

8 Old School SEO Practices That Are No Longer Effective – Whiteboard Friday

Step 4: Become a keyword connoisseur

It’s easy to become fixated on a keyword beyond what is reasonable or healthy. Are you carrying a torch for a golden keyword? Stalking it in the SERPs even though it’s completely entranced with the likes of Wikipedia, eBay, AdWords, and Image Packs?

Ranking in the high-click zone for your keywords is all about beating other sites. This special, golden ticket to traffic wonderland might be a good long term goal, but you’re not going to get to the top of the results in the near future.

On the other hand, maybe you’re afraid of competition, so you only target keywords with very low difficulty.

This can be a winning strategy if the keywords have strong intent and you’re targeting the long tail of search, but you don’t want to put in all that work creating content and find that no one is searching for it. No searches means no traffic, and no traffic means no humans to click a thing that makes a person somewhere in the world look at their analytics data and smile.

A little bit of competition is a good thing — it indicates a healthy, profitable industry.

So we’re looking for a sweet spot: keywords with some demand and less competition. I’m going to break down what organic competition is, and how you know what level of keyword difficulty you can target.

What’s the meaning of this so-called ‘competition?’

If you want to rank organically, your competition is the other sites that are currently on the first page for the keywords. It’s not the total number of sites that are using your keywords in their content, and it’s not the AdWords competition.

If someone on your team, or an agency or a client sends you competition data that’s defined as low, medium, or high, this is very likely to be AdWords competition, and it relates to the cost-per-click.

Moz’s Keyword Difficulty score uses the top 10 organic results to calculate the Difficulty metric. It’s a score out of 100, where a higher number means that the competition is strong, and it may take you longer to see results from your efforts. Every search you bash into Keyword Explorer shows you the Difficulty score, and you can build these into lists so you can compare related keywords.

Keyword Explorer metrics

Benchmark your site’s Difficulty rating

We know that Difficulty is out of 100, but a question we get all the time is: How do I know what level of Difficulty is too high?

Well, first off, testing is a sure way to find out. But if you want a little pointer before you head down that road, here’s how you can quickly benchmark your site’s Difficulty rating.

I learnt this tip from Russ Jones at Mozcon, so I apologies for the blatant rip-off here, but it’s too handy not to share.

Time for another consoling hug from Google Search Console. Grab the keywords that are already sending you traffic from Search Traffic > Search Analytics and download them to CSV.

Save these to a list in KWE.

I usually copy those darlins out of the CSV and plonk them right into a new list.

Hit “Save,” and now you have a benchmark to use when looking at other keywords you could potentially rank for.

When you’re looking at keywords to target in the future you’ll have a good idea whether it’s a short-term or long-term goal.

You can also capitalize on keywords you’re already getting traffic for by looking for opportunities in the SERP Features. Can you steal a Featured Snippet?

I also want to track these keywords over time to see if I’m losing or gaining ground, so I’ll add them from my list straight to my Moz Pro campaign.

Next time my campaign updates, and forevermore into the future, I’ll be keeping the sharpest of eyes on these keywords.

Toolkit:

Google Search Console – Grab keywords already sending you traffic

KWE – Find the real organic competition and benchmark Difficulty

Step 5: Build your site’s authority

Now step 5 is a real doozy, and it’s a common stumbling block for new sites. Just like networking in the real world, online authority is built up over time by your connection to sites that search engines already trust.

I like to think of authority as the pixie dust from the J.M. Barrie novel Peter Pan. It’s almost mentioned as an afterthought, but without it Wendy and the gang were just kids jumping up and down on their beds. They’re thinking happy thoughts. They might even get a bit of temporary lift, you know, just like when you might get a bit of traffic here and there — enough to keep you jumping. But there’s a very big difference between jumping up and down on a spring-loaded mattress and flying off to a world of perpetual youth.

Track your authority

To figure out how much dust you have in your tank, you’ll need to take a look at the Moz metric Domain Authority. This is our best prediction of how well a site will rank for any given search. It’s on a scale of 1–100, and higher DA means more authority.

You can get your paws on DA free through Open Site Explorer or the MozBar Chrome extension. I like to keep MozBar on DA mode so I can check this metric out as I scoot about the web.

You’ll want to check your DA monthly to see how you’re progressing and save this to a sheet, as incoming fresh data will replace the current data in OSE and MozBar. Once you’ve got your data, think about plotting a tasty graph to show how you’re performing versus your competitors.

To make this a whole lot easier, head to the Moz Pro “Links” tab. Here you’ll find your historical link metrics, alongside those of your direct competitors.

Pixie dust isn’t just powering your rankings, but everyone else’s as well. These metrics are relative with respect to the other sites similar to your own, including your competitors.

Gather a pocket full of pixie dust

The first thing we always recommend when people reach out to us to find out how they can improve their Domain Authority is to improve the overall SEO of their site. The good news for you is we’ve already done that in steps 1-4 — highest of high fives to you!

The second thing you have to do is get backlinks. This is commonly known as link building. When I started doing SEO for an ecommerce site back about what feels like a thousand years ago now, I had no idea what I was doing; this term irked me, and still kind of does. It sounds like you need to build links yourself, right? Nope! It’s like you’re playing Minecraft, but instead of building the structures, you’re actually trying to encourage other people to build them for you. In fact, you’re not allowed to build anything yourself, because that’s cheating. Game changer!

Don’t forget you don’t want just anyone building these structures. You need good people who themselves have authority; otherwise, your lovely gothic mansion might turn into a pile of rubble. (This is my analogy for having spammy links that could get your site penalized by search engines.)

A lot of link building today is PR and outreach. I’m not going to go into that in this post, but I’ll include some links in the toolkit below to help you in that department.

We’re going to look at what actions you can take to track and build your authority.

Check for any leaks

There’s no point grabbing up pixie dust if you have a whopping great hole in your pocket.

Find and plug any holes quick-smart. Open Site Explorer has a handy tab just for this job. Pop in your domain and hit “Link Opportunities.”

Now here’s a list of broken pages on your site that have inbound links. Any page on your site that’s down isn’t passing on its value to the other pages on your site — not to mention it’s a shoddy user experience. Look out for any pages serving a 404 status error. I can priorities the pages with the highest DA and more linking domains.

Internal links

I said before that you can’t build any of your links yourself. However, as with everything in SEO there’s a caveat: in this case, links from within your own site are not only key to your site’s usability, but they also pass equity. Internal linking is primarily for user experience, but it also helps bots navigate your site for the purposes of lovely indexing.

Don’t stuff too many links on your page

Your homepage and other top pages will probably have the strongest authority, as other sites will link to your homepage in many cases.

You want that high-equity page to link out to other pages in a natural way that resembles a pyramid structure. Don’t forget the user in your rush to dish out equity; do visitors want to go from your homepage straight to some random deep page on your site? Does this help them on their journey?

Use the Crawl Test research tool in Moz Pro to find out if any pages of your site are flagged for having too many on-page links.

You also shouldn’t go overboard with keyword-rich anchor text. Once again, think about the user, not about gaming search engines. This one can get you penalized in some way, so keep it natural.

If in doubt, just watch Season 2 Episode 4 of the IT Crowd for this delightful moment:

If you’re scooping up big swaths of copy to get keyword-rich anchor text but it doesn’t really help the person reading the article, then maybe you’ve got yourself an awkward link at your dinner party.

To follow or nofollow?

Links come in two flavors: follow and nofollow. Generally speaking, you do want your internal links to be “follow.” Bots will follow them on the journey of your choosing and equity will be passed on, which is just what you want.

You can use the MozBar to check your pages for follow and nofollow links.

Nofollow links can be marked on a link-by-link basis, or a whole page on your site can be allocated as nofollow. Let’s find the “Meta-robots Nofollow” column in your crawl CSV and filter by TRUE to check if you intended to mark these pages as nofollow.

Convert mentions to links

If people or sites are already talking about your brand, then you’re not a million miles away from converting that to a link.

What you’re searching for are pages that mention your brand term but don’t link to you yet. This takes a bit of digging to do manually, but thankfully this is automated in your Moz Pro campaign.

Head to the “Links” tab in Moz Pro and hit “Opportunities.”

If you’re not seeing suggestions, you’ll want to modify your Brand Rules (Rankings > Add & Manage Keywords > Manage Brand Rules) and add a few more options. I already had my brand term, “fantasycostumes,” but you can probably guess this won’t be mentioned that often. So I added broader mentions like “fantasy costumes” as well as more specific mentions of my domain “fantasycotumes.com.”

Back in my campaign’s Link Opportunities tab, I can see the site that mentioned the broader term “fantasy costumes” and their authority. Now we can start to use mentions and DA to judge other sites:

Having looked at these examples, maybe they’re not talking about me exactly, but that’s ok. They’re still discussing my niche, so let’s go and see who’s linking to them by popping their URL into OSE.

This will give me an idea of what sort of content is valued and linked to, and I can use this to figure out my next step forward.

Toolkit:

MozBar – In-browser link analysis

Moz Pro Crawl Test – Find those nofollow pages and pages with too many links

OSE – Explore backlink analysis

Fresh Web Explorer – Track mentions of your brand and closely related terms

Further reading:

Moz’s guide to link building

Google’s guide to follow and nofollow links

Wrapping up

I hope this helps you begin to uncover why your content isn’t ranking for your target keywords, and sets the wheels in motion for climbing up the SERPs.

Try Moz Pro + KWE, free for 30 days

Don’t forget that Moz Pro is available free for the first 30 days and it includes Keyword Explorer, so you can start to understand your site’s authority, check your on-page optimization, track your rankings over time, and figure out how to improve them.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

The 4-Step Plan to Construct Your Own Keyword-to-URL Map

Posted by Carson-Ward

Knowing how to find and effectively use keywords is probably the most important skill for an effective search marketer. Smart keyword planning and tracking should also heavily inform content planning and strategy. Unfortunately, most keyword research is done on the fly as a new page is created. Rather than helping marketers find new opportunities and plan strategically, keywords are usually found and applied to existing posts and in-flight projects.

If you’re an SEO or content creator and don’t have a living, regularly referenced keyword map, this post is for you. We won’t discuss how to optimize existing pages. There are lots of well-done technical SEO posts around if the optimization process is new to you. But if the concept of a keyword plan is new to you, this post should walk you through the process completely. If you’re experienced, you’ll probably pick up at least one new trick or application for keywords.

If you’d like to follow along with a keyword research template I’ve created, feel free to make a copy of this Google doc. You’ll see images of it throughout the post that might make more sense if you open it up.

Finding and selecting keywords

Obviously the first step to using keywords is finding what people search for. While thorough and hopefully helpful, there’s nothing shocking or ground-breaking in this first section. The real magic is in how you use your keywords.


Step 1: Build the “Big List”

Your goal in this first phase of keyword research is to gather every keyword that your business would want to appear for. You won’t achieve that goal, but set your sights high. Think outside the structure of your current site. Look beyond keywords you currently rank for and knowingly compete for.

Moz Keyword Explorer

Moz’s Keyword Explorer is a great tool, and I’m not just saying that because of Moz’s resident hypnotist. I must have missed its launch somehow, yet it’s quickly become my first stop for collecting lots of keywords quickly. The grouping function is great for finding head terms, and the sub-terms will be useful later on in either optimizing terms on existing pages or finding related pages worth creating.

Here I’m using the Moz keyword tool and excluding very low-volume keyword terms that I know I’ll be ignoring. Throughout this post I’m using our site, HighSpeedInternet.com, as an example.

Put in your known head terms and export them all using the “Export CSV” function. I’m impressed by the speed of the tool, and often use volume filters to avoid exporting terms I won’t actually use. That might sound small, but many tools force large exports prior to any estimation of search volume. Once you’re done gathering and exporting, you can remove duplicates and sort using Excel or a (slightly clumsier) Google Sheets script.

SearchMetrics

SearchMetrics is good for those who aren’t sure which keywords they want to rank for. We’ll need to input competitors’ sites to find keywords. For those who don’t know who competitors are, there’s a handy tool that shows likely candidates under “SEO research > Competitors.”

SimilarWeb (not shown) is also helpful in checking for competitors. If your site is new, simply plug in some of the queries you’d like to rank for and look those sites up. Once you’ve discovered some competitors, throw them into SearchMetrics and head over to the “Rankings” section under “SEO Research” and click “Long Tail.”

If this were a competitor’s site, I’d see a list of keywords they rank for and the potential traffic.

Other tools

  • SEMrush has a tool that can find keywords with search volume by site or related terms. One of the better all-in-one tools for keyword research.
  • UberSuggest spits out tons of related terms. It’s no longer a favorite, as many have found suggestions to be irrelevant or low-volume terms.
  • KeywordTool.io is a good complement to a more full-featured tool. It’s reliably better than most tools at finding mid-tail terms that others don’t find.
  • Google Keyword Planner offers free suggestions. One major downside is that your competitors will probably be using the tool the same way you do, resulting in lots of competition for the more narrow set of terms that Google suggests. Still, it would be fine to use this tool and nothing else if your tools budget is low.

There’s an almost unlimited number of keyword tools, but you really only need one or two. The more thorough your Big List process is, the more work you’ll save yourself later on. It’s usually worth it to spend a day or two gathering lots of keywords for a site you’ll be working on regularly.


Step 2: Get keyword volume

Use Excel’s handy function or a Google Sheets script to remove duplicate keywords. For most of us the next step is to import/paste sets of keywords into the Google Keyword Planner, export the volume, and repeat. There’s a limit to how many keywords Google will allow you to run at one time, so pre-filtering bad keywords might be a good idea. For example, I often pull out competitors’ branded terms.

Work-around for “low-volume” accounts (+extra precision)

Google recently continued its creeping war against those who use Google products for free by returning ranges in the keyword planner for low-spending accounts. These ranges (as in the image below) are so broad they’re essentially useless for anything but pre-filtering.

To get around this limit, you can just click the nice “Add to plan” button on any one of your terms.

If you’re only curious about volume for a few keywords, you can just click the “Add to plan” button for multiple terms. It’s easier to paste them in the next step for larger lists. Once you’ve added at least one keyword, click the “Review plan” button.

Now you’re on a new page where you’ll need to be careful about avoiding the “Save to account” buttons unless you actually want to start bidding. Click “Add keywords” to paste your terms in, then save it to a new ad group.

Now click the ad group. You’ll see a large table that’s mostly blank. Fill in a $999 bid and set the range to monthly. I also like to try different match types, but I typically use exact-match.

So why is this cool?

  • Impression count is more accurate, and not rounded like in the regular tool.
  • You can set custom date ranges if you want a more accurate figure for forecasting purposes.
  • You can play with match type again (which is something Google took away from the standard planner interface).
  • It works for free accounts.

At the end of Step 2, you should have a simple two-column list.


Step 3: Filter keywords

Notice I said we should filter keywords — not delete them. You’ll generally want to break keywords into three groups:

1.) Priority terms: Keywords you want to rank for immediately. A good priority term has the following attributes:

  • Related to current and near-future business
  • Implies a question you can answer well about a product you sell, OR implies a need you can fulfill
  • High-enough volume to be worth the investment

2.) Secondary terms: We’ll want to go after these some day, but not before we have our priority keywords locked in with query-responsive, well-optimized pages. Secondary terms usually have the following traits:

  • Doesn’t have buying intent, but has healthy volume and relates to what your site does
  • Implies a question you don’t have the expertise to answer
  • Low-volume terms that might convert

3.) Other terms: You might lay out some tertiary keywords (i.e those where you plan to expand the business), but you can generally stop there and label any others as keywords to “ignore for now.”

You’ll usually want to note why you are or are not pursuing a term so you don’t have to re-evaluate it every time you look for new keywords. Step three’s endpoint just adds a few columns:


Using keywords effectively

Now that you’ve gathered keywords it’s time to figure out how to use them. Your ultimate goals are to 1) find new opportunities on existing pages, and 2) find keywords for which you don’t have a good landing page so that you can create or suggest a useful new piece of content. Before we can do either, we’ll need to map the keywords to pages on your site.


Step 4: Map priority keywords

Just like you needed human judgment to determine priority keywords, you’ll need to use good judgment to map them to pages. You can skip the judgment steps and still come out with a final product, but it will ultimately be far less useful. Besides, this is why we have jobs that machines won’t be taking over for a while.

Scrape Google

First, scrape Google for your keywords and current ranking. Google frowns on rank tracking and SERP scraping, but consider it fair game for all the content they scrape and save. If you don’t want to scrape SERPs you can manually map each page, but it’s nice for larger sites to check yes/no rather than thinking through a list of potential pages every time.

There are tons of tools and services for this. AWR is probably the most common choice, as this is a one-time deal. You could also write a simple script with proxies or find a freelancer on one of a dozen sites. Moz Pro’s campaigns work up to your keyword limit, but the Moz tool is far better at helping after you’ve mapped keywords.

Mapping new & existing URLs

Once you have each keyword’s page and current rank, you’ll want to quickly check that the page matches the query.

  • How does this page help the user? (Don’t confuse this with what the user does next.)
  • Would the ideal version of that page do what you’d want if you typed this keyword into Google?
  • Would a page about this keyword or set of keywords only serve the query better?

You don’t want to create new pages for every tiny keyword variation, but we do want to make sure the page feels tailored to the user question. You’re trying to close the gap between what people want from Google and what your site does, so it shouldn’t be surprising if the questions you ask yourself feel UX-heavy.

After asking these questions a few hundred times it’ll become second nature. You won’t rank at all for some terms, so you’ll have to either manually select a page or create a new one. For some pages (especially those 50+), Google will just be plain wrong and you’ll have to re-map them.

The hardest choice is often whether an existing page could be optimized to be a better fit, or if a new URL is more appropriate. As a general rule, anything that would augment an existing page’s core purpose can be added, but anything that would detract or confuse the core purpose should be placed elsewhere. Don’t worry if it’s not immediately clear what the core purpose of the page is. Part of the value in this process is refining page purpose with keywords.

If you have a page in the top 5 or 10, it’s usually best to assume optimizing the page is a better path than creating a new page. If you have sets of conflicting keywords (meaning optimizing for both confuses the page) ranking on the same URL, you can generally choose the higher-value terms and then link to a new page about the second set.

For example, if we had a page appearing for “internet providers by zip code” and “satellite internet providers,” these would be considered conflicting. Trying to talk about satellite Internet (which is available almost everywhere) and zip code-specific Internet at the same time would be confusing. We’d create a new page for satellite Internet, delete the existing satellite Internet content, and link to the new page from the ranking URL.

Building new pages

If you’ve identified new pages that have opportunity, well done! Ensure that the amount of effort is worth the reward, and utilize the opportunities in your production process. Keyword research done well comes with a built-in business case. If you can show keyword volume and argue for keyword intent, you only have to make some assumptions on click, call, or purchase rates to put a potential dollar figure on the project.

Once you’ve mapped keywords to a new page, you should also have scope settled at a high level. Knowing what questions you’re trying to answer and what the page should do gives everyone the information they need to contribute and determine the best way to build it.

Optimizing existing pages

Improving existing pages is usually easier and less time-intensive, but don’t simply optimize page titles and call it a day. Actually look at the page and determine whether it’s a good fit for what you’d want to see if you were the one Googling. Also consider the competition and aim to be better.

There will be a larger list of existing terms ranking below the top spot where optimization and improvement need to be prioritized. Here are a couple examples of prioritization helpers:

  • Keyword opportunity: Find a click-through study, estimate the traffic you’re getting in your current position, and estimate how much traffic you’d get from the top spot. Consider both keywords and pages.
  • Competitive opportunity: Combine the opportunity above with competition metrics (e.g. PA/DA in the SERPs).
  • Crawl your pages to get titles and content, break the keyword into its individual words, and see how many of the words appear.

Use these figures as guides, and be smart about competition. It’s easy for analytical people to get too deep into a spreadsheet. Make sure you’re looking at your website and that of your competition, rather than making decisions on a pet formula alone.


A word to skeptical content strategists/marketers

I understand if you think this looks like a post for SEOs. Content that comes from highly searched keywords tends to be evergreen, but the result of writing keyword-targeting content is rarely something your visitors will rush to share. It’s very rarely inspiring, timely, fun, or otherwise sexy. Keep some things in mind, though:

  1. Depending on who you believe, organic traffic on average is 2–4x average referral traffic across the web. Don’t sell yourself short with a content strategy that only reaches half of your potential audience.
  2. You don’t have to create content the way everyone else has. In fact, please don’t! See a bunch of dull articles ranking for the term? Maybe make it an interactive tool. Write something that’s not dull. Answer the question better than anyone else has.
  3. You’ll drive more sales creating good content for boring searches than you will creating viral posts that get shared and linked to. Combine keyword hunting with shareable content for a truly business-changing organic/inbound strategy.

You don’t need an SEO’s permission to create useful content for things Google explicitly tells you your potential fans and customers are looking for. Incorporating a keyword strategy into a comprehensive content strategy almost feels like cheating.


Getting started: A spreadsheet template

If all of this sounds a bit overwhelming, I’ve created a template in Google docs that you can begin using. Just choose “File > Make a copy,” read through the comments, and start entering in your own data once you feel comfortable.

Get your keyword mapping template

The Google doc does a lot of the boring stuff for you, like calculating keyword opportunity, title optimization (if you put in page titles), and organizing your keyword map by page and keyword with opportunity, volume, and more.


When it’s time to automate

There are tools for doing much of what this spreadsheet does. The right tool will be worth the money as long as you keep some things in mind before you dive in and start paying:

  1. It’s wise to know what you want a tool to do before buying it. Use the keyword mapping template, experiment with what you actually want and use regularly, and then you can start looking for tools to help you map keywords and optimize pages. Avoid tool clutter by using them deliberately.
  2. Most tools will try to map keywords to pages, but none can reliably tell you when or how you should create new content. If you’re never actually looking at keywords with human judgment and asking, “Am I answering that query?”, then you’re probably over-relying on the tool.

For Moz Pro members, plugging in some keywords and playing around is a great place to start. Play around until you’re comfortable with rank tracking and page mapping, then look at some optimization suggestions. It’s now even better when combined with the keyword tool.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Game of Featured Snippets: How to Rank in Position 0

Posted by larry.kim

Google’s Featured Snippets are amazingly powerful. We’re seeing more snippets than ever before for more search queries. You need them.

We know this thanks to some brilliant articles and presentations from some super smart people in the industry, including Glenn Gabe (see: The Power of Google Featured Snippets in 2016 and a Google Featured Snippet Case Study – also, an extra big shout out to Glenn for helping me answer some important questions I had when writing this article!), Peter Meyers (see: Ranking #0: SEO for Answers), and Rob Bucci (see: How to Earn More Featured Snippets).

But even after reading everything I could find about Featured Snippets, one huge question remained unanswered: How the heck do you get these damn things?

:you know nothing about featured snippets meme.jpg

All of this leads us to today’s experiment: How exactly does Google’s algorithm pick which snippet to feature?

Obviously, Google isn’t manually picking them. It’s an algorithm.

So what makes Google’s Featured Snippet algorithm tick?

For example, if two competing domains both have great, snipp-able results, how does Google decide to pick one over the other? Take this one, for example:

:what is link building.jpg

Why does WordStream (in Position 4) get the Featured Snippet instead of Moz (Positions 1 and 2) or Search Engine Watch (Position 3) on a search for [what is link building]?

What we know about Featured Snippets

Before we dive into the unknown, let’s briefly review what we know.

:knowledge is power.jpg

We know snippets, like unicorns, come in all shapes and sizes. Your content must provide the answer in the “right” format, which will vary depending on the specifications in Google’s algorithm. Snippets can be:

  • Text.
  • Lists (ordered or unordered).
  • Images.
  • Charts.
  • Tables.
  • Knowledge Graph.

We also know that any website can earn a Featured Snippet. Large brands and sites have no advantage over smaller brands and sites.

Finally, we also know that winning a Featured Snippet lets you enjoy some spoils of war, including:

  • You get more website traffic.
  • You gain greater visibility in Google’s SERPs.
  • You earn trust/credibility.

So that’s what you need to know about Featured Snippets. Now let’s dive into the unknown.

Important disclaimers

Featured Snippets pose a few problems that really complicated the analysis.

For one, snippets are finicky. You can do a search right now and see the snippet. But sometimes you can conduct the same exact search an hour later and the snippet won’t be there.

For another, we’re all working with limited data sets. We’re limited to analyzing just the snippets we have.

Finally, snippets impact your organic CTRs. Some snippets will increase the CTR to your site – for instance, if you’re ranked in fourth position but you have the featured snippet. But other times a snippet can actually decrease your CTR because the searcher already got their answer – no need to click through.

Google isn’t much help either. Gabe asked Google SEO PR spokesperson Gary Illyes and got this frustratingly funny reply:

@glenngabe speculation: ancient aliens? Or, we look at many signals for determining what should be featured @JohnMu
— Gary Illyes (@methode) October 23, 2015

Theory #1: Snippets aren’t featured based on organic search ranking factors alone

This one is relatively easy to prove.

According to Gabe’s data, ranking position played some sort of role in whether you get Featured Snippets. Every single snippet was taken from a page that was good enough to rank in the top 10 organic positions.

If you look at Bucci’s data, however, he discovered that Google will take snippets from content that ranks on Page 2 of Google.

I found something a bit more incredible when I pulled a report of snippets – 981 in total – for my own website. Take a look:

  • About 70 percent of the time, Google pulled snippets from pages in positions 1 to 3.
  • About 30 percent of the time, the snippets “source” comes from positions 4 to as deep as 71 (wow!).

If Google’s algorithm were relying just on traditional search ranking factors (e.g., keywords and links), then Google would simply pick the first “snipp-able” content fragment from the highest-ranking piece of content every time. Google would never have to go to Page 2 or further (Page 8!) for snippets when other there are other perfectly nice formatted snippets to choose from which rank higher.

Clearly, this isn’t happening. Something else is at play. But what?

Theory #2: Having your content in a snipp-able format matters (but isn’t the whole picture)

Is it all about being the most clear, concise, and thorough answer? We know Google is looking for something “snipp-able.”

For the best shot at getting a Featured Snippet, your content should be between 40 and 50 words, according to SEMRush‘s analysis.

Without a doubt, format matters to Google’s algorithm. Your content needs to have the right format if you’re ever going to be eligible to be snipped.

But again, we’re back to the same question. How does Google pick between different pages with eligible stuff to snip?

Theory #3: Engagement metrics seem to play a role in snippet selection

To figure out what was happening, I looked at the outliers. (Usually, the best way to crack an algorithm is to look at the unusual edge cases.)

Let’s look at one example: [how to get more Bing Rewards Points].

This page shows up as a snippet for all sorts of queries related to “getting bing rewards points,” yet the source of the snip is from position 10. What’s crazy is that our page ranks behind Bing’s official site and all sorts of other video tutorials and community forums discussing the topic.

Why the heck is this happening?

Well, when I look at this page in Search Console, I notice it gets an unusually high CTR of 21.43 percent, despite a ridiculously low average position of 10.

This CTR is 10x higher than what you’d expect to see at this position.

The other thing I noticed was that the page had remarkably great engagement metrics. The time on site (which is proportional to dwell time) was an amazing 14 minutes and 30 seconds.

C:UserslkimAppDataLocalTempSNAGHTML83e215.PNG

This time on site is considerably higher than the site average – by nearly 3x!

Note: This is just one simple example. I did this for more than 50 pages (unfortunately I was limited by data here because I was looking specifically for pages that rank poorly, yet generate snippets).

What I found was that the relative time on site for pages that were snipped from low positions on the SERP has incredibly higher time on page, compared to the site average.

:time on site low position snippets.png

Basically, what I think might be going on is something like this:

:how-featured-snippets-get-picked.png

Supporting fact #1: Marissa Mayer said it worked this way

In addition to this data, there are a couple more reasons why I think engagement metrics may be playing a key role in Google’s Featured Snippet algorithm. These examples indicate that Google has long-held beliefs around good engagement metrics reflecting quality content.

Does the past hold some important secrets to our current plot? Let’s see.

:watch listen remember.jpg

First, we’ll head back to 2007 for an interview with Marissa Mayer discussing the OneBox and how features like news, maps, and products would get promoted above the organic results into the OneBox, based on click-through rate:

“We hold them to a very high click-through rate expectation and if they don’t meet that click-through rate, the OneBox gets turned off on that particular query. We have an automated system that looks at click-through rates per OneBox presentation per query. So it might be that news is performing really well on Bush today but it’s not performing very well on another term, it ultimately gets turned off due to lack of click-through rates. We are authorizing it in a way that’s scalable and does a pretty good job enforcing relevance.”

Supporting fact #2: Google used the same algo in paid search a few years back

OK, now let’s go back to 2008 – back when Google still had AdWords ads on the right rail. (Unfortunately, with the death of the right-side ad rail, all ads appear above the organic search results now – a moment of silence for the right-side rail).

Google would promote three ads to appear above the organic search results. How did Google decide which paid search ads to feature above the organic search results?

Here’s what Google revealed in an AdWords blog post, “Improvements to Ads Quality“:

“To appear above the search results, ads must meet a certain quality threshold. In the past, if the ad with the highest Ad Rank did not meet the quality threshold, we may not have shown any ads above the search results. With this update, we’ll allow an ad that meets the quality threshold to appear above the search results even if it has to jump over other ads to do so. For instance, suppose the ad in position 1 on the right side of the page doesn’t have a high enough Quality Score to appear above the search results, but the ad in position 2 does. It’s now possible for the number 2 ad to jump over the number 1 ad and appear above the search results. This change ensures that quality plays an even more important role in determining the ads that show in those prominent positions.”

What’s important to know here is how incredibly important CTR is in the Quality Score formula. By far, CTR has the biggest impact on Quality Score.

So here we have spokespeople from both the organic search side and Google’s own ad system telling us that CTR can play a vital role in helping Google ensure that a piece of content or an ad meets a high enough quality threshold to qualify to appear in the very prominent and valuable space above the organic search results.

That’s why I strongly believe that Featured Snippets work very much the same way – with CTR and engagement metrics being the key element.

What does it all mean?

:game of snippets.jpg

Featured Snippets give us yet another reason to focus on engagement rates. This year we talked about how engagement rates:

Any one of these alone is good reason to focus on improving your CTR. But wait, there’s more: I believe engagement rates also impact the selection of Featured Snippets.

So in addition to formatting your on-page copy to meet the snipping requirement, follow the guides on improving CTR and time on site.

A call to arms

:more featured snippets data.jpg

One thing that’s hard about doing research and analysis on Featured Snippets is that we’re limited to the data we have. You need to have lots of snippets and access to all the CTR data (only the individual webmasters have this). You can’t just crawl a site to discover their engagement metrics.

Why don’t we team up here and try to crack this nut together?

Have you won Featured Snippets? What are your engagement rates like for your featured snippets – from the Search Console for CTR and Google Analytics for time on site? Do you see any patterns? Please share your insights with us in the comments!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →