Archives for 

seo

The Complete Guide to Creating On-Site Reviews+Testimonials Pages

Posted by MiriamEllis

“Show your site’s credibility by using original research, citations, links, reviews and testimonials. An author biography or testimonials from real customers can help boost your site’s trustworthiness and reputation.”Google Search Console Course

2017 may well be the year of testimonials and reviews in local SEO. As our industry continues to grow, we have studied surveys indicating that some 92% of consumers now read online reviews and that 68% of these cite positive reviews as a significant trust factor. We’ve gone through a meaningful overhaul of Google’s schema review/testimonial guidelines while finding that major players like Yelp will publicly shame guideline-breakers. We’ve seen a major publication post a controversial piece suggesting that website testimonials pages are useless, drawing thoughtful industry rebuttals illustrating why well-crafted testimonials pages are, in fact, vitally useful in a variety of ways.

Reviews can impact your local pack rankings, testimonials can win you in-SERP stars, and if that isn’t convincing enough, the above quote states unequivocally that both reviews and testimonials on your website can boost Google’s perception of a local business’ trustworthiness and reputation. That sounds awfully good! Yet, seldom a day goes by that I don’t encounter websites that are neither encouraging reviews nor showcasing testimonials.

If you are marketing local enterprises that play to win, chances are you’ve been studying third-party review management for some years now. Not much has been written about on-site consumer feedback, though. What belongs on a company’s own testimonials/reviews page? How should you structure one? What are the benefits you might expect from the effort? Today, we’re going to get serious about the central role of consumer sentiment and learn to maximize its potential to influence and convert customers.

Up next to help you in the work ahead: technical specifics, expert tips, and a consumer feedback page mockup.

Definitions and differentiations

Traditional reviews: Direct from customers on third-party sites

In the local SEO industry, when you hear someone talking about “reviews,” they typically mean sentiment left directly by customers on third-party platforms, like this review on TripAdvisor:

rt1.jpg

Traditional testimonials: Moderated by owners on company site

By contrast, testimonials have traditionally meant user sentiment gathered by a business and posted on the company website on behalf of customers, like this snippet from a bed-and-breakfast site:

rt2.jpg

Review content has historically been outside of owners’ control, while testimonial content has been subject to the editorial control of the business owner. Reviews have historically featured ratings, user profiles, images, owner responses, and other features while testimonials might just be a snippet of text with little verifiable information identifying the author. Reviews have typically been cited as more trustworthy because they are supposedly unmoderated, while testimonials have sometimes been criticized as creating a positive-only picture of the business managing them.

Hybrid sentiment: Review+testimonial functionality on company site

Things are changing! More sophisticated local businesses are now employing technologies that blur the lines between reviews and testimonials. Website-based applications can enable users to leave reviews directly on-site, they can contain star ratings, avatars, and even owner responses, like this:

In other words, you have many options when it comes to managing user sentiment, but to make sure the effort you put in yields maximum benefits, you’ve got to:

  1. Know the guidelines and technology
  2. Have a clear goal and a clear plan for achieving it
  3. Commit to making a sustained effort

There is a ton of great content out there about managing your reviews on third-party platforms like Yelp, Google, Facebook, etc., but today we’re focusing specifically on your on-site reviews/testimonials page. What belongs on that page? How should you populate and organize its content? What benefits might you expect from the investment? To answer those questions, let’s create a goal-drive plan, with help from some world-class Local SEOs.

Guidelines & technology

There are two types of guidelines you need to know in the consumer sentiment space:

1) Platform policies

Because your website’s consumer feedback page may feature a combination of unique reviews and testimonials you directly source, widgets featuring third-party review streams, and links or badges either showcasing third-party reviews or asking for them, you need to know the policies of each platform you plan to feature.

Why does this matter? Since different platforms have policies that range from lax to strict, you want to be sure you’re making the most of each one’s permissions without raising any red flags. Google, for example, has historically been fine with companies asking consumers for reviews, while Yelp’s policy is more stringent and complex.

Here are some quick links to the policies of a few of the major review platforms, to which you’ll want to add your own research for sites that are specific to your industry and/or geography:

2) Google’s review schema guidelines

Google has been a dominant player in local for so long that their policies often tend to set general industry standards. In addition to the Google review policy I’ve linked to above, Google has a completely separate set of review schema guidelines, which recently underwent a significant update. The update included clarifications about critic reviews and review snippets, but most germane to today’s topic, Google offered the following guidelines surrounding testimonial/review content you may wish to publish and mark up with schema on your website:

Google may display information from aggregate ratings markup in the Google Knowledge Cards. The following guidelines apply to review snippets in knowledge cards for local businesses:

– Ratings must be sourced directly from users.
– Don’t rely on human editors to create, curate or compile ratings information for local businesses. These types of reviews are critic reviews.
– Sites must collect ratings information directly from users and not from other sites.

In sum, if you want to mark up consumer feedback with schema on your website, it should be unique to your website — not drawn from any other source. But to enjoy the rewards of winning eye-catching in-SERP star ratings or of becoming a “reviews from the web” source in Google’s knowledge panels, you’ve got to know how to implement schema correctly. Let’s do this right and call on a schema expert to steer our course.

Get friendly with review schema technology.

rtdavid.jpg

The local SEO industry has come to know David Deering and his company TouchPoint Digital Marketing as go-to resources for the implementation of complex schema and JSON-LD markup. I’m very grateful to him for his willingness to share some of the basics with us.

Here on the Moz blog, I always strive to highlight high quality, free resources, but in this case, free may not get the job done. I asked David if he could recommend any really good free review schema plugins, and learned a lot from his answer:

Boy, that’s a tough one because I don’t use any plugins or tools to do the markup work. I find that none of them do a good job at adding markup to a page. Some come close, but the plugin files still need to be edited in order for everything to be correct and properly nested. So I tend to hard-code the templates that would control the insertion of reviews onto a page. But I can tell you that GetFiveStars does a pretty good job at marking up reviews and ratings and adding them to a site. There might be others, too, but I just don’t have any personal experience using them, unfortunately.

It sounds like, at present, best bets are going to be to go with a paid service or roll up your sleeves to dig into schema hard coding. *If anyone in our community has discovered a plugin or widget that meets the standards David has cited, please definitely share it in the comments, but in the meantime, let’s take a look at the example David kindly provided of perfect markup. He notes,

“The following example is rather simple and straightforward but it contains everything that a review markup should. (The example also assumes that the review markup is nested within the markup of the business that’s being reviewed):”
"review": {
    "@type": "Review",
    "author": {
        "@type": "Person",
        "name": "Reviewer's Name",
        "sameAs": "<a href="http://link-to-persons-profile-page.com">http://link-to-persons-profile-page.com</a>"
    }
    "datePublished": "2016-09-23",
    "reviewBody": "Reviewer's comments here...",
    "reviewRating": {
        "@type": "Rating"
        "worstRating": "1",
        "bestRating": "5",
        "ratingValue": "5"
    }
},

This is a good day to check to see if your schema is as clean and thorough as David’s, and also to consider the benefits of JSON-LD markup, which he describes this way:

“JSON-LD is simply another syntax or method that can be used to insert structured data markup onto a page. Once the markup is created, you can simply insert it into the head section of the page. So it’s easy to use in that sense. And Google has stated their preference for JSON-LD, so it’s a good idea to make the switch from microdata if a person hasn’t already.”

There are some do’s and don’ts when it comes to schema + reviews

I asked David if he could share some expert review-oriented tips and he replied,

Well, in typical fashion, Google has been fickle with their rich snippet guidelines. They didn’t allow the marking up of third-party reviews, then they did, now they don’t again. So, I think it would be a good idea for businesses to begin collecting reviews directly from their customers through their site or through email. Of course, I would not suggest neglecting the other online review sources because those are important, too. But when it comes to Google and rich snippets, don’t put all of your eggs (and hopes) in one basket.

*As a rule, the reviews should be directly about the main entity on the page. So keep reviews about the business, products, services, etc. separate — don’t combine them because that goes against Google’s rich snippet guidelines.”

And any warnings about things we should never do with schema? David says,

“Never mark up anything that is not visible on the page, including reviews, ratings and aggregate ratings. Only use review markup for the entities that Google allows it to be used for. For example, the review and rating markup should not be used for articles or on-page content. That goes against Google’s guidelines. And as of this writing, it’s also against their guidelines to mark up third-party reviews and ratings such as those found on Google+ or Yelp.

Ready to dig deeper into the engrossing world of schema markup with David Deering? I highly recommend this recent LocalU video. If the work involved makes you dizzy, hiring an expert or purchasing a paid service are likely to be worthwhile investments. Now that we’ve considered our technical options, let’s consider what we’d like to achieve.

Define your consumer feedback page goals.

rtmike.jpg

If I could pick just one consultant to get advice from concerning the potential benefits of local consumer feedback, it would be GetFiveStars’ co-founder and renowned local SEO, Mike Blumenthal.

Before we dive in with Mike, I want to offer one important clarification:

If you’re marketing a single-location business, you’ll typically be creating just one consumer feedback page on your website to represent it, but if yours is a multi-location business, you’ll want to take the advice in this article and apply it to each city landing page on your website, including unique user sentiment for each location. For more on this concept, see Joy Hawkins’ article How to Solve Duplicate Content Local SEO Issues for Multi-Location Businesses.

Now let’s set some goals for what a consumer feedback page can achieve. Mike breaks this down into two sections:

1. Customer-focused

  • Create an effective page that ranks highly for your brand so that it becomes a doorway page from Google.
  • Make sure that the page is easily accessible from your selling pages with appropriately embedded reviews and links so that it can help sell sitewide.

2. Google-focused

  • Get the page ranking well on brand and brand+review searches
  • Ideally, get designated with review stars
  • Optimally, have it show in the knowledge panel as a source for reviews from the web

This screenshot illustrates these last three points perfectly:

rt4.jpg

Time on page may make you a believer!

Getting excited about consumer feedback pages, yet? There’s more! Check out this screenshot from one of Mike’s showcase clients, the lovely Barbara Oliver Jewelry in Williamsville, NY, and pay special attention to the average time spent on http://barbaraoliverandco.com/reviews-testimonials/:

rt5.jpg

When customers are spending 3+ minutes on any page of a local business website, you can feel quite confident that they are really engaging with the business. Mike says,

“For Barbara, this is an incredibly important page. It reflects almost 9% of her overall page visits and represents almost 5% of the landing pages from the search engines. Time on the page for new visitors is 4 minutes with an average of over 3 minutes. This page had review snippets until she recently updated her site — hopefully they will return. It’s an incredibly important page for her.”

Transparency helps much more than it hurts.

The jewelry store utilizes GetFiveStars technology, and represents a perfect chance to ask Mike about a few of the finer details of what belongs on consumer feedback pages. I had noticed that GetFiveStars gives editorial control to owners over which reviews go live, and wanted to get Mike’s personal take on transparency and authenticity. He says,

“I strongly encourage business owners to show all feedback. I think transparency in reviews is critical for customer trust and we find that showing all legitimate feedback results in less than a half-point decline in star ratings on average.

That being said, I also recommend that 1) the negative feedback be held back for 7 to 10 days to allow for complaint resolution before publishing and 2) that the content meet basic terms of service and appropriateness that should be defined by each business. Obviously you don’t want your own review site to become a mosh pit, so some standards are appropriate.

I am more concerned about users than bots. I think that a clear statement of your terms of service and your standards for handling these comments should be visible to all visitors. Trust is the critical factor. Barbara Oliver doesn’t yet have that but only because she has recently updated her site. It’s something that will be added shortly.

Respond to on-page reviews just as you would on third-party platforms.

I’d also noticed something that struck me as uncommon on Barbara Oliver Jewelry’s consumer feedback page: she responds to her on-page reviews, just as she would on third-party review platforms. Mike explains:

“In the ‘old’ days of reviews, I always thought that owner responses to positive reviews were a sort of glad handing … I mean how many times can you say ‘thank you’? But as I researched the issue it became clear that a very large minority of users (40%) noted that if they took the time to leave feedback or a review, then the owner should acknowledge it. That research convinced me to push for the feature in GetFiveStars. With GetFiveStars, the owner is actually prompted to provide either a private or public response. The reviewer receives an email with the response as well. This works great for both happy and unhappy outcomes and serves double-duty as a basis for complaint management on the unhappy side.

You can see the evolution of my thinking in these two articles

What I used to think: Should A Business Respond to Every Positive Review?
What I think after asking consumers their thoughts: Should A Business Respond to Every Positive Review? Here’s The Consumer View.

Reviews on your mind, all the time

So, basically, consumers have taught Mike (and now all of us!) that reasonable goals for reviews/testimonials pages include earning stars, becoming a knowledge panel review source, and winning a great average time on page, in addition to the fact that transparency and responsiveness are rewarded. Before he zooms off to his next local SEO rescue, I wanted to ask Mike if anything new is exciting him in this area of marketing. Waving goodbye, he shouts:

Sheesh … I spend all day, every day thinking about these sorts of things. I mean my motto used to be ‘All Local, All the Time’… now it’s just ‘All Reviews, All the Time.’

I think that this content that is generated by the business owner, from known clients, has incredible import in all aspects of their marketing. It is great for social proof, great user-generated content, customer relations, and much more. We are currently ‘plotting’ new and valuable ways for businesses to use this content effectively and easily.

I’m experimenting right now with another client, Kaplan Insurance, to see exactly what it takes to get rich snippets these days.”

I know I’ll be on the lookout for a new case study from Mike on that topic!

Plan out the components of your consumer feedback page

rtphil.jpg

Phil Rozek of Local Visibility System is one of the most sophisticated, generous bloggers I know in the local SEO industry. You’ll become an instant fan of his, too, once you’ve saved yourself oodles of time using his Ultimate List of Review Widgets and Badges for Your Local Business Website. And speaking of ‘ultimate,’ here is the list Phil and I brainstormed together, each adding our recommended components, for the elements we’d want to see on a consumer feedback page:

  • Full integration into the site (navigation, internal linking, etc.); not an island page.
  • Welcoming text intro with a link to review content policy/TOS
  • Unique sentiment with schema markup (not drawn from third parties)
  • Specification of the reviewers’ names and cities
  • Owner responses
  • Paginate the reviews if page length starts getting out of hand
  • Provide an at-a-glance average star rating for easy scanning
  • Badges/widgets that take users to the best place to leave a traditional third-party review. Make sure these links open in a new browser tab!
  • Video reviews
  • Scanned hand-written testimonial images
  • Links to critic-type reviews (professional reviews at Zagat, Michelin, etc.)
  • A link to a SERP showing more of the users’ reviews, signalling authenticity rather than editorial control
  • Tasteful final call-to-action

And what might such a page look like in real life (or at least, on the Internet)? Here is my mockup for a fictitious restaurant in Denver, Colorado, followed by a key:

Click to open a bigger version in a new tab!

Key to the mockup:

  1. Page is an integral part of the top level navigation
  2. Welcoming text with nod to honesty and appreciation
  3. Link to review content policy
  4. Paginated on-page reviews
  5. Call-to-action button to leave a review
  6. Easy-to-read average star rating
  7. Schema marked-up on-page reviews
  8. Sample owner response
  9. Links and badges to third party reviews
  10. Link to SERP URL featuring all available review sources
  11. Links to professional reviews
  12. Handwritten and video testimonials
  13. Tasteful final call-to-action to leave a review

Your live consumer feedback page will be more beautifully and thoughtfully planned than my example, but hopefully the mockup has given you some ideas for a refresh or overhaul of what you’re currently publishing.

Scanning the wild for a little sentiment management inspiration

I asked Phil if he’d recently seen local businesses recently making a good effort at promoting consumer feedback. He pointed to these, with the proviso that none of them are 100% perfect but that they should offer some good inspiration. Don’t you just totally love real-world examples?

Lightning round advice for adept feedback acquisition

Before we let Phil get back to his work as “the last local SEO guy you’ll ever need,” I wanted to take a minute to ask him for some tips on encouraging meaningful customer feedback.

“Don’t ask just once. In-person plus an email follow-up (or two) is usually best. Give customers choices and always provide instructions. Ask in a personal, conversational way. Rotate the sites you ask for reviews on. Try snail-mail or the phone. Have different people in your organization ask so that you can find ‘The Champ’,” says Phil. “Encourage detail, on-site and off-site. Saying things like ‘It will only take you 60 seconds’ may be great for getting big numbers of on-site testimonials, but the testimonials will be unhelpfully short or, worse, appear forced or fake. Dashed-off feedback helps no one. By the way, this can help you even if a given customer had a bad experience; if you’re encouraging specifics, at least he/she is a little more likely to leave the kind of in-depth feedback that can help you improve.”

Sustain your effort & facilitate your story

Every time Google sharpens focus on a particular element of search, as they are clearly doing right now with consumer and professional sentiment, it’s like a gift. It’s a clanging bell, an intercom announcement, a handwritten letter letting all of us know that we should consider shifting new effort toward a particular facet of marketing and see where it gets us with Google.

In this specific case, we can draw extra inspiration for sustaining ourselves in the work ahead from the fact that Google’s interest in reviews and testimonials intersects with the desires of consumers who make transactional decisions based, in part, on what Internet sentiment indicates about a local business. In other words, the effort you put into acquiring and amplifying this form of UGC makes Google, consumers, and your company happy, all in one fell swoop.

If you took all of the sentiment customers express about a vibrant, given business and put it into a book, it would end up reading something like War and Peace. The good news about this is that you don’t have to write it — you have thousands of potential volunteer Tolstoys out there to do the job for you, because reviewing businesses has become a phenomenal modern hobby.

Your job is simply to provide a service experience (hopefully a good one) that moves customers to start typing, back that up with a variety of ongoing feedback requests, and facilitate the publication of sentiment in the clearest, most user-friendly way.

Some more good news? You don’t have to do all of this tomorrow. I recently saw a Google review profile on which a business had “earned” over 100 reviews in a week — a glaring authenticity fail, for sure. A better approach is simply to keep the sentiment conversation going at a human pace, engaging with your customers in a human way, and ensuring that your consumer feedback page is as good as you can possibly make it. This is manageable — you can do this!

Are you experimenting with any page elements or techniques that have resulted in improved user feedback? Please inspire our community by sharing your tips!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Penguin 4.0: How the Real-Time Penguin-in-the-Core-Alg Model Changes SEO – Whiteboard Friday

Posted by randfish

The dust is finally beginning to settle after the long-awaited rollout of Penguin 4.0. Now that our aquatic avian friend is a real-time part of the core Google algorithm, we’ve got some changes to get used to. In today’s Whiteboard Friday, Rand explains Penguin’s past, present, and future, offers his analysis of the rollout so far, and gives advice for going forward (hint: never link spam).

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week, it is all about Google Penguin. So Google Penguin is an algorithm that’s been with us for a few years now, designed to combat link spam specifically. After many, many years of saying this was coming, Penguin 4.0 rolled out on Friday, September 23rd. It is now real-time in Google’s algorithm, Google’s core algorithm, which means that it’s constantly updating.

So there are a bunch of changes. What we’re going to talk about today is what Penguin 1.0 to 3.x looked like and how that’s changed as we’ve moved to the Penguin 4.0 model. Then we’ll cover a little bit of what the rollout has looked like and how it’s affecting folks’ sites and specifically some recommendations. Thankfully, we don’t have a ton.

Penguin 1.0-3x

But important to understand, if people ask you about Penguin, people ask you about the penalties that used to come from Penguin, you’ve got to know that, back in the day…

  • Penguin 1.0 to 3.x, it used to run intermittently. So every few months, Google would collect a bunch of information, they’d run the algorithm, and then they’d release it out in the wild. It would now be in the search results. When that rollout happened, that was the only time, pretty much the only time that penalties from Penguin specifically would be given to websites or removed.

    This meant that a lot of the time, you had this slow process, where if you got penalized by Penguin, you did something bad, you did some sketchy link building, you went through all the process, you went through all the processes of getting that penalty lifted, Google said, “Fine, you’re in good shape. The next time Penguin comes out, your penalty is lifted.” You could wait months. You could wait six months or more before that penalty got lifted. So a lot of fear here and a lot of slowness on Google’s side.

  • Penguin also penalized, much like Panda, where it looks at a portion of the site, these pages maybe are the only ones on this whole domain that got bad links to them, but old Penguin did not care. Penguin would hit the entire website.

    It would basically say, “No, you’re spamming to those pages, I’m burying your whole domain. Every page on your site is penalized and will not be able to rank well.” Those sorts of penalties are very, very tough for a lot of websites. That, in fact, might be changing a little bit with the new Penguin algorithm.
  • Old Penguin also required a reconsideration request process, often in conjunction with disavowing old links, proving to Google that you had gone through the process of trying to get those links removed.

    It wasn’t often enough to just say, “I’ve disavowed them.” You had to tell Google, “Hey, I tried to contact the site where I bought the links or I tried to contact the private blog network, but I couldn’t get them to take it down or I did get them to take it down or they blackmailed me and forced me to pay them to take it down.” Sometimes people did pay and Google said that was bad, but then sometimes would lift the penalties and sometimes they told them, “Okay, you don’t have to pay the extortionist and we’ll lift the penalty anyway.” Very manual process here.

  • Penguin 1.0 to 3.x was really designed to remove the impact of link spam on search results, but doing it in a somewhat weird way. They were doing it basically through penalties that affected entire websites that had tried to manipulate the results and by creating this fear that if I got bad links, I would be potentially subject to Penguin for a long period.

I have a theory here. It’s a personal theory. I don’t want you to hold me to it. I believe that Google specifically went through this process in order to collect a tremendous amount of information on sketchy links and bad links through the disavow file process. Once they had a ginormous database of what sketchy and spammy bad links looked like, that they knew webmasters had manually reviewed and had submitted through the disavowal file and thought could harm their sites and were paid for or just links that were not editorially acquired, they could then machine learn against that giant database. Once they’ve acquired enough disavowals, great. Everything else is gravy. But they needed to get that huge sample set. They needed it not to just be things that they, Google, could identify but things that all of us distributed across the hundreds of millions of websites on the planet could identify. Using those disavowal files, Google can now make Penguin more real-time.

Penguin 4.0+

So challenges here, this is too slow. It hurt too much to have that long process. So in the new Penguin 4.0 and going forward, this runs as part of the core algorithm, meaning…

  • As soon as Google crawls and indexes a site and is able to update that in their databases, that site’s penalty is either lifted or incurred. So this means that if you get sketchy links, you don’t have to wait for Penguin to come out. You could get hurt tomorrow.
  • Penguin does not necessarily any longer penalize an entire domain. It still might. It could be the case that if lots of pages on a domain are getting sketchy links or some substantive portion or Google thinks you’re just too sketchy, they could penalize you.
Remember, Penguin is not the only algorithm that can penalize websites for getting bad links. There are manual spam penalties, and there are other forms of spam penalties too. Penguin is not alone here. But it may be simply taking the pages that earn those bad links and discounting those links or using different signals, weighting different signals to rank those pages or search results that have lots of pages with sketchy links in them.
  • It is also the case — and this is not 100% confirmed yet — but some early discussion between Google’s representatives and folks in the webmaster and SEO community has revealed to us that it may not be the case that Penguin 4.0 and moving forward still requires the full disavow and whole reconsideration request process.

That’s not to say that if you incur a penalty, you should not go through this. But it may not be the case that’s the only way to get a penalty lifted, especially in two cases — no fault cases, meaning you did not get those links, they just happened to come to you, or specifically negative SEO cases.

I want to bring up Marie Haynes, who does phenomenally good work around spam penalties, along with folks like Sha Menz and Alan Bleiweiss, all three of them have been concentrating on Google penalties along with many, many other SEOs and webmasters. But Marie wrote an excellent blog post detailing a number of case studies, including a negative SEO case study where the link penalty had been lifted on the domain. You can see her results of that. She’s got some nice visual graphs showing the keyword rankings changing after Penguin’s rollout. I urge you to do that, and we’ll make sure to link to it in the transcript of this video.

  • Penguin 4.0 is a little bit different from Penguin 1.0 to 3 in that it’s still designed to remove the impact of spam links on search results, but it’s doing it by not counting those links in the core algo and/or by less strongly weighting links in search results where many folks are earning spammy links.

So, for example, your PPC, your porn, your pills, your casino searches, those types of queries may be places where Google says, “You know what? We don’t want to interpret, because all these folks have nasty links pointing to them, we are going to weight links less. We’re going to weight other signals higher.” Maybe it’s engagement and content and query interpretation models and non-link signals that are offsite, all those kinds of things, clickstream data, whatever they’ve got. “We’re going to push down the value of either these specific links or all links in the algo as we weight them on these types of results.”

Penguin 4.0 rollout

So this is what we know so far. We definitely will keep learning more about Penguin as we have more experience with it. We also have some information on the rollout.

  • Started on Friday, September 23rd, few people noticed any changes.

In fact, the first few days were pretty slow, which makes sense. It fits with what Google said about the rollout being real-time and them needing time to crawl and index and then refresh all this data. So until it rolls out across the full web and Google’s crawled and indexed all the pages, gone through processing, we’re not going to get there. So little effect that same day, but…

  • More SERP flux started three to five days after, that next Monday, Tuesday, Wednesday. We saw very hot temperatures starting that next week in MozCast, and Dr. Pete has been detailing those on Twitter.
  • As far as SEOs noticing, yes, a little bit.

So I asked the same poll on Twitter twice, once on September 27th and once on October 3rd, so about a week apart. Here is the data we got. “Nope, nothing yet.” “Went from 76% to 72%,” so a little more than a quarter of SEOs have noticed some changes.

A lot of folks noticing rankings went up. Moz itself, in fact, benefitted from this. Why is that the case? Well, any time a penalty rolls out to a lot of other websites, bad stuff gets pushed down and those of us who have not been spamming move up in the rankings. Of course, in the SEO world, which is where Moz operates, there are plenty of folks getting sketchy links and trying things out. So they were higher in the rankings, they moved down, and Moz moved up. We saw a very nice traffic boost. Thank you, Google, for rolling out Penguin. That makes our Audience Development team’s metrics look real good.

Four percent and then six percent said they saw a site or page get penalized in their control, and two percent and then one percent said they saw a penalty lifted. So a penalty lifted is still pretty light, but there are some penalties coming in. There are a few of those. Then there’s the nice benefit of if you don’t link spam, you do not get penalized. Every time Google improves on the Penguin algorithm, every time they improve on any link spam algorithm, those of us who don’t spam benefit.

It’s an awesome thing, right? Instead of cheering against Google, which you do if you’re a link spammer and you’re very nervous, you get to cheer for Google. Certainly Penguin 4.0 is a good time to cheer for Google. It’s brought a lot of traffic to a lot of good websites and pushed a lot of sketchy links down. We will see happens as far as disavows and reconsideration requests for the future.

All right, everyone, thanks for joining. Look forward to hearing about your experiences with Penguin. We’ll see you next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

How Your Brand Can Create an Enviable Customer Experience for Mobile Web Searchers

Posted by ronell-smith

57ad48ab1f61d5.60768283.jpg

Not very edible corned beef hash

Here I am, seated in a Manhattan, New York restaurant, staring at corned beef hash that looks and tastes like what I imagine dog food to look and taste like.

I’m pissed for two reasons:

  • It cost nearly $25 and was entirely inedible
  • I should have known better given the visuals depicted after doing a Google image search to find the dish, which was offered at a nearby restaurant

In retrospect, I should have checked A and B on my phone before ordering the $25 plate of Alpo. And though I didn’t do that, other would-be customers will, which means the business owner or SEO had better follow the steps below if they wish to stay in business.

The bad news is I no longer relish the thought of eating at high-end NY restaurants; the good news is this experience totally reshaped the way I view mobile, opening my eyes to simple but very effective tactics businesses of all types can immediately put to use for their brands.

My mobile education

We’ve all heard how mobile is transforming the web experience, reshaping the landscape for marketers, brands and consumers.

57ad39752f7dd4.15352822.jpg

As marketers, we now have to account for how our content will be accessed and consumed on mobile devices, whether that’s a phone, tablet or phablet. As brands, we realize our efforts will be judged not only on how well or high we show up in the SERPs, but also on much we can delight the on-the-go prospect who needs information that’s (a) fast, (b) accurate and (c) available from any device.

As prospects and consumers, we’ve come to know and value customer experience in large part because brands that use mobile to deliver what we need when we need it and in a way that’s easily consumed, have earned our attention — and maybe even our dollars.

But that’s where the similarities seemingly end. Marketers and brands seem to get so wrapped up in the technology (responsive design, anyone?) they forget that, at the end of the day, prospects want what they want right now — in the easiest-to-access way possible.

I’ve come to believe that, while marketers appreciate the overall value of mobile, they have yet to realize how, for customers, it’s all about what it allows them to accomplish.

At the customer/end-user level it’s not about mobile-friendly or responsive design; it’s about creating an enviable customer experience, one web searchers will reward you for with traffic, brand mentions and conversions.

I was alerted to the prominence of mobile phone use by noticing how many people sit staring at their phones while out at dinner, even as family members and friends are seated all around them. “How rude,” I thought. Then I realized it wasn’t only the people at restaurants; it’s people everywhere: walking down the street, driving (sadly and dangerously), sitting in movie theaters, at work, even texting while they talk on the phone.

One of my favorite comments with regard to mobile’s dominance comes with the Wizard of Moz himself, when he shared this tweet and accompanying image last year:

Mobile isn’t killing desktop. It’s killing all our free time. pic.twitter.com/pXb7F7aWsP
— Rand Fishkin (@randfish) December 20, 2015

But my “aha!” moment happened last year, in Manhattan, during the corned beef hash episode.

After working until brunch, I…

  1. Opened iPhone to Google
  2. Typed “Best corned beef hash near me”
  3. Scanned list of restaurant by distance and reviews
  4. Selected the closest restaurant having > 4-star review ratings
  5. Ended up disappointed

That’s when it hit me that I’d made errors of omission at every step, in large part by leaving one very important element out of the process, but also by not thinking like a smart web user.

Normally my process is as follows, when I wish to enjoy a specific meal while traveling:

  1. Open iPhone to Google Search box
  2. Type “Best _________ near me”
  3. Scan list of restaurants by distance and reviews
  4. Select restaurant having > 4-star review rating but has excellent reviews (> 4.5) of the dish I want and has great images of the dish online
  5. Delight ensues

That’s when three things occurred to me like a brickbat to the noggin’:

  • This is a process I use quite often and is one that has proved quite foolproof
  • It’s undoubtedly a process many other would-be customer are using to identify desirable products and services
  • Marketers can reverse-engineer the process to bring the customers they’re hoping for to their doors or websites.

(Eds. note: This post was created with small business owners (single or multiple location), or those doing Local SEO for SMBs, in mind, as I hope to inform them of how many individuals think about and use mobile, and how the marketers can get in front of them with relevant content. Also, I’d like to thank Cindy Krum of Mobile Moxie for encouraging me to write this post, and Local SEO savant Phil Rozek of Local Visibility System for making sure I colored within the lines.)

Five ways to create an enviable customer experience on mobile

#1 — Optimize your images

Image optimization is the quintessential low-hanging fruit of online marketing: easy to accomplish but typically overlooked.

For our purposes, we aren’t so much making them “mobile-friendly” as we are making them search-friendly, increasing the likelihood that Google’s crawlers can better decipher what they contain and deliver them for the optimal search query.

First and foremost, do not use a stock image if your goal is for searchers to find, read and enjoy your content. Just don’t. Also, given how much of a factor website speed is, minify your images to ensure they don’t hamper page speed load times.

But the three main areas I want us to focus on are file name, alt text and title text, and captions. My standard for each is summed up very well in a blog post from Ian Lurie, who proposes an ingenious idea:

The Blank Sheet of Paper Test: If you wrote this text on a piece of paper and showed it to a stranger, would they understand the meaning? Is this text fully descriptive?

With this thinking in mind, image optimization becomes far simpler:

  • File name: We’re all adults here — don’t be thickheaded and choose something like “DSC9671 . png” when “cornedbeefhash . jpg” clearly works better.
  • Alt text and title text: Given that, in Google’s eyes, these two are the priorities, you must make certain they’re as descriptive as possible. Clearly list what the image is and/or contains without weighing it down with unneeded text. Using the corned beef hash from above as a example, “corned beef hash with minced meat” would be great, but “corned beef hash with minced meat and diced potatoes” would work better, alerting me that the dish isn’t what I’m looking for. (I prefer shredded beef and shredded potatoes.)
  • Caption: Yes, I know these aren’t necessary for every post, but why leave your visitors hanging, especially if an optimal customer experience is the goal? Were I to caption the corned beef, it’d be something along the lines of “Corned beef hash with minced meat and diced potatoes is one of the most popular dishes at XX.” It says just enough without trying to say everything, which is the goal, says Lurie.

“’Fully descriptive’ means ‘describes the thing to which it’s attached,’ not ‘describe the entire universe,'” he adds.

Also, invite customers to take and share pictures online (e.g., websites, Instagram, Yelp, Google) and include as much rich detail as possible.

What’s more, it might behoove you to have a Google Business View photo shoot, says Rozek. “Those show up most prominently (in the Knowledge Panel) for brand-name mobile searches in Google.”

#2 — Make reviews a priority

Many prospects and customers use reviews as a make-or-break tactic when making purchases. Brands, realizing this, have taken note, making it their charge to get positive reviews.

But not all reviews are created equal.

Instead of making certain your brand gets positive reviews on the entirety of its products and services, redouble your efforts at getting positive reviews on your bread-and-butter services.

In many instances, what people have to say about your individual services and/or products matters more than your brand’s overall review ratings.

I learned this from talking to several uber-picky foodie friends who shared that the main thing they look for is a brand having an overall rating (e.g., on Yelp, Google, Angie’s List, Amazon, etc.) higher than 3.5, but who have customer comments glorifying the specific product they’re hoping to enjoy.

“These days, everyone is gaming the system, doing what they can to get their customers to leave favorable reviews,” said one friend, who lives in Dallas. “But discerning [prospects] are only looking at the overall rating as a beginning point. From there, they’re digging into the comments, looking to see what people have to say about the very specific thing they want. [Smart brands] would focus more on getting people to leave comments about the particular service they used, how happy they work with the result and how it compares to other [such services they’ve used]. We may be on our phones, but we’re still willing to dig into those comments.”

To take advantage of this behavior,

  • In addition to asking for a favorable review, ask customers to comment on the specific services they used, providing as much detail as possible
  • Redouble your efforts at over-delivering on quality service when it comes to your core offerings
  • Ask a few of your regulars, who have left comments on review sites, what they think meets the minimum expectation for provoking folks to leave a review (e.g., optimizing for the desired behavior)
  • Encourage reviewers to upload photos with their reviews (or even just photos, if they don’t want to review you). They’re great “local content,” they’re useful as social-proof elements, and your customers may take better pictures than you do, in which case you can showcase them on your site.

Relevant content:

#3 — Shorten your content

I serve as a horrible spokesperson for content brevity, but it matters a great deal to mobile searchers. What works fine on desktop is a clutter-fest on mobile, even for sites using responsive design.

As a general rule, simplicity wins.

For example, Whataburger’s mobile experience is uncluttered, appealing to the eye and makes it clear what they want me to do: learn about their specials or make a purchase:

57f3dd4b0c9037.76728058.jpg

On the other hand, McDonald’s isn’t so sure what I’m looking for, apparently:

57f3dfdb8ba5c6.40363967.jpg

Are they trying to sell me potatoes, convince me of how committed they are to freshness or looking to learn as much as they can about me? Or all of the above?

Web searchers have specific needs and are typically short on time and patience, so you have to get in front of them with the right message to have a chance.

When it comes to the content you deliver, think tight (shorter), punchy (attention-grabbing) and valuable (on- message for the query).

# 4 — Optimize for local content

Like all of you, I’ve been using “near me” searches for years, especially when I travel. But over the last year, these searches have gotten more thorough and more accurate, in large part as a result of Google’s Mobile Update and because the search giant is making customer intent a priority.

In 2015, Google reported that “near me” searches increased by 34-fold since 2011.

And though most of these “near me” searches are for durable goods/appliances and their associated retailers, services, including “surgeons near me,” “plumbers near me,” “jobs near me,” etc., and other things that are typically in a high consideration set are growing considerably, according to Google via its website, thinkwithgoogle.com.

A recent case study of 82 websites (41, control group; 41, test group) shows just how dramatic the impact of optimizing a site for local intent can be. By tweaking the hours and directions page titles, descriptions and H1s to utilize the phrases “franchise dealer near me” and “nearest franchise dealer” the brand saw mobile impressions for “near me” more than double to 8,833 impressions and 46 clicks. (The control group’s “near me” impression share only rose 11%.)

57f3e93b3725a6.45545049.jpg

Image courtesy of CDK Global

Additional steps for optimizing your site for “near me” searches

  • Prominently display your business name, address and phone number (aka, NAP) on your site
  • Use schema markup in your NAP
  • In addition to proper setup and optimization of your Google My Business listing, provide each location with its own listing and, just as important, ensure that the business name, address and phone number of each location matches what’s listed on the site
  • Consider embedding a Google Map prominently on your website. “It’s good for user experience,” says Rozek. “But it may also influence rankings.”

#5 — Use Google App Deep Linking

We’ve all heard the statistics: The vast majority — in some circles the figure is 95% — of apps downloaded to mobile devices are never used. Don’t be deceived, however, into believing apps are irrelevant.

Nearly half of all time spent on the web is in apps.

This means that the mobile searchers looking for products or services in your area are likely using an app or, at the very least, prompted to enter/use an app.

For example, when I type “thai restaurant near me,” the first organic result is TripAdvisor.

57f3f59f25e451.06227108.jpg

Upon entering the site, the first (and preferred) action the brand would like for me to make is to download the TripAdvisor app:

57f3f5888e0c16.02910367.png

Many times, a “near me” search will take us to content within an app, and we won’t even realize it until we see the “continue in XX app or visit the mobile site” banner.

And if a searcher doesn’t have the app installed, “Google can show an app install button. So, enabling your app for Google indexing could actually increase the installed base of the app,” writes Eric Enge of Stone Temple Consulting.

For brands, App Deep Linking (ADL), which he defines as “the ability for Google to index content from within an app and then display it as mobile search results,” has huge implications if utilized properly.

“Think about it,” he writes. “If your app is not one of the fortunate few that get most of the attention, but your app content ranks high in searches, then you could end up with a lot more users in your app than you might have had otherwise.”

(To access details on how to set up Google App Deep Linking, read Enge’s Search Engine Land article: SMX Advanced recap: Advanced Google App Deep Linking)

If your brand has an app, this is information you shouldn’t sleep on.

Typically, when I conduct a “near me” search, I click on/look through the images until I find one that fits what I’m looking for. Nine times out of ten (depending upon what I’m looking for), I’m either taken to content within an app or taken to a mobile site and prompted to download the app.

Seems to me that ADL would be a no-brainer.

Optimizing for mobile is simply putting web searchers first

For all the gnashing of teeth Google’s many actions/inactions provoke, the search giant deserves credit for making the needs of web searchers a priority.

Too often, we, as marketers, think first and foremost in this fashion:

  1. What do we have to sell?
  2. Who needs it?
  3. What’s the cheapest, easiest way to deliver the product or service?

I think Google is saying to us that the reverse needs to occur:

  1. Make it as fast and as easy for people to find what they want
  2. Better understand who it is that’s likely to be looking for it by better understanding our customers and their intent
  3. The sales process must begin by thinking “what specific needs do web searchers have that my brand is uniquely qualified to fulfill?”

In this way, we’re placing the needs of web searchers ahead of the needs of the brand, which will be the winning combination for successful companies in the days ahead.

Brands will either follow suit or fall by the wayside.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

SEO Trek: The Search for Google RankBrain* [New Data]

Posted by larry.kim

Rand Fishkin posted another brilliant Whiteboard Friday last week on the topic of optimizing for RankBrain. In it, he explained how RankBrain helps Google select and prioritize signals it uses for ranking.

One of the most important signals Google takes into account is user engagement. As Rand noted, engagement is a “very, very important signal.”

Engagement is a huge but often ignored opportunity. That’s why I’ve been a bit obsessed with improving engagement metrics.

My theory has been that RankBrain *and/or other machine learning elements within Google’s core algorithm are increasingly rewarding pages with high user engagement. Not always, but it’s happening often enough that it’s kind of a huge deal.

Google is looking for unicorns – and I think that machine learning is Google’s ultimate Unicorn Detector.

Now, when I say unicorns, I mean those pages that have magical engagement rates that elevate them above the other donkey pages Google could show for a given query. Like if your page has a 5 percent click-through rate (CTR) when everyone else has a 1 percent CTR.

What is Google’s mission? To provide the best results to searchers. One way Google does this is by looking at engagement data.

If most people are clicking on a particular search result – and then also engaging with that page – these are clear signals to Google that people think this page is fascinating. That it’s a unicorn.

c6i80.gif

RankBrain: Into Darkness

RankBrain, much like Google’s algorithm, is a great mystery. Since Google revealed (in a Bloomberg article just under a year ago) the important role of machine learning and artificial intelligence in its algorithm, RankBrain has been a surprisingly controversial topic, generating speculation and debate within the search industry.

Then, we found out in June that Google RankBrain was no longer just for long-tail queries. It was “involved in every query.”

We learned quite a few things about RankBrain. We were told by Google that you can’t optimize for it. Yet we also learned that Google’s engineers don’t really understand what RankBrain does or how it works.

Some people have even argued that there is absolutely nothing you can do to see Google’s machine learning systems at work.

Give me a break! It’s an algorithm. Granted, a more complex algorithm thanks to machine learning, but an algorithm nonetheless. All algorithms have rules and patterns.

When Google tweaked Panda and Penguin, we saw it. When Google tweaked its exact-match domain algorithm, we saw it. When Google tweaked its mobile algorithm, we saw it.

If you carefully set up an experiment, you should be able to isolate some aspect of what Google is proclaiming as the third most important ranking factor. You should be able to find evidence – a digital fingerprint.

Well, I say it’s time to boldly go where no SEO has gone before. That’s what I’ve attempted to do in this post. Let’s look at some new data.

The search for RankBrain [New Data]

What you’re about to look at is organic search click-through rate vs. the average organic search position for three separate 30-day periods ending April 30, July 12, and September 19 of this year. This data, obtained from the Google Search Console, tracked the same keywords in the Internet marketing niche.

I see some of the most compelling evidence of RankBrain (and/or other machine learning search algorithms!) at work.

The shape of CTR vs. ranking curve is changing every month – for the 30 days ending:

  • April 30, 2016, the average CTR for top position was about 22 percent.
  • July 12, 2016, the average CTR rose to about 24 percent.
  • By September 19, 2016, the average CTR increased to about 27 percent.

The top, most prominent positions are getting even more clicks. Obviously, they were already getting a lot of clicks. But now they’re getting more clicks than they have in recent history.

This is the winner-take-all nature of Google’s organic SERPs today. It’s coming at the expense of Positions 4–10, which are being clicked on much less over time.

Results that are more likely to attract engagement are pushed further up the SERP, while results with lower engagement get pushed further down. That’s what we believe RankBrain is doing.

Going beyond the data

This data is showing us something very interesting. A couple thoughts:

  • This is exactly the fingerprint you would expect to see for a machine learning-based algorithm doing query interpretation that impacts rank based on user engagement metrics, such as CTR.
  • Essentially, machine learning systems move away from serving up 10 blue links and asking a user to choose one of them and toward providing the actual correct answers, further eliminating the need for lower positions.

Could anything else be causing this shift to the click curve? Could it have been the elimination of right rail ads?

No, that happened in February. I was careful to use date ranges that were after the right rail apocalypse.

Could it be more Knowledge Graph elements creeping into the SERPs? If that were the case, it would look like everything got pushed down by one position (e.g., Position 1 becomes Position 2, Position 2 becomes Position 3, and so on).

The data didn’t show that happening. We see a bending of the click curve, not a shifting of the curve.

Behold the awesome power of CTR optimization!

OK, so we’ve looked at the big picture. Now let’s look at the little picture to illustrate the remarkable power of CTR optimization.

Let’s talk about guerrilla marketing. Here are two headlines. Which headline do you think has the higher CTR?

  • Guerrilla Marketing: 20+ Examples and Strategies to Stand Out

This was the original headline for an article published on the WordStream blog in 2014.

  • 20+ Jaw-Dropping Guerrilla Marketing Examples

This is the updated headline, which we changed just a few months ago, in the hopes of increasing the CTR. And yep, we sure did!

Before we updated the headline, the article had a CTR of 1 percent and was ranking in position 8. Nothing awesome.

Since we updated the headline, the article has had a CTR of 4.19 percent and is ranking in position 5. Pretty awesome, no?

Increasingly, we’ve been trying to move away from “SEO titles” that look like the original headline, where you have the primary keyword followed by a colon and the rest of your headline. They aren’t catchy enough.

Yes, you still need to include keywords in your headline. But you don’t have to use this tired format, which will deliver (at best) solid but unspectacular results.

To be clear: we only changed the title tag. No other optimization tactics were used.

We didn’t point any links (internal or external) at it. We didn’t add any images or anything else to the post. Nothing.

Changing the title tag changed the CTR. Which gave it “magical points” that resulted in 97 percent more organic traffic:

What does it all mean?

This example illustrates that if you increase your CTR, you’ll see a nice boost in traffic. Ranking in a better position means more traffic, which means a higher CTR, which also means more traffic.

What’s so remarkable is that this is on-page SEO. No link building was required! Besides, pointing new links to a page wouldn’t result in a higher click-through rate – a catchier headline, however, would result in a higher CTR.

What’s also interesting about this is that RankBrain isn’t like other algorithms, say Panda or Penguin, where it was obvious when you got hit. You lost half your traffic!

If RankBrain or a machine learning algorithm impacts your site due to engagement metrics (positive or negative), it’s a much more subtle shift. All your best pages do better. All your “upper class donkey” pages do slightly worse. Ultimately, the two forces cancel each other out, to some extent, so that the SEO alarms don’t go off.

The final frontier

When it comes to SEO, your mission is to seek out every advantage. It’s my belief that organic CTR and website engagement rates impact organic rankings.

So boldly go where many SEOs are failing to go now. Hop aboard the USS Unicorn, make the jump to warp speed, and discover the wonders of those magical creatures.

Oh, and…

Are you optimizing your click-through rates? If not, why not? If so, what have you been seeing in your analytics?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Most SEOs Are No Better than a Coin-Flip at Predicting Which Page Will Rank Better. Can You?

Posted by willcritchlow

We want to be able to answer questions about why one page outranks another.

“What would we have to do to outrank that site?”
“Why is our competitor outranking us on this search?”

These kind of questions — from bosses, from clients, and from prospective clients — are a standard part of day-to-day life for many SEOs. I know I’ve been asked both in the last week.

It’s relatively easy to figure out ways that a page can be made more relevant and compelling for a given search, and it’s straightforward to think of ways the page or site could be more authoritative (even if it’s less straight-forward to get it done). But will those changes or that extra link cause an actual reordering of a specific ranking? That’s a very hard question to answer with a high degree of certainty.

When we asked a few hundred people to pick which of two pages would rank better for a range of keywords, the average accuracy on UK SERPs was 46%. That’s worse than you’d get if you just flipped a coin! This chart shows the performance by keyword. It’s pretty abysmal:

It’s getting harder to unpick all the ranking factors

I’ve participated in each iteration of Moz’s ranking factors survey since its inception in 2009. At one of our recent conferences (the last time I was in San Diego for SearchLove) I talked about how I used to enjoy it and feel like I could add real value by taking the survey, but how that’s changed over the years as the complexity has increased.

While I remain confident when building strategies to increase overall organic visibility, traffic, and revenue, I’m less sure than ever which individual ranking factors will outweigh which others in a specific case.

The strategic approach looks at whole sites and groups of keywords

My approach is generally to zoom out and build business cases on assumptions about portfolios of rankings, but it’s been on my mind recently as I think about the ways machine learning should make Google rankings ever more of a black box, and cause the ranking factors to vary more and more between niches.

In general, “why does this page rank?” is the same as “which of these two pages will rank better?”

I’ve been teaching myself about deep neural networks using TensorFlow and Keras — an area I’m pretty sure I’d have ended up studying and working in if I’d gone to college 5 years later. As I did so, I started thinking about how you would model a SERP (which is a set of high-dimensional non-linear relationships). I realized that the litmus test of understanding ranking factors — and thus being able to answer “why does that page outrank us?” — boils down to being able to answer a simpler question:

Given two pages, can you figure out which one will outrank the other for a given query?

If you can answer that in the general case, then you know why one page outranks another, and vice-versa.

It turns out that people are terrible at answering this question.

I thought that answering this with greater accuracy than a coin flip was going to be a pretty low bar. As you saw from the sneak peak of my results above, that turned out not to be the case. Reckon you can do better? Skip ahead to take the test and find out.

(In fact, if you could find a way to test this effectively, I wonder if it would make a good qualifying question for the next moz ranking factors survey. Should you only listen only to the opinion of those experts who are capable of answering with reasonable accuracy? Note that my test that follows isn’t at all rigorous because you can cheat by Googling the keywords — it’s just for entertainment purposes).

Take the test and see how well you can answer

With my curiosity piqued, I put together a simple test, thinking it would be interesting to see how good expert SEOs actually are at this, as well as to see how well laypeople do.

I’ve included a bit more about the methodology and some early results below, but if you’d like to skip ahead and test yourself you can go ahead here.

Note that to simplify the adversarial side, I’m going to let you rely on all of Google’s spam filtering — you can trust that every URL ranks in the top 10 for its example keyword — so you’re choosing an ordering of two pages that do rank for the query rather than two pages from potentially any domain on the Internet.

I haven’t designed this to be uncheatable — you can obviously cheat by Googling the keywords — but as my old teachers used to say: “If you do, you’ll only be cheating yourself.”

Unfortunately, Google Forms seems to have removed the option to be emailed your own answers outside of an apps domain, so if you want to know how you did, note down your answers as you go along and compare them to the correct answers (which are linked from the final page of the test).

You can try your hand with just one keyword or keep going, trying anywhere up to 10 keywords (each with a pair of pages to put in order). Note that you don’t need to do all of them; you can submit after any number.

You can take the survey either for the US (google.com) or UK (google.co.uk). All results are considering only the “blue links” results — i.e. links to web pages — rather than universal search results / one-boxes etc.

Take the test!

What do the early responses show?

Before publishing this post, we sent it out to the @distilled and @moz networks. At the time of writing, almost 300 people have taken the test, and there are already some interesting results:

It seems as though the US questions are slightly easier

The UK test appears to be a little harder (judging both by the accuracy of laypeople, and with a subjective eye). And while accuracy generally increases with experience in both the UK and the US, the vast majority of UK respondents performed worse than a coin flip:

Some easy questions might skew the data in the US

Digging into the data, there are a few of the US questions that are absolute no-brainers (e.g. there’s a question about the keyword [mortgage calculator] in the US that 84% of respondents get right regardless of their experience). In comparison, the easiest one in the UK was also a mortgage-related query ([mortgage comparisons]) but only 2/3 of people got that right (67%).

Compare the UK results by keyword…

…To the same chart for the US keywords:

So, even though the overall accuracy was a little above 50% in the US (around 56% or roughly 5/9), I’m not actually convinced that US SERPs are generally easier to understand. I think there are a lot of US SERPs where human accuracy is in the 40% range.

The Dunning-Kruger effect is on display

The Dunning-Kruger effect is a well-studied psychological phenomenon whereby people “fail to adequately assess their level of competence,” typically feeling unsure in areas where they are actually strong (impostor syndrome) and overconfident in areas where they are weak. Alongside the raw predictions, I asked respondents to give their confidence in their rankings for each URL pair on a scale from 1 (“Essentially a guess, but I’ve picked the one I think”) to 5 (“I’m sure my chosen page should rank better”).

The effect was most pronounced on the UK SERPs — where respondents answering that they were sure or fairly sure (4–5) were almost as likely to be wrong as those guessing (1) — and almost four percentage points worse than those who said they were unsure (2–3):

Is Google getting so me of these wrong?

The question I asked SEOs was “which page do you think ranks better?”, not “which page is a better result?”, so in general, most of the results say very little about whether Google is picking the right result in terms of user satisfaction. I did, however, ask people to share the survey with their non-SEO friends and ask them to answer the latter question.

If I had a large enough sample-size, you might expect to see some correlation here — but remember that these were a diverse array of queries and the average respondent might well not be in the target market, so it’s perfectly possible that Google knows what a good result looks like better than they do.

Having said that, in my own opinion, there are one or two of these results that are clearly wrong in UX terms, and it might be interesting to analyze why the “wrong” page is ranking better. Maybe that’ll be a topic for a follow-up post. If you want to dig into it, there’s enough data in both the post above and the answers given at the end of the survey to find the ones I mean (I don’t want to spoil it for those who haven’t tried it out yet). Let me know if you dive into the ranking factors and come up with any theories.

There is hope for our ability to fight machine learning with machine learning

One of the disappointments of putting together this test was that by the time I’d made the Google Form I knew too many of the answer to be able to test myself fairly. But I was comforted by the fact that I could do the next best thing — I could test my neural network (well, my model, refactored by our R&D team and trained on data they gathered, which we flippantly called Deeprank).

I think this is fair; the instructions did say “use whatever tools you like to assess the sites, but please don’t skew the results by performing the queries on Google yourself.” The neural network wasn’t trained on these results, so I think that’s within the rules. I ran it on the UK questions because it was trained on google.co.uk SERPs, and it did better than a coin flip:

So maybe there is hope that smarter tools could help us continue to answer questions like “why is our competitor outranking us on this search?”, even as Google’s black box gets ever more complex and impenetrable.

If you want to hear more about these results as I gather more data and get updates on Deeprank when it’s ready for prime-time, be sure to add your email address when you:

Take the test (or just drop me your email here)


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →