Archives for 

seo

Google Keyword Planner’s Dirty Secrets

Posted by rjonesx.

Sometimes our best data sources aren’t exactly up to par. While nearly every search marketer will rely on Google Keyword Planner data at one point or another, especially while doing keyword research, the reality is that the data is often untrustworthy and should be viewed with great skepticism. Whether you plan to use it to help build a paid search campaign or determine which content to write, there are huge caveats to the numbers presented as Average Search Volume. Today, I want to walk through a number of the “gotchas” in Google Keyword Planner data so you can do better keyword research and make smarter decisions for you or your clients’ sites.

Dirty secret #1: Rounded averages

By far, the most-used piece of data from Google Keyword Planner is the “Average Monthly Search Volume” metric. This key data point is used in everything from basic decisions on what keywords to use in an ad campaign to complex traffic prediction curves. But can we trust it?

Suppose you run a sports website and two keywords pop up in the recommendations: baseball scores and basketball games. Google Keyword Planner lets us know that each of these keywords has an Average Monthly Search Volume of 201,000. At first glance, you should be able to choose either of these keywords and expect similar traffic results, right?

Wrong. The “Average Monthly Search Volume” is more than just an average; it’s rounded to the nearest-volume-bucket (which I will describe later). We know this is the case because Google Keyword Planner also exposes the last 12 months of traffic data. If we average that data, we will see that baseball scores receives 217,275 visits per month, while basketball games averages only 205,750! That is a difference of over 10,000 searches per month, which is obscured by Google KWP’s rounding algorithm.

When we took a sample of keywords at the 201,000 Average Monthly Search volume, the standard deviation was 14,621 in the “actual average.” In some cases, it was off by over 40,000 monthly searches per month! If you don’t look at the last 12 months of data, your annual traffic estimates will likely be off by tens of thousands of visits. What causes this anomaly?

Dirty secret #2: Traffic buckets

Google Keyword Planner uses “buckets” to group keywords by traffic volume. When a keyword returns a traffic volume of 201,000, it isn’t because the keyword was actually visited that many times, or really that it was particularly close to the number 201,000, but just that it was closer to 201,000 than the next biggest bucket of 246,000. The next lower bucket is 165,000, which gives us a nice 80,000-searches-per-month wiggle room — within which a keyword might actually fall and still be categorized as 201,000 by Keyword Planner.

After analyzing a massive data set, we found that Google has around 85 different “buckets” for traffic, which are logarithmically proportioned. This means that long tail keywords might fall into buckets which only differ by 10–20 searches at a time, while head tail keywords might see gaps of hundreds of thousands of searches per month. The bigger the search volume, the less certain you can be about the accuracy of the Average Monthly Searches, especially relative to other terms that fall in the same group. In fact, the largest buckets have variances of of nearly a quarter million searches per month!

Google uses this rounding procedure for convenience and, likely, to take into account the real month-to-month variance which can be huge for these very popular terms.

Dirty secret #3: Hidden keywords

Rand had an excellent write up on this issue a while back if you want to read the full details or want a more in-depth look at the problem. However, I thought I’d just throw out some stats here to show you just how ridiculous the recommendation system can be relative to the reality of related words and phrases. Let’s start with the phrase “football.” In this example, we will start with using GrepWords data to find the most valuable words that contain “football” in them. Then, we simply ask Google what they recommend. How close do they match? What is missed?

The top 3 most-trafficked football-based keywords weren’t recommended to us, and only 4 of Google’s recommended made it into the top 10. In fact, when we analyzed dozens of Google keyword recommendation reports, we found that only 35% of the keywords were among the most trafficked terms.

It appears that Google Keyword Planner is simply trying to provide a diverse cross-section of terms, but for marketers it means you potentially miss out on huge opportunities unless you dig much deeper. You can battle back against this “feature” by choosing more short-tail terms to seed your searches and setting volume and CPC limits, as the recommendations get stronger and stronger the more specific you get. In the end, though, you’re going to miss out on some great terms if you’ve restricted your research to only Google Keyword Planner.

Dirty secret #4: Combination inconsistencies

If you’re like me and spelling isn’t your forte, you have certainly seen Google give you the “showing results for {correct spelling}.” This is very useful for the searcher, but throws a pretty big wrench into keyword volume metrics. What does Google do in these situations? Does it count all the traffic towards correctly spelled keyword (which is actually showing in the search results) or does it count the traffic toward the misspelling or variation? Well, it turns out it’s a mixed bag. Let’s take a look at a fairly popular term Texas A&M Football.

In the above picture we see several variations of how one might search for the concept Texas A&M Football.

Keyword Corrected? Distinct Volume
Texas A&M Football No Yes
Texas A and M Football No Yes
Texas AM Football Yes Yes
Texas A & M Football No Yes
Texas A& M Football Yes Yes

Notice that whether or not the keyword is mapped to the canonical spelling makes no difference, in this case, for the total search volume. Even though many keywords will show you Texas A&M results, Google’s volume count is only for the correct spelling of the term.

Now here’s where it starts to matter. Let’s say that you run a site that sells football attire and you’re deciding which schools to include. You look up Google’s Keyword Planner data and see that “Texas A&M Football” and “FSU Football” are both searched 201,000 times a month. These keywords seem equal in terms of volume but, in reality, there are many more keywords that are mapped organically to the phrase “Texas A&M Football,” which makes its combined search volume much higher. In this particular case, there are several thousand visitors a year that you might miss out on by choosing “FSU Football” over “Texas A&M Football” simply because Google doesn’t combine the keywords in Keyword Planner despite doing so in organic search.

This might seem like a reasonable compromise. The Keyword Planner is giving you back the search counts for the keywords, regardless of whether those searches are redirected to a different phrase. This would be appropriate if it was consistent, but with certain punctuation in terms we see Google treat the case completely differently. Take the search terms facebook.com and facebook com. Google reports that both of these terms are searched 7.8 million times a month. Clearly these two variants are not searched an identical number of times; Google has simply mapped the keywords together BOTH in organic search results AND in volume. This forces keyword researchers to build huge keyword lists and go line-by-line removing the edge cases.

Here’s a quick tip for you Excel experts out there: Look into using Jaro Winkler distance to find very similar terms that have identical search volume. Often these terms are mapped both in organic and in volume, and you can find those exclusions easily.

Dirty secret #5: Strange recommendations

Sometimes Google Keyword Planner gets the keyword recommendations completely wrong. Here are a couple of the examples that I was able to pull in just a few minutes of brainstorming:

Starting Keyword Recommended Keyword
baseball glove boxing glove
pigeon cabins
calamari pork chops
rap country music

Because Google Keyword Planner uses more than just phrase matching to build their recommended keywords, you will regularly find some truly strange entries in your recommended keyword list, or connections that a computer might make but a human never would. Unfortunately, this means you have to be very careful about what you get back, going keyword by keyword if you want to start a paid search campaign based on what’s been returned. You simply can’t be confident in the relevancy of the results. Can you imagine how many webmasters just blindly added Google’s recommendations to their advertising campaigns?

All is not lost

Luckily, there is more than one way to get at and improve the Keyword Planner data using clickstream data sources. For example, we know of two keyword data sources — ClickStre.am and SimilarWeb — which correlate nicely with Google Keyword Planner volumes.

While this data from SimilarWeb is very useful, building a more accurate prediction of search volume for a term requires that you build a regression model comparing the user data to Google’s estimates. Moreover, demographic differences between the whole Google user base and those included in the user panels of SimilarWeb and ClickStre.am mean that building a ubiquitous regression model across all the keyword data might not be the best, as the users tracked by SimilarWeb and ClickStre.am might be biased towards different topics. The solution is to build models around topically-related keywords.

For example, instead of modeling all the keywords against one another, if Google Keyword Planner gave you 2 keywords on the same topic with the same keyword bucket (like 201,000 searches per month), you could build a regression model on the fly comparing a sample of topically-related keywords, using that to predict with greater granularity the performance of the two seemingly identical keywords.

While this user data helps you defeat issues of granularity, getting better (both more thorough and more accurate) recommendations for keywords can be a little more difficult. Your best bet here is to use keyword data aggregators like GREPWords, KeywordTool.io, or the upcoming Moz Keyword Explorer.

Keyword Planner is dead. Long live Keyword Planner

Unfortunately, despite all of the strange quirks and outright deceptions of Google Keyword Planner, it’s the best thing we really have going for us in terms of getting search volume data out of Google. We can potentially refine some of the data with clickstream data, or get estimates by running Google Adwords campaigns and watching impression counts, or even looking in Google Search Console. But none of these are strong replacements for the Google Keyword Planner.

Instead of letting Google Keyword Planner’s problems get in the way of your keyword research, use it to your advantage. Look for the edge cases where a keyword has a ton of misspellings mapped to the correct version, but not combined into the volume score. This could be a great win that your competitors are overlooking because the head term looks smaller than it really is. Wherever there’s bad data, there’s also money to be made in sweating the details. So, put your gloves on and get to scrubbing your Keyword Planner data. Somewhere beneath the rough is a diamond.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

It’s Here! The MozCon Local 2016 Agenda

Posted by EricaMcGillivray

*drumroll* The MozCon Local 2016 agenda is here! For all your local marketing and SEO needs, we’re pleased to present a fabulous lineup of speakers and topics for your enjoyment. MozCon Local is Thursday and Friday, February 18–19 2016 in Seattle. On Thursday, our friends LocalU will present a half-day of intensive workshops, and on Friday we’ll be having an entire day of keynote-style conference fun. (You do need to purchase the workshop ticket separately from the conference ticket.)

If you’ve just remembered that you need to purchase your ticket, do so now:

Buy your MozCon Local 2016 ticket!

Otherwise, let’s dig into that agenda!

MozCon Local 2016


Thursday workshops

12:00–12:30pm
Registration


12:30–12:35pm
Introduction and Housekeeping


David Mihm12:35–12:55pm
The State of Local Search with David Mihm

Already one of the most complex areas in all of search marketing, local has never been more fragmented than it is today. Following a brief summary of the Local Search Ranking Factors, David will give you his perspective on which strategies and tactics are worth paying attention to, and which ones are simply “nice to have.”

David Mihm is one of the world’s leading practitioners of local search engine marketing. He has created and promoted search-friendly websites for clients of all sizes since the early 2000s. David co-founded GetListed.org, which he sold to Moz in November 2012.


12:55–1:35pm
Local Search Processes with Aaron Weiche, Darren Shaw, Mike Ramsey, and Paula Keller

Darren Shaw, Mike Ramsey, Aaron Weiche, and Paula Keller

Panel discussion and Q&A on the best processes to use in marketing local businesses online.


1:35–2:35pm
How to do Competitive Analysis for Local Search with Aaron Weiche, Darren Shaw, David Mihm, Ed Reese, Mary Bowling, Mike Ramsey

Each panelist will demonstrate their methods and the tools they use to audit a specific area of the online presence of a single local business. The end result will be a complete picture of how a thorough competitive analysis for a local business can be done.


2:35–2:50pm
Break


During this time period, each attendee will choose any three 30-minute workshops to attend. Some workshops are offered in all time slots, while others are only offered at specific times. Present your challenges, discuss solutions, and get your burning questions answered in these small groups.

LocalU Workshops

2:50–3:20pm
  • Tracking and Conversions with Ed Reese
  • Solving Problems at Google My Business with Willys DeVoll and Mary Bowling
  • Ask Me Anything About Local Search with David Mihm
  • Local Targeting of Paid Advertising with Paula Keller
  • Using Reviews to Build Your Business with Aaron Weiche
  • Local Links with Mike Ramsey
  • Citations: Everything You Need to Know with Darren Shaw
3:20–3:50pm
  • Tracking and Conversions with Ed Reese
  • Solving Problems at Google My Business with Willys DeVoll and Mary Bowling
  • Ask Me Anything About Local Search with David Mihm
  • Local Targeting of Paid Advertising with Paula Keller
  • Using Reviews to Build Your Business with Aaron Weiche
  • Agency Issues with Mike Ramsey
  • Local Links with Darren Shaw
3:50–4:20pm
  • Tracking and Conversions with Ed Reese
  • Solving Problems at Google My Business with Willys DeVoll and Mary Bowling
  • Ask Me Anything About Local Search with David Mihm
  • Local Targeting of Paid Advertising with Paula Keller
  • Using Reviews to Build Your Business with Aaron Weiche
  • Local Links with Mike Ramsey
  • Citations: Everything You Need to Know with Darren Shaw

4:20–5:00pm
Live Site Reviews

The group will come back together for live site reviews!


5:00–6:00pm
Happy Hour!


Friday conference

Mary Bowling talks to the local crowd

8:00–9:00am

Breakfast


David Mihm9:00–9:05am
Welcome to MozCon Local 2016! with David Mihm

David Mihm is one of the world’s leading practitioners of Local search engine marketing. He has created and promoted search-friendly websites for clients of all sizes since the early 2000s. David co-founded GetListed.org, which he sold to Moz in November 2012.


Mary Bowling9:05–9:35am
Feeding the Beast: Local Content for RankBrain with Mary Bowling

We now know searcher behavior and continual testing via machine learning indeed affects Google rankings and algorithm refinements. Learn how to create local content to satisfy both Google and our human visitors.

Mary Bowling‘s been in SEO since 2003 and has specialized in local SEO since 2006. When she’s not writing about, teaching, consulting, and doing internet marketing, you’ll find her rafting, biking, and skiing/snowboarding in the mountains and deserts of Colorado and Utah.


Mike Ramsey9:35–10:05am
Local Links: Tests, Tools, and Tactics with Mike Ramsey

Going beyond the map pack, links can bring you qualified traffic, organic rankings, penalties, or filters. Mike will walk through lessons, examples, and ideas for you to utilize to your heart’s content.

Mike Ramsey is the president of Nifty Marketing and a founding faculty member of Local University. He is a lover of search and social with a heavy focus in local marketing and enjoys the chess game of entrepreneurship and business management. Mike loves to travel and loves his home state of Idaho.


Darren Shaw10:05–10:35am
Citation Investigation! with Darren Shaw

Darren investigates how citations travel across the web and shares new insights into how to better utilize the local search ecosystem for your brands.

Darren Shaw is the president and founder of Whitespark, a company that builds software and provides services to help businesses with local search. He’s widely regarded in the local SEO community as an innovator, one whose years of experience working with massive local data sets have given him uncommon insights into the inner workings of the world of citation-building and local search marketing. Darren has been working on the web for over 16 years and loves everything about local SEO.


10:35–10:55am
AM Break


Lindsay Wassell10:55–11:20am
Technical Site Audits for Local SEO with
Lindsay Wassell

Onsite SEO success lies in the technical details, but extensive SEO audits can be too expensive and impractical. Lindsay shows you the most important onsite elements for local search optimization and outlines an efficient path for improved performance.

Lindsay Wassell’s been herding bots and wrangling SERPs since 2001. She has a zeal for helping small businesses grow with improved digital presence. Lindsay is the CEO and founder of Keyphraseology.


Justine Jordan11:20–11:45am
Optimizing and Hacking Email for Mobile with Justine Jordan

Email may be an old dog, but it has learned some new mobile tricks. From device-a-palooza and preview text to tables and triggers, Justine will break down the subscriber experience so you (and your audience) get the most from your next campaign.

In addition to being an email critic, cat lover, and explain-a-holic, Justine Jordan also heads up marketing for Litmus, an email testing and analytics platform. She’s strangely passionate about email, hates being called a spammer, and still codes like it’s 1999.


Emily Grossman11:45am–12:10pm
Understanding App-Web Convergence and the Impending App Tsunami with Emily Grossman

People no longer distinguish between app and web content; both compete for the same space in local search results. Learn how to keep your local brand presence afloat as apps and deep links flood into the top of search results.

Emily Grossman is a Mobile Marketing Specialist at MobileMoxie, and she has been working with mobile apps since the early days of the app stores in 2010. She specializes in app search marketing, with a focus on strategic deep linking, app indexing, app launch strategy, and app store optimization (ASO).


Robi Ganguly12:10–12:35pm
Building Customer Love and Loyalty in a Mobile World with Robi Ganguly

How the best companies in the world relate to customers, create a personal touch, and foster customer loyalty at scale.

Robi Ganguly is the co-founder and CEO of Apptentive, the easiest way for every company to communicate with their mobile app customers. A native Seattleite, Robi enjoys building relationships, running, reading, and cooking.


12:35–1:35pm
Lunch



Luther Lowe and Willys Devol1:35–2:05pm
The Past, Present, and Future of Local Listings with Luther Lowe and Willys Devol

Two of the biggest kids on the local search block, Google and Yelp, share their views on the changing world of local listings, their place in the broader world of local search, and what you can do to keep up, in this Q&A moderated by David Mihm.

Luther Lowe is VP of Public Policy at Yelp.

Willys Devol is the content strategist for Google My Business, and he spends his time designing and writing online content to help business owners enhance their presence online. He’s also a major proponent of broccoli and gorillas.


Paula Keller2:05–2:35pm
Fake It Til You Make It: Brand Building for Local Businesses with Paula Keller

Explore real-world examples of how your local business can establish a brand that both customers and Google will recognize and reward.

As Director of Account Management at Search Influence, Paula Keller strategizes with businesses on improving their search, social, and online ads results, and she works to scale those tactics for her team’s 800+ local business clients. Paula views online marketing the same way she views cooking (her favorite way to spend her free time): trends come and go, but classic tactics are always the foundation of success!


Dana DiTomaso2:35–3:05pm
Your Marketing Team is Larger Than You Think with Dana DiTomaso

Imagine doing such a great job with your branding that you become a part of your customer’s life. They trust your brand as part of their community. This magic doesn’t happen by dictating the corporate voice from a head office, but from empowering your locations to build customer community.

Whether at a conference, on the radio, or in a meeting, Dana DiTomaso likes to impart wisdom to help you turn a lot of marketing bullshit into real strategies to grow your business. After 10+ years, she’s (almost) seen it all. It’s true, Dana will meet with you and teach you the ways of the digital world, but she is also a fan of the random fact. Kick Point often celebrates “Watershed Wednesday” because of Dana’s diverse work and education background. In her spare time, Dana drinks tea and yells at the Hamilton Tiger-Cats.


3:05–3:25pm
PM Break


Cori Shirk3:25–3:55pm
Mo’ Listings, Mo’ Problems: Managing Enterprise-Level Local Search with Cori Shirk

Listings are everyone’s favorite local search task…not. Cori takes you through how to tackle them at large scale, keep up, and not burn out.

Cori Shirk is a member of the SEO team at Seer Interactive, where she specializes in managing enterprise local search accounts and guiding strategy across all of Seer’s local search clients. When she’s not sitting in front of a computer, you can usually find her out at a concert enjoying a local craft beer.


Matthew Moore3:55–4:10pm
The Enterprise Perspective on Local Search with Matthew Moore

Learn how the person responsible for local visibility across a portfolio of nearly 1,000 locations tackles this space on a daily basis. Matthew from Sears Home Services shares his experiences and advice in this Q&A moderated by David Mihm.

Matthew Moore is Senior Director, Marketing Analytics at Sears Holdings Corporation.


Adria Saracino4:10–4:40pm
How to Approach Social Media Like Big Brands with Adria Saracino

Facebook, Twitter, LinkedIn, Instagram, Pinterest, YouTube, Snapchat, Periscope…the seemingly never-ending world of social media can leave even the most seasoned marketer flailing among too many tasks and not enough results. Adria will help you cut through the noise and share actionable secrets that big brands use to succeed with social media.

Adria Saracino is a digital strategist whose marketing experience spans mid-stage startups, agency life, and speaking engagements at conferences like SearchLove and Lavacon. When not marketing things, you can see her cooking elaborate meals and posting them on her Instagram, @emeraldpalate.


Rand Fishkin4:40–5:10pm
Analytics for Local Marketers: The Big Picture and the Right Details with Rand Fishkin

Are your marketing efforts taking your organization where it needs to go, or are they just boosting your vanity metrics? Rand explains how to avoid being misled by the wrong metrics and how to focus on the ones that will keep you moving forward. Learn how to determine what to measure, as well as how to tie it to objectives with clear, concise, and useful data points.

Rand Fishkin uses the ludicrous title “Wizard of Moz.” He’s the founder and former CEO of Moz, co-author of a pair of books on SEO, and co-founder of Inbound.org.


6:00–10:00pm
MozCon Local Networking Afterparty, location TBA

Join your fellow attendees and Moz and LocalU staff for a networking party after the conference. Light appetizers and drinks included. See you there!

Buy your MozCon Local 2016 ticket!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Persona Research in Under 5 Minutes

Posted by CraigBradford

Well-researched personas can be a useful tool for marketers, but to do it correctly takes time. But what if you don’t have extra time? Using a mix of Followerwonk, Twitter, and the AIchemy language API, it’s possible to do top-level persona research very quickly. I’ve built a Python script that can help you answer two important questions about your target audience:

  1. What are the most common domains that my audience visits and spend time on? (Where should I be trying to get mentions/links/PR)
  2. What topics are they interested in or reading on those sites? (What content should I potentially create for these people)

You can get the script on Github: Twitter persona research

Once the script runs, the output is two CSV files. One is a list of the most commonly-shared domains by the group, the other is a list of the topics that the audience is interested in.

A quick introduction to Watson and the Alchemy API

The Alchemy API has been around a while, and they were recently acquired by the IBM Watson group. The language tool has 15 functions. I’ve used it in the past for language detection, sentiment analysis, and topic analysis. For this personas tool, I’ve used the Concepts feature. You can upload a block of text or ask it to fetch a URL for analysis. The output is then a list of concepts that are relevant to the page. For example, if I put the Distilled homepage into the tool, the concepts are:

Notice there are some strange things like Arianna Huffington listed, but running this tool over thousands of URLs and counting the occurrences takes care of any strange results. This highlights one of the interesting features of the tool: Alchemy isn’t just doing a keyword extraction task. Arianna Huffington isn’t mentioned anywhere on the Distilled homepage.

Alchemy has found the mention of Huffington Post and expanded on that concept. Notice that neither search engine optimization or Internet marketing are mentioned on the homepage, but have been listed as the two most relevant concepts. Pretty clever. The Alchemy site sums it up nicely:

“AlchemyAPI employs sophisticated text analysis techniques to concept tag documents in a manner similar to how humans would identify concepts. The concept tagging API is capable of making high-level abstractions by understanding how concepts relate, and can identify concepts that aren’t necessarily directly referenced in the text.”

My thinking for this script is simple: If I get a list of all the links that certain people share and pass the URLs through the Alchemy tool, I should be able to extract the main concepts that the audience is interested in.

To use an example, let’s assume I want to know what topics the SEO community is interested in and what sites are most important in that community. My process is this:

  1. Find people that mention “SEO” in their Twitter bio using Followerwonk
  2. Get a sample of their most recent tweets using the Twitter API
  3. Pull out the most common domains that those people share
  4. Use the Alchemy Concepts API to summarize what the pages they share are about
  5. Output all of the above to a spreadsheet

Follow the steps below. Sorry, but the instructions below are for Mac only; the script will work for PCs, but I’m not sure of the terminal set up.

How to use the script

Step 1 – Finding people interested in SEO

Searching Followerwonk is the only manual part of the process. I might build it into the the script in future, but honestly, it’s too easy to just download the usernames from the interface.

Go into the “Search Bios” tab and enter the job title in quotes. In this case, that’s “SEO.” More common jobs will return a lot of results; I recommend setting some filters to avoid bots. For example, you might want to only include accounts with a certain number of followers, or accounts with less than a reasonable number of tweets. You can download these users in a CSV as shown in the bottom-right of the image below:

Everything else can be done automatically using the script.

Step 2 – Downloading the script from GitHub

Download the script from Github here: Twitter API using Python. Use the Download Zip link on the right hand side as shown below:

Step 3 – Sign up for Twitter and Alchemy API keys:

It’s easy to sign up using the links below:

Once you have the API keys, you need to install a couple of extra requirements for the script to work.

The easiest way to do that is to download Pip here: https://bootstrap.pypa.io/get-pip.py — save the page as “get-pip.py”. Create a folder on your desktop and save the Git download and the “get-pip.py” file in it. You then need to open your terminal and navigate into that folder. You can read my previous post on how to use the command line here: The Beginner’s Guide to the Command Line.

The steps below should get you there:

Open up the terminal and type:

“cd Desktop/”

“cd [foldername]”

You should now be in the folder with the get-pip.py file and the folder you downloaded from Github. Go back to the terminal and type:

“sudo python get-pip.py”

“sudo pip install -r requirements.txt”

Create two more files:

  1. usernames.txt – This is where you will add all of the Twitter handles you want to research
  2. api_keys.py – The file with your API keys for Alchemy and Twitter

In the api_keys file, paste the following and add the respective details:

watson_api_key = “[INSERT ALCHEMY KEY]”

twitter_ckey = “[INSERT TWITTER CKEY]”

twitter_csecret = “[INSERT CSECRET]”

twitter_atoken = “[INSERT TOKEN]”

twitter_asecret = “[INSERT ASECRET]”

Save and close the file.

Step 4 – Run the script

At this stage you should:

  1. Have a username.txt file with the Twitter handles you want to research
  2. Have downloaded the script from Github
  3. Have a file named api_keys.py with your details for Alchemy and Twitter
  4. Installed Pip and the requirements file

The main code of the script can be found in the “get_tweets.py” file.

To run the script, go into your terminal, navigate to the folder that you saved the script to (you should still be in the correct directory if you followed the steps above. Use “pwd” to print the directory you’re in). Once you are in the folder, run the script by going to the terminal and typing: “python get_tweets.py”. Depending on the number of usernames you entered, it should take a couple of minutes to run. I recommend starting with one or two to check that everything is working.

Once the script finishes running, it will have created two csv files in the folder you created:

  1. “domain + timestamp” – This includes all the domains that people tweeted and the count of each
  2. “concepts + timestamp” – This includes all the concepts that were extracted from the links that were shared

I did this process using “SEO” as the search term in Followerwonk. I used 50 or so profiles, which created the following results:

Top 30 domains shared:

Top 40 concepts

For the most part, I think the domains and topics are representative of the SEO community. The output above seems obvious to us, but try it for a topic that you’re not familiar with and it’s really helpful. The bigger the sample size, the better the results should be, but this is restricted by the API limitations.

Although it looks like a lot of steps, once you have this set up, it’s very easy to repeat — all you need to change is the usernames file. Using this tool can get you some top-level persona information in a very short amount of time.

Give it a try and let me know what you think.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

How to Use Hosted Blog Platforms for SEO & Content Distribution

Posted by randfish

Where do you host your content? Is it on your own site, or on third-party platforms like Medium and LinkedIn? If you’re not yet thinking about the ramifications of using hosted blog platforms for your content versus your own site, now’s your chance to start. In this week’s Whiteboard Friday, Rand explores the boons and pitfalls of using outside websites to distribute and share your content.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat a little bit about blog platforms, places like Medium, Svbtle — that’s Svbtle with a V instead of a U — Tumblr, LinkedIn, places where essentially you’ve got a hosted blog platform, a hosted content platform. It’s someone else’s network. You don’t have to set up your own website, but at the same time you are contributing content to their site.

This has become really popular, I think. Look, Medium and LinkedIn are really the two big ones where a lot of folks are contributing these days. LinkedIn very B2B focused, Medium very startup, and new media as well as new creative-focused.

So I think, because of the rise of these things, we’re seeing a lot of people ask themselves, “Should I create my own content platform? Do I need to build a WordPress hosted subfolder on my website? Or can I just use Medium because it has all these advantages, right?” Well, let me try and answer those questions for you today.

So, what do hosted platforms enable?

Well, it’s really simple to sign up and start creating on them. You plug in your name, email, a password. You don’t have to set up DNS. You don’t have to set up hosting. You can start publishing right away. That’s really easy and convenient.

It also means that, for a lot of marketers, they don’t have to involve their engineering or their web development teams. That’s pretty awesome, too.

There are also built in networks on a lot of these places, Medium in particular, but Svbtle as well. Tumblr quite obviously has a very, very big network. So as a result, you’ve got this ability to gain followers or subscribers to your content, someone that can say like, “Oh, I want to follow @randfish on Medium.” I haven’t published on Medium, but for some reason I seem to have thousands of followers there.

So I think this creates this idea like, “Hey, I could reach a lot more people that I wouldn’t necessarily be able to reach on my own platform, because it’s not like these people are all subscribed to my blog already, but they are signed up for Medium or LinkedIn, which has hundreds of millions of worldwide users.”

There’s also an SEO benefit here. You inherit domain authority. On Medium and on LinkedIn in particular, these can be really powerful. Medium is a domain authority 80. LinkedIn is a domain authority of 99, which is no surprise. Pretty much every website on the planet links to their LinkedIn page. So you can imagine that these pages have the potential to do really well in Google’s rankings, and you don’t necessarily have to point a lot of links at them in order for them to rank very well. We’ve seen this. Medium has been doing quite well in the rankings. LinkedIn articles are doing quite well in their niches.

This is a little different, a subtle but important difference for Svbtle itself, for Tumblr, and for WordPress. These are on subdomains. So it would be, yes, there are lots of people who are using WordPress, although that’s very customizable. But you could imagine that if I got randstshirts.wordpress.com or randstshirts.tumblr.com or randstshirt.svbtle.com, that doesn’t have the same ranking ability. That subdomain means that Google considers it separately from the main domain. So you’re not going to inherit the ranking benefit on those. It’s really Medium and LinkedIn where that happens. To be honest, Google+ as well, we’ve seen them ranking like a Medium or a LinkedIn too.

You also have this benefit of email digests and subscriptions, which can help grow your content’s reach. For those of you who aren’t subscribed to Medium, they send out a daily digest to all of the folks who are signed up. So if you are someone who is contributing Medium content, you can often expect that your subscribers through Medium may be getting your stuff through an email digest. It may even get broadcast to a much broader group, to people who aren’t following you but are following them. If they’ve “hearted” your content on Medium, they’ll see it. So you get all these network effects through email digests and email subscriptions too.

So what’s the downside?

This is pretty awesome. To me, these are compelling reasons to potentially consider using these. But before we get too far ahead of ourselves, let’s talk about the downside as well. To my mind, these downsides prevent me from wanting to encourage certain types of views. I’ll talk about my best advice and my tactical advice for using these in a sec.

Links authority and ranking signals that are accrued. We recognize that you put a post on Medium, a lot of times posts there do very well. They get a lot of traction, a lot of attention. They make it into news feeds. Other sites link to them. Other pages around the web link to them. It’s great. Lots of social shares, lots of engagement. That is terrific.

Guess what? Those benefits accrue only to Medium.com. So every time you publish something there and it gets lots of links and ranking signals and engagement and social and all these wonderful things, that helps Medium.com rank better in the future. It doesn’t help yoursite.com rank better in the future.

You might say, “But Rand, I’ve got a link here, and that link points right back to my site.” Yes, wonderful. You now have the equivalent of one link from Medium. Good for you. It’s not a bad thing. But this is nowhere near the kind of help that you would get if this piece of content had been hosted on your site to begin with. If this is hosted over here, all these links point in there, and all those ranking benefits accrue to your site and page.

In some ways, from an SEO perspective, especially if you’re trying to build up that SEO flywheel of growing domain authority and growing links and being able to rank for more competitive stuff, if you’re trying to build that flywheel, you’d almost say, “Hey, you know what, I’d take half the links and ranking signals if it were on my own site. That would still be worth more to me than more on Medium.”

Okay. But that being said, there are all the distribution advantages, so maybe we’re still at a wash here.

Also on these blogging platforms, these hosted platforms, there’s no ownership of or ability to influence the UI and UX. That is a tough one too. So one of the wonderful things about blogging is — and we’ve seen this over the years many times at Moz. People come to Moz to read the content, they remember Moz, and they have a positive association and they say, “Yeah, you know, Moz made me feel like they were authorities, like they knew what they were talking about. So now I want to go check out Moz Local, their product, or Moz Analytics, or Open Site Explorer, or whatever it is.”

That’s great. But if you are on Medium or if you are on Svbtle or if you are on WordPress — well, WordPress is more customizable — but if you’re on Google+, the experience is, “Oh, I had a really good experience with Medium.” That’s very, very different. They will not remember who you are and how you made them feel, at least certainly not to the extent that they would if you owned and controlled that UI and UX.

So you’re really reducing brandability and any messaging opportunities that you might have had there. That’s dramatically, dramatically reduced. I think that’s very, very tough for a lot of folks.

Next up — and this speaks to the UI and UX elements — but it’s impossible to add or to customize calls to action, which really inhibits using your blog as part of your funnel. Essentially, I can’t say, “Hey, you know what I’d like to do? I’d like to add a button right below here, below all my blog posts that says, ‘Hey, sign up to try our product for free,’ or, ‘Get on our new mailing list,’ or, ‘Subscribe to this particular piece of content.’ Or I want to put something in the sidebar, or I’d like to have it in the header. Or I want to have it as a drop over when someone scrolls halfway down the page.” You can’t do any of those things. That sort of messaging is controlled by the platform. You’re not allowed to add custom code here, and thus your ability to impact your funnel with your blog or with your content platform on these sites is severely limited. You can add a link, and yes, people can still follow you on these networks, but that is definitely not the same.

There’s also, frustratingly, for a lot of paid marketers and a lot of marketers who know that they can do this, you can’t put a retargeting pixel on Svbtle or on Medium. Actually, you may be able to on Svbtle now. I’m not sure if you can. But Medium for sure, LinkedIn for sure, Google+, you can’t say, “Hey, all the people who come to my posts on Medium, I’d like to retarget them and remarket to them as they go around the web later, and I’ll follow them around the Internet like a lost puppy dog.” Well, too bad, not possible. You can’t place that pixel. No custom code, that’s out.

The last thing, and I think one of the most salient points, is there have been many, many platforms like this over the years. Many people use the example of GeoCities where a lot of people hosted their content and then it went away. In the early days of the web, it was very big, and a few years ago it fell apart.

It’s not just that, though. The uncertain future could mean that in some time frame, in the months or years to come, Medium, or Svbtle, or LinkedIn, or Google+ could become more like Facebook, where instead of 100% of the people seeing the content that they subscribe to, maybe they only see 10% or the Facebook averages today, which are under 1%. So this means that you don’t really know what might happen to your content in the future in terms of its potential visibility to the audience there. If that’s the sole place you’re building up your audience, that is a high amount of risk depending on what happens as the platform evolves.

This is true for all social platforms. It’s not just true for these hosted blog content platforms. Many folks have talked about how Twitter in the future may not show 100% of the content there. I don’t know how real that is or whether it’s just a rumor, but it’s one of those things to consider and keep in mind.

My best advice:

So my best advice here is, use platforms like these for reaching their audiences. I think it can be great to say, “Hey, 1 out of every 10 or 20 posts I want to put something up on Medium, or I want to test it on Google+, or I want to test it on LinkedIn because I think that those audiences have a lot of affinity with what I’m doing. I want to be able to reach out to them. I want to see how those perform. Maybe I want to contribute there once a month or once a quarter.” Great. Wonderful. That can be a fine way to draw distribution there.

I think it’s great for building connections. If you know that there are people on those networks who have big, powerful followings and they’re very engaged there, I think using those networks like you would use a Twitter or a Facebook or like you already use LinkedIn to try and build up those connections makes total sense.

Amplifying the reach of existing content or messages. If you have a great piece of content or a really exciting message, something exciting you want to share and you’ve already put some content around that on your own site and now you’re trying to find other channels to amplify, well, you might want to think about treating Medium just like you would treat a post on Twitter or a post on Facebook or a post on LinkedIn. You could instead create a whole piece of content around that, sort of like you would with a guest post, and use it to amplify that reach.

I think guest post-style contribution, in general, is a great way to think about these networks. So you might imagine saying, “Hey, I’d love to contribute to YouMoz,” which is Moz’s own guest blogging platform. That could be wonderful, but you would never make that your home. You wouldn’t host all your content there. Likewise you might contribute to Forbes or Business Insider or to The Next Web or any of these sites. But you wouldn’t say that’s where all my content is going to be placed. It’s one chance to get in front of that audience.

Last one, I think it’s great to try and use these for SERP domination. So if you say, “Hey, I own one or two of the top listings of the first page of results in Google for this particular keyword, term, or phrase. I want to use Medium and LinkedIn, and I’m going to write two separate pieces targeting similar keywords or those same keywords and see if I can’t own 4 slots or 5 slots out of the top 10.” That’s a great use of these types of platforms, just like it is with guest posting.

Don’t try to use these for…

Don’t try to use these as your content’s primary or, God forbid, only home on the web. Like I said, uncertain future, inability to target, inability of using the funnel, just too many limitations for what I think modern marketers need to do.

I don’t think it is wise, either, to put content on there that’s what I’d call your money keywords, essentially stuff that is very close to the conversion funnel, where you know people are going to search for these things, and then when they find this content, they’re very likely to make their next step a sign-up, a conversion. I would urge you to keep that on your site, because you can’t own the experience. I think it’s much wiser if you say, “Hey, let’s look way up in the funnel when people are just getting associated with us, or when we’re trying to bring in press and PR, or we’re trying to bring in broad awareness.” I think those are better uses.

I think it’s also very unwise to make these types of platforms the home of your big content pieces, big content pieces meaning like unique research or giant visuals or interactive content. You probably won’t even be able to host interactive content at most of these.

If you have content that you know is very likely to drive known, high-quality links, you’ve already got your outreach list, you’re pretty sure that those people are going to link to you, please put that content on your own site because you’ll get the maximum ranking benefits in that fashion. Then you could potentially put another piece of content, repurpose a little bit of the information or whatever it is that you’ve put together that’s wonderful in terms of big content as another piece that you separately broadcast and amplify to these audiences.

What I’m really saying is treat these guys — Medium, Svbtle, LinkedIn, Tumblr, and Google+ — treat them like these guys, like you use Facebook, Twitter, Instagram, YouTube, and guest hosts in general. It’s a place to put a little bit of content to reach a new audience. It’s a way to amplify a message you already have. It’s not the home of content. I think that’s really what I urge for modern marketers today.

All right, everyone. Look forward to the comments, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

30+ Important Takeaways from Google’s Search Quality Rater’s Guidelines

Posted by jenstar

For many SEOs, a glimpse at the Google’s Search Quality Rater’s Guidelines is akin to looking into Google’s ranking algorithm. While they don’t give the secret sauce to rank number one on Google, they do offer some incredible insight into what Google views as quality – and not-so-quality – and the types of pages they want to serve at the top of their search results.

Last week, Google made the unprecedented move of releasing the entire Search Quality Rater’s Guidelines, following an analysis of a leaked copy obtained by The SEM Post. While Google released a condensed version of the guidelines in 2013, until last week, Google had never released the full guidelines that the search quality raters receive in their entirety.

First, it’s worth noting that quality raters themselves have no bearing on the rankings of the sites they rate. So quality raters could assign a low score to a website, but that low rating would not be reflected at all in the actual live Google search results.

Instead, Google uses the quality raters for experiments, assessing the quality of the search results when they run these experiments. The guidelines themselves are what Google feels searchers are looking for and want to find when they do a Google search. The type of sites that rate highest are the sites and pages they want to rank well. So while it isn’t directly search algorithm-related, it shows what they want their algos to rank the best.

The document itself weighs in at 160 pages, with hundreds of examples of search results and pages with detailed explanations of why each specific example is either good, bad, or somewhere in between. Here’s what’s most important for SEOs and webmasters to know in these newly-released guidelines.

Your Money or Your Life Pages (aka YMYL)

SEOs were first introduced to the concept of Your Money or Your Life pages last year in a leaked copy of the guidelines. These are the types of pages that Google holds to the highest standards because they’re the types of pages that can greatly impact a person’s life.

While anyone can make a webpage about a medical condition or offer advice about things such as retirement planning or child support, Google wants to ensure that these types of pages that impact a searcher’s money or life are as high-quality as possible.

In other words, if low-quality pages in these areas could “potentially negatively impact users’ happiness, health, or wealth,” Google does not want those pages to rank well.

If you have any web pages or websites that deal in these market areas, Google will hold your site to a higher standard than it would a site on a hockey team fan page or a page on rice cooker recipes.

It is also worth noting that Google does consider any website that has a shopping component, such as an online store, as a type of site that also falls under YMYL for ratings. Therefore, ensuring the sales process is secure would be another thing raters would consider.

If a rater wouldn’t feel comfortable ordering from the site or submitting personal information to it, then it wouldn’t rate well. And if a rater feels this way, it’s very likely visitors would feel the same too — meaning you should take steps to fix it.

Market areas for YMYL

Google details five areas that fall into this YMYL category. If your website falls within one of these areas, or you have web pages within a site that do, you’ll want to take extra care that you’re supporting this content with things like references, expert opinions, and helpful supplementary or additional content.

  • Shopping or financial transaction pages
    This doesn’t apply merely to sites where you might pay bills online, do online banking, or transfer money. Any online store that accepts orders and payment information will fall under this as well.
  • Financial information pages
    There are a ton of low-quality websites that fall under this umbrella of financial information pages. Google considers these types of pages to be in the areas of “investments, taxes, retirement planning, home purchase, paying for college, buying insurance, etc.”
  • Medical information pages
    Google considers these types of pages to go well beyond the standard medical conditions and pharmaceuticals, but it also covers things such as nutrition and very niche health sites for sufferers of specific diseases or conditions — the types of sites that are often set up by those suffering from medical condition themselves.
  • Legal pages
    We’ve seen a ton of legal-related sites pop up by webmasters who are looking to cash in on AdSense or affiliate revenue. But Google considers all types of legal information pages as falling under YMYL, including things such as immigration, child custody, divorce, and even creating a well.
  • All-encompassing “Other”
    Then, of course, there are a ton of other types of pages and sites that can fall under YMYL that aren’t necessarily in any of the above categories. These are still things where having the wrong information can negatively impact the searcher’s happiness, health, or wealth. For example, Google considers topics such as child adoption and car safety information as falling under this as well.

Google makes frequent reference to YMYL pages within the quality guidelines and repeatedly stresses the importance of holding these types of sites to a higher bar than others.

Expertise / Authoritativeness / Trustworthiness, aka E-A-T

Expertise / Authoritativeness / Trustworthiness — shortened to E-A-T — refers to what many think of as a website’s overall value. Is the site lacking in expertise? Does it lack authoritativeness? Does it lack trustworthiness? These are all things that readers are asked to consider when it comes to the overall quality of the website or web page, particularly for ones that fall into the YMYL category.

This is also a good rule of thumb for SEO in general. You want to make sure that your website has a great amount of expertise, whether it’s coming from you or contributors. You also want to show people why you have that expertise. Is it the the experience, relevant education, or other qualities that gives the writer of each page that stamp of expertise? Be sure to show and include it.

Authoritativeness is similar, but from the website perspective. Google wants websites that have high authority on the topic. This can come from the expertise of the writers, or even the year quality of the community if it’s something like a forum.

When it comes to trustworthiness, again Google wants raters to decide: Is a site you’d feel you can trust? Or is it somewhat sketchy and you’d have trouble believing what the website is trying to tell you?

Why you need E-A-T

This also comes down to something that goes well beyond just the quality raters and how they view E-A-T. It’s something that you should consider for your site even if these quality raters didn’t exist.

Every website should make a point of either showing how their site has a high E-A-T value or figure out what it is they can do to increase it. Does it mean bringing contributors on board? Or do you merely need to update things like author bios and “About Me” pages? What can you do to show that you have the E-A-T that not only quality raters are looking for, but also just the general visitors to your site?

If it is forums, can your posters show their credentials on publicly-visible profile pages, with additional profile fields for anything specific to the market area? This can really help to show expertise, and your contributors to the forums will appreciate being showcased as an expert, too.

This comes back to the whole concept of quality content. When a searcher lands on your page and they can easily tell that it’s created by someone (or a company) with high E-A-T, this not only tells that searcher that this is great authoritative content, but they’re also that much more likely to recommend or share it with others. It gives them the confidence that they’re sharing trustworthy and accurate information in their social circles.

Fortunately for webmasters, Google does discuss how someone can be an authority with less formal expertise; they’re not looking for degrees or other formal education for someone to be considered an expert. Things like great, detailed reviews, experiences shared on forums or blogs, and even life experience are all things that Google takes into account when considering whether someone’s an authority.

Supplementary content

Supplementary content is where many webmasters are still struggling. Sometimes it’s not easy to add supplementary content, like sidebar tips, into something like your standard WordPress blog for those who are not tech-savvy.

However, supplementary content doesn’t have to require technical know-how. It can comprise things such as similar articles. There are plenty of plug-ins that allow users to add suggested content and can be used to provide helpful supplementary content. Just remember: the key word here is helpful. Things like those suggested-article ad networks, particularly when they lead to Zergnet-style landing pages, are not usually considered helpful.

Think about the additional supporting content that can be added to each page. Images, related articles, sidebar content, or anything else that could be seen as helpful to the visitor of the page is all considered supplementary content.

If you are questioning whether something on the page can be considered secondary content or not, look at the page — anything that isn’t either the main article or advertising can be considered supplementary content. Yes, this includes a strong navigation, too.

Page design

By now you’d think this is a no-brainer, but there are still some atrocious page designs out there with horrible user experiences. But this goes much further than how easy the website is to use.

Google wants raters to consider the focus of the pages. Ideally, the main content of the page, such as the main article, should be “front and center” and the highlight of the page. Don’t make your user scroll down to see the article. Don’t have a ton of ads above the fold that push the content lower. And don’t try to disguise your ad content. These are all things that will affect the rating.

They do include a caveat: Ugly does not equal bad. There are some ugly websites out there that are still user-friendly and meet visitors’ needs; Google even includes some of them as examples of pages with positive ratings.

More on advertising & E-A-T

Google isn’t just looking for ads that are placed above the fold and in a position where one would expect the article to begin. They examine some other aspects as well that can impact the user experience.

Are you somehow trying to blend your advertising too much with the content of the page? This can be an issue. In Google’s words, they say that ads can be present for any visitors that may want to interact with them. But the ads should also be something that can be ignored for those who aren’t interested in the ads.

They also want there to be a clear separation between advertising and the content. This doesn’t mean you must slap a big “ads” label on them, or anything along those lines. But there should be a distinction to differentiate the ads from the main content. Most websites do this, but many try and blur the lines between ads and content to incite accidental clicks by those who don’t realize it was actually an ad.

All about the website

There are still a ton of websites out there that lack basic information about the site itself. Do you have an “About” page? Do you have a “Contact Us” page so that visitors can contact you? If you are selling a service or a product, do you have a customer service page?

If your site falls into the YMYL category, Google considers this information imperative. But if your site isn’t a YMYL page, Google suggests that just a simple email address is fine, or you can use something like a contact form.

Always make sure there’s a way for a visitor to find a little bit more about you or your site, if they’re so inclined. But be sure to go above and beyond this if it’s a YMYL site.

Reputation

For websites to get the highest possible rating, Google is looking at reputation as well. They ask the raters to consider the reputation of the site or author, and also ask them to do reputation research.

They direct the raters to look at Wikipedia and “other informational sources” as places to start doing reputation research when it comes to more formal topics. So if you’re giving medical advice or financial advice, for example, make sure that you have your online reputation listed in places that would be easy to find. If you don’t have a Wikipedia page, consider professional membership sites or similar sites to showcase your background and professional reputation.

Google also considers that there are some topics where this kind of professional reputation isn’t available. In these cases, they say that the reader can look at things such as “popularity, user engagement, and user reviews” to discover reputation within the community or market area. This can often be represented simply by a site that is highly popular, with plenty of comments or online references.

What makes a page low-quality?

On the other end of the spectrum, we have pages that Google considers low-quality. And as you can imagine, a lot of what makes a page low-quality should be obvious to many in the SEO industry. But as we know, webmasters aren’t necessarily thinking from the perspective of a user when gauging the quality of their sites, or they’re looking to take advantage of shortcuts.

5 clues

Google does give us insight into exactly what they consider low-quality, in the form of five things raters should look for. Any one of these will usually result in the lowest ratings.

  1. The quality of the main content is low.
    This shouldn’t be too surprising. Whether it’s spun content or just poorly-written content, low-quality content means a low rating. Useless content is useless.
  2. There is an unsatisfying amount of main content for the purpose of the page.
    This doesn’t mean that short content cannot be considered great-quality content. But if your three-sentence article needs a few more paragraphs to fully explain what the title of that article implies or promises, then you need to rethink that content and perhaps expand it. Thin content is not your SEO friend.
  3. The author of the page or website doesn’t have enough expertise for the topic of the page, and/or the website is not trustworthy or authoritative enough for the topic. In other words, the page/website is lacking E-A-T.
    Again, Google wants to know that the person has authority on the subject. If the site isn’t displaying the characteristics of E-A-T, it can be considered low-quality.
  4. The website has a negative reputation.
    This is where reputation research comes back into play. Ensure you have a great online reputation for your website (or your personal name, if you’re writing under your own name). That said, don’t be overly concerned about it if you have a couple of negative reviews; almost every business does. But if you have overwhelmingly negative reviews, it will be an issue when it comes to how the quality raters see and rate your site.
  5. The supplementary content is distracting or unhelpful for the purpose of the page.
    Again, don’t hit your visitors over the head with all ads, especially if they’re things like autoplay video ads or super flashy animated ads. Google wants the raters to be able to ignore ads on the page if they don’t need them. And again, don’t disguise your ads as content.

Sneaky redirects

If you include links to affiliate programs on your site, be aware that Google does consider these to be “sneaky redirects” in the Quality Rater’s Guidelines. While there isn’t necessarily anything bad about one affiliate link on the page, bombarding visitors with those affiliate links can impact the perceived quality of the page.

The raters are also looking for other types of redirects. These include the ones we usually see used as doorway pages, where you’re redirected through multiple URLs before you end up at the final landing page — a page which usually has absolutely nothing to do with the original link you clicked.

Spammy main content

There’s a wide variety of things that Google is asking the raters to look at when it comes to quality of the main content of the page. Some are flags for what Google considers to be the lowest quality — things that are typically associated with spam. A lot of things are unsurprising, such as auto-generated main content and gibberish. But Google wants their raters to consider other things that signal low quality, in their eyes.

Keyword stuffing

While we generally associate keyword stuffing with content so heavy with keywords that it comes across as almost unreadable, Google also considers it keyword stuffing when the overuse of those keywords seems only a little bit annoying. So for those of you that think you’re being very clever about inserting a few extra keywords in your content, definitely consider it from an outsider’s point of view.

Copied content

This shouldn’t come as a surprise, but many people feel that unless someone is doing a direct comparison, they can get away with stealing or “borrowing” content. Whether you’re copying or scraping the content, Google asks the raters to look specifically at whether the content adds value or not. They also instruct them on how to find stolen content using Google searches and the Wayback Machine.

Abandoned

We still come across sites where the forum is filled with spam, where there’s no moderation on blog comments (so they’re brimming with auto-approved pharmaceutical spam), or where they’ve been hacked. Even if the content seems great, this still signals an untrustworthy site. If the site owner doesn’t care enough to prevent it, why should a visitor care enough to consider it worthy?

Scam sites

Whether a site is trying to solicit extensive personal information, is for a known scam, or is a phishing page, these are all signs of a lowest-quality page. Also included are pages with suspicious download links. If you’re offering a download, make sure it comes across as legitimate as possible, or use a third-party verified service for offering downloads.

Mobile-friendly

If you haven’t taken one of the many hints from Google to make your site mobile friendly, know that this will hurt the perceived quality of your site. In fact, Google tells their raters to rate any page that is not mobile-friendly (a page that becomes unusable on a mobile device) at the lowest rating.

In this latest version of the quality guidelines, all ratings are now being done on a mobile device. Google has been telling us over and over for the last couple of years that mobile is where it’s at, and many countries have more mobile traffic than desktop. So, if you still haven’t made your site mobile-friendly, this should tell you emphatically that it needs to be a priority.

If you have an app, raters are also looking at things like app installs and in-app content in the search results.

Know & Know Simple Queries

Google added a new concept to their quality guidelines this year. It comes down to what they consider “Know Queries” and “Know Simple Queries.” Why is this important? Because Know Simple Queries are the driving force behind featured snippets, something many webmasters are coveting right now.

Know Simple

Know Simple Queries are the types of searches that could be answered in either one to two sentences or in a short list. These are the types of answers that can be featured quite easily in a featured snippet and contain most of the necessary information.

These are also queries where there’s usually a single accepted answer that most people would agree on. These are not controversial questions or types of questions where there are two very different opinions on the answer. These include things such as how tall or how old a particular person is – questions with a clear answer.

These also include implied queries. These are the types of searches where, even though it’s not in the form of a question, there’s clearly a question being asked. For example, someone searching for “Daniel Radcliffe’s height” is really asking “How tall is Daniel Radcliffe?”

If you’re looking for featured snippets, these are the types of questions you want to answer with your webpages and content. And while the first paragraph may only be 1–2 sentences long as a quick answer, you can definitely expand on it in subsequent paragraphs, particularly for those who are concerned about the length of content on the page.

Know Queries

The Know Queries are all the rest of the queries that would be too complex or have too many possible answers. For example, searches related to stock recommendations or a politician wouldn’t have a featured snippet because it’s not clear exactly what the searchers are looking for. “Barack Obama” would be a Know Query, while “Barack Obama’s age” would be a Know Simple Query.

Many controversial topics are considered to be Know Queries, because there are two or more very different opinions on the topic that usually can’t be answered in those 1–2 sentences.

The number of keywords in the search doesn’t necessarily preclude whether it is a Know Query or Know Simple Query. Many long-tail searches would still be considered Know Queries.

Needs Met

Needs Met is another new section to the new Quality Rater’s Guidelines. It looks at how well the search result meets what the searcher’s query is. This is where sites that are trying to rank for content that they don’t have supporting content for will have a hard time, since those landing pages won’t meet what the searchers are actually looking for.

Ratings for this range from “Fully Meets” to “Fails to Meet.”

The most important thing to know is that any site that is not mobile-friendly will get “Fails to Meet.” Again, if your site is not mobile-friendly, you need to make this an immediate priority.

Getting “Highly Meets”

Essentially, your page needs to be able to answer whatever the search query is. This means that the searcher can find all the information they were looking for from their search query on your page without having to visit other pages or websites for the answer. This is why it’s so crucial to make sure that your titles and keywords match your content, and your content is quality enough to answer fully whatever the searchers are looking for when your page surfaces in the SERPs.

Local Packs & “Fully Meets”

If your site is coming up in a local 3-pack, as long as those results in the 3-pack match what the query was, they can be awarded “Fully Meets.” The same applies when it’s a local business knowledge panel — again, provided that it matches whatever the search query is. This is where local businesses that spam Google My Business will run into problems.

Product pages

If you have a quality product page and it matches the search query, this page can earn “Highly Meets.” It can be for both more general queries — the type that might lead to a page on the business website that lists all the products for that product type (such as a listing page for backpacks) — or for a specific product (such as a specific backpack).

Featured snippets

Raters also look at featured snippets and gauging how well those snippets answer the question. We’ve all seen instances where a featured snippet seems quite odd compared to what the search query is, so Google seems to be testing how well their algorithm is choosing those snippets.

“Slightly Meets” and “Fails to Meet”

Google wants the raters to look at things like whether the content is outdated, or is far too broad or specific to what the page is primarily about. Also included is content that’s created without any expertise or has other signals that make it low-quality and untrustworthy.

Dated & updated content

There’s been a recent trend lately where webmasters change the dates on some of their content to make it appear more recent than it really is, even if they don’t change anything on the page. In contrast, others add updated dates to their content when they do a refresh or check, even when the publish date remains the same. Google now takes this into account and asks raters to check the Wayback Machine if there are any questions about the content date.

Heavy monetization

Often, YMYL sites run with heavy monetization. This is one of the things that Google asks the raters to look for, particularly if it’s distracting from the main content. If your page is YMYL, then you’ll want to balance the monetization with usability.

Overall

First and foremost, the biggest takeaway from the guidelines is to make your site mobile-friendly (if it’s not already). Without being mobile-friendly, you’re already missing out the mobile-friendly ranking boost, which means your site will get pushed down further in the results when someone searches on a mobile device. Clearly, Google is also looking at mobile-friendliness as a sign of quality. You might have fabulous, high-quality content, but Google sees those non-mobile-friendly pages as low-quality.

Having confirmation about how Google looks at queries when it comes to featured snippets means that SEOs can take more advantage of getting those featured snippets. Gary Illyes from Google has said that you need to make sure that you’re answering the question if you want featured snippets. This is clearly what’s at the heart of Know Simple Queries. Make sure that you’re answering the question for any search query you hope to get a featured snippet on.

Take a look at your supplementary content on the page and how it supports your main content. Adding related articles and linking to articles found on your own site is a simple way to provide additional value for the visitor — not to mention the fact that it will often keep them on your site longer. Think usefulness for your visitors.

And while looking at that supplementary content, make sure you’re not going overboard with advertising, especially on sites that are YMYL. It can sometimes be hard to find that balance between monetization and user experience, but this is where looking closely at your monetization efforts and figuring out what’s actually making money can really pay off. It’s not uncommon to find some that ad units generate pennies a month and are really not worth cluttering up the page to add fifty cents of monthly revenue.

Make sure you provide sufficient information to a visitor, or a quality rater, that can answer simple questions about your site. Is the author reputable? Does the site have authority? Should people consider the site trustworthy? And don’t forget to include things like a simple contact form. Your site should reflect E-A-T: Expertise, Authoritativeness and Trustworthiness.

Bottom line: Make sure you present the highest-quality content from highly reputable sources. The higher the perceived value of your site, the higher the quality ratings will be. While this doesn’t translate directly into higher rankings, doing well with regards to these guidelines can translate into the type of content Google wants to serve higher in the search results.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →