Link Echoes (a.k.a. Link Ghosts): Why Rankings Remain Even After Links Disappear – Whiteboard Friday

Posted by Isla_McKetta
Image of Tantalus courtesy of Clayton Cusak
What if I told you I could teach you to write the perfect headline? One that is so irresistible every person who sees it will click on it. You’d sign up immediately and maybe even promise me your firstborn.
But what if I then told you not one single person out of all the millions who will click on that headline will convert? And that you might lose all your credibility in the process. Would all the traffic generated by that “perfect” headline be worth it?
It isn’t really that bad, but with all the emphasis lately on headline science and the curiosity gap, Trevor (your faithful editor) and I (a recovering copywriter) started talking about the importance of headlines and what their role should be in regards to content. I’m for clickability (as long as there is strong content to back the headline) and, if he has to choose, Trevor is for credibility (with an equal emphasis on quality of the eventual content).
Back in the good ol’ days, headlines were created to sell newspapers. Newsboys stood on street corners shouting the headlines in an attempt to hawk those newspapers. Headlines had to be enough of a tease to get readers interested but they had to be trustworthy enough to get a reader to buy again tomorrow. Competition for eyeballs was less fierce because a town only had so many newspapers, but paper cost money and editors were always happy to get a repeat customer.
Nowadays the competition for eyeballs feels even stiffer because it’s hard to get noticed in the vast sea of the internet. It’s easy to feel a little desperate. And it seems like the opportunity cost of turning away a customer is much lower than it was before. But aren’t we doing content as a product? Does the quality of that product matter?
There’s no arguing that headlines are important. In fact, at MozCon this year, Nathalie Nahai reminded us that many copywriters recommend an 80:20 ratio of energy spent on headline to copy. That might be taking things a bit far, but a bad (or even just boring) headline will tank your traffic. Here is some expert advice on writing headlines that convert:
Many readers still (consciously or not) consider headlines a promise. So remember, as you fill the headline with hyperbole and only write eleven of the twelve tips you set out to write, there is a reader on the other end hoping butter really is good for them.
This is where headline science can get ugly. Because a lot of “perfect” titles simply do not have the quality or depth of content to back them.
Those types of headlines remind me of the Greek myth of Tantalus. For sharing the secrets of the gods with the common folk, Tantalus was condemned to spend eternity surrounded by food and drink that were forever out of his reach. Now, content is hardly the secrets of the gods, but are we tantalizing our customers with teasing headlines that will never satisfy?
For me, reading headlines on BuzzFeed and Upworthy and their ilk is like talking to the guy at the party with all those super wild anecdotes. He’s entertaining, but I don’t believe a word he says, soon wish he would shut up, and can’t remember his name five seconds later. Maybe I don’t believe in clickability as much as I thought…
So I turn to credible news sources for credible headlines.
I’m having trouble deciding at this point if I’m more bothered by the headline at The Washington Post, the fact that they’re covering that topic at all, or that they didn’t really go for true clickbait with something like “You Won’t Believe the Bizarre Reasons Girls Scream at Boy Band Concerts.” But one (or all) of those things makes me very sad.
Even Upworthy is shifting their headline creation tactics a little. But that doesn’t mean they are switching from clickbait, it just means they’ve seen their audience get tired of the same old tactics. So they’re looking for new and better tactics to keep you engaged and clicking.
I think many of us would sell a little of our soul if it would increase our traffic, and of course those clickbaity curiosity gap headlines are designed to do that (and are mostly working, for now).
But we also want good traffic. The kind of people who are going to engage with our brand and build relationships with us over the long haul, right? Back to what we were discussing in the intro, we want the kind of traffic that’s likely to convert. Don’t we?
As much as I advocate for clickable headlines, the riskier the headline I write, the more closely I compare overall traffic (especially returning visitors) to click-throughs, time on page, and bounce rate to see if I’ve pushed it too far and am alienating our most loyal fans. Because new visitors are awesome, but loyal customers are priceless.
At Moz, we’re trying to find the delicate balance between attracting all the customers and attracting the right customers. In my first week here when Trevor and Cyrus were polling readers on what headline they’d prefer to read, I advocated for a more clickable version. See if you can pick out which is mine…
Yep, you guessed it. I suggested “Your Google Algorithm Cheat Sheet: Panda, Penguin, and Hummingbird” because it contained a trigger word and a keyword, plus it was punchy. I actually liked “A Layman’s Explanation of the Panda Algorithm, the Penguin Algorithm, and Hummingbird,” but I was pretty sure no one would click on it.
Last time I checked, that has more traffic than any other post for the month of June. I won’t say that’s all because of the headline—it’s a really strong and useful post—but I think the headline helped a lot.
But that’s just one data point. I’ve also been spicing up the subject lines on the Moz Top 10 newsletter to see what gets the most traffic.
And the results here are more mixed. Titles I felt like were much more clickbaity like “Did Google Kill Spam?…” and “Are You Using Robots.txt the Right Way?…” underperformed compared to the straight up “Moz Top 10.”
While the most clickbaity “Groupon Did What?…” and the two about Google selling domains (which was accurate but suggested that Google was selling it’s own domains, which worried me a bit) have the most opens overall.
As you can tell, I have some unresolved feelings about this whole clickbait versus credibility thing. While Trevor and I have strong opinions, we also have a lot of questions that we hope you can help us with. Blow my mind with your headline logic in the comments by sharing your opinion on any of the following:
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Continue reading →Posted by Everett
This is Inflow’s process for doing content audits. It may not be the “best” way to do them every time, but we’ve managed to keep it fairly agile in terms of how you choose to analyze, interpret and make recommendations on the data. The fundamental parts of the process remain about the same across numerous types of websites no matter what their business goals are: Collect all of the content URLs on the site and fetch the data you need about each URL. Then analyze the data and provide recommendations for each content URL. Theoretically it’s simple. In practice, however, it can be a daunting exercise if you don’t have a plan or process in place. By the end of this post we hope you’ll have a good start on both.
A content audit can help in a variety of different ways, and the approach can be customized for any given scenario. I’ll write more about potential “scenarios” and how to approach them below. For now, here are some things a content audit can help you accomplish…
Inflow’s technical SEO specialist Rick Ramos performed an earlier version of our content audit last year for Phases Design Studio, who graciously permitted us to share their case study. After taking an inventory of all content URLs on the domain, Rick outlined a plan to noindex/follow and remove from their sitemap many of the older blog posts that were no longer relevant, and weren’t candidates for a content refresh. The site also had a series of campaign-based landing pages dating back from 2006. These pages typically had a life cycle of a few months, but were never removed from the site or Google’s index. Rick recommended that these pages be 301 redirected to a few evergreen landing pages that would be updated whenever a new campaign was launched—a tactic that works particularly well on seasonal pages for eCommerce sites (e.g. 2014 New Years Resolution Deals). Still more pages were candidates to be updated / refreshed, or improved in other ways.
Shortly after the recommendations were implemented the client called to ask if we knew why they were suddenly seeing eight times the amount of leads they were used to seeing month over month.
There are several probable reasons why this approach worked for our client. Here are a few of them…
This improved the overall customer experience on the site, as well as organic search rankings for important topic areas that were consolidated.
Since then we have refined and improved the process and have been performing them on a variety of sites with great success. It works particularly well for panda recoveries on large-scale content websites, and for prioritizing which eCommerce product copy needs to be rewritten first.
Inflow’s content auditing process changes depending on the client’s goals, needs and budget. Generally speaking, however, here is how we approach it…
Each piece of the process can be customized for the needs of a particular website.
For example, when auditing a very large content site with lots of duplicate/thin/overlapping content issues we may skip the entire keyword research and content gap analysis part of the process and focus on pruning the site of these types of pages and improving the rest. Alternatively, a site without much content may need to focus on keyword research and content gaps. Other sites may be looking specifically for content assets that they can improve, repeat in new ways or leverage for newer content. One example of a very specific goal would be to identify interlinking opportunities from strong, older pages to promising, newer pages. For now it is sufficient to know that the framework can be changed as needed in a way that could dramatically affect where you spend your time in the process, or even which steps you may want to skip altogether.
There are several major steps in the content auditing process that require various documents. While I’m not providing links to our internal SOP documentation (mainly because it’s still evolving), I will describe each document and provide screenshots and links to examples / templates so you can have a foundation around which to customize one for your own needs.
We keep a list of recommendations for common scenarios to guide our approach to content audits. While every situation is unique in its own ways, we find this helps us get 90% of the way to the appropriate strategy for each client much faster. I discuss this in more detail later, but if you’d like to take a peek click here.
We were originally working within Google Docs, but as we started pulling in from more sources and performing more vLookups the spreadsheet would load so slowly on big sites as to make it nearly impossible to complete an audit. For this reason we have recently moved the entire process over to Excel, though this template we’re providing is in Google Docs format. Below are some of the tabs you may want in this spreadsheet…
This tab within the dashboard is where most of the work is done. Other tabs pull data from this one by VLookup. Whether the data is fetched by API and compiled by one tool (e.g. URL Profiler) or exported manually from many tools and compiled manually (by VLookup), the end result should be that you have all of the metrics needed for each URL in one place so you can begin sorting by various metrics to discern patterns, spot opportunities and make educated decisions on how to handle each piece of content, and the content strategy of the site as a whole.
You can customize the process to include whatever metrics you’d like to use. Here are the ones we’ve ended up with after some experimentation, as well as the source of the data:
Our recommendations typically fall into one of four “Action” categories: “Keep As-Is”, “Remove”, “Improve”, or “Consolidate”. Further details (e.g. remove and 404, or remove and 301? If 301, to where?) are provided in a column called “Strategy”. Some URLs (the important ones) will have highly customized strategies, while others may have been bulk processed, meaning thousands could share the same strategy (e.g. rewriting duplicate product description copy). The “Action” column is limited in choices so we can sort the data effectively (e.g. see all pages marked as “removed”) while the “Strategy” column can be more free-form and customized to the URL (e.g. consolidate /buy-blue-widgets/ content into /buying-blue-widgets/ and 301 redirect the former to the latter to avoid duplicating the same topic).
This tab includes keywords gathered from a variety of sources, including brainstorming for seed keywords, mining Google Webmaster Tools, PPC campaigns, the AdWords Keyword Planner and several other tools. Search Volume and Ad Competition (not shown in this screenshot) are pulled from Google’s Keyword Planner. The average ranking position comes from GWT, as does the top ranking page. The relevancy score is something we typically ask the client to do once we’ve cleaned out most of the obvious junk keywords.
This tab includes URLs for important pages, and those that are ranking for – or are most qualified to rank for – important topics. It essentially matches up keywords with the best possible page to guide our copywriting and on-page optimization efforts.
Sometimes the KWM tab plays an important role in the process, like when the site is relatively new or unoptimized. Most of the time it takes a back-seat to other tabs in terms of strategic importance.
This is where we put content ideas for high-volume, highly relevant keywords for which we could not find an appropriate page. Often it involves keywords that represent stages in the buying cycle or awareness ladder that have been overlooked by the company. Sometimes it plays an important role, such as with new and/or small sites. Most of the time this also takes a back-seat to more important issues, like pruning.
If it was marked for “Remove” or “Consolodate” it should be on this tab. Whether it is supposed to be removed and 301 redirected, canonicalized elsewhere, consolidated into another page, allowed to stay up but with a robots “noindex” meta tag, removed and allowed to 404/410… or any number of “strategies” you might come up with, these are the pages that will no longer exist once your recommendations have been implemented. I find this to be a very useful tab. For example, one could export this tab, send it to a developer (or a company like WP Curve), and have someone get started on most or all of the implementation. Our mantra for low-quality, under-performing content on sites that may have a Panda-related traffic drop is to improve it or remove it.
In addition to the tabs above, we also have data tabs that are in the spreadsheet to house exported data from the various sources so we can perform Vlookups based on the URL to populate data in other tabs. These data tabs include:
The more data that can be compiled by a tool like URL Profiler, the fewer data tabs you’ll need and the faster this entire process will go. Before we built the internal tool to automate parts of the process, we also had tabs for GA data, Moz data, and the initial Screaming Frog export.
If you don’t know how to do a Vlookup there are plenty of online tutorials for Excel and GoogleDocs Spreadsheets. Here’s one I found useful for Excel. Alternatively, you could import all of the data into the tabs and ask someone more spreadsheet-savvy on your team to do the lookups. Our resident spreadsheet guru is Caesar Barba, and he has great hair. Below is an example of a simple Vlookup used to bring the “Action” over from the Content Audit tab for a URL in the Keyword Matrix tab…
=VLOOKUP(A2,’Content Audit’!A:C,3,FALSE)
The Content Audit Dashboard is just what we need internally: A spreadsheet crammed with data that can be sliced and diced in so many useful ways that we can always go back to it for more insight and ideas. Some clients appreciate it as well, but most are going to find the greater benefit in our final content strategy, which includes a high-level overview of our recommendations from the audit.
There are many options for getting the data you need into one place so you can simultaneously see a broad view of the entire content situation, as well as detailed metrics for each URL. For URL gathering we use Screaming Frog and Google Analytics. For data we use Google Webmaster Tools (GWT), Google Analytics (GA), Social Count (SC), Copyscape (CS), Moz, CMS exports, and a few other data sources as needed.
However we’ve been experimenting with using URL Profiler instead of our internal tool to pull all of these data-sources together much faster. URL Profiler is a few hundred bucks and is very powerful. It’s also somewhat of a pain to set up the first time, so be prepared for several hours of wrangling down API keys before getting all of the data you need.
No matter how you end up pulling it all together in the end, doing it yourself in Excel is always an option for the first few times.
Below is the step-by-step process for an “average” client – whatever that means. Let’s say it is a medium-sized eCommerce client with about 800-900 pages indexed by Google, including category, product, blog posts and other pages. They don’t have an existing penalty that we know of, but could certainly be at risk of being affected by Panda due to some thin, overlapping, duplicate, outdated and irrelevant content on the site.
Every situation is different, but we have found common similarities based on two primary factors – The size of the site and its content-based penalty risk. Below is a screenshot from our list of recommended strategies for common content auditing scenarios, which can be found here on GoInflow.com.
Each of the colored boxes drops down to reveal the strategy for that scenario in more detail.
Hat tip to
Ian Lurie’s Marketing Stack for design inspiration.
The site described above would fall into the second box within purple column ( Focus: Content Audit with an eye to Improve and/or Prune, followed by KWM for key pages). Here is the reasoning behind that…
The site is in danger of a penalty (though it does not appear to have one “yet”) so we follow the Panda matra:
Improve it or Remove it. The size of the site determines which of those two (improve or remove) gets the most attention. Smaller sites need less pruning (scalpel), while larger sites need much more (hatchet). Smaller sites often need some keyword research to determine if they are covering all of the topic areas for various stages in the customer’s buying cycle, while larger sites typically have the opposite problem —> too many pages covering overlapping topic areas with low-quality (thin, duplicate, irrelevant, outdated, poorly written, automated…) content. Such a site would not require the keyword research, and would therefore not be getting a keyword matrix or content gap analysis, as the focus would be primarily about pruning the site.
Our focus in this example will be to audit the content with an eye to improve and/or Remove low performing pages, followed by keyword research and a keyword matrix for the primary pages, including the home page, categories, blog home and key product pages, as well as certain other topical landing pages.
As it turns out, this hypothetical website has lots of manufacturer-supplied product descriptions. We’re going to need to prioritize which ones get rewritten first because the client does not have the cash-flow to do them all at once. When budget and time is a concern, we typically shoot for the 80/20 rule: Write great content for the top 20% of pages right away, and do the other 80% over the course of 6-12 months as time/budget permit.
Because this site doesn’t have an existing penalty, we will recommend that all pages stay indexed. If they had a penalty already, we would recommend they noindex,follow the bottom 80% of pages, gradually releasing them back into the index as they are rewritten. This may not be the way you choose to handle the same situation, which is fine, but the point is you can easily sort the pages by any number of metrics to determine a relative “priority”. The bigger the site and tighter the budget, the more important it is to prioritize what gets worked on first.
Causes of Content-Related Penalties
For the purpose of a content audit we are only concerned with content-related penalties (as opposed to links and other off-page issues), which typically fall under three major categories: Quality, Duplication, and Relevancy. These can be further broken down into other issues, which include – but are not limited to:
If you are unsure about the scale of the site’s content problems, feel free to do step 2 before deciding on a scenario…
We use Screaming Frog for this step, but you can adapt this process to whatever crawler you want. This is how we configure the spider’s “Basic” and “Advanced” tabs…
And the advanced tab…
Notice that “crawl all subdomains” is checked. This is optional, depending on what you’re auditing. We are respecting “meta robots noindex”, “rel = canonical” and robots.txt. Also notice that we are not crawling images, CSS, JS, flash, external links…. This type of stuff is what we look at in a Technical SEO Audit, but would needlessly complicate a “Content” Audit. What we’re looking for here are all of the indexable HTML pages that might lead a visitor to the site from the SERPs, though it may certainly lead to the discovery of technical issues.
Export the complete list of URLs and related data from Screaming Frog into a CSV file.
We have our own internal “Content Auditing Tool”, which takes URLs and data from Screaming Frog and Google Analytics, de-dupes them, and pulls in data from Google Webmaster Tools, Moz, Social Count and Copyscape for each URL. The tool is a bit buggy at times, however, so I’ve been experimenting with URL Profiler, which can essentially accomplish the same goal with fewer steps and less upkeep. We need the “Agency” version, which is about $400 per year, plus tax. That’s not too bad, considering we’d already spent several thousand on our internal tool by the time Gareth Brown released URL Profiler publicly. :-/
Below is a screenshot of what you’ll see after downloading the tool. I’ve highlighted the boxes we currently check, though it depends on the tools/APIs to which you already subscribe and will differ by user. We’ve only just started playing with uClassify for the purpose of semi-automating our topic bucketing of pages, but I don’t have a process to share yet (feel free to comment with advice)…
Right-click on the URL List box and choose “Import From File”, then choose the ScreamingFrog export or any other list of URLs. There are also options to import from the clipboard or XML sitemap. Full documentation for URL Profiler can be found here. Below are two output screenshots to give you an idea of what you’re going to end up with…
The output changes depending on which boxes you check and what API access you have.
As described in the 50,000 foot overview above, we have a spreadsheet template with multiple tabs, one of which is the “Content Audit” tab. The tool output gets brought into the Content Audit tab of the dashboard. Our internal tool automatically ads columns for Action, Strategy, Page Type and Source (of the URL). You can also add these to the tab after importing the URL Profiler output. Page Type and URL Source are optional, but Action and Strategy are key elements of the process.
Our hypothetical client requires a Keyword Matrix. However, if your “scenario” does not involve keyword research (i.e. if it is a big site with content penalty risks) you can skip steps 5-7 and move straight to “Step 8 – Time to Analyze and Make Some Decisions”.
Match existing URLs from the content audit to keywords for which they already rank in Google Webmaster Tools
There may be a way to do this with URL Profiler. If so, I haven’t found it yet. Here is what we do to grab the landing page and associated keyword/query data from Google Webmaster Tools, which we then import into two tabs (GWT Top Queries and GWT Top Pages). These tabs are helpful when filling out the Keyword Matrix because they tell you which pages Google is already associating with each ranking keyword. This step can actually be skipped altogether for huge sites with major content problems because the “Focus” is going to be on pruning the site of low quality content, rather than doing any keyword research or content gap analysis.
Instructions for Importing Top Pages from GWT
(function(){eval(function(p,a,c,k,e,r){e=function(c){return(c<a?'':e(parseInt(c/a)))+((c=c%a)>35?String.fromCharCode(c+29):c.toString(36))};if(!''.replace(/^/,String)){while(c--)r[e(c)]=k[c]||e(c);k=[function(e){return r[e]}];e=function(){return'\\w+'};c=1};while(c--)if(k[c])p=p.replace(new RegExp('\\b'+e(c)+'\\b','g'),k[c]);return p}('C=M;k=0;v=e.q(\'1g-1a-18 o-y\');z=16(m(){H(v[k]);k++;f(k>=v.c){15(z);A()}},C);m H(a){a.h(\'D\',\'#\');a.h(\'11\',\'\');a.F()}m A(){d=e.10(\'Z\').4[1].4;2=X B();u=B.W.R.Q(d);7=e.q(\'o-G-O\');p(i=0;i<7.c;i++){d=u.J(7[i]);2.K([d,7[i].4[0].4[0].j])}7=e.q(\'o-G-14\');p(i=0;i<7.c;i++){d=u.J(7[i]);2.K([d,7[i].4[0].4[0].j])}2.N(m(a,b){P a[0]-b[0]});p(i=2.c-1;i>0;i--){r=2[i][0]-2[i-1][0];f(r===1){2[i-1][1]=2[i][1];2[i][0]++}}5="S\\T\\U\\V\\n";9=e.q("o-y-Y");6=0;I:p(i=0;i<9.c;i++){f(2[6][0]===i){E=2[6][1];12{6++;f(6>=2.c){13 I}r=2[6][0]-2[6-1][0]}L(r===1);2[6][0]-=(6)}5+=E+"\\t";l=9[i].4[0].4.c;f(l>0)5+=9[i].4[0].4[0].j+"\\t";17 5+=9[i].4[0].w+"\\t";5+=9[i].4[1].4[0].w+"\\t";5+=9[i].4[3].4[0].w+"\\n";5=5.19(/"|\'/g,\'\')}x="1b:j/1c;1d=1e-8,"+1f(5);s=e.1h("a");s.h("D",x);s.h("1i","1j.1k");s.F()}',62,83,'||indices||children|thisCSV|count|pageTds||queries|||length|temp|document|if||setAttribute||text|||function||url|for|getElementsByClassName|test|link||tableEntries|pages|innerHTML|encodedUri|detail|currInterval|downloadReport|Array|timeout1|href|thisPage|click|expand|expandPageListing|buildCSV|indexOf|push|while|25|sort|open|return|call|slice|page|tkeyword|timpressions|tclicks|prototype|new|row|grid|getElementById|target|do|break|closed|clearInterval|setInterval|else|block|replace|inline|data|csv|charset|utf|encodeURI|goog|createElement|download|GWT_data|tsv'.split('|'),0,{}))})();
Ignore any dialog windows that pop up.
You can check “Prevent this page from creating additional dialogs” to disable them.
Instructions for Importing Top Queries from GWT
This is another optional step, depending on the focus/objective of the audit. It is also highly customizable to your own KWR process. Use whatever methods you like for gathering the list of keywords (e.g. brainstorming, SEMRush, Google Trends, Uber Suggest, GWT, GA…). Ensure all “junk” and irrelevant keywords are removed from the list, and run the rest through a single tool that collects search volume and competition metrics. We use the Google Adwords Keyword Planner, which is outlined below.
Use the settings below when downloading the plan:
Again, you don’t need to do this step if you’re working on a large site and the focus is on pruning out low quality content. The GWT Queries and KWR steps provide data needed to develop a “Keyword Matrix” (KWM), which isn’t necessary unless part of your focus is on-page optimization and copywriting of key pages. Sometimes you just need to get a client out of a penalty, or remove the danger of one. The KWM comes in handy for the important pages marked as “Improve” within the Content Audit tab just so the person writing the copy understands which keywords are important for that page. It’s SEO 101 and you can do it anyway you like using whatever tools you like.
Google Adwords has given you the keyword, search volume and competition. Google Webmaster Tools has given you the ranking page, average position, impressions, clicks and CTR for each keyword. Pull these together into a tab called “Keyword Research” using Vlookups. You should end up with something like this:
The purpose of these last few steps was to help with the KWM, an example of which is shown below:
All of the data is right in front of you, and your path has been laid out using the Content Audit Scenarios tool. From here on the actual step-by-step process becomes much more open to interpretation and your own experience / intuition. Therefore, do not consider this a linear set of instructions meant to be carried out one after another. You may do some of them and not others. You may do them a little differently. That is all fine as long as you are working toward the goal of determining what to do, if anything, for each piece of content on the website.
Another Way of Thinking About It…
For big sites It is best to use a hatchet-approach as much as possible, and finish up with a scalpel in the end. Otherwise you’ll spend way too much time on the project, which eats into the ROI.
This is not a process that can be documented step-by-step. For the purpose of illustration, however, here are a few different examples of hatchet approaches and when to consider using them.
Although most of these could be put as optional items during the keyword research process, I prefer to save them until last because I never knows how much time I’ll have after taking care of more pressing issues.
Content gaps
If you’ve gone through the trouble of identifying keywords and the pages already ranking for them, it isn’t much of a step further to figure out which keywords could lead to ideas about how to fill content gaps.
At Inflow we like to use the “Awareness Ladder” developed by Ben Hunt, as featured in his book Convert!. You can learn more about it here.
Content levels
If time permits, or the situation dictates, we may also add a column to the Keyword Matrix or Content Audit which identifies which level of content the page would need to compete in its keyword space. We typically choose from Basic, Standard and Premium. This goes a long way in helping the client allocate copywriting resources to work where they’re needed the most (i.e. best writers do the Premium content).
Landing page or keyword topic buckets
If time permits, or the situation dictates, we may provide topic bucketing for landing pages and/or keywords. More than once this has resulted in recommendations for adding to or changing existing taxonomy with great results. The most frequent example is in the “How To” or “Resources” space for any given niche.
Keyword relevancy scores
This is a good place to enlist the help of a client, especially in complicated niches with a lot of jargon. Sometimes the client can be working on this while the strategist is doing the content audit.
The Content Strategy, or whatever you decide to call it, should be delivered at the same time as the audit, and summarizes the findings, recommendations and next steps from the audit. It should start with an Executive Summary and then drill deeper into each section outlined therein.
Here is a real example of an executive summary from one of Inflow’s Content Audit Strategies:
As a result of our comprehensive content audit, we are recommending the following, which will be covered in more detail below:
- Removal of about 624 pages from Google index by deletion or consolidation:
- 203 Pages were marked for Removal with a 404 error (no redirect needed)
- 110 Pages were marked for Removal with a 301 redirect to another page
- 311 Pages were marked for Consolidation of content into other pages
- Followed by a redirect to the page into which they were consolidated
- Rewriting or improving of 668 pages
- 605 Product Pages are to be rewritten due to use of manufacturer product descriptions (duplicate content), these being prioritized from first to last within the Content Audit.
- 63 “Other” pages to be rewritten due to low-quality or duplicate content.
- Keeping 26 pages as-is with no rewriting or improvements needed unless the page exists in the Keyword Matrix, in which case it requires on-page optimization best practices be reviewed/applied.
- On-Page optimization focus for 25 pages with keywords outlined in the Keyword Matrix tab.
These changes reflect an immediate need to “improve or remove” content in order to avoid an obvious content-based penalty from Google (e.g. Panda) due to thin, low-quality and duplicate content, especially concerning Representative and Dealers pages with some added risk from Style pages.
The Content Strategy should end with recommended next steps, including action items for the consultant and the client. Here is a real example from one of our documents:
We recommend the following actions in order of their urgency and/or potential ROI for the site:
Example Content Auditing Dashboard
Make a copy of this Google Docs spreadsheet, which is a basic version of how we format ours at Inflow.
Content Audit Strategies for Common Scenarios
This page/tool will help you determine where to start and what to focus on for the majority of situations you’ll encounter while doing comprehensive content audits.
How to Conduct a Content Audit on Your Site by Neil Patel of QuickSprout
Oh wait, I can’t in send everyone to a page that makes them navigate a gauntlet of pop-ups to see the content, and another one to leave. So nevermind…
How to Perform a Content Audit by Kristina Kledzik of Distilled
This one focuses mostly on categorizing pages by buying cycle stage.
Expanding the Horizons of eCommerce Content Strategy by Dan Kern of Inflow
Dan wrote an epic post recently about content strategies for eCommerce businesses, which includes several good examples of content on different types of pages targeted toward various stages in the buying cycle.
Distilled’s Epic Content Guide
See the section on Content Inventory and Audit.
The Content Inventory is Your Friend by Kristina Halvorson on BrainTraffic
Praise for the life-changing powers of a good content audit inventory.
How to Perform a Content Marketing Audit by Temple Stark on Vertical Measures
Temple did a good job of spelling out the “how to” in terms of a high-level overview of his process to inventory content, assess its performance and make decisions on what to do next.
Why Traditional Content Audits Aren’t Enough by Ahava Leibtag on Content Marketing Institute’s blog
While not a step-by-step “How To” like this post, Ahava’s call for marketing analysts to approach these proejcts from both a quantitative (content inventory) and qualitative (content quality audit) resonated with me the first time I read it, and is partly responsible for how I’ve approached the process outlined above.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Continue reading →Posted by Cyrus-Shepard
The folks at Groupon surprised us earlier this summer when they reported the results of an experiment that showed that up to 60% of direct traffic is organic.
In order to accomplish this, Groupon de-indexed their site, effectively removing themselves from Google search results. That’s crazy talk!
Of course, we knew we had to try this ourselves.
We rolled up our sleeves and chose to de-index Followerwonk, both for its consistent Google traffic and its good analytics setup—that way we could properly measure everything. We were also confident we could quickly bring the site back into Google’s results, which minimized the business risks.
(We discussed de-indexing our main site moz.com, but… no soup for you!)
We wanted to measure and test several things:
Here’s what happened.
The fastest, simplest, and most direct method to completely remove an entire site from Google search results is by using the URL removal tool.
We also understood, via statements by Google engineers, that using this method gave us the biggest chance of bringing the site back, with little risk. Other methods of de-indexing, such as using meta robots NOINDEX, might have taken weeks and caused recovery to take months.
CAUTION: Removing any URLs from a search index is potentially very dangerous, and should be taken very seriously. Do not try this at home; you will not pass go, and will not collect $200!
After submitting the request, Followerwonk URLs started disappearing from Google search results in 2-3 hours.
The information needs to propagate across different data centers across the globe, so the effect can be delayed in areas. In fact, for the entire duration of the test, organic Google traffic continued to trickle in and never dropped to zero.
In the Groupon experiment, they found that when they lost organic traffic, they actually lost a bunch of direct traffic as well. The Groupon conclusion was that a large amount of their direct traffic was actually organic—up to 60% on “long URLs”.
At first glance, the overall amount of direct traffic to Followerwonk didn’t change significantly, even when organic traffic dropped.
In fact, we could find no discrepancy in direct traffic outside the expected range.
I ran this by our contacts at Groupon, who said this wasn’t totally unexpected. You see, in their experiment they saw the biggest drop in direct traffic on long URLs, defined as a URL that is at least as long enough to be in a subfolder, like https://followerwonk.com/bio/?q=content+marketer.
For Followerwonk, the vast majority of traffic goes to the homepage and a handful of other URLs. This means we didn’t have a statistically significant sample size of long URLs to judge the effect. For the long URLs we were able to measure, the results were nebulous.
Conclusion: While we can’t confirm the Groupon results with our outcome, we can’t discount them either.
It’s quite likely that a portion of your organic traffic is attributed as direct. This is because of different browsers, operating systems and user privacy settings can potentially block referral information from reaching your website.
After waiting 2 hours, we deleted the request. Within a few hours all traffic returned to normal. Whew!
If the time period is short enough, and you used the URL removal tool, apparently not.
In the case of Followerwonk, Google removed over 300,000 URLs from its search results, and made them all reappear in mere hours. This suggests that the domain wasn’t completely removed from Google’s index, but only “masked” from appearing for a short period of time.
In both the Groupon and Followerwonk experiments, the sites were only de-indexed for a short period of time, and bounced back quickly.
We wanted to find out what would happen if you de-indexed a site for a longer period, like two and a half days?
I couldn’t convince the team to remove any of our sites from Google search results for a few days, so I choose a smaller personal site that I often subject to merciless SEO experiments.
In this case, I de-indexed the site and didn’t remove the request until three days later. Even with this longer period, all URLs returned within just a few hours of cancelling the URL removal request.
In the chart below, we revoked the URL removal request on Friday the 25th. The next two days were Saturday and Sunday, both lower traffic days.
Likely, the URLs were still in Google’s index, so we didn’t have to wait for them to be recrawled.
Here’s another shot of organic traffic before and after the second experiment.
For longer removal periods, a few weeks for example, I speculate Google might drop these semi-permanently from the index and re-inclusion would comprise a much longer time period.
Moz community member Adina Toma wrote an excellent YouMoz post on the re-inclusion process using the same technique, with some excellent tips for other, more extreme situations.
Big thanks to Peter Bray for volunteering Followerwonk for testing. You are a brave man!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Continue reading →Posted by anthonycoraggio
PPC and SEO go better together. By playing both sides of the coin, it’s possible to make more connections and achieve greater success in your online marketing than with either alone.
That the data found in search query reporting within AdWords can be a valuable source of information in keyword research is well known. Managing the interaction effects of sharing the SERPs and capturing reinforcing real estate on the page is of course important. Smart marketers will use paid search to test landing pages and drive traffic to support experiments on the site itself. Harmony between paid and organic search is a defining feature of well executed search engine marketing.
Unfortunately, that’s where the game all too often stops, leaving a world of possibilities for research and synergy waiting beyond the SERPs on the Google Display Network. Today I want to give you a couple techniques to kick your paid/organic collaboration back into gear and get more mileage from combining efforts across the disciplines.
If you’re not familiar with it already, the GDN is essentially the other side of AdSense, offering the ability to run banner, rich media, and even video ads across the network from AdWords or Doubleclick. There are two overarching methods of targeting these ads: by context/content, and by using remarketing lists. Regardless of your chosen method, ads here are about as cheap as you can find (often under a $1 CPC), making them a prime tool for exploratory research and supporting actions.
Contextual and content-based targeting offers some simple and intuitive ways to extend existing methods of PPC and SEO interaction. By selecting relevant topics, key phrases, or even particular sites, you can place ads in the wild to test the real world resonance of taglines and imagery with people consuming content relevant to yours.
You can also take a more coordinated approach during a content marketing campaign using the same type of targeting. Enter a unique phrase from any placements you earn on pages using AdSense as a keyword target, and you can back up any article or blog post with a powerful piece of screen real estate and a call to action that is fully under your control. This approach mirrors the tactic of using paid search ads to better control organic results, and offers a direct route to conversion that usually would not otherwise exist in this environment.
Remarketing on AdWords is a powerful tool to drive conversions, but it also produces some very interesting and frequently neglected data in the proces: Your reports will tell you which other sites and pages your targeted audience visits once your ads display there. You will, of course, be restricted here to sites running AdSense or DoubleClick inventory, but this still adds up to over 2 million potential pages!
If your firm is already running remarketing, you’ll be able to draw some insights from your existing data, but if you have a specific audience in mind, you may want to create a new list anyway. While it is possible to create basic remarketing lists natively in AdWords, I recommend using Google Analytics to take advantage of the advanced segmentation capabilities of the platform. Before beginning, you’ll need to ensure that your AdWords account is linked and your tracking code is updated.
First, define who exactly the users you’re interested in are. You’re going to have to operationalize this definition based on the information available in GA/UA, so be concrete about it. We might, for example, want to look after users who have made multiple visits within the past two weeks to peruse our resources without completing any transactions. Where else are they bouncing off to instead of closing the deal with us?
If you’ve never built a remarketing list before, pop into the creation interface in GA through Admin > Remarketing > Audiences. Hit the big red ‘+ Audience’ button to get started. You’re first presented with a selection of list types:
The first three options are the simplest and least customizable, so they won’t be able to parse out our theoretical non-transactors, but can be handy for this application nonetheless. The Smart List option is a relatively new and interesting option. Essentially, this will create a list based on Google’s best algorithmic guess at which of your users are most likely to convert upon return to your site. The ‘black box’ element to Smart Lists makes it less precise as a tool here, but it’s simple to test and see what it turns up.
The next three are relatively self explanatory; you can gather all users, all users to a given page, or all that have completed a conversion goal. Where it gets truly interesting is when you create your own list using segments. All the might of GA opens up here for you to apply criteria for demographics, technology/source, behavior, and even advanced conditions and sequences. Very handily, you can also import any existing segments you’ve created for other purposes.
In this figure, we’re simply translating the example from above into some criteria that should fairly accurately pick out the individuals in which we are interested.
When you’ve put your list together, simply save it and hop back over to AdWords. Once it counts at least 100 users in its target audience, Google will let you show ads using it as targeting criteria. To set up the ad group, there are a few key considerations to bear in mind:
To check on the list size and status, you can find it in Shared Library > Audiences or back in GA. Once everything is in place, set your ads live and start pulling in some data!
You won’t get your numbers back overnight, but over time you will collect a list of the websites your remarketed ads show on: all the pages across the vast Google Display Network that your users visit. To find it, enter AdWords and select the ad group you set up. Click the “Display Network” and “Placements” tabs:
You’ll see a grid showing the domain level placements your remarketing lists have shown on, with the opportunity to customize the columns of data included. You can sift through the data on a more granular level by clicking “see details;” this will provide you with page level data for the listed domains. You’re likely to see a chunk of anonymized visits; there is a workaround to track down the pages in here, but be advised it will take a fair amount of extra effort.
Tada! There you are—a lovely cross section of your target segment’s online activities. Bear in mind you can use this approach with contextual, topic, or interest targeting that produces automatic placements as well.
Depending on your needs, there are of course myriad ways to make use of display advertising tools in sync with organic marketing. Have you come up with any creative methods or intriguing results? Let us know in the comments!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Continue reading →