Archives for 

seo

Measuring and Increasing the ROI of Your Content Resources

Posted by Mike Pantoliano

Let me cut right to the chase. Do you want to know the value of your content marketing efforts? Want this report?

the resulting assisted conversion report

Read on and I’ll tell ya!

Calculating the real ROI

With so much emphasis often put on the traffic generation potential of a good content marketing strategy, I want to focus this post on measuring and increasing the return on the (sometimes sneakily large) investment. Some common goals you’ll hear surrounding a content marketing strategy include generating traffic for generic terms, increasing social shares, and developing the brand’s authority (measured by increases in branded traffic, or some other indicator). In the right circumstances, all of these are nice metrics for the relevant stakeholders in the organization, but they’re all just proxies for measuring the growth of a business. They’re measurements of the means, not the end.

The impetus for a lot of what I’ll be talking about in this post comes from Josh Braaten‘s post on the Google Analytics Blog a few months ago titled “How to Prove the Value of Content Marketing with Multi-Channel Funnels“. Josh talks practically about how to measure the business impact of traffic that first experiences your site via a page that isn’t directly selling a product or service to a consumer. Think: the “How to get into fly-fishing” article written by the outdoors retailer that sells fly-fishing poles, or even the “How to measure the effectiveness of content marketing” article written by the guy working for a company that’s doing a two day kick-ass web marketing conference in Boston on May 20th & 21st :). Indeed, these content pages aren’t selling a product or service, but they are selling the brand, the “purchase” made by the consumer is everlasting trust; and it has a really low conversion rate.

The necessary analysis for this gets difficult because it is so rare for a user to make the jump from discovery/informational stage to transactional stage in one sitting. Hence the need for multi-channel analysis: we need to take a conversion, look back at all of the interactions that have taken place leading up to that conversion, and assign some amount of credit to those channels that often show up toward the beginning of the conversion path. Social networks and the content that usually ranks for generic keywords are most often found in these early interactions. They are inherently ‘openers’ or ‘exposers’.

So, now that we’ve covered the theory, let’s look at measuring that ROI.

Expanding upon Josh Braaten’s multi-content funnels

Everyone interested in what I’ve covered above should absolutely read Josh’s post. In it, Josh walks you through how to create a report within Google Analytics’ Multi-Channel Funnels that classifies users by the page type for which they first interacted (based upon landing page).

creating a channel grouping in GA's MCFs

Custom channel creation is a lot like creating an advanced segment in GA

A long conversion path in MCF

The top conversions path report – seen here displaying a pretty convoluted conversion path for one particular conversion.

I’m going to offer a slightly different direction, but they both accomplish the goal of getting value out of our visit data. Instead of comparing content sections against each other, let’s instead compare it against our other channels like direct, referral, organic, and paid.

Let’s do a step-by-step walkthrough

Head on down to the multi-channel funnels reports.

location for multi-channel funnels in google analytics

Make a copy of the basic channel grouping template.

make a copy of the basic channel grouping

Include traffic based on landing page URL. Hopefully you’ve got your resource center, blog, or content home on a neatly identifiable path in the URL. If you don’t, you may have to go the route of declaring page-level custom variables.

create your channel grouping

Drag it to the top. The order at which you put these channels is important because GA will go down the line until a match is found, then stop. If we leave our Resource Center channel at the bottom, the channels above will take a ton of visitors first because our rules aren’t mutually exclusive.

channel ordering is important

Though not completely related to this topic, I’d also suggest separating your organic channel into branded, unbranded, and (not provided).

break out your branded, not provided, and unbranded organic search

Because of that importance of ordering, if you put (not provided) first and branded second, the final organic group will necessarily consist of unbranded traffic.

use regex to create your branded channel segment

You can create this segment with a neatly crafted regex of your brand name and other branded terms.

Finally, let GA calculate things out, and voila!

the resulting assisted conversion report

What can we learn from the above?

Well, it should be pretty clear that under the traditional model of last click analysis, our resource center is under-valued. This much is obvious by the disparity in last click conversions and conversion value compared with assisted conversions and conversion value. Not only that, but the “Assisted/Last Click or Direct Conversions” ratio (6.62 in the screenshot) tells us that this content is acting in an assist role more than any other channel we have (the higher the number, the more likely it’s an ‘opener’, not a ‘closer’ – those trend toward zero).

When we look at assisted conversion numbers, we CANNOT say that our resource center content is now directly responsible for $26k in revenue; that would not be quite fair using this model. But our content did have its hand in a lot more conversions than we may have originally assumed.

Now, as for this channel’s relative contribution to the bottom line compared with other channels, well, yes, it’s still a lot smaller. But consider that this particular website’s resource center is actually quite small, especially compared with the size of the rest of the site. Knowing how many pages are in a resource center makes it pretty easy to apply simple math to determine what each new page is roughly worth. Or you could choose to do deeper analysis into specific pages or sections within. Again, I point to Josh Braaten’s post for more on that.

But at the end of the day if you know that each new page added to the resource center has an assist in $X worth of conversions per year, justifying expansion becomes a lot easier.

A bonus tip for content marketers

So that was measuring the ROI of a content marketing strategy. But I’ve actually got a tip for increasing ROI that I’d like to share.

Our content strategies are targeted at the generic keywords that more often than not are queries that align with the user’s information-seeking intent. If we had our way, the path would go like this:

Kitten mittens purchasing decision

A user searches “my cat’s too noisy” and lands on your site’s blog post “10 ways to deal with a noisy cat.”

The user reads and is very happy with your content. In that content, you suggest “kitten mittens,” a product that you sell.

The seed is planted in the user’s mind, and upon deciding that they’re ready to buy, the user either searches for your brand name, that post again, or the “kitten mittens” product, all of which lead back to your site.

always sunny's kitten mittons

Nightmare scenario time: what if they searched for “kitten mittens” and you don’t rank for that term? Well, your content has done all the hard work, but your high-ranking competitor swoops in and gets the purchase. This must be corrected. But how?

Remarketing

It doesn’t matter what remarketing tool you use (this would be super easy with GA’s remarketing tool – here I wrote a post on it!), put the user above in a “noisy cat owner” list, and target them with “kitten mitten” ads around the web.

creating the kitten mittes segment

Thanks for reading, I hoped you learned something!

Let me know what you think in the comments or on Twitter, @MikeCP. Don’t forget that Distilled is running our search marketing conference, SearchLove, in Boston on May 20th and 21st!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Keyword-Driven Personas – Whiteboard Friday

Posted by RuthBurr

As inbound marketing is gaining traction, marketers in all inbound disciplines are realizing the importance of taking on keywords with a more holistic approach. It’s time to start building your keywords into the bones of your site, rather than adding them once your site is already completely mapped out. 

In today’s Whiteboard Friday, Ruth Burr discusses how you can use your keywords to drive personas, and ultimately affect your site mapping process for the better. Leave your thoughts and questions in the comments below! 

 

For your viewing pleasure, here’s a still image of the whiteboard used in this week’s video!

Still image of Whiteboard Friday - Ruth Burr - Keyword-Driven Personas

 

Video Transcription

“Howdy, SEOmoz fans. My name’s Ruth Burr. Welcome to another Whiteboard Friday. I’m the Lead SEO here at SEOmoz, and today I want to talk about using keywords to drive personas and ultimately your site mapping process.

One thing that we’re really thinking a lot about as we move more and more toward an inbound marketing model, where there are multiple different people with multiple different functions all working together to have the best inbound marketing possible, is what we’re doing with keywords and sort of when we’re adding keywords into the site. I know that we’ve all had the experience in years past where we would get a site or get a piece of copy that was completely written and then just kind of have to plug our keywords into that existing content wherever they would fit. You might have an entire site that’s already completely mapped out, it’s got a sitemap, it’s got information architecture, and then you’re supposed to go in and put in your keywords. I’ve found that that is not always the best user experience for the keyword, and also isn’t as effective as taking a more holistic approach.

So what I’m really hoping you guys will get out of this is take it back to your UX and your IA teams and really think about how you can build keywords more into the bones of the site.

One thing that Google is thinking a lot about that is really important for us to be thinking about as marketers as well is searcher intent. Search engines are spending tons of money and tons of time and tons of effort trying to figure out what people are searching for when they use a keyword. It behooves us as marketers to do the same thing because that way we can give people what they want when they tell us they want it, and that’s the beauty of search engine marketing.

My example here is chocolate cookies, because I like to think about cookies. You might have somebody that’s searching for the keyword “chocolate cookies,” and maybe you own ChocolateCookies.com, a great domain. If that’s the case, you don’t really know what they want when they want chocolate cookies. They could be looking to buy chocolate cookies. They could want to learn how to make chocolate cookies. They could want recipes. You might also have ingredients. Maybe in addition to cookies you sell ingredients for cookies. Maybe you have recipe content and sales content, and you want to know how to serve up each of those pieces of content in a way that’s really going to serve the user. What you can start doing is really thinking about the search intent of each one of these keywords and building that in to a traditional persona-based marketing model.

This is my example model. All of these examples are made up. The data is not real. You cannot use this data and take it out and just go build ChocolateCookies.com. You could, but results are not guaranteed. To reiterate, this data, made up.

In my ChocolateCookies.com example, we’ve got three different personas. We’ve mapped out who they are and what they want. Now we can actually assign keywords to them. Say you’re trying to target people who want to make cookies. What they’re looking for, they’re looking for recipes, they’re looking for ingredients. They are not looking to buy cookies. If somebody googles “chocolate cookie recipes” and they click through to your site and it’s a page about how you can buy cookies from you, that is a bad user experience. Those people are not going to buy cookies, and they’re also going to bounce right back to the search results.

That is the kind of thing that search engines are tracking. How quickly did somebody return to the search results page from your site? Did they do it without taking an action? If so, that can be a signal that you’re not serving up quality content. It’s bad from a ranking factor’s perspective, and it’s also bad because that person did not give you money and that’s what we’re trying to do, trying to sell cookie recipes.

So you really want to make sure that this person when they’re searching for these keywords, which you’ve mapped back to their persona, you’re serving up chocolate cookie recipes. And if they’re looking for ingredients, you’re serving up ingredients. Then you’re creating an entire experience. You’re not just paying lip service saying, “Oh, here’s a recipe and then buy a bunch of stuff.” You really are serving them up that high quality content that users love, that brings them back to the site again and again. If the recipe content is good enough, this baker might even share your content and share it with their friends, and maybe even link to it from their blog that’s all about making cookies. Wouldn’t that be nice?

Then you might also have somebody who does not want to make cookies because they don’t have that kind of time. They want to buy cookies. They just want to buy them and then eat them. It’s a model that I practiced for years. So they’re going to be looking to buy cookies online. They’re not going to care about recipes at all. They’re not going to care about ingredients at all. They’re going to be much more purchase-driven and be looking at keywords around their favorite brands and looking for sales. These are the people that you can really incentivize with calls to action and trust signals, like free shipping, delivery, sales, coupons, join our mailing list, and things like that. You’ve now mapped these back, so again you’re creating this entire experience and all of this content based around the fact that this person does not care about recipes at all, they just want to buy.

Then our third persona is somebody who’s buying at the corporate level. Maybe they’re an office manager, or at SEOmoz, Team Happy is constantly buying us goodies and snacks, and we love that. But this person is in charge of the cookie supply at their office. What, does your office not have cookies? I’m so sorry. Get some cookies.

So this guy, he doesn’t care about recipes at all. He’s not going to make cookies every day for 100 people. He wants to buy them, and he’s not spending his own money. He’s spending the company’s money. So he’s looking for things like a corporate discount, a bulk discount, Maybe he’s catering a party. He needs same-day delivery. These are the things that are really going to be important to this person. Since you know that, you can create content that is solely targeted toward this one person, this one buyer. Especially if you have things like a corporate discount, this is the place to really show it off.

So you’ve got these three different personas, and they’re taking three very different paths through the site and they’re consuming the site in different ways, whether it’s buying a bunch of stuff, buying one thing, consuming your content and buying ingredients, coming back. Each of these personas is experiencing your content in very different ways. Rather than just creating one site and popping in keywords all willy-nilly so that all of these people are having the same experience, you can start crafting unique user experiences for each of these people based on their paths through the site.

Great, except that that takes a lot of time and money. Both in the fact that at most businesses time in some ways is money, and you may actually have to spend some money on it. One of the things that I actually really recommend doing during this part of the process is running some PPC campaigns around the keywords where you’re trying to define user intent. If somebody’s just searching on chocolate cookies, you might not know if they want to buy them, or if they want to make them or what they want to do. So use PPC, run a little test, and see whether people respond better if you’ve got recipes, or free shipping, or what the different calls to action are for those more generic terms. Over time you can start to see what the majority of users’ intent is and what they really respond to and craft experiences for those more generic terms based around that. That’s a really great way to use PPC as a little guinea pig test.

Now here comes my favorite part because it involves metrics. What you can do is go into your Google Analytics or whatever, use your analytics tools and start looking at these behaviors based on keywords. Once you’ve got your persona and you’ve got your keywords assigned to your persona, first of all make sure that all of these keywords really are the same persona. Make sure that users who enter on those keywords are taking similar paths through the site and executing similar actions. That’s a great secondary indicator that all of these keywords do belong to this same persona.

Start looking at what they do. Maybe you get the most traffic from the baker, but you get the most revenue per order from the corporate guy. Maybe the shopper doesn’t return as much, but she does convert at 2.4%. The baker spends the longest time on site, but maybe she doesn’t buy as much. These are the things that you can start to look at and say, “Okay, so we know that the baker spends a lot of time on site, that’s great. What can we do to encourage her to turn that into a purchase? How can we brand message to her in ways that make her feel more comfortable buying ingredients, or what can we do to incentivize her sharing this content which clearly she’s consuming or loving?”

The same thing with the corporate guy. If he’s got the highest revenue per order, obviously we want more of this guy. We want to figure out what does he want, what is he doing, and what are the triggers that we can use that get him to buy more or get him to return to the site more. You can start really testing, and that’s great because it allows you, even just before you’ve done any of that amazing tweaking and testing, to say, “Okay where is the biggest mover of the needle among these two personas? What are the activities that we could be doing that could encourage them to do more of the activities they want to do fastest?” Then that’ll help you prioritize and it’ll help you target your efforts and your budget.

Then if you want to go above and beyond and really get in there and be a little bit creepy, what you can do is actually link up your site to Facebook Open Graph so that people are opting in to a Facebook app when they’re registering on your site. They’re connecting with Facebook. So there is that opt-in. You don’t just want to take people’s information. Once you’ve done that, you can actually, in your Google Analytics code, link it up to your Facebook Open Graph data, and you can start getting real demographic data on the actual people who are using these keywords and coming to your site. Now in addition to knowing that the baker is 40% of searches, you know that she’s 35 to 40, you know she’s female, and you know she’s a mom. The corporate guy you know that he works at a company of more than 100 people most of the time. So you can really start targeting these people based on their demographic information.

What you also learn then is who these people are that like you so much. They’re coming to your site over and over. They’re buying things from you, which is really what we’re trying to do here. And you can start targeting more of those people in your own SEO efforts, in your own customer acquisition efforts. You’re targeting them on social. You’re reaching out to them for links. You’re buying ads to put in front of them, and you have more confidence that you’ll have a return on those ads because you already know these are the kind of people who like you.

So you have all of this information about keywords and about personas. Now you can take that back to your user experience team, to your information architects and say, “Hey, let’s redo the sitemap and have it be based on these personas, based on these proven user behaviors that start with a keyword and end with a purchase, and let’s build experiences for those keywords.” Now instead of just saying, “Well, here’s what I think. We’ve got like About Us, Contact Us, Products.” You can really say, “These are three main personas, so in the header we should probably have cookie recipes, shop cookies, corporate discount,” and know that even from page one on the site whenever one of your target people comes to the site, it’s really easy for them to find the experience they’re looking for, make their way through the site, and then buy something.

Mike King of iAquire, who blogs at ipullrank.com, put together some code using Stack Overflow, which may or may not work on your site. Take it to your devs and see if they can make it work with your analytics. Every site is different. Your mileage may vary, but there is a link to it here at the bottom of the screen. There should be. It’s invisible to me, but you can see it.

Now that you have this data, go to your UX people and show them the power of keyword-driven site mapping. Show them how SEO has so much to do with what they do, and not only will this project work for you, but in the future they’ll be more likely to come back to you and say, “Hey, we’re going to change the whole site, and we thought you should know before we do it.” That’s what you want.

That’s it for Whiteboard Friday this week. Thanks for coming by you guys. See you next time.”

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Keyword-Driven Personas – Whiteboard Friday

Posted by RuthBurr

As inbound marketing is gaining traction, marketers in all inbound disciplines are realizing the importance of taking on keywords with a more holistic approach. It’s time to start building your keywords into the bones of your site, rather than adding them once your site is already completely mapped out. 

In today’s Whiteboard Friday, Ruth Burr discusses how you can use your keywords to drive personas, and ultimately affect your site mapping process for the better. Leave your thoughts and questions in the comments below! 

 

For your viewing pleasure, here’s a still image of the whiteboard used in this week’s video!

Still image of Whiteboard Friday - Ruth Burr - Keyword-Driven Personas

Update: Ruth referred to some code that Mike King of iAquire put together that may help your site if integrated into your analtyics. Give it a look!

Video Transcription

“Howdy, SEOmoz fans. My name’s Ruth Burr. Welcome to another Whiteboard Friday. I’m the Lead SEO here at SEOmoz, and today I want to talk about using keywords to drive personas and ultimately your site mapping process.

One thing that we’re really thinking a lot about as we move more and more toward an inbound marketing model, where there are multiple different people with multiple different functions all working together to have the best inbound marketing possible, is what we’re doing with keywords and sort of when we’re adding keywords into the site. I know that we’ve all had the experience in years past where we would get a site or get a piece of copy that was completely written and then just kind of have to plug our keywords into that existing content wherever they would fit. You might have an entire site that’s already completely mapped out, it’s got a sitemap, it’s got information architecture, and then you’re supposed to go in and put in your keywords. I’ve found that that is not always the best user experience for the keyword, and also isn’t as effective as taking a more holistic approach.

So what I’m really hoping you guys will get out of this is take it back to your UX and your IA teams and really think about how you can build keywords more into the bones of the site.

One thing that Google is thinking a lot about that is really important for us to be thinking about as marketers as well is searcher intent. Search engines are spending tons of money and tons of time and tons of effort trying to figure out what people are searching for when they use a keyword. It behooves us as marketers to do the same thing because that way we can give people what they want when they tell us they want it, and that’s the beauty of search engine marketing.

My example here is chocolate cookies, because I like to think about cookies. You might have somebody that’s searching for the keyword “chocolate cookies,” and maybe you own ChocolateCookies.com, a great domain. If that’s the case, you don’t really know what they want when they want chocolate cookies. They could be looking to buy chocolate cookies. They could want to learn how to make chocolate cookies. They could want recipes. You might also have ingredients. Maybe in addition to cookies you sell ingredients for cookies. Maybe you have recipe content and sales content, and you want to know how to serve up each of those pieces of content in a way that’s really going to serve the user. What you can start doing is really thinking about the search intent of each one of these keywords and building that in to a traditional persona-based marketing model.

This is my example model. All of these examples are made up. The data is not real. You cannot use this data and take it out and just go build ChocolateCookies.com. You could, but results are not guaranteed. To reiterate, this data, made up.

In my ChocolateCookies.com example, we’ve got three different personas. We’ve mapped out who they are and what they want. Now we can actually assign keywords to them. Say you’re trying to target people who want to make cookies. What they’re looking for, they’re looking for recipes, they’re looking for ingredients. They are not looking to buy cookies. If somebody googles “chocolate cookie recipes” and they click through to your site and it’s a page about how you can buy cookies from you, that is a bad user experience. Those people are not going to buy cookies, and they’re also going to bounce right back to the search results.

That is the kind of thing that search engines are tracking. How quickly did somebody return to the search results page from your site? Did they do it without taking an action? If so, that can be a signal that you’re not serving up quality content. It’s bad from a ranking factor’s perspective, and it’s also bad because that person did not give you money and that’s what we’re trying to do, trying to sell cookie recipes.

So you really want to make sure that this person when they’re searching for these keywords, which you’ve mapped back to their persona, you’re serving up chocolate cookie recipes. And if they’re looking for ingredients, you’re serving up ingredients. Then you’re creating an entire experience. You’re not just paying lip service saying, “Oh, here’s a recipe and then buy a bunch of stuff.” You really are serving them up that high quality content that users love, that brings them back to the site again and again. If the recipe content is good enough, this baker might even share your content and share it with their friends, and maybe even link to it from their blog that’s all about making cookies. Wouldn’t that be nice?

Then you might also have somebody who does not want to make cookies because they don’t have that kind of time. They want to buy cookies. They just want to buy them and then eat them. It’s a model that I practiced for years. So they’re going to be looking to buy cookies online. They’re not going to care about recipes at all. They’re not going to care about ingredients at all. They’re going to be much more purchase-driven and be looking at keywords around their favorite brands and looking for sales. These are the people that you can really incentivize with calls to action and trust signals, like free shipping, delivery, sales, coupons, join our mailing list, and things like that. You’ve now mapped these back, so again you’re creating this entire experience and all of this content based around the fact that this person does not care about recipes at all, they just want to buy.

Then our third persona is somebody who’s buying at the corporate level. Maybe they’re an office manager, or at SEOmoz, Team Happy is constantly buying us goodies and snacks, and we love that. But this person is in charge of the cookie supply at their office. What, does your office not have cookies? I’m so sorry. Get some cookies.

So this guy, he doesn’t care about recipes at all. He’s not going to make cookies every day for 100 people. He wants to buy them, and he’s not spending his own money. He’s spending the company’s money. So he’s looking for things like a corporate discount, a bulk discount, Maybe he’s catering a party. He needs same-day delivery. These are the things that are really going to be important to this person. Since you know that, you can create content that is solely targeted toward this one person, this one buyer. Especially if you have things like a corporate discount, this is the place to really show it off.

So you’ve got these three different personas, and they’re taking three very different paths through the site and they’re consuming the site in different ways, whether it’s buying a bunch of stuff, buying one thing, consuming your content and buying ingredients, coming back. Each of these personas is experiencing your content in very different ways. Rather than just creating one site and popping in keywords all willy-nilly so that all of these people are having the same experience, you can start crafting unique user experiences for each of these people based on their paths through the site.

Great, except that that takes a lot of time and money. Both in the fact that at most businesses time in some ways is money, and you may actually have to spend some money on it. One of the things that I actually really recommend doing during this part of the process is running some PPC campaigns around the keywords where you’re trying to define user intent. If somebody’s just searching on chocolate cookies, you might not know if they want to buy them, or if they want to make them or what they want to do. So use PPC, run a little test, and see whether people respond better if you’ve got recipes, or free shipping, or what the different calls to action are for those more generic terms. Over time you can start to see what the majority of users’ intent is and what they really respond to and craft experiences for those more generic terms based around that. That’s a really great way to use PPC as a little guinea pig test.

Now here comes my favorite part because it involves metrics. What you can do is go into your Google Analytics or whatever, use your analytics tools and start looking at these behaviors based on keywords. Once you’ve got your persona and you’ve got your keywords assigned to your persona, first of all make sure that all of these keywords really are the same persona. Make sure that users who enter on those keywords are taking similar paths through the site and executing similar actions. That’s a great secondary indicator that all of these keywords do belong to this same persona.

Start looking at what they do. Maybe you get the most traffic from the baker, but you get the most revenue per order from the corporate guy. Maybe the shopper doesn’t return as much, but she does convert at 2.4%. The baker spends the longest time on site, but maybe she doesn’t buy as much. These are the things that you can start to look at and say, “Okay, so we know that the baker spends a lot of time on site, that’s great. What can we do to encourage her to turn that into a purchase? How can we brand message to her in ways that make her feel more comfortable buying ingredients, or what can we do to incentivize her sharing this content which clearly she’s consuming or loving?”

The same thing with the corporate guy. If he’s got the highest revenue per order, obviously we want more of this guy. We want to figure out what does he want, what is he doing, and what are the triggers that we can use that get him to buy more or get him to return to the site more. You can start really testing, and that’s great because it allows you, even just before you’ve done any of that amazing tweaking and testing, to say, “Okay where is the biggest mover of the needle among these two personas? What are the activities that we could be doing that could encourage them to do more of the activities they want to do fastest?” Then that’ll help you prioritize and it’ll help you target your efforts and your budget.

Then if you want to go above and beyond and really get in there and be a little bit creepy, what you can do is actually link up your site to Facebook Open Graph so that people are opting in to a Facebook app when they’re registering on your site. They’re connecting with Facebook. So there is that opt-in. You don’t just want to take people’s information. Once you’ve done that, you can actually, in your Google Analytics code, link it up to your Facebook Open Graph data, and you can start getting real demographic data on the actual people who are using these keywords and coming to your site. Now in addition to knowing that the baker is 40% of searches, you know that she’s 35 to 40, you know she’s female, and you know she’s a mom. The corporate guy you know that he works at a company of more than 100 people most of the time. So you can really start targeting these people based on their demographic information.

What you also learn then is who these people are that like you so much. They’re coming to your site over and over. They’re buying things from you, which is really what we’re trying to do here. And you can start targeting more of those people in your own SEO efforts, in your own customer acquisition efforts. You’re targeting them on social. You’re reaching out to them for links. You’re buying ads to put in front of them, and you have more confidence that you’ll have a return on those ads because you already know these are the kind of people who like you.

So you have all of this information about keywords and about personas. Now you can take that back to your user experience team, to your information architects and say, “Hey, let’s redo the sitemap and have it be based on these personas, based on these proven user behaviors that start with a keyword and end with a purchase, and let’s build experiences for those keywords.” Now instead of just saying, “Well, here’s what I think. We’ve got like About Us, Contact Us, Products.” You can really say, “These are three main personas, so in the header we should probably have cookie recipes, shop cookies, corporate discount,” and know that even from page one on the site whenever one of your target people comes to the site, it’s really easy for them to find the experience they’re looking for, make their way through the site, and then buy something.

Mike King of iAquire, who blogs at ipullrank.com, put together some code using Stack Overflow, which may or may not work on your site. Take it to your devs and see if they can make it work with your analytics. Every site is different. Your mileage may vary, but there is a link to it here at the bottom of the screen. There should be. It’s invisible to me, but you can see it.

Now that you have this data, go to your UX people and show them the power of keyword-driven site mapping. Show them how SEO has so much to do with what they do, and not only will this project work for you, but in the future they’ll be more likely to come back to you and say, “Hey, we’re going to change the whole site, and we thought you should know before we do it.” That’s what you want.

That’s it for Whiteboard Friday this week. Thanks for coming by you guys. See you next time.”

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

The Difference Between Penguin and an Unnatural Links Penalty (and some info on Panda too)

Posted by Marie Haynes

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.

Are you confused about the difference between Penguin and an Unnatural Links penalty? Not sure whether you should be disavowing your links? Wondering whether you should file for reconsideration? Well…you’re not alone! I have spent a good amount of time answering questions and learning from others in the SEOmoz Q&A and I see a lot of site owners and even SEOs who are unsure about the answers to these questions.

Recently, a YouMoz article (that was promoted to the main blog) was written in which the author showed an image of the unnatural links warning that his site received and then stated:

“We straight away knew that we had been hit by Google’s Panda 3.9.1 update!”

Oh dear. An unnatural links warning is NOT indication that you have been affected by Panda! Now, this article, and the comments below it have some great information on unnatural links recovery, so I don’t want to be too harsh on the author. My point in mentioning this though is that even SEOs who know a thing or two about Google penalties and algorithm changes can be confused on these matters.

A confession – I messed up too.

I am insanely obsessed with understanding Penguin, Unnatural Links Penalties and Panda. I really don’t know why. But it all started because I made a mistake. I was part of an SEO forum discussion in which a site owner felt they had been affected by the Penguin algorithm. I told him to clean up his bad links and then file for reconsideration. A senior member of the forum rightfully corrected me and said that I was giving incorrect advice. And he was right! As I will discuss further on in this article, filing for reconsideration is not going to help a Penguin hit site. I gave some bad advice and I am grateful that I was corrected. What that correction did was make me realize that Penguin and Unnatural Links Penalties are confusing. A lot of SEOs, myself included at the time, had a lot to learn about these issues. I made a decision that day that I would learn everything I could about algorithm changes and Google penalties.

A Brief Description of Penguin, Unnatural Links and Panda

Before we start answering questions, here is some fundamental information about Penguin, Unnatural Links Penalties and Panda:

The Penguin Algorithm

On April 24, 2012, Google announced “Another Step to Reward High Quality Sites“, an algorithm change aimed at fighting against webspam. The algorithm change was first called “The Webspam Algorithm” but eventually began to go by the name of “Penguin”. This algorithm severely affected sites that had widespread keyword stuffing and participation in link schemes. Matt Cutts, head of webspam at Google, eventually admitted on Twitter that links are “a primary area to monitor” when you have been affected by Penguin.

Matt Cutts Tweet

What most SEOs believe is that one of the primary causes of Penguin is when sites create easily made links containing keywords as anchor text from low quality places such as article marketing sites, bookmarks and do-follow comments.

Unnatural Links Penalties

Unnatural links warningThese penalties are manual penalties that Google can place on sites when they determine that a site is widely attempting to manipulate the search engine results by creating links. These penalties are manual as opposed to Penguin which is algorithmic. So, what causes a site to be hit with an unnatural links warning?

Most webmasters believe that if someone files a spam report against you, then this will open up your site for a manual review. Some have speculated that Google monitors some of the more competitive niches such as “payday loans”, “car insurance”, casino sites, etc. and manually checks for unnatural links. No one knows for sure.

The Panda Algorithm

The Panda Algorithm was created by Google in an attempt to cause low quality sites to be displayed much lower in the search results. When Panda first hit, it was an unnamed algorithm. Many named it the “Farmer update” as it seemed to be aimed at content farms that ranked well as a result of scraping content from other sites. Most SEOs believe that sites affected by Panda have issues with on page quality as opposed to the quality of their links as in Penguin and Unnatural Links penalties. Sites that have been affected by Panda often have significant amounts of duplicated content (either on their own site or more commonly, from other sites) and also thin content. Thin content is usually a page that consists of very few words. If a site contains a lot of duplicate and thin content then Google sees little reason to show this site prominently in its search results. An entire site can be severely demoted because of Panda even if only parts of the site have duplicate and thin content.

Now let’s cover some of the points where people are the most confused about these issues.

What is the difference between Penguin and an Unnatural Links Penalty?

Both of these issues have to do with unnatural links. In both cases, the use of keywords as anchor text can be a factor. However, the main difference between the two is that Penguin is an algorithmic issue while Unnatural Links penalties are manual. A manual penalty is one that is levied by a human being, one site at a time. For example, a competitor could file a spam report on you which could result in a Google Webspam employee looking at your site. The employee could look at your backlinks and see that you have been engaging in practices that are considered as link schemes. As such, they may decide to levy a manual penalty on your site.

Penguin is not levied one site at a time. Google has created an algorithm which is designed to programmatically find sites that have been engaging in unnatural link building tactics. When Penguin updates, if your site has been flagged as a site that is engaging in webspam, then your site will be affected on the date of the update. No human being is directly involved in determining whether your site is affected. As a point of interest, I have heard some SEOs who have done testing and believe that Penguin can affect a site on any day and not just Penguin refresh days. So far, in sites that I have seen, it seems that Penguin can only affect a site on a Penguin refresh day. The reality is that at this point no one knows for certain whether or not a site can be affected by Penguin on a date other than a Penguin refresh date.

Do Penguin, Unnatural Links and Panda affect the whole site or just part of the site?

Penguin: Penguin usually affects a site on a page and keyword level. Let’s say that you have a page called example.com/greenwidgets/ and you have been building links to this page all containing the anchor text, “green widgets”. If Penguin affected you, then it would mean that this particular page would no longer rank well for “green widgets”. Penguin generally does not affect an entire site. However, quite often when sites have been affected by Penguin, they have built many anchor texted links, possibly for many different keywords all to the homepage. This can mean that the homepage will not rank for a number of terms.

Unnatural Links: A manual unnatural links penalty can affect the entire site, or just a page, or even just one keyword. Sometimes a site can be penalized and be totally removed from the Google index. Other times, the site can still be in the index but not be shown in the first 10 pages for any of its keywords. Or, sometimes the penalty will not be as severe and may only affect one or two keywords. Here is a quote from Matt Cutts regarding a site that was penalized on a keyword level:

Matt Cutts on Widgets

The site in this example would not be able to rank for the keywords that they had used as anchors for sites that embedded their widgets.

Panda: Panda can affect an entire site, or sometimes one section such as a news blog on the site. Panda does not tend to affect just single pages of a website. If you have a site that has some good content, but a lot of thin and duplicate content, then the Panda filter can cause the entire site to have trouble ranking, not just the thin and duplicate pages.

Should you file for reconsideration if you have been affected by Penguin, Unnatural Links or Panda?

Penguin: No. A reconsideration request is only meant for sites that have a manual warning. If you have a manual warning then you will have a message in your WMT. (See the image next to the section above on Unnatural Links.) If you have been affected by Penguin, then, because this is an algorithmic issue, having a Google employee review the site will not help.

Unnatural Links: Yes. If you have a manual warning in your WMT then once you have done the work required to clean up the site (see below) then you will need to file for reconsideration.

Panda: No. See Penguin. Panda is also an algorithmic change and a reconsideration request will not help you recover.

Should you be using the disavow tool if you have been affected by Penguin, Unnatural Links or Panda?

On October 16, 2012, Google released the disavow tool which allowed webmasters to essentially have Google add an invisible “nofollow” to certain links that are pointing to their site. Since the release of this tool, there have been so many webmasters asking questions in Q&A as well as other SEO forums wondering if they should be disavowing their links. Many have become paranoid about their links and want to disavow everything that looks suspicious. I’ve seen people who wanted to disavow a great link because it was site-wide. I’ve seen others who wanted to disavow a pile of links even though they are already nofollowed links. There is a lot of confusion around the use of the disavow tool. This is probably why the disavow tool comes with this disclaimer:

Disavow warning.

Penguin: Google vaguely suggests that the disavow tool could be useful for a Penguin hit site. In their blog post about the disavow tool, they say the following:

Q: Should I create a links file as a preventative measure even if I haven’t gotten a notification about unnatural links to my site?

A: If your site was affected by the Penguin algorithm update and you believe it might be because you built spammy or low-quality links to your site, you may want to look at your site’s backlinks and disavow links that are the result of link schemes that violate Google’s guidelines.”

Most SEOs believe that if you have been affected by Penguin then you should use the disavow tool to discount the unnatural links to your site. At the time of writing this, Penguin has not refreshed since the disavow tool was released. (The tool was released October 16th and the last Penguin refresh was October 5th.) What this means is that we do not have any proof yet as to whether or not disavowing links will help a site to recover from Penguin. Hopefully it will, but there may be other factors that need to be addressed as well such as on page issues like keyword stuffing.

Unnatural Links: Yes. This is what the disavow tool was made for. Google says, in regards to a manual unnatural links penalty, “If you’ve done as much as you can to remove the problematic links, and there are still some links you just can’t seem to get down, that’s a good time to visit our new Disavow links page.”

Panda: No. As Panda generally does not have anything to do with backlinks, disavowing links to your site is not likely to help.

Do you need to manually remove links?

Penguin: While removing links is probably a good idea, it is likely not necessary. Because Penguin is an algorithm, to recover you don’t need to show a human being evidence that you have worked hard to remove links. Most SEOs who are experienced with Penguin issues believe that disavowing your problematic links will help and that physically removing the links is not necessary. With that being said, if the bad links are under your control and easy to remove, then it is a good idea to do so.

Unnatural Links: When trying to recover from a manual unnatural links penalty, it is not enough to just disavow the bad links. Google wants to see evidence that you have tried to get as many of the unnatural links removed as possible. When you file for reconsideration, one of the first things that the webspam team member does is check a number of the links that they have flagged as unnatural and see how many of them you have gotten physically removed. For the unnatural links that you are unable to get removed because the webmaster didn’t reply, or because they wanted a large sum of money or for whatever other reason, then you can disavow those links.

Removing an unnatural links penalty from a site can take a lot of work. If you are struggling to remove a penalty from your site, or if you are an SEO who would like to get involved in doing penalty removal work, I have documented everything that I do in order to get penalties removed in my book (see bio section for link).

Panda: No, it is not believed that any links need to be removed for sites affected by Panda.

When will you recover?

Penguin: Most SEOs believe that you will not be able to recover a Penguin hit site until Penguin refreshes again. Google announced at SMX West that in 2013 there would be a major Penguin update but did not say when this would happen. There are some people who believe that they have seen Penguin hit sites recover on a day other than a refresh day. There are ways to recover a Penguin hit site without waiting for a refresh. For example, if you had a “green widgets” page that had been affected by Penguin because you built anchor text using the phrase “green widgets”, you could build a new page called “buying-green-widgets” and get new, good quality links to that page and possibly rank again for this term. The original page would not rank, but the new one could. The problem with this is that getting new good quality links is difficult. Google wants you to earn links and not make them yourself.

I asked John Mueller, a Google employee about whether or not it was possible to recover a Penguin hit site outside of a Penguin refresh and here is what he said:

+Marie Haynes theoretically, in an artificial situation where there’s only one algorithm (which is, in practice, never the case), if a site is affected by a specific algorithm, then the data for that algorithm needs to be updated before it would see changes. In practice, while some elements might be very strong depending on what was done in the past, there are always a lot of factors involved, so significantly improving the site will result in noticeable changes over time, as we recrawl & reindex the site and it’s dependencies, as well as reprocess the associated signals. So yes, you’d need to wait for the algorithm to update if it were the only thing involved, but in practice it’s never the only thing involved so you’re not limited to waiting.

Also keep in mind that for long-running processes (be it algorithm updates like this, or other higher-level elements in our algorithms), it’s never a good idea to limit yourself to small, incremental improvements; waiting to see if “it’s enough” can take a while, so I’d recommend working to take a very good look at the issues you’ve run across, and working to make very significant improvements that will be more than enough (which users will appreciate as well, so there’s that win too).”
 
A full discussion on ways to recover from Penguin is outside of the scope of this article.

manual spam action revokedUnnatural Links: Once you file for reconsideration, it will take anywhere from 3-14 days to hear back from Google. I have had it take as long as six weeks, but this was just after the disavow tool was released and Google probably had a large backlog of sites to review. If you get the wonderful “manual spam action revoked” message, for some sites recovery can happen in a couple of days. Depending on how severe the penalty was, it can take significantly longer such as several months.

There are some sites that can have a penalty revoked but not see any increase in rankings at all. This generally happens when sites have no good links to prop the site up. If your site’s backlink profile consisted of 99% self made links and you have removed or disavowed almost all of those links then you will need to get good, quality links to your site in order to rank again. Gone are the days of being able to rank well on poor quality links.

Some sites can still appear to be penalized after their manual penalty is lifted if they are also under the effects of Penguin. In most cases, it is believed that the work that is done to recover from an unnatural links penalty will also get you out of Penguin trouble. However, you’ll need to see a Penguin refresh in order to start ranking well again.

Panda: Again, a full discussion on Panda recovery is outside of the scope of this article. Once you have done what is necessary to fix Panda issues such as duplication and thin content, then many sites will recover with the next Panda refresh. However, I have seen some sites that have taken several Panda refreshes in order to recover. As of March, 2013, Matt Cutts stated that Panda will not be doing large regular refreshes as we have been used to but instead it will now be regularly rolled into the regular algorithm. I expect that this means that Panda hit sites can recover much sooner now once the work is done.

Conclusion

The purpose of this article was to answer some of the regularly asked questions when it comes to differences between Penguin, Unnatural Links and Panda issues. I don’t claim to have all of the answers though. I hope this article generates some good discussion and questions!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Rel=Confused? Answers to Your Rel=Canonical Questions

Posted by Dr. Pete

It’s been over four years (February 2009) since Google and Yahoo announced support for the rel=canonical tag, and yet this single line of HTML is still causing a lot of confusion for SEOs and webmasters. Recently, Google posted 5 common mistakes with rel=canonical – it’s a good post and a welcome bit of transparency, but it doesn’t address a lot of the questions we see daily here in Q&A.  So, I thought it was a good time to tackle some of your most common questions (and please forgive the following nonsense)….

Canonical Cannonsicles (Don't Ask)

What Is Rel=Canonical?

Put simply, the rel=canonical tag is a way to tell Google that one URL is equivalent to another URL, for search purposes. Typically, a URL (B) is a duplicate of URL (A), and the canonical tag points to (A). The following tag would appear on the page that generates URL (B), in the <head></head>:

<link rel=”canonical” href=”http://www.example.com/url-a.html” />

Google’s support document on rel=canonical is actually pretty good. The subject of duplicate content is complex, and I’ve addressed it previously in detail. For this post, I’m going to skip ahead and assume that you have a working knowledge of technical SEO and have attempted to use rel=canonical on your site.

Note: Rel=canonical is also referred to as “Rel-canonical” and “The Canonical Tag”. For this article, I will try to consistently refer to it as “Rel=canonical”.

(1) Should I Use Rel=Canonical for Pagination?

I’m not going to repeat all of Google’s answers, but this one is so frequently asked that it deserves more detail.  Let’s say you have a series of paginated search results (1, 2, 3… n). These can be considered “thin”, from a search standpoint, so should you rel=canonical page n back to page 1?

Officially, the answer is “no” – Google does not recommend this. They recommend that you either rel=canonical to a “View All” page (if having all results on one page is viable) or that you use rel=prev/next. Rel=canonical can be used in conjunction with rel=prev/next to handle search sorts, filters, etc., but that gets complicated fast.

Pagination for SEO is a very tricky subject, and I recommend you check out these two resources:

(2) Can I Use Rel=Canonical Cross-domain?

Yes – in late 2009, Google announced support for cross-domain use of rel=canonical. This is typically for syndicated content, when you’re concerned about duplication and only want one version of the content to be eligible for ranking.

(3) Should I Use Rel=Canonical Cross-Domain?

That’s a tougher question. First off, Google may choose to ignore cross-domain use of rel=canonical if the pages seem too different or it appears manipulative. The ideal use of cross-domain rel=canonical would be a situation where multiple sites owned by the same entity share content, and that content is useful to the users of each individual site. In that case, you probably wouldn’t want to use 301-redirects (it could confuse users and harm the individual brands), but you may want to avoid duplicate content issues and control which property Google displays in search results. I would not typically use rel=canonical cross-domain just to consolidate PageRank.

(4) Should I Use Rel=Canonical on Near Duplicates?

As my catastrophic canonicalization experiment and follow-up experiments showed, Google does honor rel=canonical even on very different pages, in some cases. That doesn’t mean that it’s a good idea. Generally speaking, I think it’s best to reserve rel=canonical for duplicates or very-near duplicates. For example, if a product pages spins off into five URLs for five different colors, and each color’s page only differs by a sentence or two (or an image), then yes, I think it’s fine to rel=canonical to the “parent” product page.

Do not use rel=canonical as a substitute for appropriate 301-redirects and/or 404s. While it probably won’t cause large-scale catastrophes, I strongly suspect that Google will start to ignore your canonical tags, and this may impact how you control legitimate duplicates.

(5) Can I Put Rel=Canonical on the Canonical Page?

In other words, is it alright to put a rel=canonical tag on the canonical version of the URL, pointing back to itself? Practically speaking – yes, it is, but you don’t have to. Early on, there were hints that both Google and Bing preferred that you not overuse rel=canonical. Over time, though, their stances seemed to soften, and I’ve seen no evidence in recent history of a properly used, self-referencing canonical causing any harm.

This is often just a practical issue – many URLs share common templates, and the code needed to display a rel=canonical tag on just the duplicates and not the canonical version of a page can get messy and increase your chance of mistakes. Personally, I believe that the search engines recognized the reality most webmasters face and adjusted their initial, conservative stance.

(6) Is It OK to Put Rel=Canonical on My Entire Site?

Should you pre-emptively rel=canonical your entire site – even if many of the pages aren’t subject to duplicate content issues? I think this gets very speculative. We have recommended this approach at SEOmoz in the past, and I think it’s generally safe. I do worry that excessive use of rel=canonical could cause search engines to devalue and even ignore those tags, but I can’t point to any clear evidence of this happening. I also worry that people often implement site-wide rel=canonical tags badly, and end up pointing them to the wrong pages.

I do think that a pre-emptive rel=canonical on your home-page is generally a good ideas, as home pages are prone to URL variations. In a perfect world, I’d say to use rel=canonical on the home-page, known duplicates, and any pages with parameters that could drive duplicate content, and leave the rest alone. However, that’s often a very difficult procedure. In some cases, site-wide rel=canonical implementation is better than no index control.

(7) Should I Use Rel=Canonical or 301 Redirects?

Please understand that while these two approaches can behave similarly, from an SEO standpoint, they are not interchangeable. Here’s the critical difference – a 301-redirect takes the visitor to the canonical URL, while a rel=canonical tag does not. Usually, only one of these approaches is the right one for your visitors. If you really want to permanently consolidate two pages and remove the duplicates, then use a 301-redirect. If you want to keep both pages available to visitors, but only have one appear in search results, then use rel=canonical.

(8) Does Rel=Canonical Pass Authority/PageRank?

This is very difficult to measure, but if you use rel=canonical appropriately, and if Google honors it, then it appears to act similarly to a 301-redirect. We suspect it passes authority/PageRank for links to the non-canonical URL, with some small amount of loss (similar to a 301).

(9) Can I Chain Rel=Canonicals (+301s, 302s, etc.)?

What happens if you rel=canonical to a URL with rel=canonical to another URL, or you rel=canonical to a URL that 301-redirects to another URL? It gets complicated. In some cases, it might work and it might even pass PageRank. Generally speaking, though, it’s a bad idea. At best, it’s sloppy. At worst, it might not function at all, or you might lose significant PageRank across the chain. Wherever possible, avoid chains and implement rel=canonical in a single hop.

(10) Are Non-Canonical Pages Indexed?

For all practical purposes – no. If Google honors a rel=canonical tag, then the non-canonical page is not eligible for ranking. It will not have a unique cached copy, and it will not appear in the public index via a “site:” search. Now, does Google maintain a record of the non-canonical URL? I assume they do. As an SEO, though, the non-canonical URL ceases to exist in any meaningful way.

(11) Can Someone Else Rel=Canonical My Pages?

I’ve seen occasional worries about someone using rel=canonical, especially cross-domain, to harm a site or steal its authority. Keep in mind that you can only grant canonical status from pages you control. So, you could rel=canonical all of your pages to someone else’s site, but why would anyone do that? To wreak any real havoc, someone would have to hack into your site. If that happens, then rel=canonical abuse is the least of your problems. The vast, vast majority of damage done by rel=canonical is self-inflicted.

(12) Can I Have My Cake and Eat It, Too?

No. Yeah, I know – you don’t want to hear it. At least a third of the questions we get about rel=canonical boil down to “I want all of these pages to rank, and they’re the same, but I don’t want to get in any trouble for duplicate content!” I don’t have any secret sauce to pour on that.

You don’t have to use rel=canonical, but. in my experience. controlling your own duplicate content is better than having Google do it for you, and eventually they’ll do it for you. In the old days, that might just mean that the wrong page got filtered out. After 25+ Panda updates, though, it could mean that your entire site suffers. You can’t have it both ways – if you have duplicate content, then remove it, control it, or improve it.

What Questions Do You Have?

If you have any general questions about the canonical tag or how to use it, feel free to leave a comment, and I’ll try to address them. Please understand that I can’t dig into your site and provide consulting-level services, but if you can ask the question in a general way that will be helpful to others, I’ll do my best to leave a reply.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →