Archives for 

seo

A Dozen Digestible Takeaways from 2016’s E-Commerce Benchmarks Study

Posted by Alan_Coleman

Hey Moz Blog readers.

I’m delighted to share with you a big body of work the Wolfgang team has just completed. It’s our E-commerce Benchmarks 2016 study. We dove into Google Analytics insights from over 80 million website sessions and over one-quarter of a billion dollars in online revenue for travel and retail websites, calculating average e-commerce website key performance indicators (KPIs) for you to use as benchmarks.

I hope these findings help you benchmark your KPIs and gain deeper insights into what you can do to boost conversion.

There are a number of unique features to this study:

  • We’ve divvied the results up into overall, travel, and retail. Within the retail cohort, we’ve broken out results for our “online only” retailers and “multichannel” retailers. The KPIs are distinctly different for the two sets of retailers.
  • We’ve conducted a correlation study in which we correlate all the factors of the study with conversion rate and with average order value.
  • We’ve expanded the scope of the study since last time and based on your comments, we’ve included site speed analysis, as well as more info around paths to conversion and assisted conversions.

In this post I’m going to give you an overview of 12 key takeaways. You can read the full report here. Or grab some quick insights from our infographic here.

1/ The average e-commerce conversion rate is 1.48%.

  • Retail websites averaged 1.36%.
  • Online-only websites converted almost twice as well as their multi-channel counterparts with 2%, compared to 1.12%.
  • The travel websites in the study averaged a 2.04% conversion rate.

It was notable that the travel websites enjoyed higher conversion rates but lower engagement rates than the average retailer. This spiked my curiosity, as that just seemed too darn easy for the travel retailers. After deep-diving the data, I found that the committed retail customer would visit the one retail website multiple times on their journey to purchase. On the other hand, the travel shopper does a lot of research, but on other websites, review sites, via online travel agents, travel bloggers, etc. before arriving at the e-commerce website to merely check price and availability before booking. This finding illuminates the fact that the retailer has more influence on its customers’ journey to purchase than the travel website, who’s more dependent on an ecosystem of travel websites to warm up the prospect.

Average Conversion Rate animated gif

Click the image to open a still image in a new tab

2/ The death of SEO?

The data states it emphatically: “Hell no!”

Google organic is the largest source of both traffic (43%) and revenue (42%). SEO traffic from Google organic has actually increased by 5% since our last study.

There was also a strong correlation between websites with a high percentage of traffic from Google organic and higher-than-average Average Order Values (AOVs).

From this finding, we can infer that broad organic coverage will be rewarded by transactions from research-heavy, high-value customers.

3/ AdWords is the king of conversion

The strongest correlation we saw with higher conversion rates was higher-than-average traffic and revenue from AdWords.

In my experience, Google AdWords is the best-converting traffic source. So my take is that, when a website increases its spend on Adwords, it adds more high-conversion traffic to its profile and increases its average conversion rate.

AdWords accounts for 26% of traffic and 25% of revenue on average.

4/ Google makes the World Wide Web go ’round

When you combine Google organic and PPC, you see that Google accounts for 69% of traffic and 67% of revenue. More than two-thirds! Witness the absolute dominance of “The Big G” as our window to the web.

Revenue Sources

Click the image to open a still image in a new tab

5/ Facebook traffic quadruples!

In our last study, Facebook accounted for a meagre 1.3% of traffic. This time around, it’s leapt up to 5%, with Facebook CPC emerging from nowhere to 2%. When better cross-device measurement becomes available in Google Analytics, I expect Facebook to be seen as an assisted conversion power player.

6/ Don’t discount email

Email delivers 6% of traffic, which is actually as much as all the social channels combined — and treble the revenue. In fact, with a 6% share of revenue, Google is the only medium that delivers more revenue than email. Digital marketers often lust after shiny new toys (hello, Snapchat!), but the advice here is to look after the old reliables first. And this 40-year-old technology we all use every day is about as old and reliable as it gets.

7/ Site speed matters most

This section was added to the study after comments from you, the Moz Blog readers, last time around, so thanks for your input. The server response time correlation with conversion rate (-0.31) was one of the strongest we saw. It was dramatically stronger than engagement metrics, such as time on site (0.11) or pages viewed (0.10). We also found that for every two-tenths of a second you shave off your server response time, you’ll increase conversion rate by 8%. Don’t forget that site speed is a Google ranking factor, so by optimizing for it you’ll benefit from a “multiplier effect” of more traffic and a higher conversion rate on all your traffic. Google’s page speed tool is a great place to start your speed optimization journey.

Check out our conversion rate correlation chart below to get more insights on which metrics can move conversion rate.

Conversion Rate Correlation

Click the image to open a still image in a new tab

8/ Mobile is our “decision device”

2015 was finally “the year of mobile.” Mobile became the largest traffic source of the devices, but seriously underperforms for revenue. Its 42% share of traffic becomes a miserly 21% share of revenue, and it suffers the lowest average conversion rate and AOV. Despite these lowly conversion metrics, our correlation study found that websites with a larger-than-average portion of mobile traffic benefitted from larger-than-average conversion rates. This indicates that the “PA in your pocket” is the device upon which decisions are arrived at before being completed on desktop. We can deduce that while desktop remains our “transaction device,” mobile has become our “decision device,” where research is carried out and purchase decisions arrived at.

9/ Digital marketers are over-indexing on display advertising

Despite accounting for 38% of digital marketers budgets (IAB Europe), display failed to register as a top ten traffic source. This means it contributed less than 1% of e-commerce website traffic.

10/ Bounce rate don’t mean diddly squat

Bounce rate actually has zero correlation with conversion rate! Digital marketers feel a deep sense of rejection when they see a high bounce rate. However, as an overall website metric, it’s a dud. While admittedly there are bad bounces, there are many good bounces accounted for in the number.

11/ Digital marketing “economies of scale”

Interestingly, websites that enjoyed more-than-average traffic levels enjoyed higher-than-average conversion rates.

This illustrates a digital marketing version of “economies of scale”; more traffic equals better conversion rates.

The corollary of this is lower CPAs (Cost Per Acquisitions).

12/ People are buying more frequently and spending more per order online.

Average conversion rates have increased 10% since the last study. Retail average order value has shot up a whopping 25%! This demonstrates people are migrating more and more of their shopping behavior off the high street and onto the Internet. There’s never been a better time to be an e-commerce digital marketer.

You can deep-dive the above digestibles by reading the full study here.

How do these benchmarks compare to your personal experience? Anything you’re surprised by, or that confirms your long-held suspicions?

I’d love to hear your thoughts in the comments below.

Optimize hard,

Alan


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Weird, Crazy Myths About Link Building in SEO You Should Probably Ignore – Whiteboard Friday

Posted by randfish

The rules of link building aren’t always black and white, and getting it wrong can sometimes result in frustrating consequences. But where’s the benefit in following rules that don’t actually exist? In today’s Whiteboard Friday, Rand addresses eight of the big link building myths making their rounds across the web.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about some of the weird and crazy myths that have popped up around link building. We’ve actually been seeing them in the comments of some of our blog posts and Whiteboard Fridays and Q&A. So I figured, hey, let’s try and set the record straight here.

1. Never get links from sites with a lower domain authority than your own

What? No, that is a terrible idea. Domain authority, just to be totally clear, it’s a machine learning system that we built here at Moz. It takes and looks at all the metrics. It builds the best correlation it can against Google’s rankings across a broad set of keywords, similar to the MozCast 10K. Then it’s trying to represent, all other things being equal and just based on raw link authority, how well would this site perform against other sites in Google’s rankings for a random keyword? That does not in any way suggest whether it is a quality website that gives good editorial links, that Google is likely to count, that are going to give you great ranking ability, that are going to send good traffic to you. None of those things are taken into account with domain authority.

So when you’re doing link building, I think DA can be a decent sorting function, just like Spam Score can. But those two metrics don’t mean that something is necessarily a terrible place or a great place to get a link from. Yes, it tends to be the case that links from 80- or 90-plus DA sites tend to be very good, because those sites tend to give a lot of authority. It tends to be the case that links from sub-10 or 20 tend to not add that much value and maybe fail to have a high Spam Score. You might want to look more closely at them before deciding whether you should get a link.

But new websites that have just popped up or sites that have very few links or local links, that is just fine. If they are high-quality sites that give out links editorially and they link to other good places, you shouldn’t fret or worry that just because their DA is low, they’re going to provide no value or low value or hurt you. None of those things are the case.

2. Never get links from any directories

I know where this one comes from. We have talked a bunch about how low-quality directories, SEO-focused directories, paid link directories tend to be very bad places to get links from. Google has penalized not just a lot of those directories, but many of the sites whose link profiles come heavily from those types of domains.

However, lots and lots of resource lists, link lists, and directories are also of great quality. For example, I searched for a list of Portland bars — Portland, Oregon, of course known for their amazing watering holes. I found PDX Monthly’s list of Portland’s best bars and taverns. What do you know? It’s a directory. It’s a total directory of bars and taverns in Portland. Would you not want to be on there if you were a bar in Portland? Of course, you would want to be on there. You definitely want those. There’s no question. Give me that link, man. That is a great freaking link. I totally want it.

This is really about using your good judgment and about saying there’s a difference between SEO and paid link directories and a directory that lists good, authentic sites because it’s a resource. You should definitely get links from the latter, not so much from the former.

3. Don’t get links too fast or you’ll get penalized

Let’s try and think about this. Like Google has some sort of penalty line where they look at, “Oh, well, look at that. We see in August, Rand got 17 links. He was under at 15 in July, but then he got 17 links in August. That is too fast. We’re going to penalize him.”

No, this is definitely not the case. I think what is the case, and Google has filed some patent applications around this in the past with spam, is that a pattern of low-quality links or spammy-looking links that are coming at a certain pace may trigger Google to take a more close look at a site’s link profile or at their link practices and could trigger a penalty.

Yes. If you are doing sketchy, grey hat/black hat link building with your private networks, your link buys, and your swapping schemes, and all these kinds of things, yeah, it’s probably the case that if you get them too fast, you’ll trip over some sort of filter that Google has got. But if you’re doing the kind of link building that we generally recommend here on Whiteboard Friday and at Moz more broadly, you don’t have risk here. I would not stress about this at all. So long as your links are coming from good places, don’t worry about the pace of them. There’s no such thing as too fast.

4. Don’t link out to other sites, or you’ll leak link equity, or link juice, or PageRank

…or whatever it is. I really like this illustration of the guys who are like, “My link juice. No!” This is just crap.

All right, again, it’s a myth rooted in some fact. Historically, a long time ago, PageRank used to flow in a certain way, and it was the case that if a page had lots of links pointing out from it, that if I had four links, that a quarter each of the PageRank that this page could pass would go to each of them. So if I added one more, oh, now that’s one-fifth, then that becomes one-fifth, and that becomes one-fifth. This is old, old, old-school SEO. This is not the way things are anymore.

PageRank is not the only piece of ranking algorithmic goodness that Google is using in their systems. You should not be afraid of linking out. You should not be afraid of linking out without a “nofollow” link. You, in fact, should link out. Linking out is not only correlated with higher rankings. There have also been a bunch of studies and research suggesting that there’s something causal going on, because when followed links were added to pages, those pages actually outranked their non-link-carrying brethren in a bunch of tests. I’ll try and link to that test in the Whiteboard Friday. But regardless to say, don’t stress about this.

5. Variations in anchor text should be kept to precise proportions

So this idea that essentially there’s some magic formula for how many of your keyword anchor text, anchor phrases should be branded, partially branded, keyword match links that are carrying anchor text that’s specifically for the keywords you’re trying to rank for, and random assorted anchor texts and that you need some numbers like these, also a crazy idea.

Again, rooted in some fact, the fact being if you are doing sketchy forms of link building of any kind, it’s probably the case that Google will take a look at the anchor text. If they see that lots of things are kind of keyword-matchy and very few things contain your brand, that might be a trigger for them to look more closely. Or it might be a trigger for them to say, “Hey, there’s some kind of problem. We need to do a manual review on this site.”

So yes, if you are in the grey/black hat world of link acquisition, sure, maybe you should pay some attention to how the anchor text looks. But again, if you’re following the advice that you get here on Whiteboard Friday and at Moz, this is not a concern.

6. Never ask for a link directly or you risk penalties

This one I understand, because there have been a bunch of cases where folks or organizations have sent out emails, for example, to their customers saying, “Hey, if you link to us from your website, we’ll give you a discount,” or, “Hey, we’d like you to link to this resource, and in exchange this thing will happen,” something or other. I get that those penalties and that press around those types of activities has made certain people sketched out. I also get that a lot of folks use it as kind of blackmail against someone. That sucks.

Google may take action against people who engage in manipulative link practices. But for example, let’s say the press writes about you, but they don’t link to you. Is asking for a link from that piece a bad practice? Absolutely not. Let’s say there’s a directory like the PDX Monthly, and they have a list of bars and you’ve just opened a new one. Is asking them for a link directly against the rules? No, certainly not. So there are a lot of good ways that you can directly ask for links and it is just fine. When it’s appropriate and when you think there’s a match, and when there’s no sort of bribery or paid involvement, you’re good. You’re fine. Don’t stress about it.

7. More than one link from the same website is useless

This one is rooted in the idea that, essentially, diversity of linking domains is an important metric. It tends to be the case that sites that have more unique domains linking to them tend to outrank their peers who have only a few sites linking to them, even if lots of pages on those individual sites are providing those links.

But again, I’m delighted with my animation here of the guys like, “No, don’t link to me a second time. Oh, my god, Smashing Magazine.” If Smashing Magazine is going to link to you from 10 pages or 50 pages or 100 pages, you should be thrilled about that. Moz has several links from Smashing Magazine, because folks have written nice articles there and pointed to our tools and resources. That is great. I love it, and I also want more of those.

You should definitely not be saying “no.” You shouldn’t be stopping your link efforts around a site, especially if it’s providing great traffic and high-quality visits from those links pointing to you. It’s not just the case that links are there for SEO. They’re also there for the direct traffic that they pass, and so you should definitely be investing in those.

8. Links from non-relevant sites or sites or pages or content that’s outside your niche won’t help you rank better

This one, I think, is rooted in that idea that Google is essentially looking and saying like, “Hey, we want to see that there’s relevance and a real reason for Site A to link to Site B.” But if a link is editorial, if it’s coming from a high-quality place, if there’s a reason for it to exist beyond just, “Hey, this looks like some sort of sketchy SEO ploy to boost rankings,” Googlebot is probably going to count that link and count it well.

I would not be worried about the fact that if I’m coffeekin.com and I’m selling coffee online or have a bunch of coffee resources and corvettecollectors.com wants to link to me or they happen to link to me, I’m not going to be scared about that. In fact, I would say that, the vast majority of the time, off-topic links from places that have nothing to do with your website are actually very, very helpful. They tend to be hard for your competitors to get. They’re almost always editorially given, especially when they’re earned links rather than sort of cajoled or bought links or manipulative links. So I like them a lot, and I would not urge you to avoid those.

So with that in mind, if you have other link ideas, link myths, or link facts that you think you’ve heard and you want to verify them, please, I invite you to leave them in the comments below. I’ll jump in there, a bunch of our associates will jump in there, folks from the community will jump in, and we’ll try and sort out what’s myth versus reality in the link building world.

Take care. We’ll see you again next week for another edition of Whiteboard Friday.

Video transcription by Speechpad.com

Feeling inspired by reality? Start building quality links with OSE.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Why Content Marketing’s Future Depends on Shorter Content and Less Content

Posted by ronell-smith

Steve Rayson’s latest BuzzSumo article is provocative, interesting and well-written. But I do hope he’s wrong when he says the future will be about more content, not less. He shares why he thinks content marketing brands will begin producing more content in the days ahead, and how they’ll likely be successful by doing so.

Upon reading the piece, I did a facepalm. I was reminded of a conversation I had a few years back, when I walked into the break room of the agency I was working for, and almost bumped into the content specialist on my team.

After we exchanged pleasantries, she informed me of an unwise decision she was about to make.

Her: “Guess what? I’m going to run a marathon.”

Me: “Why?”

Her: “I think it’ll be fun.”

Me: “OK. How many marathons have you run? And have you been training for this one?”

Her: “I’ve never ran one, but there are a lot of training guides online; they say it only takes 17 weeks to train for it.”

Me: “…”

57d04262b548d8.50940299.jpg

The philosophy of doing a lot what we don’t yet do well is ruining content marketing — and the knees, joints and backs of wannabe marathoners.

If you doubt that, please explain why 90% of what’s published online barely rises to the level of crap.

Anyone who disagrees with that statement is either (a) fooling themselves or (b) never had to conduct a content audit.

Even for big brands, producing quality content with frequency is seemingly near-impossible task

Therefore, when someone says “create more content,” I hear “brands will continue to waste resources that would be better spent elsewhere,” for now. Worse still, it means they’ll see the failure as not one of execution, but born of content marketing itself.

57cff051a5e331.21709689.jpg

Rayson is a solid content marketer working for a brand with a strong product. I admire them both. And while I don’t mean to attack him, I would like to tackle the logic of the post, which I’ll excerpt below.

[Eds. note: The primary reason I chose to tackle this topic is because content frequency and content length remain two of the biggest albatrosses impacting our industry. Despite this fact, many fail to see how related they are. That is, many brands are failing fast by chasing the long-form posts and frequent posting unicorn. Also, I’m very clear in understanding that Rayson is not advocating for quantity at the expense of quality. My contention is simply that quantity is typically the wrong goal, at least for the vast majority of brands.]

You’re a brand who publishes content, not a brand publisher

The Washington Post now publishes around 1,200 posts a day. That is an incredible amount of content. My initial reaction when I read the statistic was ‘surely that is too much, the quality will suffer, why produce so much content?’ The answer seems to be that it works. The Post’s web visitors have grown 28% over the last year and they passed the New York Times for a few months at the end of 2015.

As a former journalist who spent four years in a newsroom, I’ve always been against the brands as publisher mantra, in large part because, well, as a brand you ARE NOT a publisher. Publishing content no more makes you a publisher than running 26 miles makes someone a marathoner. Newsrooms are built to produce lots of content.

There are often dozens of editors, copy editors, line editors and writers on staff, so quality control is baked in and a priority. Additionally, a newspaper writer can easily write several stories a day and not break a sweat, owing to an environment that places premium on speed.

By contrast, most many content marketers use junior writers or, worse still, content mills, that deliver low-quality posts for $20.

It’s very unlikely that attempting to follow the path of newspapers would prove fruitful.

Better idea: Determine the cadence with which your brand can create uniquely valuable content, which Rand defined and described in a 2015 Whiteboard Friday. The key is to focus the lion’s share of your attention on creating content that’s exclusive and recognized as best-by-far in its class.

57d0562dc1a313.02470821.jpg

Will WaPo’s strategy work for your brand?

I think whilst it is true that content will take a wider range of forms, including interactive content, the future is not less content but the opposite.

My reasoning is based on a number of factors including the effectiveness of the strategy adopted by the Post and others. … As we noted above the number of pages Google has indexed over 7 years from 2008 to 2014 has increased from 1 trillion to 30 trillion.

That is an increase of 29 trillion pages in 7 years. The number of net additional pages indexed by Google each year appears to be increasing, it was 3 trillion in 2010, 5 trillion in 2012 and 8 trillion in 2014.

I’m of the opinion that seeing WaPo’s strategy as anything but “effective for them” is a mistake. As anyone who’s been around the marketing space for any amount of time can attest, chasing what another brand has been successful at is a bad idea. Yes, you should be aware of what the competition is doing, but seeing their success as anything more than unique to them, or their vertical, is a recipe for pain.

Remember, too, that WaPo isn’t selling anything but ad space, not products, so the more real estate the better for them/businesses like them.

Also, the rapid rise in number of pages indexed by Google would seem to highlight one thing: A lot of brands are investing in content; it doesn’t mean a lot of brands are being successful with it.

Better idea: After finding your cadence and nailing quality consistently, test frequency along with elements such as length and content type to find the right balance for your brand.

57d010babc2809.58087145.jpg

Quality and quantity typically go in the opposite direction

As the costs of production, storage and distribution fell, particularly with online and digital products, it became economically attractive to provide products for the long tail niche audience, in fact revenue from the long tail became greater than the hits because the tail was very long indeed. Companies like Amazon and Netflix were arguably some of the first long tail companies.

Unlike WaPo, which buys ink by the proverbial barrel and has a stout staff, most brands have razor-thin content teams, increasing the likelihood that producing more and more content means increased expenditure as new team members must be hired and vetted or contractors are hired.

As I experienced while working for an agency, brands expect that as the cost rises, so too do their rankings and traffic, which is not typically the case. And when those two don’t move in lockstep, the spigot is shut off, often for good.

Better idea: Develop a goal for your content that’s in line with your brand’s goals, then let your marketing team test and refine the publishing schedule. You’re likely to find that the right cadence to nail quality is fewer but bigger content pieces.

Don’t conflate strategy with the goal

By creating over 1,000 pieces of content a day you are more likely to cater for demand in the long tail for specific niche content or simply to produce content that engages a wider audience. … Sites such as BuzzFeed have also increased their content production, the Atlantic recently reported the following figures:
April 2012 BuzzFeed published 914 posts and 10 videos
April 2016 BuzzFeed published 6,365 posts and 319 videos

Again, these are — even in the case of BuzzFeed — media companies we’re talking about, so it’s not surprising that traffic, frequency and quality can continue in the same direction. For most brands, two out of three is the gold standard and one out of three is the norm.

Better idea: Stop thinking you’re a media company. It’s OK to adopt a strategy that includes more frequent publishing, but that strategy must fit inside your brand’s overall goals, not vice-versa.

Shares are the cotton candy of content marketing

When I looked recently at the most shared content published by marketing and IT sites, the data confirmed that on average long form posts achieved more shares. But when I looked in more detail at the 50 most shared posts, 45 of them were short form and under 1,000 words. Thus people are very happy to share short form content and given the pressures on everyone’s time may prefer short form content. …

I personally think there is a big opportunity for short form content and I aim to adapt my strategy to focus more on repurposing and republishing short form versions of my research that focus on specific issues. These could be focused around just a single image or chart.

On this point, I largely agree with Rayson insofar as shorter content, with rare exception, should be a part of your brand’s content strategy (this post notwithstanding). I know, I know, many of you do very well with posts of varying lengths. I get that. What I’m saying is your content should be assigned, not by your whims or the needs of the brand, but by the needs of the audience.

And certainly not based on shares, which, as we know from a recent Moz and BuzzSumo post, do not correlate with the all-important links.

In many cases and for many brands, shares are a distraction serving to keep our attention away from the important elements of content marketing. I liken them to the cotton candy at the county fair: a lot of puff, but not nearly as filling as that smoked turkey leg.

57d0438e9b99b5.91403141.png

When creating content, we should begin with empathy being top-of-mind. That’s when you can allow your inner journalist to soar:

  • Who benefits most from this information (i.e., who, specifically, am I talking to?)
  • What are their specific needs?
  • Why is my brand uniquely qualified to satisfy those needs?
  • How can I best depict and share the information?
  • When is the optimal time to create, share and promote it?

Notice I never mentioned length. That was intentional.

The length of your content should be determined by your audience, not your brand.

A recent study by Chartbeat, which looked at user behavior across 2 billion visits over the web during the course of a month, found that 55% of visitors spent fewer than 15 seconds actively on a page. 15 seconds!

Better idea: If readers aren’t spending a great deal of time on our site’s we should reward them, not punish them: create short but meaty posts; share graphics with a few lines of commentary to invite comments; share videos or podcasts you’ve enjoyed, as curated content; or ask a question, then be the first answer, nudging others to dive into the fray.

Whatever direction you decide to go in, do so with guidance from your audience and/or would-be audience.

Imagine a world filled with web searcher advocates

Again, this post is not meant as an attack on Raysons’ post. If anything, I wanted to take the opportunity to reiterate to folks that content marketing isn’t an either/or game; it’s a long-haul game, a “this and that” game, an iterative game.

As someone who’s been made sick from doing deep dives into clients’ content, I feel strongly that we often need to protect brands from themselves. Big budgets and large teams don’t prevent marketers from making bad decisions.

I’ve made it clear to prospects and clients that I’m there as an advocate for them, but first and foremost I’m an advocate for web searchers. The more and the better I can help brands be the chosen result (not merely the top result), consistently, the happier we will all be.

Who’s willing to join me on the web searcher advocate crusade?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

​The Future of the Moz Community

Posted by Dr-Pete

As many of you know, Moz recently went through a major reorganization, which included the loss of 28% of our staff. Our Community team was heavily impacted, which has understandably led to speculation about the future of the Moz Community. I want to specifically address those concerns, project by project. The Moz Community is an essential part of our past and future, and while we can’t ignore the reality and difficulty of our recent losses, we believe strongly in our Community and are doing our best to chart a path forward.

A personal note

I asked to write this post, knowing it wouldn’t be easy. I’ve been a member of the Moz Community for almost 10 years. When my first YouMoz post was promoted in April 2007, I didn’t realize it would be the start of a decade-long journey. The Moz Community made my career in SEO possible, and I’ll always be grateful for that.

The people affected by the past few weeks are my peers and friends, and I take that loss personally. It’s ok to take that personally. At the same time, there are 160 peers and friends still at Moz trying to figure out how to do more with less, and they believe in our Community, too. We will make mistakes along the way, and we will need your help.

I and the entire Moz team would also like to thank the departing members of the Community team – Jen, Erica, Charlene, and Matt – for all of their contributions over the years building and maintaining a thriving community. This has been a month of difficult decisions driven by both unpleasant financial realities and shifts in Moz strategy, but we do not take their contributions for granted.

A few clarifications

Before diving in project-by-project, I’d like to clarify a few points. First, we did not lose the entire Community team – Megan and Danielle are still at Moz, and they’ve been with the team since 2011 and 2013, respectively. They know our Community well.

The day-to-day of our main blog is (and has been) run by members of our Audience Development team (Trevor and Felicia, with the help of the Marketing team), which is separate from the Community team. Moz Q&A and Social are a joint effort between Community, Customer Support, and a group of dedicated industry experts known as Moz Associates. It takes a lot of hard-working, dedicated people to create a world-class community.

The Big List

Here is a list of all of our major Moz Community-focused projects, and the status of those projects as best we know today. I will try to be as transparent as possible.

(1) MozCon 2016

Please be assured that MozCon 2016 is full speed ahead. Erica and Charlene graciously agreed to stay with Moz through the conference, and everything will proceed as originally planned. We look forward to seeing many of you in Seattle next week. When you see them, please thank Erica and Charlene for everything they’ve done to make MozCon a great event.

(2) MozCon 2017

We’ve had many conversations about MozCon 2017 in the past two weeks, and have committed to moving forward with our flagship event. As originally planned, MozCon 2017 will take place in Seattle, from July 17–19. We realize that an event of the size and quality of MozCon is not an easy thing to pull off, but we have many team members who have been actively involved in past events and we will collectively work hard to maintain the MozCon tradition.

(3) Moz Blog

Before Moz was a company and long before it was a product, there was a blog. The Moz Blog has our full support, will remain a core part of our Community, and we will continue to support and update big content projects, including The Beginner’s Guide to SEO. We are 100% committed to maintaining strong educational resources for the SEO community.

(4) YouMoz

Prior to the reorganization, we had started some difficult conversations about YouMoz. As our Community and the entire world of content marketing has evolved, the quantity of submissions has increased while the quality has suffered. This left our team spending a large amount of time on managing the queue and editing posts. It also meant that good posts had to wait longer to be published, frustrating our best contributors.

In the near future, YouMoz will be phased out in favor of a better guest contributor process and system for the Moz Blog. Our hope is to offer guest authors higher-profile opportunities on the main blog. We will also be exploring ways to allow our community to pitch blog topic ideas without submitting an entire post, to save everyone time and frustration.

I have a long, personal history with YouMoz, and this is a difficult decision, but as a content marketer I also know that our world has changed dramatically in the past couple of years. We will do our best to adapt to those changes and give our Community the chance to contribute in meaningful ways.

(5) Moz Q&A

A few months back, we started looking for a new technology platform for the Q&A forum, one that could better serve our evolving community. Those plans will continue forward. Our Community team is working hard to launch a better Q&A engine that can support both our Moz Pro and Moz Local customer communities, as well as the broader SEO community. We are fully committed to a new and improved Q&A in the coming months that supports a wide range of SEO conversations and helps the next generation of marketers grow in their careers.

(6) Social media

Obviously, the Community team had a huge hand in growing and managing our various social channels. Many of those channels have also changed, with Facebook pushing hard toward paid inclusion and Google+ facing an uncertain future. Megan, Danielle, and our Customer Support team are committed to actively supporting Twitter and our other existing channels, even as we look for the best ways to engage our Community in the broader social world.

We will continue to explore new channels, such as Instagram, as well as better ways to engage with old channels, including LinkedIn. Thanks to support teams in the UK and Australia, we will soon have 24/7 social coverage on major channels.

The coming months

Ultimately, our commitment to the Moz Community will be judged by our actions in the coming weeks and months and not just by our words. Our resources are more constrained now, but our dedication is as strong as ever. I’d like to thank all of you for supporting us over the years, and I hope that you will continue this journey with us as we explore the future of the industry and the Community.

If you have specific questions or concerns, please feel free to ask in the comments, and I will do my best to address them (or find someone on our team who can). We look forward to seeing many of you at MozCon next week!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

The Most Effective Way to Improve Sitewide Quality and Rankings (Most of the Time)

Posted by Everett

Quality and relevance are different things, yet we often discuss them as if they were the same. SEOs have been optimizing for relevance all this time, but are now being asked to optimize for quality. This post will discuss what that means and how to do it, with a focus on what I believe to be the single most effective and scaleable tactic for improving your site’s overall level of quality in most situations.

Potential Ranking FactorsYou need BOTH quality AND relevance to compete these days.

First, let’s establish what we’re talking about here, which is quality. You can have relevancy (the right topic and keywords) without quality, as shown here:

Highly Relevant. But Very Low Quality.This MFA site is highly relevant for the phrase “Chairs for Baby.” But it also sucks.

“…babies are sensitive, delicate individuals who need cautious. So choosing a right Chairs For Baby is your gentle care.” WTF?

It doesn’t matter how relevant the page is. The only way to get that page to rank these days would be to buy a lot of links, but then you’re dealing with the added risk. After a certain point, it’s just EASIER to build a better site. Yes, Google has won that battle in all but the most hyper-competitive niches, where seasoned experts still find the reward:risk ratio in their favor.

Quality + Relevance = Ranking

OK, now that we’ve established that quality and relevance are different things, but that you need both to rank, how do you optimize for quality?

Quality Indicators

Tactics (How to Optimize for Quality)

Grammar, spelling, depth Hire a copy editor. Learn to write better.
Expertise, Authority, Trust (EAT) Make content deep and useful. Call out your awards, certifications, news coverage, and use trust symbols. Make it easy to contact you, and easy to find policies like terms of service, returns, and privacy.
PageRank (PR) from links Build high-quality links. There are thousands of great articles out there about it.
Reviews Ask your best clients/customers.
Short-clicks Improve Out-of-Stock pages with in-stock alerts and related products. Be less aggressive with your pop-up strategy. Also see query refinements, pages per visit, and dwell time.
Query refinements Only attract the “right” audience and deliver what you promised in the search results. Choose keywords by relevance, not just volume. Think about query intent.
Dwell time on site Make your pages stickier by improving the overall design. Add images and video. Run GA reports on the stickiest traffic sources and develop a strategy to increase traffic from those sources.
Pages per visit Improve internal linking, suggest related content, customize 404 pages. Split up long posts into a series.
Conversion rates Do A/B testing, make your messaging very clear, follow CRO best practices.
Ad placements No ads above the main content on the page and do not have an annoying amount of ad blocks or pop-ups.
CTR from SERPs Craft better title tags, URLs, and descriptions. Grab attention. Gain trust. Make them want to click. Add schema markup when appropriate.
Page speed Visit https://developers.google.com/speed/pagespeed/insights/.
Mobile experience Responsive and/or adaptive design, less text, easy navigation, loads lighting fast, shorter forms.

There are many ways to improve the quality of your site. Some are obvious. Others are more abstract. All of these quality indicators together make up your site’s overall level of quality. For more, check out the keyword agnostic ranking factors and the engagement metrics from SimilarWeb areas in the Moz Search Engine Ranking Factors report.

We’ve already established that Google knows the relative quality of a page. Let’s assume — because it is very likely — that Google also knows the relative quality of your entire site. And let’s call that your sitewide QualityRank (QR) (h/t Ian Lurie, 2011).

What’s the most effective, scalable, proven tactic for improving a website’s QR?

In a word: Pruning.

Learn more about it here. Pruning requires doing a content audit first, which you can learn more about here. It’s nothing groundbreaking or new, but few clients come in the door that can’t substantially benefit from this process.

Sometimes pruning is as easy as applying a robots.txt noindex tag to all pages that have had zero organic search traffic over the course of a year. You may be surprised how many enterprise-level sites have huge chunks of the site that fit that criteria. Other times it requires more analysis and tougher decisions. It really all depends on the site.

So let’s look at some pruning case studies.

Three things to remember:

1. Significant portions of the sites were removed from Google’s index.

2. Pruning was not the only thing that was done. Rarely do these things happen in a vacuum, but evidence points to pruning as a major contributor to the growth examples you’re about to see.

3. You can read more about Inflow’s research in our case studies section.

SEO Case Study1800doorbell had technical issues that made it possible to identify cruft and prune big chunks of the site quickly. This contributed to a 96% increase in revenue from organic search within six months.

The dip at the end has to do with how the timeline was generated in GA (i.e. an incomplete month).The dip at the end has to do with how the timeline was generated in GA. Growth was sustained.

We’re not the only ones finding success with this. Go check out the Ahrefs case study for another example. Here’s a compelling image from their case study:

Read the ahref Blog Pruning Case StudyAhrefs saw amazing results after performing a content audit and pruning their blog.

If you weren’t already convinced, I hope by now it’s clear that performing a content audit to determine which pages should be improved and which should be pruned from Google’s index is an effective and scalable SEO tactic. That being established, let’s talk about why this might be.

We don’t know exactly how Google’s ranking algorithms work. But it seems likely that there is a score for a site’s overall level of quality.

Does QualityRank actually exist as a calculation in Google’s organic algorithm? Probably not under that name and not in this simplistic form. But it DOES seem likely that something similar would exist, especially since it exists for PPC. The problem I have with the PPC equivalent is that it includes relevance factors like keyword use in their metric for “quality.”

Google needs a way to measure the overall quality level of each site in order to rank them properly. It’s just probably much more mathematically complicated than what we’re talking about here.

The point of discussing QualityRank as a framework for pruning is to help explain why pruning works. And to do that, we don’t need to understand the complex formulas behind Google’s ranking algorithms. I doubt half of the engineers there know what’s going on these days, anyway.

Let’s imagine a site divided into thirds, with each third being assigned a QualityRank (QR) score based on the average QR of the pages within that section.

The triangle below represents all indexable content on a domain with a QR of 30. That sitewide QR score of 30 comes from adding all three of these sections together and dividing by three. In the real world, this would not be so simple.

Before Pruning PyramidI hope the mathematicians out there will grant me some leeway for the sake of illustrating the concept.

This is the same site after removing the bottom 50 percent from the index:

After Pruning Pyramid

Notice the instant lift from QR 30 to QR 40 just by removing all LOW QR pages. That is why I say pruning is the most effective way to raise your site’s overall quality level for better rankings, IF you have a lot of low-quality pages indexed, which most sites do.

Time to switch analogies

Pruning works because it frees up the rest of your content from being weighed down by the cruft.

“Cruft” includes everything from a 6-year-old blog post about the company holiday party to 20 different variants with their own landing pages for every product. It also includes pages that are inadvertently indexed for technical reasons, like faceted navigation URLs.

iceberg-1-seo

Remove the bottom half of this iceberg and the rest of it will “rise up,” making more of it visible above the surface (i.e. on the first 2–3 pages of Google).

Iceberg SEO Content Cruft

The idea of one page being weighed down by another has been around at least since the first Panda release. I’m not writing about anything new here, as evidenced by the many resources below. But I’m constantly surprised by the amount of dead weight most websites continue to carry around, and hope that this post motivates folks to finally get rid of some of it. Your choices are many: 404, 301, rel =”canonical”, noindex, disallow… Some of the resources below will help you decide which solutions to use for your unique situation.

Pruning for SEO resources:


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →