Archives for 

seo

Penguin 2.0/4 – Were You Jarred and/or Jolted?

Posted by Dr. Pete

The long-awaited Penguin 2.0 (also called “Penguin 4”) rolled out on Wednesday, May 22nd. Rumor has been brewing for a while that the next Penguin update would be big, and include significant algorithm changes, and Matt Cutts has suggested more than once that major changes are in the works. We wanted to give the dust a day to settle, but this post will review data from our MozCast Google weather stations to see if Penguin 2.0 really lives up to the hype.

Short-Term MozCast Data

First things first – the recorded temperature (algorithm “flux”) for May 22nd was 80.7°F. For reference, MozCast is tuned to an average temperature of about 70°, but the reality is that that average has slipped into the high 60s over the past few months. Here’s a 7-day history, along with a couple of significant events (including Penguin 1.0):

MozCast Temperatures (for 7 days around Penguin 2.0)

By our numbers, Penguin 2.0 was about on par with the 20th Panda update. Google claimed that Penguin 2.0 impacted about 2.3% of US/English queries, while they clocked Panda #20 at about 2.4% of queries (see my post on how to interpret “X% of queries“). Penguin 1.0 was measured at 3.1% of queries, the highest query impact Google has publicly reported. These three updates seem to line up pretty well between temperature and reported impact, but the reality is that we’ve seen big differences for other updates, so take that with a grain of salt.

Overall, the picture of Penguin 2.0 in our data confirms an update, but it doesn’t seem to be as big as many people expected. Please note that we had a data collection issue on May 20th, so the temperatures for May 20-21 are unreliable. It’s possible that Penguin 2.0 rolled out over two days, but we can’t confirm that observation.

Temperatures by Category

In addition to the core MozCast data, we have a beta system running 10K keywords distributed across 20 industry categories (based on Google AdWords categories). The average temperature for any given category can vary quite a bit, so I looked at the difference between Penguin 2.0 and the previous 7 days for each category. Here they are, in order by most impacted (1-day/7-day temps in parentheses):

  • 33.0% (80°/60°) – Retailers & General Merchandise
  • 31.2% (81°/62°) – Real Estate
  • 30.8% (90°/69°) – Dining & Nightlife
  • 29.1% (89°/69°) – Internet & Telecom
  • 26.0% (82°/65°) – Law & Government
  • 24.4% (79°/64°) – Finance
  • 23.5% (81°/65°) – Occasions & Gifts
  • 20.8% (88°/73°) – Beauty & Personal Care
  • 17.3% (70°/60°) – Travel & Tourism
  • 15.7% (87°/75°) – Vehicles
  • 15.5% (84°/73°) – Arts & Entertainment
  • 15.4% (72°/62°) – Health
  • 15.0% (83°/72°) – Home & Garden
  • 14.2% (78°/69°) – Family & Community
  • 13.4% (79°/70°) – Apparel
  • 13.1% (78°/69°) – Hobbies & Leisure
  • 12.0% (74°/66°) – Jobs & Education
  • 11.5% (88°/79°) – Sports & Fitness
  • 7.8% (75°/70°) – Food & Groceries
  • -3.7% (70°/73°) – Computers & Consumer Electronics

Retailers and Real Estate came in at the top, with just over 30% higher than average temperatures. Consumer Electronics rounded out the bottom, with slightly lower than average flux, oddly. Of course, split 20 ways, this represents a relatively small number of data points for each category. It’s useful for reference, but I wouldn’t read too much into these breakdowns.

“Big 20” Sub-domains

Across the beta 10K data-set, we track the top sub-domains by overall share of SERP real-estate. Essentially, we count how many page-1 positions each sub-domain holds and divide it across the entire data set. These were the Big 20 sub-domains for the day after Penguin 2.0 hit, along with their SERP share and 1-day change:

  1. 5.66% (+0.29%) – en.wikipedia.org
  2. 2.35% (-0.75%) – www.amazon.com
  3. 2.22% (+3.11%) – www.youtube.com
  4. 1.49% (+6.05%) – www.facebook.com
  5. 1.35% (-8.11%) – www.yelp.com
  6. 0.84% (+4.77%) – twitter.com
  7. 0.58% (+0.37%) – www.webmd.com
  8. 0.58% (+1.87%) – pinterest.com
  9. 0.52% (+1.24%) – www.walmart.com
  10. 0.49% (+4.54%) – www.tripadvisor.com
  11. 0.47% (+0.45%) – www.foodnetwork.com
  12. 0.47% (-0.44%) – allrecipes.com
  13. 0.44% (+1.98%) – www.ebay.com
  14. 0.41% (-0.76%) – www.mayoclinic.com
  15. 0.38% (+1.72%) – www.target.com
  16. 0.37% (-4.37%) – www.yellowpages.com
  17. 0.37% (+0.58%) – popular.ebay.com
  18. 0.36% (+2.12%) – www.huffingtonpost.com
  19. 0.33% (+3.27%) – www.overstock.com
  20. 0.32% (-0.32%) – www.indeed.com

By percentage change, Yelp was the big day-over-day loser, at -8.11%, and Twitter picked up the highest percentage, at +4.77%. In absolute positions, YouTube picked up the most page-1 rankings, and Yelp was still the biggest loser. Overall, the Big 20 occupied 20.00% of the page-1 real estate the day after Penguin 2.0, up from 19.88% the previous day, picking up a modest number of ranking positions.

3rd-Party Analyses

I’d just like to call out a few analyses that were posted yesterday based on unique data, since there are bound to be a lot of speculative posts in the next few weeks. SearchMetrics posted its Penguin 2.0 biggest losers list, with porn and gaming sites taking the heaviest losses (Search Engine Land provided additional analysis). GetStat.com showed a jump in Top 100 rankings for big brands, but relatively small changes for most sites, and most of those changes on pages 3+ of SERPs.
 

Most reports yesterday showed relatively modest day-over-day changes (solid evidence of an algorithm update, but not a particularly big update). One exception was Dejan SEO’s Australian flux tracker, Algoroo, which showed massive day-over-day flux. We believe that at least two other major algorithm updates have rolled out in May in the US, so it’s possible that multiple updates were combined and hit other countries simultaneously. This is purely speculative, but no other reports seem to suggest changes on the scale of the Australian data.

The May 9th Update

I’d like to also call out an unconfirmed algorithm update in early May. There was a period of heavy flux for a few days at the beginning of the month, which was backed up by webmaster chatter and other 3rd-party reports. Temperatures on May 9th reached 83.3°F. The MozCast 7-day graph appears below:

May 9th Algo Update

The temperature spike on May 5th is unconfirmed, and may have been a test across a small number of data centers (unfortunately, our 10K data for that day was running a separate test and so we can’t compare the two data sets). Reports of updates popped up across this time period, but our best guess is May 9th. Interestingly, traffic to MozCast tends to reveal when people suspect an update and are looking for confirmation, and the traffic pattern shows a similar trend:

MozCast May Traffic

Traffic data also suggest that May 5th was probably an anomaly. Private data from multiple SEOs shows sites gradually losing traffic over a couple of days in this period. Unfortunately, we have no clear explanation at this time, and I do not believe that this was directly related to Penguin 2.0. Google did roll out a domain crowding update at some point in the past couple of weeks, which may be connected to the early May data, but we don’t have solid evidence either way. At this point, though, I strongly believe that the data indicates a significant algorithm update around May 9th.

Were You Hit by Penguin 2.0?

It’s important to keep in mind that all of this is aggregate data. Algorithm updates are like unemployment rates. If the unemployment rate is 10%, the reality for any individual is still binary – you either have a job or you don’t. You can weather 20% unemployment if you have a job (although you may worry more), and 5% unemployment is little comfort if you’re jobless. I don’t want to suggest any lack of empathy for those hit by Penguin 2.0 by suggesting that the update was relatively small, but overall the impact seems to be less jarring and jolting than many people feared. If you were hit, please share your story in the comments.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Penguin 2.0/4 – Were You Jarred and/or Jolted?

Posted by Dr-Pete

The long-awaited Penguin 2.0 (also called “Penguin 4”) rolled out on Wednesday, May 22nd. Rumor has been brewing for a while that the next Penguin update would be big, and include significant algorithm changes, and Matt Cutts has suggested more than once that major changes are in the works. We wanted to give the dust a day to settle, but this post will review data from our MozCast Google weather stations to see if Penguin 2.0 really lives up to the hype.

Short-Term MozCast Data

First things first – the recorded temperature (algorithm “flux”) for May 22nd was 80.7°F. For reference, MozCast is tuned to an average temperature of about 70°, but the reality is that that average has slipped into the high 60s over the past few months. Here’s a 7-day history, along with a couple of significant events (including Penguin 1.0):

MozCast Temperatures (for 7 days around Penguin 2.0)

By our numbers, Penguin 2.0 was about on par with the 20th Panda update. Google claimed that Penguin 2.0 impacted about 2.3% of US/English queries, while they clocked Panda #20 at about 2.4% of queries (see my post on how to interpret “X% of queries“). Penguin 1.0 was measured at 3.1% of queries, the highest query impact Google has publicly reported. These three updates seem to line up pretty well between temperature and reported impact, but the reality is that we’ve seen big differences for other updates, so take that with a grain of salt.

Overall, the picture of Penguin 2.0 in our data confirms an update, but it doesn’t seem to be as big as many people expected. Please note that we had a data collection issue on May 20th, so the temperatures for May 20-21 are unreliable. It’s possible that Penguin 2.0 rolled out over two days, but we can’t confirm that observation.

Temperatures by Category

In addition to the core MozCast data, we have a beta system running 10K keywords distributed across 20 industry categories (based on Google AdWords categories). The average temperature for any given category can vary quite a bit, so I looked at the difference between Penguin 2.0 and the previous 7 days for each category. Here they are, in order by most impacted (1-day/7-day temps in parentheses):

  • 33.0% (80°/60°) – Retailers & General Merchandise
  • 31.2% (81°/62°) – Real Estate
  • 30.8% (90°/69°) – Dining & Nightlife
  • 29.1% (89°/69°) – Internet & Telecom
  • 26.0% (82°/65°) – Law & Government
  • 24.4% (79°/64°) – Finance
  • 23.5% (81°/65°) – Occasions & Gifts
  • 20.8% (88°/73°) – Beauty & Personal Care
  • 17.3% (70°/60°) – Travel & Tourism
  • 15.7% (87°/75°) – Vehicles
  • 15.5% (84°/73°) – Arts & Entertainment
  • 15.4% (72°/62°) – Health
  • 15.0% (83°/72°) – Home & Garden
  • 14.2% (78°/69°) – Family & Community
  • 13.4% (79°/70°) – Apparel
  • 13.1% (78°/69°) – Hobbies & Leisure
  • 12.0% (74°/66°) – Jobs & Education
  • 11.5% (88°/79°) – Sports & Fitness
  • 7.8% (75°/70°) – Food & Groceries
  • -3.7% (70°/73°) – Computers & Consumer Electronics

Retailers and Real Estate came in at the top, with just over 30% higher than average temperatures. Consumer Electronics rounded out the bottom, with slightly lower than average flux, oddly. Of course, split 20 ways, this represents a relatively small number of data points for each category. It’s useful for reference, but I wouldn’t read too much into these breakdowns.

“Big 20” Sub-domains

Across the beta 10K data-set, we track the top sub-domains by overall share of SERP real-estate. Essentially, we count how many page-1 positions each sub-domain holds and divide it across the entire data set. These were the Big 20 sub-domains for the day after Penguin 2.0 hit, along with their SERP share and 1-day change:

  1. 5.66% (+0.29%) – en.wikipedia.org
  2. 2.35% (-0.75%) – www.amazon.com
  3. 2.22% (+3.11%) – www.youtube.com
  4. 1.49% (+6.05%) – www.facebook.com
  5. 1.35% (-8.11%) – www.yelp.com
  6. 0.84% (+4.77%) – twitter.com
  7. 0.58% (+0.37%) – www.webmd.com
  8. 0.58% (+1.87%) – pinterest.com
  9. 0.52% (+1.24%) – www.walmart.com
  10. 0.49% (+4.54%) – www.tripadvisor.com
  11. 0.47% (+0.45%) – www.foodnetwork.com
  12. 0.47% (-0.44%) – allrecipes.com
  13. 0.44% (+1.98%) – www.ebay.com
  14. 0.41% (-0.76%) – www.mayoclinic.com
  15. 0.38% (+1.72%) – www.target.com
  16. 0.37% (-4.37%) – www.yellowpages.com
  17. 0.37% (+0.58%) – popular.ebay.com
  18. 0.36% (+2.12%) – www.huffingtonpost.com
  19. 0.33% (+3.27%) – www.overstock.com
  20. 0.32% (-0.32%) – www.indeed.com

By percentage change, Yelp was the big day-over-day loser, at -8.11%, and Twitter picked up the highest percentage, at +4.77%. In absolute positions, YouTube picked up the most page-1 rankings, and Yelp was still the biggest loser. Overall, the Big 20 occupied 20.00% of the page-1 real estate the day after Penguin 2.0, up from 19.88% the previous day, picking up a modest number of ranking positions.

3rd-Party Analyses

I’d just like to call out a few analyses that were posted yesterday based on unique data, since there are bound to be a lot of speculative posts in the next few weeks. SearchMetrics posted its Penguin 2.0 biggest losers list, with porn and gaming sites taking the heaviest losses (Search Engine Land provided additional analysis). GetStat.com showed a jump in Top 100 rankings for big brands, but relatively small changes for most sites, and most of those changes on pages 3+ of SERPs.
 

Most reports yesterday showed relatively modest day-over-day changes (solid evidence of an algorithm update, but not a particularly big update). One exception was Dejan SEO’s Australian flux tracker, Algoroo, which showed massive day-over-day flux. We believe that at least two other major algorithm updates have rolled out in May in the US, so it’s possible that multiple updates were combined and hit other countries simultaneously. This is purely speculative, but no other reports seem to suggest changes on the scale of the Australian data.

The May 9th Update

I’d like to also call out an unconfirmed algorithm update in early May. There was a period of heavy flux for a few days at the beginning of the month, which was backed up by webmaster chatter and other 3rd-party reports. Temperatures on May 9th reached 83.3°F. The MozCast 7-day graph appears below:

May 9th Algo Update

The temperature spike on May 5th is unconfirmed, and may have been a test across a small number of data centers (unfortunately, our 10K data for that day was running a separate test and so we can’t compare the two data sets). Reports of updates popped up across this time period, but our best guess is May 9th. Interestingly, traffic to MozCast tends to reveal when people suspect an update and are looking for confirmation, and the traffic pattern shows a similar trend:

MozCast May Traffic

Traffic data also suggest that May 5th was probably an anomaly. Private data from multiple SEOs shows sites gradually losing traffic over a couple of days in this period. Unfortunately, we have no clear explanation at this time, and I do not believe that this was directly related to Penguin 2.0. Google did roll out a domain crowding update at some point in the past couple of weeks, which may be connected to the early May data, but we don’t have solid evidence either way. At this point, though, I strongly believe that the data indicates a significant algorithm update around May 9th.

Were You Hit by Penguin 2.0?

It’s important to keep in mind that all of this is aggregate data. Algorithm updates are like unemployment rates. If the unemployment rate is 10%, the reality for any individual is still binary – you either have a job or you don’t. You can weather 20% unemployment if you have a job (although you may worry more), and 5% unemployment is little comfort if you’re jobless. I don’t want to suggest any lack of empathy for those hit by Penguin 2.0 by suggesting that the update was relatively small, but overall the impact seems to be less jarring and jolting than many people feared. If you were hit, please share your story in the comments.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Conducting Market Research Before Investing in Tactical Execution – Whiteboard Friday

Posted by randfish

The phrase “look before you leap” has never been more true! Before you start investing in tactics, it’s important to do your market research. Many businesses are tempted to dive into the details before answering the bigger questions, like who their customers are, how those customers make purchase decisions, where their potential users are on the web, and how customers may choose between similar companies and offerings. 

In today’s Whiteboard Friday, Rand discusses why building out a research-based roadmap before you start you building your tactics (like SEO, content, and social campaigns) will help boost your chance of success. Leave your thoughts in the comments below!

Conducting Market Research Before Investing in Tactical Execution – Whiteboard Friday

For your viewing pleasure, here’s a screenshot of the whiteboard used in today’s video:
 

 

Video Transcription

“Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I want to talk a little bit about doing your market research before you start jumping in and investing in tactics. Shout out to @Andrew_Isidoro on Twitter for suggesting this topic. I really appreciate it Andrew.

The reason this is so important and why I was so passionate and why I was excited when Andrew suggested it, is because I’ve seen us here at Moz and many, many other companies back when we use to do consulting, even with the folks that I try and help today, lots of people I talk to all over the industry, making this mistake of wanting to dive right into the details and start sending their tweets and building their content, tweaking their website, set up their conversion tests, optimizing their pages for search engines, all that stuff, before they have answers to the big questions. Who’s our customer target? Where on the web are they? How do they make their purchase decisions? What are their influencers? What are the things that influence them to make a purchase or not, and how do they choose between different companies and different offerings?

If we answer these questions, we can build something really beautiful, a research based roadmap. We know things like the personas of who we’re targeting. What types of customers are we trying to reach? For example, when we launch SEOmoz Pro years ago, we thought we were just trying to target primarily, at least, in-house marketers, people who worked in-house at companies, not consultants and agencies. So we hadn’t built things like white labeling and custom reports and the ability to add your logo and all that kind of stuff, branding. Those personas were critical to getting the product right. About 40%, in fact, of our customers are agencies and consultants.

Channels, what are the channels that we’re going to reach people at? Is it social networks? Is it things like YouTube, where there’s a lot of video going on and obviously a lot of search activity? Is it Google and Bing, where the searches are taking place? Is it content? Are they only at events? Is there a very, very small set of these folks and we need to reach them initially through events or direct outreach? Do we need to build a sales pipeline and then have introductions being made? Are we going to use LinkedIn? Those channels are critical to knowing what marketing things we’re going to do.

The tactics to pursue on a per channel basis. So it could be the case that the same tactic I’m using again and again on a certain channel is going to work very well. You could see, for example, that content marketing for Moz, at least, works pretty well across all of our social channels. But it’s not exactly what we do in person. We try and have a very educational bent to a lot of our content, and that might change up a little bit depending on which forum we’re in and what kind of folks we’re trying to reach or who we’re talking to at the time. So those different tactics per channel.

We want the information. We want to know how they make purchase decisions so that we can provide the information that potential customers need to make a decision. If they’re making it based on features or based on price or based on what experts have said. Is it based on feedback? Is it based on brand? A lot of times marketing decisions are made on brand. Is it based on design and UX?

This roadmap can then tell us things like:  what goes on the website, where and how we’re going to spend the money. Is it going to be on people and resources to build up kind of a long-term marketing funnel through content and search and social, organic or inbound channels? Or is it going to be on a lot of one-off purchases of an email list that we’re going to blast or a homepage takeover or a lot of display ads, PPC ads, those kind of things?

How are we going to measure success? How do we know whether we’re actually winning? Is it based on a percentage of the market? Is it based on market share against another company? Is it pure adoption? Is it something else? Is it brand awareness?

What marketing tactics do we need to be good at? What are the ones where it’s a very competitive sphere versus the ones where it isn’t? What are things where we need to invest a lot of time and energy to build up skills and tactics versus maybe throwing dollars at it, hiring an agency to do it? All those kinds of things.

This research based roadmap can answer all of those questions for you, but you can’t do it unless you’re doing market research first. I do want to talk a little bit about some types of market research and how to specifically conduct those.

So a very obvious one, one that folks who are in the SEO and web marketing fields are very familiar with is competitive research. Competitive research, very obvious to most of us because we investigate what our competitors are doing to be successful in search results, or on Twitter, Facebook, or in their content efforts.

We can look at lots of attributes of competitive research. Who are the evangelists? Who are the people who are pushing this company, speaking on behalf of them? What are the marketing channels that they’re using? What are their traffic sources? Where are they getting visits and traffic from? This can be challenging to get to, and I won’t dive into all of these. Press and mentions? Where are they getting mentioned? By whom? What are people saying about them? Who do they compare them to? Hopefully it’s us.

Design and UX, what are they doing successfully or not so successfully on their website? Unique value propositions, what’s the angle that they take that says, “Oh this is what’s really unique about our company. This is the particular reason why you would buy- I don’t know – Columbia Sportswear brand instead of Nike or Reebok or Mountain Gear or whatever it is.” And who’s their target market? Oftentimes these two are very tied together. The UVP or USP ties in with the target market because they’re trying to reach a particular person, and they think that those specific attributes that are unique to their company are what’s going to successfully reach them.

There’s also customer research, and you can do customer research of all kinds. You can do profiling, that can be demographic or psychographic. You can do targeted surveys where essentially I have a list of customers. For example, here at Moz obviously we have a list of the 21,000 people who pay to use Moz, and we can send a targeted survey to them. We actually have a customer advisory board of about 300 folks that Jackie runs here on our product team, and she talks to those folks very directly and will send them questions to answer.

There’s also, and these are quite interesting, this is a relatively recent phenomenon, just the last few years, sizing and perceptions surveys. The two big providers for those are Survey Monkey’s Audience product and Google’s Consumer Surveys product. Essentially what they’ve got is lots of people that they advertise to, they’re sort of random citizens of the web, denizens of the web, and they will take surveys based on profile data that you request. So you can get senses of how big is my brand in a space? Have people heard of this thing that I’m trying to offer? How many people are even interested in this thing? You can ask those broad, broad questions to a random group of users with specific sets of interests or for profile features.

You can do in-person interviews. A lot of startups especially do in-person interviews. They talk to a customer, bring him into the office. What are you doing? How are you doing it now? What could you see making that process easier or better? What is something you would pay for?

Usability studies are similar, but they are actually with a finished product or a near-finished product. Wireframe reviews are sort of a little bit less of a finished product, but more of a “hey let’s walk through these wireframes and see if this product were built, would it solve your problems? Would it be something you’d passionate about, something you would buy?”

Then there’s also, there’s two more, expert data that you can gather in terms of market research, and expert data is a little bit different from customer data. So this is not saying, “Hey I want to reach out to anyone who would potentially be a customer,” but rather, “I want to reach out to the experts in the field.” This is something, again, that we do a lot of at Moz. We have kind of a core group of people inside and outside of the company who have been marketing experts, web marketing experts, for many, many years and have a lot of deep depth of knowledge in SEO and all those kinds of features. Finding those folks is really cool because a lot of times they turn out to be the evangelists and the influencers of much of the rest of the field. So by bringing them into your process, you can do those interviews, surveys, profiling, usability studies, wireframe reviews, the same as you can with customers, but potentially get very different data and oftentimes very interesting data. I would be careful, though. I’m personally biased, oftentimes, to listening to the experts at the expense of customers. Not a good idea. You should very much consider both of these folks. Experts sometimes are so deep that they can’t see the forest for the trees, which is a problem I have myself a lot of the time too.

Then the last one is published or professional data, and these are often collected by large firms, Forrester Research, for example. They put together these large scale studies on different industries. This form of data is also fine, but it’s usually a leading indicator that you then want to verify and validate with some of these other forms.

So by doing this, by doing these forms of market research, you can get the answers to these questions, build that research based roadmap, and then when you go and execute, you’ll know that you’re on the right path. This is really powerful because a lot of the time when you take off and you start diving into the details without it, it’s bad biscuits. Bad biscuits make the baker broke, bro.

All right everyone. Hope you’ve enjoyed this edition of Whiteboard Friday. We’ll see you again next week. Take care.”

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Conducting Market Research Before Investing in Tactical Execution – Whiteboard Friday

Posted by randfish

The phrase “look before you leap” has never been more true! Before you start investing in tactics, it’s important to do your market research. Many businesses are tempted to dive into the details before answering the bigger questions, like who their customers are, how those customers make purchase decisions, where their potential users are on the web, and how customers may choose between similar companies and offerings. 

In today’s Whiteboard Friday, Rand discusses why building out a research-based roadmap before you start you building your tactics (like SEO, content, and social campaigns) will help boost your chance of success. Leave your thoughts in the comments below!

Conducting Market Research Before Investing in Tactical Execution – Whiteboard Friday

For your viewing pleasure, here’s a screenshot of the whiteboard used in today’s video:
 

 

Video Transcription

“Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I want to talk a little bit about doing your market research before you start jumping in and investing in tactics. Shout out to @Andrew_Isidoro on Twitter for suggesting this topic. I really appreciate it Andrew.

The reason this is so important and why I was so passionate and why I was excited when Andrew suggested it, is because I’ve seen us here at Moz and many, many other companies back when we use to do consulting, even with the folks that I try and help today, lots of people I talk to all over the industry, making this mistake of wanting to dive right into the details and start sending their tweets and building their content, tweaking their website, set up their conversion tests, optimizing their pages for search engines, all that stuff, before they have answers to the big questions. Who’s our customer target? Where on the web are they? How do they make their purchase decisions? What are their influencers? What are the things that influence them to make a purchase or not, and how do they choose between different companies and different offerings?

If we answer these questions, we can build something really beautiful, a research based roadmap. We know things like the personas of who we’re targeting. What types of customers are we trying to reach? For example, when we launch SEOmoz Pro years ago, we thought we were just trying to target primarily, at least, in-house marketers, people who worked in-house at companies, not consultants and agencies. So we hadn’t built things like white labeling and custom reports and the ability to add your logo and all that kind of stuff, branding. Those personas were critical to getting the product right. About 40%, in fact, of our customers are agencies and consultants.

Channels, what are the channels that we’re going to reach people at? Is it social networks? Is it things like YouTube, where there’s a lot of video going on and obviously a lot of search activity? Is it Google and Bing, where the searches are taking place? Is it content? Are they only at events? Is there a very, very small set of these folks and we need to reach them initially through events or direct outreach? Do we need to build a sales pipeline and then have introductions being made? Are we going to use LinkedIn? Those channels are critical to knowing what marketing things we’re going to do.

The tactics to pursue on a per channel basis. So it could be the case that the same tactic I’m using again and again on a certain channel is going to work very well. You could see, for example, that content marketing for Moz, at least, works pretty well across all of our social channels. But it’s not exactly what we do in person. We try and have a very educational bent to a lot of our content, and that might change up a little bit depending on which forum we’re in and what kind of folks we’re trying to reach or who we’re talking to at the time. So those different tactics per channel.

We want the information. We want to know how they make purchase decisions so that we can provide the information that potential customers need to make a decision. If they’re making it based on features or based on price or based on what experts have said. Is it based on feedback? Is it based on brand? A lot of times marketing decisions are made on brand. Is it based on design and UX?

This roadmap can then tell us things like:  what goes on the website, where and how we’re going to spend the money. Is it going to be on people and resources to build up kind of a long-term marketing funnel through content and search and social, organic or inbound channels? Or is it going to be on a lot of one-off purchases of an email list that we’re going to blast or a homepage takeover or a lot of display ads, PPC ads, those kind of things?

How are we going to measure success? How do we know whether we’re actually winning? Is it based on a percentage of the market? Is it based on market share against another company? Is it pure adoption? Is it something else? Is it brand awareness?

What marketing tactics do we need to be good at? What are the ones where it’s a very competitive sphere versus the ones where it isn’t? What are things where we need to invest a lot of time and energy to build up skills and tactics versus maybe throwing dollars at it, hiring an agency to do it? All those kinds of things.

This research based roadmap can answer all of those questions for you, but you can’t do it unless you’re doing market research first. I do want to talk a little bit about some types of market research and how to specifically conduct those.

So a very obvious one, one that folks who are in the SEO and web marketing fields are very familiar with is competitive research. Competitive research, very obvious to most of us because we investigate what our competitors are doing to be successful in search results, or on Twitter, Facebook, or in their content efforts.

We can look at lots of attributes of competitive research. Who are the evangelists? Who are the people who are pushing this company, speaking on behalf of them? What are the marketing channels that they’re using? What are their traffic sources? Where are they getting visits and traffic from? This can be challenging to get to, and I won’t dive into all of these. Press and mentions? Where are they getting mentioned? By whom? What are people saying about them? Who do they compare them to? Hopefully it’s us.

Design and UX, what are they doing successfully or not so successfully on their website? Unique value propositions, what’s the angle that they take that says, “Oh this is what’s really unique about our company. This is the particular reason why you would buy- I don’t know – Columbia Sportswear brand instead of Nike or Reebok or Mountain Gear or whatever it is.” And who’s their target market? Oftentimes these two are very tied together. The UVP or USP ties in with the target market because they’re trying to reach a particular person, and they think that those specific attributes that are unique to their company are what’s going to successfully reach them.

There’s also customer research, and you can do customer research of all kinds. You can do profiling, that can be demographic or psychographic. You can do targeted surveys where essentially I have a list of customers. For example, here at Moz obviously we have a list of the 21,000 people who pay to use Moz, and we can send a targeted survey to them. We actually have a customer advisory board of about 300 folks that Jackie runs here on our product team, and she talks to those folks very directly and will send them questions to answer.

There’s also, and these are quite interesting, this is a relatively recent phenomenon, just the last few years, sizing and perceptions surveys. The two big providers for those are Survey Monkey’s Audience product and Google’s Consumer Surveys product. Essentially what they’ve got is lots of people that they advertise to, they’re sort of random citizens of the web, denizens of the web, and they will take surveys based on profile data that you request. So you can get senses of how big is my brand in a space? Have people heard of this thing that I’m trying to offer? How many people are even interested in this thing? You can ask those broad, broad questions to a random group of users with specific sets of interests or for profile features.

You can do in-person interviews. A lot of startups especially do in-person interviews. They talk to a customer, bring him into the office. What are you doing? How are you doing it now? What could you see making that process easier or better? What is something you would pay for?

Usability studies are similar, but they are actually with a finished product or a near-finished product. Wireframe reviews are sort of a little bit less of a finished product, but more of a “hey let’s walk through these wireframes and see if this product were built, would it solve your problems? Would it be something you’d passionate about, something you would buy?”

Then there’s also, there’s two more, expert data that you can gather in terms of market research, and expert data is a little bit different from customer data. So this is not saying, “Hey I want to reach out to anyone who would potentially be a customer,” but rather, “I want to reach out to the experts in the field.” This is something, again, that we do a lot of at Moz. We have kind of a core group of people inside and outside of the company who have been marketing experts, web marketing experts, for many, many years and have a lot of deep depth of knowledge in SEO and all those kinds of features. Finding those folks is really cool because a lot of times they turn out to be the evangelists and the influencers of much of the rest of the field. So by bringing them into your process, you can do those interviews, surveys, profiling, usability studies, wireframe reviews, the same as you can with customers, but potentially get very different data and oftentimes very interesting data. I would be careful, though. I’m personally biased, oftentimes, to listening to the experts at the expense of customers. Not a good idea. You should very much consider both of these folks. Experts sometimes are so deep that they can’t see the forest for the trees, which is a problem I have myself a lot of the time too.

Then the last one is published or professional data, and these are often collected by large firms, Forrester Research, for example. They put together these large scale studies on different industries. This form of data is also fine, but it’s usually a leading indicator that you then want to verify and validate with some of these other forms.

So by doing this, by doing these forms of market research, you can get the answers to these questions, build that research based roadmap, and then when you go and execute, you’ll know that you’re on the right path. This is really powerful because a lot of the time when you take off and you start diving into the details without it, it’s bad biscuits. Bad biscuits make the baker broke, bro.

All right everyone. Hope you’ve enjoyed this edition of Whiteboard Friday. We’ll see you again next week. Take care.”

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →