About frans

Website:
frans has written 4625 articles so far, you can find them below.

The 2017 Local SEO Forecast: 10 Predictions According to Mozzers

Posted by MiriamEllis

Maybe it takes a bit of daring to forecast local search developments in quarters 2, 3, and 4 from the fresh heights of Q1, but the Moz team thrives on challenges. In this post, Rand Fishkin, Dr. Pete Meyers, George Freitag, Britney Muller, and I peer into the future in hopes of helping your local business or local search marketing agency be mentally and tactically prepared for an exciting ride in the year ahead.


1. There will be a major shakeup in local SEO ranking factors.

Rand Fishkin, Founder & Wizard of Moz

My prediction is that the local SEO ranking factors will have a major shakeup, possibly devaluing some of the long-held elements around listing consistency from hard-to-control third parties. I think Google might make this move because, while they perceive the quality and trustworthiness of those third-party local data aggregators to be decent, they don’t want to force small business owners into maintaining contentious relationships or requiring them to learn about these services that control so much of their ranking fate. I’ll be the first to say this is a bold prediction, and I don’t give it super-high odds, but I think even if it doesn’t happen in 2017, it’s likely in the next few years.


2. Feature diversification will continue to mature.

Dr. Peter J. Myers, Marketing Scientist at Moz

I predict that local SEO will finally see the kind of full-on feature diversification (organic and paid) that has been going on with organic for a few years now. We’ve already seen many changes to local packs and the introduction of local knowledge panels, including sponsored hotel panels. Now Google is testing paid home services, ads in local packs, destination carousels, trip planning guides and, most recently, “Discover More Places” map results. By the end of 2017, “local SEO” will represent a wide variety of organic and paid opportunities, each with their own unique costs and benefits. This will present both new opportunities and new complications.


3. Voice search will influence features in Google and Amazon results.

George Freitag, Local Search Evangelist at Moz

I also think we’ll see a new wave of features appear in the local pack over the next year. I believe that voice search will play a large part in this as it will determine the most important features that Google (and Amazon) will incorporate into their results. As both companies start to gather more and more data about the types of complex searches — like “How long will it take me to get there?” or something more ambitious like “Do they have any more of those in my size” — Google and Amazon will start to facilitate businesses in answering those questions by allowing more opportunities to directly submit information. This satisfies both Google’s desire to have even more data submitted directly to them and the searcher’s desire to have access to more information about the businesses, which means it’s something that is definitely worth their time.


4. Google will begin to provide incredibly specific details about local businesses.

Britney Muller, SEO & Content Architect at Moz

I predict that we will see Google acquiring more intimate details about local businesses. They will obtain details from your customers (via different incentives) for unbiased feedback about your business. This will help Google provide searchers with a better user experience. We’ve already started seeing this with “Popular Times” and the “Live” features, showing you if current traffic is under or over the typical amount for the specific location. Your location’s level of noise, coziness, bedside manner (for doctors and clinics), and even how clean the bathroom is will all become accessible to searchers in the near future.


5–10. Six predictions for the price of one!

Miriam Ellis, Moz Associate & Local SEO

I have a half-dozen predictions for the coming year:

Diminishing free packs

Google paid packs will have replaced many free packs by 2017’s end, prompting local business owners to pay to play, particularly in the service industries that will find themselves having to give Google a piece of the pie in exchange for leads.

Voice search will rise

Local marketers will need to stress voice search optimization to business owners. Basically, much of this will boil down to including more natural language in the site’s contents and tags. This is a positive, in that our industry has stressed natural language over robotic-sounding over-optimization for many years. Voice search is the latest incentive to really perfect the voice of your content so that it matches the voice your customers are using when they search. Near-me searches and micro-moment events tie in nicely to the rise of voice search.

Expansion of attributes

Expect much discussion of attributes this year as Google rolls out further attribute refinements in the Google My Business dashboard, and as more Google-based reviewers find themselves prompted to assign attributes to their sentiments about local businesses.

Ethical businesses will thrive

Ongoing study of the millennial market will cement the understanding that serving this consumer base means devoting resources to aspirational and ethical business practices. The Internet has created a segment of the population that can see the good and bad of brands at the click of a link, and who base purchasing decisions on that data. Smart brands will implement sustainable practices that guard the environment and the well-being of workers if they want millennial market share.

Google will remain dominant

What won’t happen this year is a major transfer of power from the current structure. Google will remain dominant, but Facebook will continue to give them the best run for their money. Apple Maps will become more familiar to the industry. Yelp will keep building beyond the 115 million reviews they’ve achieved and more retail business owners will realize Yelp is even bigger for their model than it is for restaurants. You’ve pretty much got to be on Yelp in 2017 if you are in the retail, restaurant, or home service industries.

Amazon’s local impact will increase

Amazon’s ingress into local commerce will almost certainly result in many local business models becoming aware of the giant coming to town, especially in metropolitan communities. I’m withholding judgement on how successful some of their programs (like Amazon Go) will be, but local business owners need to familiarize themselves with these developments and see what’s applicable to them. David Mihm recently mentioned that he wouldn’t be surprised to see Amazon buying a few bankrupt malls this year — that wouldn’t surprise me, either.


Taken in sum, it’s a safe bet that local SEO is going to continue to be a significant force in the world of search in the coming year. Local business owners and the agencies which serve them will be wise to stay apprised of developments, diversifying tactics as need arises.

Now it’s your turn! Do you agree/disagree with our predictions? And how about your forecast? When you look to the future in local, what do you foresee? Please help us round out this post with predictions from our incredibly smart community.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

How to Uncover Hidden Keyword-Level Data Using Google Sheets

Posted by SarahLively

TL;DR

Keyword-level data isn’t gone, it’s just harder to get to. By using Google Sheets to marry the data from Search Console and Google Analytics into a sheet, you’ll have your top keywords and landing page engagement metrics together (for free!). It’s not perfect keyword-level data, but in 7 steps you can see the keywords that drove clicks to a page and the organic engagement metrics for that page, all together in one place. The Google Analytics Add-on for Google Sheets will pull organic landing page engagement metrics, and the Search Analytics for Sheets Add-on will pull the top queries by landing page from Search Console. Then, use VLOOKUP and an Array Formula to combine the data into a new tab that has your specified landing pages, the keywords that drove clicks there, and the specified engagement metrics.

What do you mean you don’t know which keyword drove that conversion?

Since the disappearance of keyword-level data in Google Analytics, SEOs have been struggling to tie keyword strategies to legitimate, measurable metrics. We put much of our time, resources, and research efforts into picking the perfect keyword theme, full of topically relevant terms that leverage new semantic strategies. We make sure to craft the perfect metadata, positioning our top keywords in the right place in the title tag and integrating them seamlessly into the meta description, but then what? We monitor rankings and look to landing page metrics, but all of our data is disjointed and we’re left to extrapolate insights based on a limited understanding of how our themes are truly performing.

There is good news, though! Keyword-level data is still there — it’s just much harder to get to given the structure of existing platforms. If you’re like me, you have your landing page metrics in Google Analytics, your keyword click data in Search Console, and your keyword themes in a manual program (probably Excel). Given the way Google Analytics exports data, the way Search Console separates keywords and landing pages, and the nuances you’ve applied to your own keyword theme documents, it’s difficult to marry all of the data in a way that gives you actionable insights and real-time data monitoring capabilities.

Difficult… but not impossible. Enter: Google Sheets. In 7 easy steps you can pull all of this data into one sheet so you can see your keyword theme, the keywords you’re getting clicks for, the page ranking, and any organic metric for that page (think engagement metrics, conversion metrics, revenue metrics, etc.), all in one place! You can monitor keyword opportunities within striking distance, whether the keywords you want to rank for are actually ranking, and what terms and themes are driving the majority of your revenue or conversions. At the end of the day all of this works to give you actionable metrics you can monitor and change through keyword strategies. It’s much easier than you may think, and the steps below will get you started.

Follow this guide to build out a basic Google Sheet that ties Search Console, Google Analytics, and your keyword theme into one place for a few pages, and then you’ll be well on your way to building out automated sheets that give you greater insight into keyword-level data!

Step 1: Get the Google Analytics and Search Analytics for Sheets Add-ons

The Google Analytics Add-on will allow you to pull any metric from Google Analytics into your spreadsheet and Search Analytics for Sheets will pull data from Search Console. Pulling from these two sources will be the key to combining the data from Google Analytics and the Search Analytics report in a meaningful way. Once you have a new sheet open and you’re in the add-on feature, finding and installing Google Analytics and Search Analytics for Sheets should be pretty straightforward. Also, both add-ons are free.

Step 2: Create Google Analytics reports

Once you’ve installed the Google Analytics add-on, you’ll find “Google Analytics” in your menu. Hover over Google Analytics and select Create new report to get started. After the sidebar menu pops in, select the Account, Property, and View that you want to pull data from. You will also be able to name your report (see note below) and then select Create Report. You do not have to worry about the metrics and dimensions at this point, but that will come later.

Note: At the end of this article I have a template you can use to combine the data from Google Analytics and Search Analytics. If you want to use the template, make sure you name this first report Organic Landing Pages Last Year. I will also walk through the formulas and functions used in this article, so you don’t have to rely on the template, but the nomenclature of each tab must be consistent to use my exact formulas. There are plenty of opportunities to rename the report and tabs, so don’t stress if you miss this part and name your report something different; just know that if at the end the template isn’t working, you should double-check the tab names.

Step 3: Configure your Google Analytics reports

The Report Configuration tab you now see as the first tab in your sheet is where you can configure the data you want to pull. I highly recommend familiarizing yourself with this functionality by watching this quick, five-minute video from Google as an overview on how to generate reports from Google Analytics in Google Sheets. Listed below are the fields being used for this report, and you can find an extensive overview of what all of these fields mean and the metrics you can use within them here: https://developers.google.com/analytics/solutions/google-analytics-spreadsheet-add-on.

Note: If you prefer to simply fill in your sheet and read the details on each field configuration later, you can paste the cells below into your table at cell B5 (just double-check it looks like the screenshot above) and skip down to the last paragraph in this section, right after Segments.

395daysAgo
365daysAgo
ga:sessions, ga:bounces, ga:goalCompletionsAll
ga:landingPagePath
-ga:sessions
sessions::condition::ga:medium==organic

Report Name:

The name you set when you created the report. This can be changed, but note that when you run your report, the tab with your report will use this report name.

Type:

This will automatically fill in “core” for you, meaning we are pulling from the Core Reporting API.

View:

This will also automatically fill in your Profile ID, which you set when you created the report.

Start Date:

To compare the last 30 days to the same 30 days the previous year, we will set the Start Date as 395daysAgo

End Date:

To compare a full 30 days last year to a full 30 days this year, we will set the End Date as 365daysAgo

Metrics:

This refers to the metrics you want to pull and will dictate the columns you see in your report. For this report we want to look at sessions, bounces, and goal completions, so we are using the metrics ga:sessions, ga:bounces, ga:goalCompletionsAll. Google has an excellent tool for searching possible metrics here (https://developers.google.com/analytics/devguides/reporting/core/dimsmets) if you want to eventually test and pull anything other than sessions, bounce rate, and goal completions.

Dimensions:

Dimensions refers to the dimensions you want to see specific metrics for; in this case, landing pages. We’re using landing pages as the dimension because this will allow us to match Search Analytics landing page query data with landing page Google Analytics. To pull the metrics you selected above by landing page, use ga:landingPagePath

Sort:

The Google Analytics API will default to sort your metrics in ascending order. For me, it’s more valuable to see the top landing pages in descending order so I can get a quick look at the pages driving the most traffic to my site. To do this, you simply place a minus (-) sign before the metric you want to sort your date by: -ga:sessions. You can learn more about sorting metrics through the Google Analytics API here: https://developers.google.com/analytics/devguides/reporting/core/v3/reference#sort.

Segments:

The last field we’re going to be adding to is Segments so we can look at just organic traffic. This is where you could put in new organic users, return organic users, or any special segment you’ve created in Google Analytics. However, for this report we’re going to use the primary organic traffic segment that’s standard in Google Analytics: sessions::condition::ga:medium==organic.

As mentioned, we want to see organic traffic to each page during the last 30 days compared to the previous year. To do this, we need to generate two reports: one with our session data for the last 30 days, and one for the session data for the same span of time one year ago. We have 2015 ready to go, so simply paste that into column C, rename the Report Name to Organic Landing Pages This Year and change Start Date to 30daysAgo and End Date to yesterday. Double-check the screenshot above matches your configurations before moving on.

Step 4: Run your Google Analytics report

You will run the report you just created by selecting Run reports under the Google Analytics add-on. We won’t be reviewing scheduling reports in this article, but it can be useful to time these to run on a specific day to align with any ongoing reporting you have. You can learn more about scheduling reports here: https://developers.google.com/analytics/solutions/google-analytics-spreadsheet-add-on#scheduling-reports.

If everything has been completed correctly so far, you should see this popup:

If, for some reason, you see a popup noting that you have an error, Google Analytics is great at letting you know exactly which field has been implemented incorrectly. Double-check your segments here (https://developers.google.com/analytics/devguides/reporting/core/v3/reference) and as long as you’re using valid formatting, you should be able to fix any issues.

Assuming everything went according to plan, you’ll see a spreadsheet that looks like this:

Step 5: Run your Search Analytics for Sheets report

Running a Search Analytics for Sheets report is really simple. Click to your empty sheet (Sheet1), and in the same place you were able to launch Google Analytics, launch the sidebar for Search Analytics for Sheets. From there, you’ll authorize the app and set the parameters of your report. Any metrics that I updated are highlighted in the screenshot below, but you want to group by query and page, aggregate by page, and have the results display on the active sheet. The default for Search Analytics for Sheets is to pull from the previous 90 days, but you can adjust this to display whatever makes sense for your website.

As long as everything runs correctly, you’ll see your top search queries, landing pages, clicks, impressions, CTR, and average position in descending order by clicks. Rename Sheet1 to Search Console Data, and your sheet should look like this:

Step 6: Remove the domain name from Search Analytics landing pages

Hopefully you can see where this is going now. We have one tab with all of our Google Analytics data by landing page, and one with our Search Analytics data by landing page, so all that’s left is to marry the data.

First, we just need to strip the domain name from the Search Console data. You’ll notice the data from Google Analytics pulls the top landing pages excluding the https://domain-name.com, and Search Console pulls the entire domain.Therefore, we have to format them identically in order to combine the data. To do this, you’ll need to execute a “find and replace” on your Page column in the Search Console tab in Google Sheets and replace https://domain-name.com with no replacement (eliminating the domain name from the URL).

Step 7: Combine the data

Download the Keyword Level Data template here. This template has the proper formulas in place to pull landing page sessions year over year, bounce rate, and total goal conversions. I’ve also set Column C up as “Target Keywords” to type in the terms you’re actively targeting on each page. This way, you can see if what you’re targeting is similar to what you’re ranking for in Google. Once the template is up, copy the Keyword Data tab to your worksheet.

After you copy the sheet over, you should see a new sheet with a tab called Keyword Data. From here, select the Keyword Data tab and click Copy to…

Select the sheet you have built with your data, and a copy of the Keyword Data tab will populate at the end of your sheet.

If you’ve done everything correctly so far, you will be able to update your URLs and the data will automatically appear within the template for your specific pages. When adding your page URL, be sure not to include the domain name. For example, if you wanted to see data for https://www.domain-name.com/products/, you would type /products/ in cell B6 and see the data populate. Also make sure everything is matching up with trailing slashes between your Google Analytics data and your Search Console data. If you have issues with duplicate URL structures, you may need to work with the data a bit to make the URL structure formatting consistent (and also you should fix that on the server side!). Your results should look something like this:

How is the template working?

If you’re interested in looking at more than two pages and really building this out into a more robust report, you probably want to understand what formulas are controlling the results so you can expand the data.

The majority of this template utilizes VLOOKUP to pull the Google Analytics data into the sheet. If you’re not sure how VLOOKUP works, you can read more on that here.

The year-over-year percent change column and bounce rate column are simple calculations. For example, the percent change in cell G6 is calculated using =(E6-F6)/F6 and the bounce rate in cell I6 uses =(H6/E6). You’re probably familiar with these common Excel functions already.

The more complicated formula is the array formula that’s being used to pull the keyword data from Search Analytics. Due to the fact that a VLOOKUP will stop after the first match, and we want to see up to five matches for queries, we’re utilizing an array formula instead to pull the matches in up to 5 cells. There are other functions that will do this as well (pull all possible matches in a sheet, that is); however, the array formula is unique in that it lets us limit the results to five rows (otherwise, if you have 10 matches for one term but 4 for another, you wouldn’t be able to structure your sheet in way that displays multiple pages within one tab).

Here is the array formula that’s used in cell D6:

=ArrayFormula(IF(ISERROR(INDEX(‘Search Console Data’!$A$1:$B$5000,SMALL(IF(‘Search Console Data’!$B$1:$B$5000=$B$6,ROW(‘Search Console Data’!$A$1:$B$5000)),ROW(2:2)),1)),””,INDEX(‘Search Console Data’!$A$1:$B$5000,SMALL(IF(‘Search Console Data’!$B$1:$B$5000=$B$6,ROW(‘Search Console Data’!$B$1:$B$5000)),ROW(2:2)),1)))

This formula is allowing multiple values to pull for the value in B6, but also allows the formula to drag down and expand through cell D11. The array formula in cell D11 is:

=ArrayFormula(IF(ISERROR(INDEX(‘Search Console Data’!$A$1:$B$5000,SMALL(IF(‘Search Console Data’!$B$1:$B$5000=$B$6,ROW(‘Search Console Data’!$A$1:$B$5000)),ROW(7:7)),1)),””,INDEX(‘Search Console Data’!$A$1:$B$5000,SMALL(IF(‘Search Console Data’!$B$1:$B$5000=$B$6,ROW(‘Search Console Data’!$B$1:$B$5000)),ROW(7:7)),1)))

You can learn more about array formulas here, but the way they are executed in Google Sheets is a bit different than Excel. From my research, this formula gave the results I wanted (multiple matches controlled in a specific set of cells), but if you know of a function in Google Sheets that does something similar, feel free to share in the comments!

Conclusion

Keyword-level data isn’t gone! Google is giving us valuable insights into what terms are leading users to our sites — we just need to combine the data in a meaningful way. Google Sheets is a powerful way to connect to various APIs and pull loads of data from multiple sources. There are some limitations to the Search Analytics report (see this great post from Russ Jones on some inaccuracies he found in Search Console Search Analytics data), so hopefully this small sheet will inspire you to expand the data and include more engagement metrics from Google Analytics, additional click data from Search Console, rankings data, data for traffic outside of organic, and more. Not to mention all of this can be scheduled, so you can have your Search Analytics and Google Analytics data ready when you open your sheets and automate almost this entire process.

We don’t have to use tools like Search Console and Google Analytics in a vacuum simply because they exist that way. Experiment with ways to combine the data on your own to gain more valuable insights into your campaigns!

Also, if you loved this, if any of this doesn’t work for you, if you know paid tools that do this, you’re doing this a different way, you’re doing this in a bigger way, or this just didn’t make sense to you — comment! I would love to hear how other SEOs are gleaning insights into keyword data in the new days of (not provided) and improve on this process with your help!

Shout outs

A special shout out goes to @mihaiaperghis for publishing this blog post on How to Use Search Analytics in Google Sheets for Better SEO Insights as I was finishing up this post. Thanks to your post, I was able to find a free, easy way to pull from the Search Analytics API into sheets. Before reading, I was utilizing and wrote about a paid add-on that was ~$30/month, so thanks to your post I can call this entire process free. Also thanks to @SWallaceSEO for reviewing this article, testing the sheet, and helping me with edits and debugging!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

The Keyword + Year Content/Rankings Hack – Whiteboard Friday

Posted by randfish

What’s the secret to earning site traffic from competitive keywords with decent search volume? The answer could be as easy as 1, 2, 3 — or more precisely, 2, 0, 1, 7. In today’s Whiteboard Friday, Rand lets you in on a relatively straightforward tactic that can help you compete in a tough space using very fresh content.

Keyword + year hack

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about what is sometimes known as the keyword-plus-year hack. This is the idea that you take a keyword that has some existing search volume, and you add on a date, either a month or a month and a year, and you are able to outrank many of the other players because it’s a much less competitive space. I’ll show you what I’m talking about. It helps to have an example.

Keyword trend graph

So let’s say there are a good number of people every year who search for the best web designs. They want to get design inspiration, or they just want to see what’s out there. They’re looking for the design gallery, so they search for best web designs. Many times, though, those folks will get results from the last 5 to 10 years, and they’ll see those in there and they’ll go, “You know, I want something more modern, more updated,” and so they will revise their query to “best web designs 2016.” In fact, many people even start with these queries. So they want to see the top Android games of January 2017. They want to see the best summer dishes 2017 for planning on cooking something.

They’re looking for something that is trendy. Maybe they’re looking around fashion, or you see this a lot in searches around hairstyle or anything that is grading products or services. Who are the best real estate agents in Seattle? No, no, not the 2012 edition of “Seattle Met Magazine.” I want 2017. Who are the best real estate agents in Seattle 2017? So they’ll add this year on there. What’s great is, because this year or month only happens as it happens, those searches only happen as we get to that time period, the keyword research will not expose it to you. So if, for example, let’s make this 2017, so “best web designs 2017.”

We’re filming this Whiteboard Friday in January. There have only just started to be a few searches for best web designs 2017. There have only just started to be a few any keyword searches that include the word 2017, because 2017 has just started. Therefore, your competitors are not seeing those in their keyword research list. They’re not targeting them. There’s not a ton of content out there yet, and so it’s easier. Even though the volume tends to be lower than the usual keyword — sometimes it’s higher, but usually lower — you will find it is vastly easier to rank for, and it’s also the case — this is sort of beautiful — if you’re using this tactic, even though it’s higher in volume to have “best web designs” versus “best web designs 2017,” the one without the year, it is often the case that Google will bias to show more recent content, especially if there are lots of searches that get revised to include the temporal number or date.

That is awesome because it means that you can win twice. You can rank for this one, which, of course, the search volume for it will die off at the end of the year, but you might be able to rank for this one as well. If you keep that updated, and change it up, and add to it, retire the old one, move the old one over to an old URL, put the new one up at the new URL, or keep the same URL if you’re trying to build on top of the link authority that you’ve built to that URL, you can have some awesome ranking and traffic power.

The process

How do you do this?

1. Conduct keyword research using:

  • NON-date keywords – You want to conduct the keyword research without using the date. I’m going to start with non-date keywords. So if I’m in Keyword Explorer, or if I’m doing my keyword research in AdWords, or wherever I am, I would search for “best web design,” get a big list of my target keywords.
  • Last year + keywords – Then I would go look at last year numbers. For example, I would search for “2016 best web design” or “best web design 2016” or anything from my keyword export or list that includes years.
  • 2–3 years ago + keywords – I would go two and three years ago, so that I could get a sense for the volume that includes the year. I would also be looking for month at this time.

2. Use Google Trends and/or SimilarWeb/Jumpshot Trends to ID seasonality

Then I’d use Google Trends, or if you’re not a fan of Google Trends — they can be a little squirrely with some data — SimilarWeb and Jumpshot also have keyword trend data, at least at the head of the demand curve, that can be good, and try and identify some of that seasonality. If you see that there’s a high season that includes a particular month, that’s often an indication that month plus year could be there, and then you can go and look in here. I could add “May 2016 best web design” to see if there was actually search volume for just the May keyword.

3. Use Google SERPs to determine if the month/year tactic is popular or underserved in your niche

Then I’m going to use Google SERPs. I’m going to check the keyword difficulty of those SERPs, and I’d probably look to see how many different outlets are producing monthly or annual content. For annual content, it’s really going to be very January- and February-centric. That’s when it all gets produced. Then, if it is underserved, that means there’s more opportunity there, but, even still, it’s almost always a lower difficulty, easier to get in there.

4. Target and create that timely content

So, to do that, you’re going to be using:

  • Recent data. If I were creating a page to target “best web design 2017,” I would want to use designs that have come out in the last month only or maybe just at the very end of 2016.
  • Employ emerging trends and language. So maybe it’s PWAs, maybe it is language around clean design, whatever the trends in the field are right now.
  • Serve the recency of the searcher’s intent by giving them the ability or showing right up front that my data and my information is very recent and that I’m helping them with what’s going now, not just historically.

5. Publish as early in the period as possible

You want to publish this content as early as you can in the period without doing it earlier. So what I don’t want to do is have my launch be in December 2016. December is a very quiet period anyway. It’s tough to get traction and attention, it’s tough to build links, but it also can be the case that you won’t hit the search algorithms as recency systems. Google has these algorithms called the QDF, query deserves freshness, and so if they see that you’re producing that content a month or two months before it’s actually the right date, there’s going to be skepticism, both from users who might stumble upon it or find it, but also from engines. So you want to publish early in the period, but not any earlier than that.

With this tactic, yeah, you can hack your way to some pretty awesome traffic. I look forward to hearing from all of you who’ve done this, who are trying it, and hear your experiences. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

p.s. For some great examples of this in action, check out KeywordKeg’s list of 2017 phrases.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

A Guide to JSON-LD for Beginners

Posted by alexis-sanders

What is JSON-LD?

JSON-LD stands for JavaScript Object Notation for Linked Data, which consists of multi-dimensional arrays (think: list of attribute-value pairs).

It is an implementation format for structuring data analogous to Microdata and RDFa. Typically, in terms of SEO, JSON-LD is implemented leveraging the Schema.org vocabulary, a joint effort by Google, Bing, Yahoo!, and Yandex in 2011 to create a unified structured data vocabulary for the web. (However, Bing and other search engines have not officially stated their support of JSON-LD implementations of Schema.org.)

JSON-LD is considered to be simpler to implement, due to the ability to simply paste the markup within the HTML document, versus having to wrap the markup around HTML elements (as one would do with Microdata).

What does JSON-LD do?

JSON-LD annotates elements on a page, structuring the data, which can then be used by search engines to disambiguate elements and establish facts surrounding entities, which is then associated with creating a more organized, better web overall.

Figure 1 – A conceptual visualization of JSON-LD taking the unstructured content on the web, annotating, and structuring the content to create an organized, structured result.

Where in the HTML (for a webpage) does JSON-LD live?

Google recommends adding JSON-LD to the <head> section of the HTML document; however, it’s okay if the JSON-LD is within the <body> section. Google can also grasp dynamically generated tags in the DOM.

JSON-LD breakdown

The immutable tags (Think: You don’t need to memorize these, just copy/paste)

<script type=”application/ld+json”> {

When you see JSON-LD, the first think you should always see is a <script> tag. The <script> tag with a type attribute says, “Hey browser, I’m calling the JavaScript that contains JSON-LD.”

Image result for light bulb symbol png Pro Tip: Close every tag you open when you open it. Think: The salt goes with the pepper, and opening braces come with a closing brace.

Note: If your JSON-LD isn’t in the curly braces, it isn’t being parsed (i.e., curl it up).

“@context”: “http://schema.org”,

The second element that retains a permanent place in JSON-LD markup is the @context with the value of http://schema.org. The @context says, “Hey browser, this is the vocabulary I’m referencing. You can find it at http://schema.org.” The benefit for an SEO is that we get to use any of the item types and item properties that Schema.org defines.

Additionally, you’re probably noticing that cute, eyelash-like comma at the end of the statement. Commas mean “There’s more. Don’t stop parsing the data.”

Image result for light bulb symbol png Pro Tip: Mind your commas (and always check in Google’s Structured Data Testing Tool). Commas are a traditional sore-spot for many programmers and JSON-LD offers no solace here. Missed commas mean invalid markup.

“@Type”: “…”,

The final element in the JSON-LD Schema copy/paste squad is the @type specification (after the colon, it becomes all data annotation). @type specifies the item type being marked up. You can find a comprehensive list of all item types at: https://schema.org/docs/full.html.

In the example below, the @type says, “Hey, I’m using the Person item type (You can find it at http://schema.org/Person).” Indeed, if you type the URL into the browser, the item type’s documentation and technical specifications should appear, including any item properties (and often some example use cases).

@type for nesting: When you use a nested item type, you’re going to need to nest another @type (this is particularly important to understanding product and breadcrumb markups).

Attribute-value pairs

The next step is to annotate information about the item type. You can find item properties within the item type’s Schema.org page.

In terms of the syntax of JSON-LD, there are two important elements for each item property:

  1. Item Property This comes from the Schema.org vocabulary and should always be in double straight quotation marks (it may sound pedantic here, but for real the curly and single quotation marks are different and will interfere with validation), and must belong to the properties allowed within the item type (as specified within Schema.org).
  2. Value – You insert your value here for the property. It’s vital the value aligns with the property and is singular (i.e., each value must be annotated separately. In the situation of multiple values for an item property, use square brackets). Strings (characters) and URLs need the “double straight quotation marks.” Numbers, integers, floats, or doubles (for the programming inclined) alone don’t need quotation marks, but it’s also okay to put them into quotations (this just means they’ll be considered a string data type).

Square brackets

Square brackets exist for situations where there are multiple values for an item property. A common use is leveraging the sameAs item property as using [square brackets] for listing multiple social media platforms.

The square brackets below are saying, “There are multiple values for this item type; Jason Derulo has two given names.”

Note: There is no comma after the last element in the square brackets. This indicates that there is no more information within the square brackets.

Nesting

Nesting is defined as where information is organized in layers, or where objects contain other objects. The image of a nesting doll is a common analogy, where large dolls contain smaller dolls within them, as a relational data organization visual.

Figure 2 – Image source

Nest is a vital aspect for accurately marking up Schema.org JSON-LD, because you’re going to have certain item properties that belong to item types that do not belong to others. For example, below we can see the item property “name” can refer to the event name, the name of the performer, and the name of the venue. The name of the performer and venue are both nested.

Match the correct name item properties to the appropriate item type:

Nesting in JSON-LD starts with the item property. Within the first item type (ex. Movie) you must first use the item property (ex. actor, director, image). That item property is identified and then we must open curly brackets with the new item type (as indicated by the “@type”:) and attribute/value data.

JSON-LD nesting checklist:

  • Must use the item property (specific to the item type)
  • The value lives in curly braces
  • You MUST identify the item type of that property
  • Attribute/value properties must be included (typically there are requirements for what needs to be included)
  • No comma before the closing curly bracket
  • Comma after closing curly bracket if there are more item properties (if not, it’ll be followed by a curly brace)
Image result for light bulb symbol png Pro Tip: Indent nested elements for readability.

Common use: Within the Product item type markup, Price is nested within an Offer item type and ratings are also nested!

Pitfalls

If your markup isn’t validating in Google’s Structured Data Testing Tool and you’re unsure of what’s going on, check this list. Below are some major pitfalls in creating JSON-LD structured data markup.

  1. Syntax
    • “” are not the same as “” (curly versus straight; the struggle is real)
    • Mind your commas
      • Especially note the Structured Data Testing Tool’s little red “x” on the left-hand rail. Oftentimes the “x” will appear below a missing or extraneous comma
  2. Vocabulary
  3. Policy Violation
    • All annotated information must be on the page; adding information that is not on the page will likely not show in search results and is against Google guidelines
    • It is also against the rules to engage in manipulative practices (not a pitfall I’m worried about for you!)
    • Check/review Google’s Structured Data Policies
  4. Microsoft (sorry Bill, I’m still a huge fan!)
    • Copy/paste from Word/Excel can create issues (added quotation marks, added style formatting)
    • Microsoft switches “” to “”
    • Solution: use an HTML editor

Process of adding JSON-LD to site

The process of creating JSON-LD structured data markup is dependent on one’s comfort with the Schema.org vocabulary and the JSON-LD syntax. Below outlines a process for a person newer to JSON-LD and Schema.org to create markups, while developing a deeper understanding of the vocabulary.

  1. Mentally answer:
    • What do you want to mark up?
      • Goal: Determine that you can mark up the content with the Schema.org vocabulary. Some things may make sense conceptually, but are not available within the vocabulary.
    • Why do you want to mark it up?
      • Goal: Determine whether there is a business case, or perhaps you’re looking to experiment. You don’t want to mark content up just for the sake of marking them up; you want to mark up content that will help search engines understand the most vital information on your page and maximize your ability to demonstrate that you are the best resource for users.
      • Look for resources on markups Google is supporting, how they are using them, and examples.
  2. If you’re using a markup that Google is explicitly using (i.e., resources on Google), open the specific documentation page and any relevant examples
    • Don’t feel like you have to create JSON-LD markup from scratch. Use Google’s examples to reverse-engineer your markups. (This isn’t to take away from your understanding of JSON-LD and the Schema.org vocabulary; however, no need to reinvent the wheel! #efficiency ☺).
  3. Open up the Schema.org item type page
    • Especially when you’re starting off with Schema.org, skimming the Schema.org technical documentation page to get a gist of what the item type entails, how many sites are using this markup, and its various properties can facilitate a better understanding as you continue along your structured data journey. After a while, this step might become necessary only when attempting a new markup or looking for a corner case.
  4. Copy/paste the immutable elements (i.e., from <script to “@type”:)
    • Save yourself time and mental energy. Another possibility here is to reverse-engineer an existing example, in which case these element should be present.
    • Occasionally in Google’s examples they’ll leave out the <script> tags, but please note that they are vital for the content within the HTML document. JavaScript can’t be parsed without <script> tags.
  5. Add desired item type you’re interested in marking up as the value of @type:
  6. List item properties and values
    • This step doesn’t require syntax and is more of a mental organization exercise. Concentrate on what you want to markup — don’t worry about the nitty-gritty yet. Basically, you want to get out your thoughts before you start diving into the “how.”
    • Often times you may have ideas about what you want to mark up, but may not necessarily know if it’s possible within the vocabulary or how it’s nested.
  7. Add JSON-LD syntax, nesting where required/appropriate
    • The nitty-gritty step where you put everything into the syntax, nest it, and put markup together.
  8. Test with the Structured Data Testing Tool
    • Confirm that the structured data is validating and that all item properties are listed and accurate.
  9. Determine strategy for adding to the webpage

What have your experiences been so far with JSON-LD? Please share your questions and thoughts in the comments!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Reputation, Rankings, and Revenue: Navigating Local for Non-Technical People

Posted by MiriamEllis

Your local SEO agency needs new clients in 2017. Your department needs to convince management to earmark robust resources for local SEM this year. What if the only thing standing in your way is presentation?

3rs.jpg

In the 10+ years I’ve been consulting with local businesses, I’ve watched our industry grow to absorb an incredibly diverse set of disparate-seeming tasks. The breadth of the lingo alone is on the verge of becoming a dialect of its own. Here, supporting our Moz Local product, some of my internal communications with team members read like a code, packed with acronyms, abbreviations, and shorthand references that encapsulate large concepts which, while perfectly understood between local SEOs, would likely mean little to many CEOs or local business owners. In other words: shoptalk. Every industry has it.

The ability to codify and convey a complex concept by distilling it down to its essence is critical to the art of the pitch. Tell your new lead or your all-hands meeting that the company’s NAP is inconsistent on FB and YP, their DA is weak, and their owner responses are painfully MIA and watch their eyes glaze over. Today, I’d like to help you get meaningful attention by translating your local SEO work into 3 terms that almost any non-technical party will not only understand, but care about tremendously: reputation, rankings and revenue.

How to explain the main components of local SEO

1. Guideline compliance

Step One: Determine that the business qualifies as local via Google’s definition in their guidelines.

Step Two: Adhere to all guidelines to ensure that the business isn’t spamming Google. The same applies to other major local business data platforms.

How does it impact the 3 Rs?

This protects reputation, in that the business conducts itself in an above-board fashion and doesn’t come across as spammy to search engines or consumers. It protects rankings in that penalties are avoided. It protects revenue in that resources are not wasted on risky practices and funds are being devoted to appropriate forms of marketing for the business model; money and time aren’t being spent on dubious work that can fall apart at any moment.

Further reading:

2. Website

Step One: Develop a technically clean website with good UX for all users/devices. If the site already exists, audit it for problems/penalties and resolve them.

Step Two: Develop the best possible website content in the business’ geo-industry.

Step Three: Properly optimize the site for local search + organic search.

Step Four: Optimize for conversions. All four goals should be a simultaneous effort.

How does it impact the 3 Rs?

This protects reputation in that the website delivers excellent customer service and establishes the business as an authoritative resource. It protects rankings in that penalties and filters are avoided, excellent content rises in visibility, and both local and organic results are won and held. It protects revenue in that conversions are not being lost to unsatisfactory user experiences.

Further reading:

3. Citations

Step One: Audit the existing citation landscape and correct inconsistent, incomplete and duplicate listings.

Step Two: Ensure listings have been developed on core local business data platforms.

Step Three: Develop geo/industry-specific citations.

Step Four: Manage citations on an on-going basis to catch emerging inconsistencies/duplicates/third party edits.

Step Five: Seek out unstructured citation opportunities (news, blogs, etc.).

How does it impact the 3 Rs?

This protects reputation in that the business is accurately listed in consumers’ preferred places, establishing identity and professionalism — citations are simply publishing and no business wants wrong information to be published about it. It protects rankings in that search engines’ trust in the validity of the business’ basic data is being augmented. It protects revenue in that transactions are not being lost due to the misdirection and frustration of consumers via inaccurate basic data around the web.

Further reading:

4. Reviews

Step One: Perfect and reinforce customer service policies and staff training.

Step Two: Implement a review acquisition strategy for key citation platforms and for the company website.

Step Three: Respond to reviews.

How does it impact the 3 Rs?

This protects reputation in that incoming customers derive trust from previous customers and the business’ reputation is being carefully managed from in-store service to online sentiment by the owner or agency department, including the improvement/resolution of negative sentiment via owner responses. It protects rankings by dint of surpassing competitors with a larger number of positive reviews on the major platforms. It protects revenue by winning trust-based transactions from new customers who are influenced by previous customers’ sentiment, while ensuring that neglect of negative sentiment or a simple lack of reviews isn’t turning potential consumers away. Actively managed reviews are one of the very best indicators of a responsive, reliable brand.

Further reading:

5. Links

Step One: Audit the existing link landscape for problem links and disavow or otherwise resolve them.

Step Two: Earn voluntary links via the publication and promotion of exceptional materials.

Step Three: Carefully seek out relevant link opportunities via safe methods such as local sponsorships, editorial contributions, or other vehicles on quality geo/industry sites.

How does it impact the 3 Rs?

This protects reputation in that the business is associating with the best-of-the-best and isn’t being lumped in by search engines or consumers with shady actors or practices. It protects the website’s rankings in that links are growing the brand’s renown over time, making it an active and visible competitor and proving its relevance to search engines. It protects the website’s revenue both in fostering traffic and conversions from new sources, and in utilizing allowed practices to safeguard against sudden plunges in visibility.

Further reading:

6. Social

Step One: Identify the social hubs preferred by your specific geo/industry consumers.

Step Two: Based on the culture of each platform, develop a policy and strategy for participation.

Step Three: Participate on these platforms in a spirit of sharing rather than selling.

Step Four: Given that Social is an extension of customer service, monitor all social accounts for consumer needs/complaints and enact your policy for resolution.

How does it impact the 3 Rs?

This protects reputation in that you are both contributing to and managing the online discussion of your brand, providing accessibility in a modern vein. It protects rankings in that some social results (like Twitter) will appear directly within the organic results of search engines like Google, establishing a sense of both company activity and consumer sentiment. It protects revenue in that neglected consumer sentiment does not lead to lost transactions or permanent negative reviews.

Further reading:

7. Offline

Step One: Recognize that anything that happens offline may be published online, whether this relates to company activity driving online content development or consumer in-store experiences driving online sentiment.

Step Two: Take whatever steps necessary to create a cohesive offline-to-online experience, including branding, messaging, signage, promotions, in-store apps or kiosks, and transactional support.

Step Three: Seek out real-world opportunities for establishing your brand as a community resource via traditional methods like print, radio, and television, as well as by participation in appropriate community organizations and events.

How does it impact the 3 Rs?

This protects reputation by cementing for consumers that they will enjoy a specific type of desired experience interacting with your brand, whether on the Internet or offline — it’s all about consistency, and it carries over into reviews. It protects rankings by creating the active, real-world company culture that contributes to both your own online publication strategy and the acquisition of third-party media mentions (online news, blogs, social, etc.). It protects revenue in that the most-desired end of the funnel of all of the above is the transaction, and today, most consumers will arrive at that moment via a combination of both on- and offline influences. By being present in what Google calls its four micro-moments, revenue is safeguarded and, ideally, improved.

Further reading:

8. Other media

Depending on the business’ industry, other forms of media may contribute directly to reputation, rankings, and revenue. This could include email marketing, video marketing, or app, tool, or widget development. In essence, these are specialized forms of content development and social promotion that will need to be built into marketing strategies wherever appropriate.

Further reading:

How much do they need to know?

I’m a firm believer in full transparency and thorough documentation of all work performed so that clients, teams, or bosses can see exactly what is being done, even if the technicalities aren’t perfectly understood by them. As you undertake the various tasks of local SEM, you’ll want to both fully detail the steps you are taking and use every available means for measuring their outcomes. That’s how you keep clients and keep your department funded.

But initially, when first presenting your proposed strategic outline, paring it down to finite goals may greatly improve your communication with industry outsiders, establishing common ground where you are seeing eye-to-eye with confidence. I have yet to meet a business owner who doesn’t instinctively sense the importance of his company’s reputation, rankings, and revenue, so rather than risk losing him with complex jargon at the outset, why not signal that you are on the same wavelength with the simplest terms possible?

As a fellow local search marketer, I know that you, too, have your livelihood wrapped up in the 3 Rs, and I’m wishing you a highly converting 2017!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →