Archives for 

seo

Fixing the Broken Culture of SEO Metrics – Whiteboard Friday

Posted by randfish

As SEO continues to evolve, the metrics that indicate success continue to change with it. However, many of our client’s needs don’t seem to be changing as rapidly. With clients focused on specifics like the number of links they’re getting and weekly ranking reports, it’s tough to move the needle in the right direction for true SEO success. 

How do we push other inbound channels (like search, content marketing, and social) forward to offer a more holistic and strategic approach to inbound marketing that our clients can get behind? In today’s Whiteboard Friday, Rand talks about the current broken culture of SEO metrics, and offers advice on what we can do to fix it. 

 

For your viewing pleasure, here’s a still image of the whiteboard used in this week’s video.

Still image of Whiteboard Friday - Fixing the Broken Culture of SEO Metrics


 

 

 

 

Video Transcription

“Howdy SEOmoz fans. Welcome to another edition of Whiteboard Friday. This week, I want to share an experience I had with you and then get to our Whiteboard Friday topic, which is going to be all about metrics and how we change this broken culture that we have in the SEO world that’s sort of carried over from the past.

I got to go to SMX Sydney, which was an incredible time and an amazing visit, and I spoke there with Dan Petrovic from Dejan SEO, who is a well-known SEO guy in Australia, very, very smart guy, leads an agency down there. He asked me some questions that I think are very important and resonated with me because they’re things that I’ve heard from a lot of people and seen reflected in a lot of the questions that we get all the time.

That was:  “Rand, I want to do more of this broader inbound marketing. I want to get more strategic about the way I help people with SEO. I want to get less focused on things like the number of links I send you and your particular ranking report for a week. But these are things that our clients care about. When we talk specifically with clients and we pitch them on SEO, they tell us, ‘Hey, look, you’re not here for that. You’re here to get me more links. I want this many links and I want these rankings. I want my page rank to go up. I want my DAPA to go up.'”

Those kinds of metrics have been ingrained as what SEO is all about, and tragically that’s not the way to be successful at our jobs. The way that we really move the needle on search, on social, on content marketing, on any of these inbound channels is to have a holistic and strategic focus on them, not this little tactical, rinky-dink, “I’m going to get 50 links That’s going to move this one ranking up.” We know this. We’ve been talking about it for a long time here on Whiteboard Friday and across the SEO world. You can find it on nearly every reputable SEO blog out there.

So Dan and I were chatting and I said, “Well, I think what we have to do is take that conversation a level higher and say, ‘What do you want those metrics to accomplish? Why do you want links? Why do you want your rankings higher?'” The answer is often, “Well, we’re trying to attract more traffic and expose people to this new branding campaign,” or, “We’re trying to get more people signed up for this webinar. We’re trying to get more people in our salespeople’s funnel. We’re trying to convert more leads to perform these types of comparison searches and then buy from one of our partners.”

Okay, good. That is getting us all the way down from these what I call “leading indicator metrics” down to the business KPIs. Business KPIs, the things that indicate the performance of the business, are where we should take our strategic initiative, our strategic lead, for any sort of online marketing effort, whether that’s SEO, whether it’s PPC, advertising. I don’t care what it is that you’re spending money on, it should be focused on this, centered on this, trying to achieve these things, and then, yes, we can use metrics like links and rankings, even something like page rank or crawl depth, as leading indicators, performance indicators that things are maybe going the right way, that they’re not going the right way. We can compare them against our competition, and they’re fine metrics for that. We just can’t focus on them as where we take our strategy.

If the strategy is “go get me more links,” I’m probably going to do some gray or black hat SEO because very frankly, that’s how you move the needle on that one indicator. If you don’t care about potentially getting banned or hurting your brand impression or making a bad impression with the search engines and eventually getting into trouble that kind of way, then, yeah, you’re going to do stuff that is non-ideal for your business metrics. So let’s have this conversation first.

I’m going to start down here. Business KPIs, things that I think about as being business metrics, and these are just a sample. I don’t want you to get the idea that these are the only metrics or that these have to fit in these buckets. But in this purple bucket down here, I have things like conversions. Conversions might even be a marketing KPI for you, depending on what your true business goals are. But transaction value, life time customer value, retention of those customers and recidivism of customers, those are the business KPIs, typically, in most organizations. They’re trying to get people to the site, perform some type of action that will lead to revenue, lead to a goal being accomplished.

Marketing KPIs, these are one step up, but not yet at that level of sort of the SEO leading indicators. These are things like visits and traffic, tweets, shares, +1’s. Those are signals of engagement and success over social media, so is followers and fans, and these might be in leading indicators, tweets, shares, +1’s could easily be in leading indicators rather than marketing KPIs, brand mentions, pre-conversion action. So people, for example, visiting pages that lead to a conversion on your site and following through that funnel that you’ve got set up on your site, those are the types of marketing KPIs that the marketing team might be reporting and that you particularly, if you’re doing any type of consulting working or if you’re working in-house and trying to help move the needle, you do want to have a dashboard that’s showing you these.

Then those leading indicators, those are much more of a, “Hey, I think this is a signal that we might be on the right path,” or, “This is a test. Let’s see if moving the needle on links actually moves the needle on these other things that we care about and these business metrics that we care about,” or, “Boy, you know, sometimes it seems like it doesn’t.” Sometimes it seems like other things that we might focus on, perhaps social is really moving the needle, because you’re finding that you’re having a huge brand impact that’s biasing clicks in the search results, that’s moving you up in positions through usage and user data types of algorithms, and that’s really doing a much better job for you than raw links and raw rankings.

Maybe you’re expanding your portfolio of content, and that’s what’s moving the needle for you. You could easily put things like content production in here. You could put that in a leading indicator, or you could put it in a marketing KPI. You could put content engagement, things like comments or registrations. Those could fit into marketing KPIs. It’s okay to have different things in these different buckets. Just know what they are and make sure if you’re working with someone, that you’re getting the right answers here so that you can make the right decisions here.

Don’t focus on these. If you focus on these from a strategic point of view, your tactics are probably going to lead you in the wrong direction, and, by the way, those of you who might be buying consulting services or hiring an in-house SEO or an in-house marketing team and having them focus on this stuff, you’re really going to be misleading your marketers, and they’re going to be focused on the wrong kinds of things that aren’t going to move the needle for the business. They need to be up here.

Let me show you in a more precise fashion how I love to see this visualized and illustrated, how I love to see this done. We actually do this right now at Moz. We’ve got an internal tool that does some of this stuff, and then we have a big Google docs spreadsheet that I would love to make more sophisticated, and we probably will after we release some of the big, new things we’re working on here. But basically, there are three categories up in this leading indicators column that I pay attention to, and those are things like I want to look at the leading indicators, whatever they are, and compare them versus my budget and my goals.

So I might have, okay, this was our goal, and we are +x over that goal. This is our goal and we’re -y over this goal, and this is our other goal, we’ve got +c over here, compared to last year this time, Q1 2012. Q1, January 1st to April 1st of 2013, here’s what we’ve done so far, and here’s how far ahead we are of where we were this time last year, what we performed in Q1 of last year. I like doing this because seasonality plays a big role in many, many businesses, not every one but many, many businesses. So comparing year over year is really healthy for this.

Then compare versus the competition. The wonderful thing about leading indicators, and often one of the big reasons why a lot of folks use them is because we can compare. We can see where our competitors are ranking. We can see what sort of links they’re getting. We can see their DA and PA. Maybe we can’t see their crawl rate and depth, but those other sorts of leading indicators, even things like tweets and shares and +1’s, followers and fans, those indicators we can put in here, and we can compare against our competition.

Once we get down a layer, and I would encourage you to have the top layer, which we care about and it’s interesting, but it’s not the focus. It’s just a leading indicator. When we get to the marketing KPIs, we’ve got, again, budget year over year and competition. Then when we go to the business KPIs, we almost never can get competition, the data on what the competition’s doing. So we just have budgeting year over year. But being able to see this, being able to visualize this, it doesn’t necessarily have to be in this funnel view, but being able to see this and compare and then to show your clients, your managers, your team members what you’re doing and how that stacks up against what the business is trying to accomplish, this is incredibly powerful. It’s so much more powerful than saying, “I want links and rankings.”

If you’re hearing from folks, “I want links and rankings,” please have them watch this whiteboard video, have them leave comments, have them e-mail me. My goodness, I don’t think that this is going to be how successful SEO gets done in the future. This is how tactical SEO was done in the past, and, unfortunately, it’s how a lot of black and gray hat SEO became the norm – well, I don’t want to say “the norm” – but became very popular in our world. By focusing on bigger things, we can be smarter. We can accomplish a lot more.

All right everyone, look forward to your comments, and we will see you again next week for another edition of Whiteboard Friday.”

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Dusting The Website For Spring: Optimization and SEO Cleaning

Posted by scottwyden

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.

Spring is here, so why not put some spring cleaning actions through on your website?

With that said, I’m going to get straight to the action items. Below is a list of many things you can look for on your site to adjust, add or remove.

On-site dusting

The following are things that you can do on your website to improve the presence on the web.

Authorship: Extremely important SEO action item. Google authorship‘s role in your website’s rankings is more important than ever. In addition to being a SEO tactic, it also acts as a conversion tactic. You see, it is estimated that SERPs with authorship enabled receive around 120% more clicks.

+1 button: Going along with the Google Plus topic, make sure that you have a +1 button somewhere on your pages and/or posts. Why? Well, Google’s Author Rank is using +1 data as part of ranking. So in addition to the marketing benefits of engagement, a simple +1 can help your website rank higher for keywords.

Pinterest button: Hopefully you are on Pinterest and pinning, liking and commenting with your customers. If not, hop on it. Then also add the Pin button to your website’s pages and/or posts. People love to pin things, so make it easier for them.

Metadata: If you’re using a WordPress theme similar to ProBlogger and Copyblogger (they’re running Genesis), then there are SEO meta settings for every page and post on your website. I personally don’t recommend using a theme’s SEO settings. In fact, I much prefer WordPress SEO by Yoast – mentioned in this article on WordPress plugins. My reasoning for this is because in addition to providing meta functionality, it incorporates Open Graph for social media and a page analysis feature that is amazing.

Open Graph: I might as well move on to Open Graph since I mentioned above. Open Graph is what lets specific titles, descriptions and images show up on websites like Facebook and Google Plus. By using the same WordPress SEO by Yoast plugin, you can specify your image and descriptions for each.

Site speed: How fast does your website load? Could it be faster? Use the Google Speed Test tool to find what is slowing it down. Sometimes, simple changes can improve your load speed. Sometimes it requires development work. I like to tell people who if you’re over a score of 70 then you’re doing well, but to aim for mid 90s. Page speed helps for SEO and for user experience, so don’t overlook it anymore. Sometimes slow site speeds can be due to shared hosting that so many websites are using.

Broken links: Log into your Google & Bing Webmaster Tools and look at any 404 errors that show up. Fix them by creating new pages in their place or redirect them to another page using 301 redirects. (WordPress users can use this plugin)

Index status: Are you preventing important pages from being indexed by search engines? To find out, go to Google and do a search that looks like the following, “site:seomoz.org”. A search like that will bring up every page that Google has indexed on the domain listed. Dr. Pete shared a fantastic article on the SEOmoz blog that covers all the important search queries that you can utilize to see what content is indexed. The article also has queries that can help for guest post research. [bonus!]  It’s also worth mentioning that you can also log in to your Google Webmaster Tools panel to view all the URLs indexed by Google.  Take advantage of that tool!

User experience: To keep people on your website you need to make sure that visitors can browse it easily. As a photographer, I use a related posts plugin that shows other photographs that viewers might be interested in. That technique, used similarly on many major publications like Mashable and Engadget, is a great user experience tip to keep people browsing your website.

Off-site dusting

The following are things that you can do off your website to improve the presence on the web.

Be social: Author Rank isn’t 100% confirmed yet, but it’s an inevitable ranking system that will soon be on Google, and a form of it will wind up on Bing eventually. To sum up Author Rank; it’s where Google uses a combination of their core SEO algorithm and PageRank and combine it with the social activity around your website. For example, your commenting on and off-site, and +1s of your content and the content that you +1.

Guest post: Writing a guest post on websites like SEOmoz do many things for your presence. The two most common benefits are the relationships you build in the process and the ability to create a backlink to your website which helps with SEO.

Final Note

Sometimes dusting your website requires a fresh look. Do some searching on Google for business WordPress themes and find one that suits your needs and provides that fresh look that you’ve always wanted. Maybe it’s a Genesis theme or something else. Either way, change can be good at times.

So grab your dust pan, an extra cup of coffee, sit down for a few hours and start cleaning up for spring time, both on and off-site.  It’s spring, so start fresh.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Machine Learning and Link Spam: My Brush With Insanity

Posted by wrttnwrd

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.

sadfishie

Know someone who thinks they’re smart? Tell them to build a machine learning tool. If they majored in, say, History in college, within 30 minutes they’ll be curled up in a ball, rocking back and forth while humming the opening bars of “Oklahoma.”

Sometimes, though, the alternative is rooting through 250,000 web pages by hand, checking them for compliance with Google’s TOS. Doing that will skip you right past the rocking-and-humming stage, and launch you right into writing-with-crayons-between-your-toes phase.

Those were my two choices six months ago. Several companies came to Portent asking for help with Penguin/manual penalties. They all, for one reason or another, had dirty link profiles.

Link analysis, the hard way. Back when I was a kid…

I did the first link profile review by hand, like this:

  1. Download a list of all external linking pages from SEOmoz, MajesticSEO, and Google Webmaster Tools.
  2. Remove obviously bad links by analyzing URLs. Face it: if a linking page is on a domain like “FreeLinksDirectory.com” or “ArticleSuccess.com,” it’s gotta go.
  3. Analyze the domain and page trustrank and trustflow. Throw out anything with a zero, unless it’s on a list of ‘whitelisted’ domains.
  4. Grab thumbnails of each remaining linking page, using Python, Selenium, and Phantomjs. You don’t have to do this step, but it helps if you’re going to get help from other folks.
  5. Get some poor bugger a faithful Portent team member to review the thumbnails, quickly checking off whether they’re forums, blatant link spam, or something else.

After all of that prep work, my final review still took 10+ hours of eye-rotting agony.

There had to be a better way. I knew just enough about machine learning to realize it had possibilities, so I dove in. After all, how hard can it be?

Machine learning: the basic concept

The concept of machine learning isn’t that hard to grasp:

  1. Take a large dataset you need to classify. It could be book titles, people’s names, Facebook posts, or, for me, linking web pages.
  2. Define the categories. In this case, I’m looking for ‘spam’ and ‘good.’
  3. Get a collection of those items and classify them by hand. Or, if you’re really lucky, you find a collection that someone else classified for you. The Natural Language Toolkit, for example, has a movie reviews corpus you can use for sentiment analysis. This is your training set.
  4. Pick the right machine learning tool (hah).
  5. Configure it correctly (hahahahahahaha heee heeeeee sniff haa haaa… sorry, I’m ok… ha ha haaaaaaauuuugh).
  6. Feed in your training set, with the features — the item attributes used for classification — pre-selected. The tool will find patterns, if it can (giggle).
  7. Use the tool to compare each item in your dataset to the training set.
  8. The tool returns a classification of each item, plus its confidence in the classification and, if it’s really cool, the features that were most critical in that classification.

If you ignore the hysterical laughter, the process seems pretty simple. Alas, the laughter is a dead giveaway: these seven steps are easy the same way “Fly to moon, land on moon, fly home” is three easy steps.

Note: At this point, you could go ahead and use a pre-built toolset like BigML, Datameer, or Google’s Prediction API. Or, you could decide to build it all by hand. Which is what I did. You know, because I have so much spare time. If you’re unsure, keep reading. If this story doesn’t make you run, screaming, to the pre-built tools, start coding. You have my blessings.

The ingredients: Python, NLTK, scikit-learn

I sketched out the process for IIS (Is It Spam, not Internet Information Server) like this:

  1. Download a list of all external linking pages from SEOmoz, MajesticSEO, and Google Webmaster Tools.
  2. Use a little Python script to scrape the content of those pages.
  3. Get the SEOmoz and MajesticSEO metrics for each linking page.
  4. Build any additional features I wanted to use. I needed to calculate the reading grade level and links per word, for example. I also needed to pull out all meaningful words, and a count of those words.
  5. Finally, compare each result to my training set.

To do all of this, I needed a programming language, some kind of natural language processing (to figure out meaningful words, clean up HTML, etc.) and a machine learning algorithm that I could connect to the programming language.

I’m already a bit of a Python hacker (not a programmer – my code makes programmers cry), so Python was the obvious choice of programming language.

I’d dabbled a little with the Natural Language Toolkit (NLTK). It’s built for Python, and would easily filter out stop words, clean up HTML, and do all the other stuff I needed.

For my machine learning toolset, I picked a Python library called scikit-learn, mostly because there were tutorials out there that I could actually read.

I smushed it all together using some really-not-pretty Python code, and connected it to a MongoDB database for storage.

A word about the training set

The training set makes or breaks the model. A good training set means your bouncing baby machine learning program has a good teacher. A bad training set means it’s got Edna Krabappel.

And accuracy alone isn’t enough. A training set also has to cover the full range of possible classification scenarios. One ‘good’ and one ‘spam’ page aren’t enough. You need hundreds or thousands to provide a nice range of possibilities. Otherwise, the machine learning program stagger around, unable to classify items outside the narrow training set.

Luckily, our initial hand-review reinclusion method gave us a set of carefully-selected spam and good pages. That was our initial training set. Later on, we dug deeper and grew the training set by running Is It Spam and hand-verifying good and bad page results.

That worked great on Is It Spam 2.0. It didn’t work so well on 1.0.

First attempt: fail

For my first version of the tool, I used a Bayesian Filter as my machine learning tool. I figured, hey, it works for e-mail spam, why not SEO spam?

Apparently, I was already delirious at that point. Bayesian filtering works for e-mail spam about as well as fishing with a baseball bat. It does occasionally catch spam. It also misses a lot of it, dumps legitimate e-mail into spam folders, and generally amuses serious spammers the world over.

But, in my madness, I forgot all about these little problems. Is It Spam 1.0 seemed pretty great at first. Initial tests showed 75% accuracy. That may not sound great, but with accurate confidence data, it could really streamline link profile reviews. I was the proud papa of a baby machine learning tool.

But Bayesian filters can be ‘poisoned.’ If you feed the filter a training set where 90% of the spam pages talk about weddings, it’s possible the tool will begin seeing all wedding-related content as spam. That’s exactly what happened in my case: I fed in 10,000 or so pages of spammy wedding links (we do a lot of work in the wedding industry). On the next test run, Is It Spam decided that anything matrimonial was spam. Accuracy fell to 50%.

Since we tend to use the tool to evaluate sites in specific verticals, this would never work. Every test would likely poison the filter. We could build the training set to millions of pages, but my pointy little head couldn’t contemplate the infrastructure required to handle that.

The real problem with a pure Bayesian approach is that there’s really only one feature: The content of the page. It ignores things like links, page trust and authority.

Oops. Back to the drawing board. I sent my little AI in for counseling, and a new brain.

Note: I wouldn’t have figured this out without help from SEOmoz’s Dr. Pete and Matt Peters. A ‘hat tip’ doesn’t seem like enough, but for now, it’ll have to do.

Second attempt: a qualified success

My second test used logistic regression. This machine learning model uses numeric data, not text. So, I could feed it more features. After the first exercise, this actually wasn’t too horrific. A few hours of work got me a tool that evaluates:

  • Page TrustFlow and CitationFlow (from MajesticSEO – I’m adding SEOmoz metrics now)
  • Links per word
  • Page Flesch-Kincaid reading grade level
  • Page Flesch Kincaid reading ease
  • Words per page
  • Syllables per page
  • Characters per page
  • A few other seemingly-random bits, like images per page, misspellings, and grammar errors

This time, the tool worked a lot better. With vertical-specific training sets, it ran with 85%+ accuracy.

In case you’re wondering, this is what victory looks like:

This is what victory looks like

When I tried to use the tool for more general tests, though, my coded kid tripped over its big, adolescent feet. Some of the funnier results:

  • It saw itself as spam.
  • It thought Rand’s blog was a swirling black hole of spammy despair.

False positives remain a big problem if we try to build a training set outside a single vertical.

Disappointing. But the tool chugs along happily within verticals, so we continue using it for that. We build a custom training set for each client, then run the training set against the remaining links. The result is a relatively clear report:

excelreport

Results and next steps

With little IIS learning to walk, we’ve cut the brute-force portion of large link profile evaluations from 30 hours to 3 hours. Not. Too. Shabby.

I tried to launch a public version of Is It Spam, but folks started using it to do real link profile evaluations, without checking their results. That scared the crap out of me, so I took the tool down until we cure the false positives problem.

I think we can address the false positives issue by adding a few features to the classification set:

  1. Bayesian filtering: Instead of depending on a Bayesian classification as 100% of the formula we’ll use the Bayesian score as one more feature.
  2. Grammar scoring: Anyone know a decent grammar testing algorithm in Python? If so, let me know. I’d love to add grammar quality as a feature.
  3. Anchor text matters a lot. The next generation of the tool needs to score the relevant link based on the anchor text. Is it a name (like in a byline)? Or is it a phrase (like in a keyword-stuffed link)?
  4. Link position may matter, too. This is another great feature that could help with spam detection. It might lead to more false positives, though. If Is It Spam sees a large number of spammy links in press release body copy, it may start rating other links located in body copy as spam, too. We’ll test to see if the other features are enough to help with this.

If I’m lucky, one or more of these changes may yield a tool that can evaluate pages across different verticals. If I’m lucky.

Insights

This is by far the most challenging development project I’ve ever tried. I probably wore another 10 years’ enamel off my teeth in just six weeks. But it’s been productive:

  1. When you start digging into automated page analysis and machine learning, you learn a lot about how computers evaluate language. That’s awfully relevant if you’re a 21st Century marketer.
  2. I uncovered an interesting pattern in Google’s Penguin implementation. This is based on my fumbling about with machine learning, so take it with a grain of salt, but have a look here.
  3. We learned that there is no such thing as a spammy page. There are only spammy links. One link from a particular page may be totally fine: For example, a brand link from a press release page. Another link from that same page may be spam: For example, a keyword-stuffed link from the same press release.
  4. We’ve reduced time required for an initial link profile evaluation by a factor of ten.

It’s also been a great humility-building exercise.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Transmedia Storytelling: Building Worlds For and With Fans

Posted by gfiorelli1

I believe in Harvey Dent TooJohn loves Batman.

He’s collected comics since he was nine years old, is proud of owning the first edition of Gotham by Gaslight, and still remembers the afternoons spent watching the TV series with Adam West at home when he was a kid.

Obviously, he has seen all the movies. The Dark Knight was a masterpiece; the I believe in Harvey Dent web campaign and the Joker counter site were pure genius, as were all the other sites created for that movie.

The passion John has for Batman is such that he could not resist, and ended up buying some action figures, Batman Arkham City, and the Batman Lego video games, too. But John is especially proud of the two short stories he wrote and published in Fan Fiction.

Seriously, John loves the universe of the Batman stories.

This universe is the consequence of the most complex and exciting expression of our culture nowadays: Transmedia Storytelling.

What is it Transmedia Storytelling?

Henry Jenkins presented the idea of Transmedia Storytelling for the first time in 2003 in the MIT Technology Review magazine. The idea can be defined as a story that unfolds across multiple media outlets and platforms, and in which a portion of the end users take an active role in the process of expansion.

Transmedia Storytelling tends to be mistaken for Cross Media, which can be defined as one story – rather than different ones pertaining to the same narrative universe – narrated through different media channels. Although, nowadays the two terms are used almost as if they were synonyms. 

In the same semantic galaxy, we can find concepts like:

The characteristics of Transmedia Storytelling

Spreadability vs. drillability


Spreadability refers to the expansion of a narrative through viral practices in social networks and the Web. Drillability is the task of penetration into the audience the producer develops to find the core of the fan base of his work; those ones who will disseminate and expand the transmedia productions.

Duff beer as example of inverse product placementContinuity vs. multiplicity

Transmedia worlds must have a continuity across the different languages and platforms that are used. For example, we expect Indiana Jones to behave in the same way as in a movie as in a video game. This continuity is complemented by multiplicity, which is the creation of seemingly incoherent narrative experiences related to the original narrative world. The Gotham by Gaslight comic cited before is a good example of multiplicity.

Immersion vs. extractability

Transmedia stories almost always offer to their consumer the opportunity to immerse themselves in their worlds (an example are the games inspired by a series or film or novel). On the other hand, the gadgets allow us to extract elements of the story and bring them to our everyday life. A special type of extractability is the reverse product-placement (for instance, the Duff beer of The Simpsons).

World building

As any other form of storytelling, Transmedia Storytelling presents a narrative world, which requires the suspension of disbelief by the user. This suspension is more effective the more detailed the narrative world is. Details like Sherlock Holmes playing a Stradivarius or snorting a line of cocaine are those giving verisimilitude to the narrative world. Inception, somehow, described well how the Transmedia professionals see themselves as World Building Architects.

Seriality

Transmedia storytelling is dispersive, and by its nature tends to be serial. The seriality is not linear, but becomes an hypertextual network.

Subjectivity

In Transmedia, the presence of multiple subjectivities is common. Therefore, Transmedia tends to enhance the polyphony caused by the large number of characters and stories. A classic example was Lost.

Performance

The actions of the consumers are essential as the fans are evangelists. Some of them – like our John – give the next step and become prosumers (producer + consumer), and do not hesitate in creating new texts and add them to further expand the boundaries of the original narrative world. Just think what would be Star Trek without the Trekkers.

Transmedia, which was at first a pop culture new form of storytelling spontaneously originated in the entertainment industry, is now spread and common in fields like journalisms and, for some years now. It’s a form of branding.

This sophisticated form of marketing, which we’ll call “Inbound Square,” has a rather complex creation process, such that I decided to present it in a more “digestible” Slideshare.

The Transmedia prototype: Star Wars

The Transmedia expansion of Star Wars began immediately after its premiere on May 25, 1977. In July of the same year, Marvel published the first Star Wars comic, and even though the first six numbers reflected what was seen in the movies,  new situations started being presented from the seventh number onwards.

In 1978, the first novel, Splinter of the Mind’s Eye – a spin-off based on an early version of the script – was published. But it wasn’t until 1987, with the commercialization of the  Star Wars roleplaying game, that George Lucas created what is known as the Star Wars Expanded Universe. It covers about 4,000 thousand years of history of that “galaxy far far away” and is not public (the Encyclopedia section of the official Star Wars site is not the same). From that moment, the flood of products began: video games (from X-Wing to the Old Republic MMORPG), the Director’s Cut of the first trilogy, the new trilogy, cartoons series, new comics, new novels, and even radio versions.

Star Wars is also paradigmatic of the relationship between the producer (George Lucas) and the fans, who (from one side) have taken possession of the Star Wars universe from the beginning, while (from the other side) George Lucas tried for a long time to keep under strict control the expansion of the universe of the Star Wars narrative with actions such as The Official Star Wars Fan Film Awards (here you can see some of those fans creations). 

It is humanly impossible to monitor and classify all the variants that fans have created within the narrative universe of Star Wars, such as the stories published on Fanfiction.com, sites like Death Star PR, Twitter bots like Yoda, photos using the action figures, and all kinds of videos published on YouTube. The fans have created content ranging from the staid tribute to parody itself, like this hilarious Lego video based on the Eddie Izzard gag of the Death Star canteen:

Transmedia Storytelling gives gigantic opportunities, which is why it can help explain the acquisition of Lucas Films by Disney and that J.J. Abrams, creator of now classic Transmedia example as Lost and Fringe (and director of the revamped Star Trek movies series) was chosen as the director of the next trilogy.

Transmedia and Hollywood

The relationship between the producer and the fans is critical to the success of any Transmedia Storytelling strategy, and Hollywood has understood its implications the best so far. A good, recent example is The Hunger Games movie trilogy. 

The Hunger Games Catching Fire Transmedia

I strongly suggest you to recover the post that Bryden McGrath wrote for Portent: 44 Ways ‘The Hunger Games’ Social Media Campaign Increased the Movie’s Odds of Success. What Bryden defines as a “social media campaign” really is the “spreadability” social side of the transmedia campaign that Ignition created for Lions Gate, the film’s producer:

The Hunger Games Case Study from Ignition on Vimeo.

This campaign has just been relaunched last week, with the presentation of the first trailer of The Hunger Games: Catching Fire.

This time, a new site was created in partnership with Microsoft Internet Explorer, The Hunger Games Explorer, which works as a hub for all the sites and social media profiles used in this transmedia experience (check the Capitol.pn official site…, or its Google+ page).

Another great – and maybe more complete – example of Transmedia Storytelling is that Ridley Scott and (again) Ignition created for Prometheus last year. This video below explains (better than I can) why it was created, how it was structured, what channels were used, and the results it obtained:

Prometheus Transmedia Campaign from Ignition on Vimeo.

I’m sure that many of you remember having seen some of the things Campfire created for the transmedia campaign preview of the first season of Games of Thrones on HBO.

Game of Thrones Case Study from Campfire on Vimeo.

Here’s the big question: can only big major movies or tv series do transmedia? Absolutely not.

An example is the Veronica Mars Movie Project, which two weeks ago closed its crowdfunding campaign on Kickstarter, all based on fans and social media, after managing to accumulate $5,702,153.

Transmedia journalism

Journalism – real journalism – is storytelling. There is no discussion about that, as it was well explained in Telling True Stories: A Nonfiction Writer’s Guide published by the Nieman Foundation.

Transmedia storytelling is the next logical step that journalism has already done (not always perfectly). Remember that:

  1. In transmedia, a story is narrated among different media channels and platforms.
  2. Prosumers actively participate in the construction of the narrative world.

Here’s a more recent example about a world event that shook many people in many countries. Do you remember how the world discovered that Osama Bin Laden was dead? Here’s how it went down:

“Helicopter hovering above Abbottabad at 1AM (is a rare event).”

— Sohaib Athar (@ReallyVirtual) 01 maggio 2011

Not long after, when the entire Internet was discussing the news, Keith Urbhan launched this tweet:

“So I’m told by a reputable person they have killed Osama Bin Laden. Hot damn.”

— Keith Urbahn (@keithurbahn) 02 maggio 2011

Check out the post Gilad Lotan wrote on Social Flow to understand the social earthquake that tweet caused.

President Obama announced the news of the Osama Bin Laden death an hour later, and after his announcement, it became known that many people in many countries already knew through social media.

From that moment, the journalistic storytelling (as well as continuing in social networks) expanded in the news portals, televisions, radios, and, of course, the printed paper.

This event relied on transmedia, was “organic,” and was unplanned (obviously), but nontheless, it was transmedia.

Transmedia journalism, which is well explained by Kevin Moloney in is thesis Porting Transmedia Storytelling to Journalism (2011), substantially means that the news must generate a space, which allows the active participation of the readers/viewers. It is not surprising, nowadays, seeing how much this interaction between the journalists and the readers is central in the news industry.

Transmedia journalism can take many different forms, but all must see the reader – meant as a free citizen – as a contributor in the creation and storytelling of news:

Great examples of Transmedia journalism can be find in sites like 18 day in Egypt or Storify

Transmedia branding

Do you remember the concept of Liquid Content promoted by Coca Cola? Well, that is nothing but the Coca Cola definition of Transmedia Storytelling.

Brands have found a naturally ally for their marketing efforts in transmedia (and crossmedia) for years. Think to the Art of Heist ARG created for the launch of Audi 3 in the USA, to the first Old Spice campaign or, more recently, to the Daybreak transmedia campaign created for AT&T, which shows five online movies, two websites (you can discover the second almost in the footer of the fist), and an app.

Another example is Dove’s extremely recent Real Beauty Sketches campaign created by Ogilvy Brasil, which cannot be defined really as transmedia, but that showcases the transmedia spirit: narrating a story, which represents the brand’s world (“Real Beauty”) to its fans (women just as they are) by using channels likes online video, a microsite, and social networks (Facebook, especially). The purpose was to create a public, yet intimate dialogue with the fans, and the campaign surely obtained that objective. It did so well that it already has originated a parody:

You can read here more about the story behind the Dove’s campaign

I can imagine the doubts you may have in your mind: a narrative world is easier when you have a movie, a series, a book, or news, but what about my website? What about my bolts and rolled aluminum factory?

I understand your hesitations, but the answer is that you are probably searching in the wrong place. All brands have a story, and the basis of their stories are in their mission pages, that you usually forget exists.

Let’s take two example: REI and our own SEOmoz.

REI has created all its business, and marketing, around few very clear values:

(Image taken from How to Build SEO into Content Strategy, by Jonathon Colman)

Those values, the REI’s “Why,” define the narrative world of REI, which is shared by more than five million members of the Company (REI is a cooperative). 

We have a narrative world through of outdoor adventure, environmentalism, and stewardship, along with a gigantic base of brand evangelists who not only share what REI does (and sells), but create their own narratives inside the REI’s world. That is the base from where Transmedia Storytelling can be built, and REI seems to be moving towards it with ideas like the REI 1440 Project, the REI Members Stories YouTube videos. They also have a presence in Facebook, Twitter, Pinterest, Instagram, and all their IRL events (i.e.: Learn at REI and Travel with REI). 

What about SEOmoz? Moz could do transmedia, and actually (without maybe knowing it) it does it, because – as I told before – transmedia is the “Inbound Square.”

Moz has built its story around the 6 TAGFEE Tenets; has built a community of entusiastic people (yes, you!) around its tools and Internet marketing philosophy; and a narrative world, which evolves thanks to the fans, both internally (YouMoz, QA) and externally (check the Mozscape API’s Gallery). 

The world of fans that SEOmoz nurtures with gamification (one of the experience principle of transmedia) and live events like MozCon, meet-ups, and MozCation is the most transmedia campaign SEOmoz has created so far. And, finally and importantly, Moz has Roger.

Conclusion

People have always needed stories to communicate and feel connected, and good stories always become part of our history and our culture.
The most recognized brands have one thing in common: they all tell a story.

But something is changing. Never before have people had so many devices and screens from which to follow a story, and now the consumers seek new experiences and a deeper engagement. The stories are formed and followed differently than before.

In order to be relevant for a hyper-connected generation and be present in its life, we need:

  • Liquid and customizable content to distribute by any means available.
  • Different levels of depth in the story made ​​for different levels of depth of user involvement.
  • A consistent message, where each piece feeds the story and the conversation with the audience continuously.

What we need is Transmedia Storytelling, a process where the elements of a story are dispensed each other through multiple distribution channels with the purpose of creating a unified entertainment experience, where each medium really makes its own contribution to the development of the story.

Transmedia Storytelling is a very powerful and immersive persuasion tool, a fans-generating machine, because it creates a strong emotional connection with the audience. It is profitable because it redefines and increases ROI, increases impact over media, and can cause exceptional sources of income.
Transmedia Storytelling, finally, is the best and most effective way to connect (especially) with the new generations of consumers and build a sustainable audience around a brand, as Red Bull demonstrated.

So… are you ready to tell your story?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

April Mozscape Index Is Live

Posted by bradfriedman

Hello Mozzers, and happy Monday!

My name is Brad Friedman, Technical Lead for Mozscape, and I’m happy to announce that we’ve released a brand new Mozscape index for April. You can find fresh, new data across all of our apps. Check out Open Site Explorer, the Mozbar, your PRO campaigns, and the Mozscape API.

We’ve reduced our index crawl time to just eleven days for this release! Thanks to our Big Data wizards on the processing team, Douglas Vojir and Martin York, for improving the freshness of our metrics! You can read more details on our technical improvements in this post from February.

We started processing this index on Wednesday, April 10, so the metrics will reflect crawl data from the end of March and the first week of April.

Here are the numbers for this latest index:

  • 88,973,525,592 (88 billion) URLs
  • 9,077,621,093 (9.1 billion) Subdomains
  • 161,124,038 (161 million) Root Domains
  • 887,067,310,285 (887 billion) Links
  • Followed vs. Nofollowed
    • 2.15% of all links found were nofollowed
    • 56.0% of nofollowed links are internal
    • 44.0% are external
  • Rel Canonical – 15.08% of all pages use a rel=canonical tag
  • The average page has 76 links on it
    • 65.05 internal links
    • 11.02 external links

And these are the correlations with Google’s US search results:

  • Page Authority – 0.36
  • Domain Authority – 0.19
  • MozRank – 0.24
  • Linking Root Domains – 0.30
  • Total Links – 0.25
  • External Links – 0.29

Crawl histogram for the April Mozscape index

All this delicious data! What a great way to start off the week, huh?

Follow our planned update schedule on our Mozscape calendar, and you can check out the metrics on our previous releases here.

We’re happy to answer your questions or read your feedback! Feel free to leave your comments here on this thread, or you can reach me on Twitter (@brad_friedman).


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →