Here’s How to Supercharge Your Competitive Research Using a URL Profiler and Fusion Tables

Posted by Craig_Bradshaw

[Estimated read time: 19 minutes]

As digital marketers, the amount of data that we have to collect, process, and analyze is overwhelming. This is never more true than when we’re looking into what competitors are doing from a link building perspective.

Thankfully, there are a few things we can do to make this job a little bit easier. In this post, I want to share with you the processes I use to supercharge my analysis of competitor backlinks. In this post, you’ll learn:

  • How to use URL Profiler for bulk data collection
  • How to use fusion graphs to create powerful data visualizations
  • How to build an SEO profile of the competition using URL Profiler and fusion tables

Use URL Profiler for bulk data collection

Working agency-side, one of the first things I do for every new client is build a profile of their main competitors, including those who have a shared trading profile, as well as those in their top target categories.

The reason we do this is that it provides a top-level overview of the industry and how competitive it actually is. This allows us to pick our battles and prioritize the strategies that will help move the right needles. Most importantly, it’s a scalable, repeatable process for building links.

This isn’t just useful for agencies. If you work in-house, you more than likely want to watch your competitors like a hawk in order to see what they’re doing over the course of months and years.

In order to do this, you’re inevitably going to need to pull together a lot of data. You’ll probably have to use a range of many different tools and data points.

As it turns out, this sort of activity is where URL Profiler becomes very handy.

For those of you who are unfamiliar with URL Profiler is, it’s a bulk data tool that allows you to collect link and domain data from thousands of URLs all at once. As you can probably imagine, this makes it an extremely powerful tool for link prospecting and research.

URL Profiler is a brilliant tool built for SEOs, by SEOs. Since every SEO I know seems to love working with Excel, the output you get from URL Profiler is, inevitably, most handy in spreadsheet format.

Once you have all this amazing bulk data, you still need to be able to interpret it and drive actionable insights for yourself and your clients.

To paraphrase the great philosopher Ben Parker: with great data power comes great tedium. I’ll be the first to admit that data can be extremely boring at times. Don’t get me wrong: I love a good spreadsheet as much as I love good coffee (more on that later); but wherever possible, I’d much rather just have something give me the actionable insights I need.

This is where the power of data visualization comes into play.

Use fusion tables for powerful data visualization

Have you ever manually analyzed one million articles to see what the impact of content format and length has on shares on links? Have you ever manually checked the backlink profile of a domain that has over half a million links? Have you ever manually investigated the breakdown of clicks and impressions your site gets across devices? Didn’t think so.

Thanks to Buzzsumo & Moz, Majestic, Ahrefs, and the Google Search Console, we don’t have to; we just use the information they give us to drive our strategy and decision-making.

The reason these tools are so popular is they allow you to input your data and discern actionable insights. Unfortunately, as already mentioned, we can’t easily get any actionable insights from URL Profiler. This is where fusion tables become invaluable.

If you aren’t already familiar with fusion tables, then the time has come for you to get acquainted with them.

Back in 2012, Google rolled out an “experimental” version of their fusion tables web application. They did this to help you get more from your data and tell the story of what’s going on in your niche with less effort. It’s best to think of fusion tables as Google’s answer to big data.

There are plenty of examples of how people are using fusion tables to tell their stories with data. However, for the purpose of brevity, I only want to focus on one incredibly awesome feature of fusion tables — the network graph.

h8SDcTN.png

If fusion tables are Google’s answer to big data, then the network graph feature is definitely Google’s answer to Cerebro from X-Men.

I won’t go into too many details about what network graphs are (you can read more about them here), as I would much rather talk about their practical applications for competitive analysis.

Note: There is a fascinating post on The Moz Blog by Kelsey Libert about effective influencer marketing that uses network graphs to illustrate relationships. You should definitely check that post out.

I’d been using URL Profiler and fusion tables tools in isolation of each other for quite a while — and they each worked very well — before I figured out how to combine their strengths. The result is a process that combines the pure data collection power of URL Profiler with the actionable insights that fusion graphs provide.

I’ve outlined my process below. Hopefully, it will allow you to do something similar yourself.

Build a competitive SEO profile with URL Profiler and fusion tables

To make this process easier to follow, we’ll pretend we’re entering the caffeinated, yet delicious space of online coffee subscriptions. (I’ve chosen to use this particular niche in our example for no reason other than the fact that I love coffee.) Let’s call our hypothetical online coffee subscription company “Grindhaus.”

Step 1: Assess your competition

We’ll start by looking at the single keyword “buy coffee online.” A Google search (UK) gives us the top 10 that we’ll need to crack if we want to see any kind of organic progress. The first few results look like this: zjDG2Tc.png?1

Step 2: Gather your data

However, we’ve already said that we want to scale up our analysis, and we want to see a large cross-section of the key competitors in our industry. Thankfully, there’s another free tool that comes in handy for this. The folks over at URL Profiler offer a number of free tools for Internet marketers, one of which is called the SERP Scraper. No prizes for guessing what it does: add in all the main categories and keywords you want to target and hit scrape.

e3jAb81.png?1

As you can see from the image above, you can do this for a specific keyword or set of keywords. You can also select which country-specific results you want to pull, as well as the total number of results you want for each query.

It should only take a minute or so to get the results of the scrape in a spreadsheet that looks something like this:

sNko03Z.png

In theory, these are the competitors we’ll need to benchmark against in order for Grindhaus to see any sort of organic progress.

From here, we’ll need to gather the backlink profiles for the companies listed in the spreadsheet one at a time. I prefer to use Majestic, but you can use any backlink crawling tool you like. You’ll also need to do the same for your own domain, which will make it easier to see the domains you already have links from when it’s time to perform your analysis.

After this is done, you will have a file for your own domain, as well as a file for each one of the competitors you want to investigate. I recommend investigating a minimum of five competitors in order to obtain a data set large enough to obtain useful insights from.

Next, what we need to do is clean up the data so that we have all the competitor link data in one big CSV file. I organize my data using a simple two-column format, as follows:

  • The first column contains the competitor being linked to. I’ve given this column the imaginative heading “Competitor.”
  • The second column contains the domains that are linking to your competitors. I’ve labeled this column “URL” because this is the column header the URL Profiler tool recognizes as the column to pull metrics from.

Once you have done this, you should have a huge list of the referring domains for your competitors that looks something like this:

IjfGTeb.png

This is where the fun begins.

Step 3: Gather even more data

Next, let’s take each domain that is linking to one, some, or all of your competitors and run it through URL Profiler one at a time. Doing this will pull back all the metrics we want to see.

It’s worth noting that you don’t need any additional paid tools or APIs to use URL Profiler, but you will have to set up a couple of API keys. I won’t go into detail here on how to do this, as there are already plenty of resources explaining this readily available, including here and here. Vl6tUIQ.png?1

One of the added benefits of doing this through URL Profiler is that you can use its “Import and Merge” feature to append metrics to an existing CSV. Otherwise, you would have to do this by using some real Excel wizardry or by tediously copying and pasting extreme amounts of data to and from your clipboard.

As I’ve already mentioned, URL Profiler allows me to extract both page-level and domain-level data. However, in this case, the domain metrics are what I’m really interested in, so we’ll only examine these in detail here.

Majestic, Moz, and Ahrefs metrics

Typically, SEOs will pledge allegiance to one of these three big tools of the trade: Majestic, Moz, or Ahrefs. Thankfully, with URL Profiler, you can collect data from any or all of these tools. All you need to do is tick the corresponding boxes in the Domain Level Data selection area, as shown below. iIoJzQi.png

In most cases, the basic metrics for each of the tools will suffice. However, we also want to be able to assess the relevance of a potential link, so we’ll also need Topical Trust Flow data from Majestic. To turn this on, go to Settings > Link Metrics using the top navigation and tick the “Include Topical Trust Flow metrics” box under the Majestic SEO option.
JnUG72w.png

Doing this will allow us to see the three main topics of the links back to a particular domain. The first topic and its corresponding score will give us the clearest indication of what type of links are pointing back to the domain we’re looking at.

In the case of our Grindhaus example, we’ll most likely be looking for sites that scored highly in the “Recreation/Food” category. The reason we want to do this is because relevance is a key factor in link quality. If we’re selling coffee, then links from health and fitness sites would be useful, relevant, and (more likely to be) natural. Links from engineering sites, on the other hand, would be pretty irrelevant, and would probably look unnatural if assessed by a Google quality rater.

Social data

Although the importance of social signals in SEO is heavily disputed, it’s commonly agreed that social signals can give you a good idea of how popular a site is. Collecting this sort of information will help us to identify sites with a large social presence, which in theory will help to increase the reach of our brand and our content. In contrast, we can also use this information to filter out sites with a lack of social presence, as they’re likely to be of low quality.

Social Shares

Ticking “Social Shares” will bring back social share counts for the site’s homepage. Specifically, it will give you the number of Facebook likes, Facebook shares, Facebook comments, Google plus-ones, LinkedIn shares, and Pinterest pins.

Social Accounts

Selecting “Social Accounts” will return the social profile URLs of any accounts that are linked via the domain. This will return data across the following social networks: Twitter, Google Plus, Facebook, LinkedIn, Pinterest, YouTube, and Instagram.

Traffic

In the same way that sites with strong social signals give us an indication of their relative popularity, the same can also be said for sites that have strong levels of organic traffic. Unfortunately, without having direct access to a domain’s actual traffic figures, the best we can do is use estimated traffic.

This is where the “SEMrush Rank” option comes into play, as this will give us SEMrush’s estimation of organic traffic to any given domain, as well as a number of organic ranking keywords. It also gives us AdWords data, but that isn’t particularly useful for this exercise. pNgt3pH.png

It’s worth mentioning once more time that this is an estimate of organic traffic, not an actual figure. But it can give you a rough sense of relative traffic between the sites included in your research. Rand conducted an empirical study on traffic prediction accuracy back in June — well worth a read, in my opinion.

Indexation

One final thing we may want to look at is whether or not a domain is indexed by Google. If it hasn’t been indexed, then it’s likely that Google has deindexed the site, suggesting that they don’t trust that particular domain. The use of proxies for this feature is recommended, as it automatically queries Google in bulk, and Google is not particularly thrilled when you do this! pw4DOYa.png

After you’ve selected all the metrics you want to collect for your list of URLs, hit “Run Profiler” and go make yourself a coffee while it runs. (I’d personally go with a nice flat white or a cortado.)

For particularly large list of URLs, it can sometimes take a while, so it would probably be best to collect the data a day or two in advance of when you plan to do the analysis. For the example in this post, it took around three hours to pull back data for over 10,000 URLs. But I could have it running in the background while working on other things.

Step 4: Clean up your data

One of the downsides of collecting all of this delicious data is that there are invariably going to be columns we won’t need. Therefore, once you have your data, it’s best to clean it up, as there’s a limit on the number of columns you can have in a fusion table. CXFldtb.png

You’ll only need the combined results tab from your URL Profiler output. So you can delete the results tab, which will allow you to re-save your file in CSV format.

Step 5: Create your new fusion table

Head on over to Google Drive, and then click New > More > Google Fusion Tables. zbULZzA.png

If you can’t see the “Google Fusion Tables” option, you’ll have to select the “Connect More Apps” option and install Fusion Tables from there: nffgrIL.png

From here, it’s pretty straightforward. Simply upload your CSV file and you’ll then be given a preview of what your table will look like.

Click “Next” and all your data should be imported into a new table faster than you can say “caffeine.”
VwO62dA.png

WSpdPNN.png

Step 6: Create a network graph

Once you have your massive table of data, you can create your network graph by clicking on the small red “+” sign next to the “Cards” tab at the top of your table. Choose “Add Chart” and you’ll be presented with a range of chart options. The one we’re interested is the network graph option: DadqMBW.png

Once you’ve selected this option, you’ll then be asked to configure your network graph. We’re primarily interested in the link between our competition and their referring domains.

However, the relationship only goes in one direction: I, the referring website, give you, the retailer, a link. Thus the connection. Therefore, we should tick the “Link is directional” and “Color by columns” options to make it easier to distinguish between the two.

By default, the network graph is weighted by whatever is in the third column — in this case, it’s Majestic CitationFlow, so our blue nodes are sized by how high the CitationFlow is for a referring domain. Almost instantly, you can spot the sites that are the most influential based on how many sites link to them.

This is where the real fun begins.

One interesting thing to do with this visualization that will save you a lot of time is to reduce the number of visible nodes. However, there’s no science to this, so be careful you’re not missing something. wzwURXr.png

As you increase the number of nodes shown, more and more blue links begin to appear. At around 2,000 nodes, it’ll start to become unresponsive. This is where the filter feature comes in handy, as you can filter out the sites that don’t meet your chosen quality thresholds, such as low Page Authority or a large number of outbound links.

So what does this tell us — other than there appears to be a relatively level playing field, which means there is a low barrier to entry for Grindhaus?

This visualization gives me a very clear picture of where my competition is getting their links from. adaFRBx.png

In the example above, I’ve used a filter to only show referring domains that have more than 100,000 social shares. This leaves me with 137 domains that I know have a strong social following that would definitely help me increase the reach of my content.

You can check out the complete fusion table and network graph here.

Step 7: Find your mutant characteristics

Remember how I compared network graphs to Google’s answer to Cerebro from X-Men? Well, this is where I actually explain what I meant.

For those of you that are unfamiliar with the X-Men universe, Cerebro is a device that amplifies the brainwaves of humans. Most notably, it allows telepaths to distinguish between humans and mutants by finding the presence of the X-gene in a mutant’s body.

Using network graphs, we can specify our own X-gene and use it to quickly find high-quality and relevant link opportunities. For example, we could include sites that have a Domain Authority greater than or equal to 50:

81Wu6Zp.png

For Grindhaus, this filter finds 242 relevant nodes (from a total of 10,740 total nodes). In theory, these are domains Google would potentially see as being more trustworthy and authoritative. Therefore, they should definitely be considered as potential link-building opportunities.

You should be able to see that there are some false positives in here, including Blogspot, Feedburner, and Google. However, these are outweighed by an abundance of extremely authoritative and relevant domains, including Men’s Health, GQ Magazine, and Vogue.co.uk.

Sites that have “Recreation/Food” as their primary Topical Trust Flow Topic:

rp5JT4o.png

This filter finds 361 relevant nodes out of a total of 10,740 nodes, which all have “Recreation/Food” as their primary Topical Trust Flow Topic.

Looking at this example in more detail, we see that another cool feature of network graphs is that the nodes that have the most connections are always in the center of the graph. This means you can quickly identify the domains that link to more than one of your competitors, as indicated by the multiple yellow lines. This works in a similar way to Majestic’s “Click Hunter” feature and Moz’s “Link Intersect” tool.

However, you can do this on a much bigger scale, having a wider range of metrics at your fingertips.

qFP2gro.png

In this case, toomuchcoffee.com, coffeegeek.com, and beanhunter.com would be three domains I would definitely investigate further in order to see how I could get a link from them for my own company.

Sites that are estimated to get over 100,000 organic visits, weighted by social shares:

1ui0EZa.png

For our Grindhaus, this filter finds 174 relevant nodes out of 10,740, which are all estimated to receive more than 100,000 organic visits per month. However, I have also weighted these nodes by “Homepage Total Shares.” This allows me to see the sites that have strong social followings and have also been estimated to receive considerable amounts of organic traffic (i.e., “estimorganic” traffic).

By quickly looking at this network graph, we can immediately see some authoritative news sites such as The Guardian, the BBC, and the Wall Street Journal near the center, as well as quite a few university sites (as denoted by the .ac.uk TLD).

Using this data, I would potentially look into reaching out to relevant editors and journalists to see if they’re planning on covering National Coffee Week and whether they’d be interested in a quote from Grindhaus on, say, coffee consumption trends.

For the university sites, I’d look at reaching out with a discount code to undergraduate students, or perhaps take it a bit more niche by offering samples to coffee societies on campus like this one.

This is barely scratching the surface of what you can do with competitor SEO data in a fusion table. SEOs and link builders will all have their own quality and relevance thresholds, and will also place a particular emphasis on certain variables, such as Domain Authority or total referring domains. This process lets you collect, process, and analyze your data however you see fit, allowing you to quickly find your most relevant sites to target for links.

Step 8: Publish and share your amazing visualization

Now that you have an amazing network graph, you can embed it in a webpage or blog post. You can also send a link by email or IM, which is perfect for sharing with other people in your team, or even for sharing with your clients so you can communicate the story of the work you’re undertaking more easily.

Note: Typically, I recommend repeating this process every three months.

Summary and caveats

Who said that competitive backlink research can’t be fun? Aside from being able to collect huge amounts of data using URL Profiler, with network graphs you can also visualize the connections between your data in a simple, interactive map.

Hopefully, I’ve inspired you to go out and replicate this process for your own company or clients. Nothing would fill me with more joy than hearing tales of how this process has added an extra level of depth and scale to your competitive analysis, as well as given you favorable results.

However, I wouldn’t be worth my salt as a strategist if I didn’t end this post with a few caveats:

Caveat 1: Fusion tables are still classed as “experimental,” so things won’t always run smoothly. The feature could also disappear altogether overnight, although my fingers (and toes) are crossed that it doesn’t.

Caveat 2: Hundreds of factors go into Google’s ranking algorithm, and this type of link analysis alone does not tell the full story. However, links are still seen as an incredibly important signal, which means that this type of analysis can give you a great foundation to build on.

Caveat 3: To shoehorn one last X-Men analogy in… using Cerebro can be extremely dangerous, and telepaths without well-trained, disciplined minds put themselves at great risk when attempting to use it. The same is true for competitive researchers. However, poor-quality link building won’t result in insanity, coma, permanent brain damage, or even death. The side effects are actually much worse!

In this age of penguins and penalties, links are all too often still treated as a commodity. I’m not saying you should go out and try to get every single link your competitors have. My emphasis is on quality over quantity. This is why I like to thoroughly qualify every single site I may want to try and get a link from. The job of doing competitive backlink research using this method is to assess every possible option and filter out the websites you don’t want links from. Everything that’s left is considered a potential target.

I’m genuinely very interested to hear your ideas on how else network graphs could be used in SEO circles. Please share them in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

What You Should Know About Accessibility + SEO, Part I: An Intro

Posted by Laura.Lippay

[Estimated read time: 4 minutes]

Do you know anyone who is visually impaired? Maybe they have low vision or color blindness, or are fully blind. Think about how they use the Internet. Close your eyes, or at least squint really hard, and try to find today’s news or interact with your friends on Facebook. It’s a challenge many of us don’t think about every day, but some of what we do in SEO can affect the experience that people with visual impairments have when visiting a page.

Accessibility and the Internet

accessibilitymac.gif

Visually impaired Internet users are able to navigate and use the web using screen readers like VoiceOver or Jaws. Screen readers, much like search engine crawlers, rely on signals in the code to determine the structure and the context of what they’re crawling. The overlap in what search crawlers look for and interpret versus what screen readers look for and interpret is small, but the idea is the same: Where are the elements of this page and how do I understand them?

The SEO overlap

While it’s important to understand where SEO and accessibility (a11y) overlap in order to optimize correctly for both, it’s also important to note that optimizing for one is not necessarily akin to optimizing for the other. In other words, if you’ve optimized a page for search engines, it doesn’t mean you’ve necessarily made it accessible — and vice versa.

Recently, web accessibility expert Karl Groves wrote a post called The Accessibility & SEO Myth. Mr. Groves knows the world of accessibility inside and out, and knows that optimizing for accessibility, which goes far beyond optimizing for the visually-impaired, is very different overall, and much more complex (strictly from a technical standpoint) than optimizing for search engines. He’s right — that despite the ways SEO and a11y overlap, a11y is a whole different ballgame. But if you understand the overlap, you can successfully optimize for both.

Here are just some examples of where SEO and accessibility can overlap:

  • Video transcription
  • Image captioning
  • Image alt attributes
  • Title tags
  • Header tags (H1, H2, etc)
  • Link anchor text
  • On-site sitemaps, table of contents, and/or breadcrumbs
  • Content ordering
  • Size and color contrast of text
  • Semantic HTML

If you’re developing the page yourself, I would challenge you to learn more about the many things you can do for accessibility beyond where it overlaps with SEO, like getting to know ARIA attributes. Take a look at the W3C Web Content Accessibility Guidelines and you’ll see there are far more complex considerations for accessibility than what we typically consider for technical SEO. If you think technical SEO is fun, just wait until you get a load of this.

Optimizing for accessibility or SEO?

Chances are, if you’re optimizing for accessibility, you’re probably covering your bases for those technical optimizations where accessibility and SEO overlap. BUT, this doesn’t always work the other way around, depending on the SEO tactics you take.

Thankfully, the Converse site has a pretty descriptive alt attribute in place!

Consider a screen reader reaching an image of a pair of women’s black Chuck Taylor All-Star shoes and reading its alt attribute as “Women’s black Chuck Taylor All-Stars buy Chucks online women’s chuck taylors all-stars for sale.” Annoying, isn’t it? Or compare these page titles with SEO and accessibility in mind: “Calculate Your Tax Return” versus “Online Tax Calculator | Tax Return Estimator | Tax Refund/Rebate.” Imagine you just encountered this page without being able to see the content. Which one more crisply and clearly describes what you can expect of this page?

While it’s nice to know that proper technical search engine optimization will affect how someone using a screen reader can contextualize your site, it’s also important to understand (1) that these two optimization industries are, on a bigger level, quite different, and (2) that what you do for SEO where SEO and a11y overlap will affect how some visitors can (or can’t) understand your site.

http://platform.twitter.com/widgets.js

For Global Accessibility Awareness Day on May 19, I’ll be collaborating with some experts in a11y on a post that will go into more details on what aspects of SEO + a11y to be keenly aware of and how to optimize for both. I’ll be sure to find as many examples as I can — if you’ve got any good ones, please feel free to share in the comments (and thanks in advance).

Educational resources & tools

In the meantime, to learn more about accessibility, check out a couple of great resources:

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The Guide to International Website Expansion: Hreflang, ccTLDs, & More!

Posted by katemorris

Growth. Revenue, visits, conversions. We all want to see growth. For many, focusing on a new set of potential customers in another market (international, for instance) is a source of growth. It can sometimes seem like an easy expansion. If your current target market is in the US, UK, or Australia, the other two look promising. Same language, same content — all you need is to set up a site for them and target it at them, right?

International expansion is more complicated than that. The ease of expansion depends highly on your business, your resources, and your customers. How you approach expansion and scale it over time takes consideration and planning. Once you’ve gone down a path of URL structure and a process for marketing and content, it’s difficult to change.

This guide is here to help you go down the international expansion path on the web, focused on ensuring your users see the right content for their query in the search engines. This guide isn’t about recommendations for translation tools or how to target a specific country. It is all about international expansion from a technical standpoint that will grow with your business over time.

At the end is a bonus! A flow chart to help you troubleshoot international listings showing up in the wrong place in the SERPs. Have you ever wondered why your Canadian page showed for a user in the US? This will help you figure that out!

Before we begin: Terminology

ccTLD – A country-specific top-level domain. These are assigned by ICANN and are geo-targeted automatically in Google Search Console.

gTLD – A generic top-level domain. These are not country-specific and if used for country-specific content, they must be geo-target inside of Google Search Console or Bing Webmaster Tools. Examples include .com, .net, and .tv. Examples from Google found here.

Subdomain – A major section of a domain, distinguished by a change to the characters before the root domain. The most-used standard subdomain is www. Many sites start with www.domain.com as their main subdomain. Subdomains can be used for many reasons: marketing, region targeting, branded micro sites, and more.

Subfolder – A section of a subdomain/domain. Subfolders are sections marked by a trailing slash. Examples include http://ift.tt/1DZqpSp, or in terms of this guide, www.domain.com/en or www.domain.ca/fr.

Parameter – A modifier of a URL that either tracks a path of a user to the content or changes the content on the page based on the parameters in the URL. These are often used to indicate the language of a page. An example is http://ift.tt/1qd0mVq, with lang being the parameter.

Country – A recognized country that has a ccTLD by ICANN or an ISO code. Google uses ISO 3166-1 Alpha-2 for hreflang.

Region – Collections of countries that the general public groups together based on geography. Examples include the EU or the Middle East. These are not countries and cannot be geo-targeted at this time.

Hreflang – A tag used by Google to allow website owners to indicate that a specific page has a copy in another language. The tags indicate all other translated versions of that page along with the language. The language tags can have regional dialects to distinguish between language differences like British English and American English. These tags can reside on-page or in XML sitemaps.

Meta language – The language-distinguishing tag used by Bing. This tag merely informs Bing of the language of the current page.

Geo-targeting – Both Bing Webmaster Tools and Google Search Console allow website owners to claim a specific domain, subfolder, or subdomain, and inform the search engine that the content in that domain or section is developed for and targeted at the residents of a specific country.

Translation – Changing content from one language or regional dialect to another language or regional dialect. This should never be done with a machine, but rather always performed by someone fluent in that language or regional dialect.

Understanding country and language targeting

The first step in international expansion planning is to determine your target. There is some misunderstanding between country targeting and language targeting. Most businesses start international expansion wanting to do one of two things:

  1. Target users that speak another language.
    Example – A business in Germany: “We should translate our content to French.”
  2. Target users that live in another part of the world.
    Example – A business in Australia: “We should expand into the UK.”

False associations: Country and language

The first issue people run into is associating a country and a language. Many of the world’s top languages have root countries that share the same name; specifically, France/French, Germany/German, Portugal/Portuguese, Spain/Spanish, China/Chinese, Japan/Japanese, and Russia/Russian. Many of these languages are used in a number of other countries, however. Below is a list of the top languages used by Internet users.

Click to open a bigger version in a new tab!

Please note this is not the list of top languages in the world; that is a vastly different list. This list is based on Internet usage. And there are some languages that only have one country set as the official language, but users exist in other countries that browse the Internet with that language as their preferred language. An example might be a Japanese national working in the US setting up a new office.

Another note is that the “main” country chosen above is what country is the originator of the language (English) or what country shares a name with/is close to the language name. This is how many people associate languages and countries in most instances, but those assumptions are not correct.

Flags and languages

We must disassociate languages and countries. There are too many times when a country flag is used to note a language change on a site. Flags should only be used when the country is being targeted, not the language.

Click to open a bigger version in a new tab!

Web technology and use impacts targeting

The second issue arises in the execution. The business in Germany from the first few examples might hire a translator from France and translate their content to French. From there, the targeting can get confused based on where that content is placed and how it is tagged.

Below are some implementations of posting the translated content we might see by the business. This table looks at a variety of combinations of ccTLDs, gTLDs, subfolders, subdomains, hreflang tagging, and geo-targeting. Each combination of URL setup and tagging results in different targeting according to search engines and how that can impact the base number of Internet users in that group.

Click to open a bigger version in a new tab!

Given the above, you can see that the implementation is not as straightforward as it might seem. There’s no single right answer in the above possible implementations. However, many of them change the focus of the original target market (speakers of the French language) and that has an impact on the base target market.

International search strategy tool

This is what many of us face when trying to do international expansion. There is conflicting data on what should be done. This is why I developed a tool to help businesses determine which route they should take in international expansion. It helps them determine what their real focus should be (language, country, or if they need to use both) and narrows down the list of choices above while understanding their business needs, resources, and user needs. It’s developed over the years from a flow chart, to a poorly designed tool, to a better-structured tool found by clicking the link in the image below.


Start with those questions and then come back here when you have other questions. That’s what the rest of this guide is about. It’s broken down into three types of targeting:

  1. Language
  2. Country
  3. Hybrid (multiple countries with multiple languages)

No one type is easier than another. You really need to choose the path early on and use what you know of your business, user needs, and resources.

Language targeting

Language-only targeting can seem like the easiest route to take, as it doesn’t require a major change and multiple instances of marketing plans. Country-focused targeting requires new targeted content to each targeted country. There are far fewer languages in the world than countries. In addition, if you target the major world languages, you could potentially start with a base of millions of users that speak those languages.

However, language targeting involves two very tricky components: translation and language tagging. If either of these components are not done right, it can cause major issues with user experience and indexation.

Translation

The first rule of working with languages and translation is NEVER machine translate. Machine translation is highly inaccurate. I was just at an all-inclusive resort in Mexico, and you could tell the translations were done by a machine, not a person. Using machine translations produces a very poor user experience and poor SEO targeting as well.

Translations of content should always be done by a human who is fluent both in that language and the original language of the content. If you are dealing with regional variations, it is recommended to get someone that is native to and/or living in that area to translate, as well as being fluent.

Spending the right resources on translation will ensure the best user experience and the most organic traffic.

Language tagging: Hreflang and meta language

When you hear about translation and international expansion, the first thing people think about is the hreflang tag. Relative to the Internet, the hreflang tag is new. This launched in late 2010. It is only used by Google as of when this post was written. If the bulk of your traffic comes from Google and you are translating only, this is of use to you. However, do know that Bing uses a different tag format, called the meta language tag.

Tips: Ensure that there’s an hreflang tag on every page that’s translated to every other translated instance of that page. I prefer the tags be put in XML sitemaps (instructions here) to keep the tagging off the page, as any removal of code increases page load time, no matter how small. Do what works for your team.

What about x-default?

One of the tagging mistakes that happens most often is using x-default. Many people misunderstand its use. X-default was added to the hreflang markup family to help Google serve un-targeted pages, like those from IKEA and FedEx, to users that don’t have language-targeted content on that site or Google doesn’t know where to place them. This tag is not meant to set the “original” page.

Checking for tagging issues

Once you have your tagging live (or on a testing server that is crawlable by Google but not indexable), you can check for issues inside of Google Search Console. This will let you know what tag issues you are having and where they’re located.

URL selections

Choosing the URL structure of your language extensions is totally up to you. If you are focusing on language targeting only, don’t use a ccTLD. Those are meant for targeting a specific country, not a language. ccTLDs automatically geo-target and that selection cannot be changed. Your other choices are subfolder, subdomain, and parameter. They’re listed below in order of my professional preference and why.

  1. Subfolders provide a structure that’s easier to build upon and develop as your site and business grows and changes. You might not want to target specific countries now or have the resources, but you may someday. Setting up a subfolder structure allows you to use the same structure for any future ccTLDs or subdomains for country sections in the future. Your developers will appreciate this choice because it’s scalable for hreflang tags, as well.
  2. Parameters allow a backup system in case your tagging fails in a site update in the future. Parameters can be defined in Google as being used to modify the language on the page. If your other tags are lost, that parameter setting is still telling Google that the content is being translated.
    Using a parameter for language is also scalable for future plans and easy for tagging, like subfolders. The downsides are that they’re ugly and might accidentally be negated by a misplaced rel canonical tag in the future.
  3. Subdomains for language targeting is my least favorite option. Only use this if it’s the only option you have, by decree of your technical team. Using subdomains for languages means that if you change plans to target countries in the future, you’ll lose many options for URLs there. To follow the same structure for each country, you would need to use ccTLDs; while those are the strongest signal for geo-targeting, they are also the option that requires the most investment.

Notice that ccTLDs are not on this list. Those are only for geo-targeting. Unless you’re changing your content to focus on a specific country, do not use ccTLDs. I say this multiple times for a reason: too many websites make this mistake.

Detecting languages

Many companies want to try to make the website experience as easy as possible for the user. They attempt to detect the user’s preferences without needing input from the user. This can cause problems with languages.

There are a few ways to try to determine a user’s language preferences. The most-used are browser settings and IP address. It is not recommended to ever use the IP address for language detection. An IP address can show an approximate user location, but not their preferred language. The IP address is also highly inaccurate (just the other day I was “in” North Carolina and live in Austin) and Google still only crawls from a US IP address. Any automatic redirects based on IP should be avoided.

If you choose to try to guess at the user’s language preference when they enter your site, you can use the browser’s language setting or the IP address and ask the user to confirm the choice. Using JavaScript to do this will ensure that Googlebot does not get confused. Pair this with a good XML sitemap and the user can have a great interaction. Plus, the search engines will be able to crawl and index all of your translated content.

Country targeting, AKA geo-targeting

If your business or content changes depending on the location of the user, country targeting is for you. This is the most common answer for those businesses in retail. If you offer a different set of products, if you have different shipping, pricing, grouping structure, or even different images and descriptions, this is the way to go.

Example: If a greeting card business in the US wanted to expand to Australia, not only are the prices and products different (some different holidays), the Christmas cards are VASTLY different. Think of Christmas in summer, as it is in Australia, and only being able to pick from cards with winter scenes!

Don’t go down the geo-targeting route if your content or offerings don’t change or you don’t have the resources to change the content. If you launch country-targeted content in any URL structure (ccTLD, subdomain, or subfolder) and the content is identical, you run the risk of users coming across another country’s section.

Check out the flow chart at the end to help figure out why one version of your site might be ranking over another.

Example: As a web development service in Canada, you want to expand into the US. Your domain at the moment is www.webdevexpress.ca (totally made up!). You buy www.webdevexpress.us (that’s the ccTLD for the US, by the way). Nothing really needs to change, so you just use the same content and go live. A few months down the road, US clients are still seeing www.webdevexpress.ca when they do a brand name search. The US domain is weaker (fewer links, mentions, etc.) and has the same content! Google is going to show the more relevant, stronger page when everything is the same.

Regions versus countries

Knowing what country or which countries you want to focus on in expansion is usually decided before you determine how to get there. That’s what spawns the conversation.

There’s one misconception that can throw off the whole process of expansion, and that is that you can target a region with geo-targeting. As of right now, you can purchase a regional top-level domain like .eu, but those are treated as general top-level domains like .com or .net.

The search engines only operate geo-targeting in terms of countries right now. The Middle East and the European Union are collections of countries. If you set up a site dedicated to a region, there are no geo-targeting options for you.

One workaround is to select a primary country in that region, perhaps one in which you have offices, and geo-target to that country. It’s possible to rank for terms in that primary language in surrounding countries. We see this all the time with Canada and the US. If the content is relevant to the searcher, it’s possible to rank no matter the searcher.

Example: If you’re anywhere other than the UK, Google “fancy dress” — you see UK sites, right? At least in the US, “fancy dress” is not a term we use, so the most relevant content is shown. I can’t think of a good Canadian/US term, but I guarantee there are some out there!

URL selections

The first thing to determine in geo-targeting beyond the target countries is URL structure. This is immensely important because once you choose a structure, every country expansion should follow that. Changing URL structure in the future is difficult and costly when it comes to short-term organic traffic.

In order of my professional preference, your choices are:

  1. Subfolders. As with the language/translation option, this is my preferred setup, as it utilizes the same domain and subdomain across the board. This translates to utilizing some of the power you already built with other country-focused areas (or the initial site). This setup works well for adding different translations within one country (hybrid approach) down the line.
    Note: If you go with subfolders on both, always lead with the country, then language down the line.

    Example:
    www.domain.com/us/es (US-focused, in Spanish language) or www.domain.com/ca/fr (Canada-focused, in Canadian French).
  2. ccTLDs. This is the strongest signal that you’re focusing your content on a specific country. They geo-target automatically (one less step!), but that has a downside as well. If you started with a ccTLD and expanded later, you can’t geo-target a subfolder within a ccTLD at this point in time.
    Example: www.domain.ca/us will not work to target the US. The target will remain Canada. It might rank in the US, depending on the term competition and relevance, but you can’t technically geo-target the /us subfolder within the Canadian ccTLD.
  3. Subdomains. My last choice, because while you’re still on the same root domain, there’s that old SEO part of me that thinks a subdomain loses some equity from the main domain. BUT, if your tech team prefers this, there’s nothing wrong with using a subdomain to geo-target. You’ll need to claim each subdomain in Search Console and Bing Webmaster Tools and set the geo-target for each, just as you would with subfolders.
    Example: gb.domain.com

Content changes

The biggest question asked when someone embarks on country-targeting expansion is: “How much does my content need to change to not be duplicated?” In short — there is no magic number. No metric. There isn’t a number of sentences or a percentage. How much your content needs to change per country site or subsite is entirely up to your target market and your business.

You’ll need to do research into your new target market to determine how your content should change to meet their needs. There are a number of ways you might change your content to target a new country. The most common are:

Product differentiation

If you offer a different set of products or services to different countries by removing those that are not in demand, outlawed, or otherwise not wanted, or by adding new products for that country specifically, that is changing your site content.

Example #1: Amazon sells the movie “Elf” in the US and the UK, but they are different products. DVDs in Europe are coded for Europe and might not play on US players.

Example #2: Imagine you’re a drugstore in the UK and want to expand to the US. One of your products, 2.5% Selenium Sulphide, is not approved for use in the US. This is one among hundreds or thousands of products that are different.

Naming schema

The meaning of product names can change in different countries. How a specific region terms a product or service can change as well, making it necessary to change your product or service naming schema.

Keyword usage

Like the above, the words you use to describe your products or services might change in a new country. This can look like translation, but if it’s the change of just a few terms, it’s not considered full translation. There’s a fine line between these two things. If you realize that the only thing you’re changing is the wording between US and UK English, for example, you might not need to geo-target at all and mark the different pages as translations.

Keyword use change example: “Mum” versus “Mom” or “Mother” when it comes to Happy Mother’s Day cards. You need to offer different cards in this and other categories because of the country change. This is more than a word change, so it’s a case of geo-targeting — not just translation.

Translation change example: Etsy.com. Down at the bottom of the page, you can change your language setting. I set mine to UK English, and words like “favourite” started to show up. If this sounds like what you would need to do and your content would not change otherwise (Etsy shows all content to all users regardless of their location), consider translation only.

Pricing structure

Many times, one of the most common things that change in country-specific content is pricing. There’s the issue of different currency, but more than that, different countries have different supply and demand markets that should and will change your pricing structure.

Imagery changes

When dealing with different cultures, sometimes you find the need to change your site imagery. If you’ve never explored psychology, I highly recommend checking out The Web Psychologist – Nathalie Nahai and some of her talks. Understanding your new target market’s culture is imperative to marketing effectively.

Example: Samsung changes the images on their UK versus China sites to change the focus from an individualistic to a collectivistic culture. See my presentation at SearchLove San Diego for more examples.

Laws, rules, and regulations

One of the most important ways to change your content is to satisfy the local laws and regulations. This is going to depend on each business. You might deal with tons, while others might deal with none. Check out local competitors — the biggest you can identify — to see what you might need to do.

Example: If you move into the UK and set cookies on your visitor’s machine, you have to alert them to the use of cookies. This is not a law in the US and is easily missed.

User experience and IP redirects

When people start moving into other countries, one of the things they want to ensure is that users get to the right content. This is especially important when products change and the purchase of an incorrect product would cause issues for the user, or the product isn’t available to them. Your customer service, user experience, or legal team is going to ask that you redirect users to the correct country. Everyone gets to the right place and the headaches lessen.

There isn’t anything wrong with asking a user to select the country they reside in and set a cookie, but many people don’t want to bother their users. Therefore, they detect the user’s IP address and then force a redirect from there. There are two problems with this setup.

  1. IP addresses are inaccurate – I was in Seattle, WA once and my IP had me in Washington, DC. No kidding. Look at that distance on a map. Think about that distance in terms of Europe and how much might change there.
  2. Google crawls from California – For the time being, using an IP-based forced redirect will ensure your international content is not indexed. Google will only ever see the US content if you do a forced redirect.

You can deal with this by detecting the country-using IP address (or if organic traffic, what version of Google they came from) and using a JavaScript popup to ask what their preferred country is, then set a cookie with that preference. Even if the user clicks on another country’s content in the future, they will be redirected to their own.

No hreflang??

If you went through that tool, you noticed that my geo-targeting plan does not include hreflang. Many other people disagree with me on this point, saying that the more signals you can send, the better.

Before I get into why I don’t recommend setting up hreflang between country targeted sub-sites, let me make one thing clear. Setting up hreflang will not hurt your site if you are really focusing on country targeting and it’s not that intricate of a setup yet (more on that later). Let’s say you’re in Canada and want to open a US-targeted site. Your content changes because your products change, your prices change, your shipping info changes. You create domain.com/us and geo-target it to the US. You can add hreflang between each page that is the same between the two sub-sites — two products that exist in both locations, for example. The hreflang will not hurt.

Example: If you don’t have the resources to change your content at the moment to fully target the UK, only translate your content a bit between your US (domain.com) and UK (domain.co.uk), and have plans to change your content down the road, an hreflang tag between those two ccTLDs can help Google understand the content change and who you’re targeting.

Why I don’t recommend hreflang for geo-targeting only

Hreflang was meant to help Google understand when two pages are exactly the same, but translated. It works much like a canonical tag (which is why using another canonical can be detrimental to the hreflang working) in which you have multiple versions of one page with slight changes.

Many people get confused because there’s the ability to use country codes in the hreflang tags. This is for when you need to tell Google of a dialect change. An example would be if you have two sub-sites that are identical, but the American English has been changed to British English. It’s not meant to inform Google that content that’s targeted at a different country is targeted at that country.

When I recommend geo-targeting only, I make it very clear to clients that going down this route means you really need to change the content. International business is so much more than just translation. Translating content only might hurt your conversion rates if you miss some aspect of the new target market.

Hiring content writers in that country that understand the nuances is very important. I worked for a British company for 4 years, so I get some of the differences, but things continually surprise me still. I would never feel comfortable as an American writing content for a British audience.

I also don’t recommend hreflang in most geo-targeting cases, because the use of geo-targeting and hreflang can get really confusing. This has led to incorrect hreflang tags in the past that have wreaked havoc on Google’s understanding of the site structure.

Example: A business starts off with a Canadian domain (domain.ca) and a France domain (domain.fr). They use hreflang between the English for Canada and French for France using the code below. They then add a US site and the code is modified to add a line for the US content.

<link rel="alternate" hreflang="en" href="http://domain.ca/" />
<link rel="alternate" hreflang="fr" href="http://domain.fr/" />
<link rel="alternate" hreflang="en-us" href="http://domain.com/" />

This looks odd because there is one English-language page with no regional modifications that is on a Canadian-targeted domain. There is a US regional English dialect version on a general top-level domain (as .com is general and is not US-specific, but people use it that way).

Remember, this is a bot that’s trying to logic out a structure. For a user that prefers UK English, there is no logical choice. The general English is a Canadian site and the general TLD is in US English. This is where we get some of the inconsistencies with international targeting.

You might be saying things like “That would never happen!” and “They should have changed the first English to Canadian English (en-ca)!”, but if you’ve ever dealt with hurried developers (they really do have at least 50 requests at once sometimes) you’ll know that they, like search bots, prefer consistency.

Hreflang should not be needed in geo-targeting cases because, if you’re really going to target a new country-specific market, you should treat them as a whole new market and create content just for them. If you can’t, or don’t think it’s needed, then providing language translations is probably all you need to do at the moment. And hreflang in geo-targeting cases can cause confusion with code that might confuse the search engines. The less we can confuse them, the better the results are!

Hybrid targeting

Finally, there is the route I call “hybrid,” or utilizing both geo-targeting and translation. This is what most major retail corporations should be doing if they’re international. Due to laws, currency, market changes, and cultural changes, there is a big need for geo-targeted content. But in addition to that, there are countries that require multiple language versions. There might be anywhere from one to a few hundred used languages in a single country! Here are the top countries that use the web and how many recognized languages are used in each.

Click to open a bigger version in a new tab!

Do you need to translate into all 31 languages used in the US? Probably not. But if 50% of your target market in Canada prefers Canadian French as their primary language, the translation investment might be a good one.

In cases where a geo-targeted site (ccTLD use) or sub-site (subdomain or subfolder) needs more than one language, then there is the need to geo-target the site or sub-site and then use hreflang within that country-specific site.

This statement can be confusing, so let me show you what I mean:

Click to open a bigger version in a new tab!

This requires a good amount of planning and resources, so if you need to embark on this path in the future, start setting up the structure now. If you need to go the hybrid route, I recommend the following URL structures for language and country targeting. As with before, these are in order of my professional preference and are all focused on content targeted to Canada in Canadian French.

(Country structure/Language structure)

  1. Subfolder/Subfolder
    Example: domain.com/ca/fr
  2. Subfolder/Parameter
    Example: http://ift.tt/1RwM5tE
  3. ccTLD/Subfolder
    Example: domain.ca/fr
  4. ccTLD/Parameter
    Example: http://ift.tt/1qd0swh
  5. Subdomain/Subfolder
    Example: ca.domain.com/fr
  6. Subdomain/Parameter
    Example: http://ift.tt/1RwM5JS
  7. ccTLD/Subdomain (not recommended, nor are the other combinations I intentionally left out)
    Example: fr.domain.ca

The hybrid option is where the hreflang setup can get the most messed up. Make sure you have mapped everything out before implementing, and ensure you’re considering future business plans as well.

I hope this helps clear up some of the confusion around international expansion. It really is specific to each individual business, so take the time to plan and happy expansion!

Troubleshooting International SEO: A flowchart

Click to open a bigger version in a new tab!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Are Keywords Really Dead? An Experiment

Posted by sam.nemzer

[Estimated read time: 6 minutes]

A quantitative analysis of the claim that topics are more important than keywords.

What’s more important: topics or keywords? This has been a major discussion point in SEO recently, nowhere more so than here on the Moz blog. Rand has given two Whiteboard Fridays in the last two months, and Moz’s new Related Topics feature in Moz Pro aims to help you to optimize your site for topics as well as keywords.

The idea under discussion is that, since the Hummingbird algorithm update in 2013, Google is getting really good at understanding natural language. So much so, in fact, that it’s now able to identify similar terms, making it less important to worry about minor changes in the wording of your content in order to target specific keyword phrases. People are arguing that it’s more important to think about the concepts that Google will interpret, regardless of word choice.

While I agree that this is the direction that we’re heading, I wanted to see how true this is now, in the present. So I designed an experiment.

The experiment

The question I wanted to answer was: “Do searches within the same topic (but with different keyword phrases) give the same result?” To this end, I put together 10 groups of 10 keywords each, with each group’s keywords signifying (as closely as possible) the same concept. These keywords were selected in order to represent a range of search volume, and across the spectrum of informational to transactional. For example, one group of keywords are all synonymous the phrase “cheapest flight times” (not-so-subtly lifted from Rand’s Whiteboard Friday):

  • cheapest flight times
  • cheapest time for flights
  • cheapest times to fly
  • cheap times for flights
  • cheap times to fly
  • fly at cheap times
  • time of cheapest flights
  • what time of day are flights cheapest
  • what time of day to fly cheaply
  • when are flights cheapest

I put the sample of 100 keywords through a rank-tracking tool, and extracted the top ten organic results for each keyword.

Then, for each keyword group, I measured two things.

  1. The similarity of each topic’s SERPs, by position.
    • For example, if every keyword within a group has the same page ranking no. 2, that result will score 10. If 9 results are the same and one is different, nine results will get a score of 9, and the other will score 1.
    • This score is then averaged across all 100 (10 results * 10 keywords) results within each topic. The highest possible score (every SERP identical) is 10, the lowest possible (every result different) is 1.
  2. The similarity of each topic’s SERPs, by all pages that rank (irrespective of position).
    • As above, but scoring each keyword’s results by the number of other keywords that contain that result anywhere in the top 10 results. If a result appears in the top 10 for all keywords in a topic group, it scores a 10, even if the results in the other keywords’ SERPs are in different positions.
    • Again, the score is averaged across all results in each topic, with 10 being the highest possible and 1 the lowest.

Results

The full analysis and results can be seen in this Google Sheet.

This chart shows the results of the experiment for the 10 topic groups. The blue bars represent the by position score, averaged across each topic group, and the red bars show the average all pages score.

The most striking thing about this is the wide range of results that can be seen. Topic group D’s keywords are 100% identical if you don’t take ordering into account, whereas group J only has 38% crossover of results between keywords.

We can see from this that targeting individual keywords is definitely not a thing of the past. For most of the topic groups, the pages that rank in the top 10 have little consistency across different wordings of the same concepts. From this we can assume that the primary thing making one page rank where another does not, is matching exact keywords.

Why is there such variation?

If we look into what factors might be affecting the varying similarities between the different topic groups, we could consider the following factors:

  • Searcher intent: Informational (Know) vs Transactional (Do) topics.
  • Topics with high competition levels.

Searcher intent

Although Google’s categorisation of searches into do, know and go can be seen as a false trichotomy, it can still be useful as a simplistic model to classify searcher intent. All of the keyword groups I used can be classed as either informational or transactional.

If we break up our topic groups in this way, we can see the following:

As you can see, there’s no clear difference between the two types. In fact the highest and lowest groups (D and J) are both transactional.

This means that we can’t say — based on this data, at least — that there’s any link between the search intent of a topic and whether you should focus on topics over keywords.

Keyword Difficulty

Another factor that could be correlated with similarity of SERPs is keyword difficulty. As measured by Moz’s keyword difficulty tool, this is a proxy for how strong the sites that rank in a SERP are, based on their Page Authority and Domain Authority.

My hypothesis here is that, for searches where there are a lot of well-established, high-DA sites ranking, there will be less variation between similar keywords. If this is the case, we would expect to see a positive correlation in the data.

This is not borne out by the data. The higher the keyword difficulty is across the keywords in a topic group, the less similarity there is between SERPs within that topic group. This correlation is fairly weak (R2=0.28), so we can’t draw any conclusions from this data.

One other factor that could explain the lack of pattern in this result is that 100 keywords in 10 groups is a fairly small sample size, and is subject to variation in the selection of keywords to go into each group. It is impossible to perfectly control how “close” in definition the keywords in each group are.

Also, it may just be the case that Google simply understands some concepts better than others. This would mean it can see some synonyms as being very closely related, whereas for others it’s still perplexed by the variations, so looks for specific words within the content of each page.

Conclusion

So does this mean that we should or shouldn’t ignore Rand when he tells us to forget about keywords and focus on topics? Somewhat unsatisfyingly, the answer is a strong “maybe.”

While for some search topics there’s a lot of variation based on the exact wording of the keywords, for others we can see that Google understands what users mean when they search and sees variations as equivalent. The key takeaway from this? Both keywords and topics are important.

You should still do keyword research. Keyword research is always going to be essential. But you should also consider the bigger picture, and as more tools that allow you to use natural language processing become available, take advantage of that to understand the overall topics you should write about, too.

It may be a useful exercise to carry out this type of analysis within your own vertical, and see how well Google can tell apart the similar keywords you want to target. You can then use this to inform how exact your targeting should be.

Let me know what you think, and if you have any questions, in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The Brand as Publisher Masterplan – Reinventing Content Marketing for the Next Decade

Posted by SimonPenson

[Estimated read time: 20 minutes]

Introduction and background

The how-to process

Setting up your team

Free downloads and help guide


Content marketing has an image problem.

Like all potentially transformational opportunities, the world sees something glistening and jumps in head first to claim a piece of the next “goldmine.”

The ensuing digital gold rush that follows often creates a stampede to be first, rather than best, and normally strategic thinking is usurped and instead replaced with a brain-out approach to delivery.

And in the case of the content marketing revolution, the result has been an outpouring of disconnected content that adds little value and serves very few, leaving many with nothing more than a handful of “failed” content campaigns to show for the effort.

It’s something I see every day, and it is incredibly saddening.

Content marketing, you see, is not the answer to those prayers. It’s simply part of a much broader strategic picture that I call the “Brand as Publisher” play; a reset of the core principles behind the content marketing charge.

This piece is designed to explain precisely how you can take the “Brand as Publisher” approach, what it is, and how it can help your business succeed with content.

I’ve even created a unique Brand as Publisher Toolkit for you to download to help in that quest! Click the banner below (or at the bottom of the post) to grab a copy.

Defining the opportunity

So, what exactly is “Brand as Publisher?” Put simply, it’s changing your mindset to put content at the forefront of your business, almost before the products or services that you sell.

As controversial as that may seem, the idea is that you’re able to build an engaged, loyal audience of value for your brand… an audience you can then monetize later.

It’s a long-term play, without doubt, and one that requires consistent investment in both time, resources, and cold hard cash — but it creates “cast-in-granite” value that your competitors will find impossible to steal away from you.

Those who take the time and effort to do it will beat you in the long run, and there will be little you can do about it!

Changing mindsets

Now, before you close your browser, let me add a dose of reality. The suggestion is NOT that you start a magazine or newspaper for a living, but instead take the value from that business model and leverage it for your gain.

In many ways, you must start to…


“Imagine yourself as THE leading consumer magazine for your market.”


The easiest way to do that is to imagine your business as a magazine, as the leading publication for your specialist market and THE place anyone with even the slightest interest in your area of expertise goes to expand their knowledge.

Think about it for a second. In the same way that newspapers and magazines create “value” by sharing quality content on their specialist area and then building an audience around it, so can you.

Where they then monetize that audience by selling ad space, you may do the same by selling related products or services, or capturing leads.

The ability to create what I call “target audiences of value” in this way is how value has always been created. And with those eyeballs now focused online more than ever before, there has never been a better time to capture it.

The challenge is that few understand how to make content work long-term. While many brands (and agencies, for that matter) make a song and dance about delivering amazing campaigns, there is a very real need to get back to the basics and build, not just a campaign plan, but a longer-term brand content plan.

This excellent piece for Adage does a great job of arguing why we really do now need to focus on “proper content strategy” and not just on delivering content, particularly from agencies.

Recreating it online

This post is designed to share the secrets honed by the magazine industry over the last six decades; to share the principles that will maximize your chance of success with a content-led strategy.

To make that more digestible, the approach is broken down into a series of integral “pillars,” the first of which focuses on audience insight.

Pillar One: Audience understanding

This process starts and ends with people, with a pure understanding of who that already is and, critically, who you want to consume your content.

Traditionally, the process of gathering insight would have been carried out by running reader focus groups, an often fascinating series of meetings with existing readers and those who currently don’t purchase but are very much “in the market.”

It’s a process I ran as editor of a British specialist car magazine called Max Power, visiting six different locations across the country to meet between four and twelve existing and potential readers.

Those candidates were selected by our own subscriptions team and from the wider industry events we attended on a regular basis in order to “stay close to the audience.”

A budget then allowed us to work with a professional research agency to run structured Q&A sessions with them. In reality, however, you can do the same meeting at a bar, providing you prepare the right questions beforehand.

Every business will have different insight needs. One way of determining which questions to ask is to first capture the key outputs you wish to come away with:


1. Who is currently buying your product or service?

2. Why are other people not buying it?

3. What general trends are affecting these people’s lives at the moment?

4. Where would people buy your product or service from?

5. When, where, and how would they use or consume it?

6. Why would they buy it? What need do they want to satisfy?

7. Who is your real competition?

8. What image do people have of your brand vs. your competitors?

9. What do they think about the different aspects of your product or service (name, packaging, features, advertising, pricing…)?

10. What improvements could be made to your product or service to meet people’s needs even better?

11. What is the single most important benefit your brand should be seen to be offering?

12. How can you best communicate that benefit to the people you’re interested in attracting?

13. What is the right price to charge?

14. What other new products or services could your brand offer people?


Questions can then be crafted to capture that information easily, and you’ll go to those research meetings armed and ready.

Digital insight

That real-world data can be further improved with the addition of digital insight. I have written several times about my process for extracting useful customer information from Facebook and also how you can use paid-for tools such as Global Web Index to form an understanding of how your audience interacts with your brand and wider market.

Combing both the qualitative information you collect in the focus group meetings and the quantitative data you can access digitally will allow you to create data-informed personas for your brand as publisher strategy.

Pillar Two: Personas

Having a clear view on who you wish to target helps steer and shape everything you do editorially. If I rewind back to those Max Power days, we went as far as painting those personas clearly on meeting room walls and in the main office so we were constantly reminded of whom we were there to work for.

How you pull personas together is the subject of a lengthy post in its own right, but there are guides like these will help you do just that:

The point is to put a human face on the data you have bundled into audience segments. By doing so, it enables not just the team pulling the information together to understand who they are and how their needs differ, but also the wider business.

It is also a very good idea, as I’ve written previously, to try to align those personas to celebrities. This really lifts each persona into a living, breathing character that everyone can understand in much greater detail. We all know, for instance, how Beyoncé talks, holds herself, and may be portrayed attitudinally.

Pillar Three: Editorial mission statement

With the audience piece complete, the next stage is to then create your “Editorial/Content Mission Statement”: the crystallisation of your content value and objectives.

Any good content team will have this burnt into their retinas, such is the importance of having a statement that outlines what you stand for. This is your guiding light when creating content, focusing on who your audience is and how you’ll serve them. It should be the measuring stick by which you evaluate all of your content.

A great example of this done well can be found hidden within the wider brand documentation for a brand like Sports Illustrated:


“Sports Illustrated covers the people, passions and issues of numerous sports with the journalistic integrity that has made it the conscience of all sport. It is surprising, engaging, and informative, and always with a point of view that puts its readers ‘in the game.'”


It’s a good example for several reasons because it captures all the key focal points for the brand succinctly. Below we can see how they have managed to cover the key pillars in their editorial strategy:

Our positioning on Max Power was also captured in a similar mission statement, succinctly defined as:

“The definitive guide to arsing [sic] around in cars.”

Editorially, we ensured we injected “attitude,” “fun,” and “entertainment” into every issue, while also maintaining our stance as “experts” and “trend setters” in what was a fast-moving youth market.

Pillar Four: Content flow

We knew that by staying close to our audience, we would continue to lead the market due to our reach. But we also knew that as we covered a wider audience of car enthusiasts, we needed to ensure that our publication was reflective of the audience/readership.

This meant thinking very hard about the “flow” of the magazine; what mix of content we included and how it was delivered over time.

Content flow is a process I have written about previously here, but it’s worth covering again, such is its importance. Getting it right is the difference between campaign delivery and truly connected content strategy.

The basis of flow is having the right mix of content to deliver from page-to-page, or day-to-day in the case of digital.

The best way of doing this is via a process known as “flatplanning,” a print publishing technique that also lends itself well to digital planning.

Pillar Five: Flatplanning and regulars (reinventing the wheel)

So, how does flatplanning work?

The concept is a very simple one for print publications: you recreate the pages you wish to fill with lovely content schematically in a document that looks a little like the one below.

You’ll see that I have started populating this to give you an idea of how it worked in the Max Power example we’ve been using.

Above, you’ll see how each element, or content idea, has been added to the plan. Doing it this way it makes it very easy to visualize how the strategy ebbs and flows in terms of the variation of pace afforded by the different types of content you include.

Take, for instance, the first couple of pages here. You can quickly see that we kick-start with some shorter-form, faster-paced news on page 1–12 before we then change pace and move into a four-page longer-form piece on pages 13–16 before going back to a two-page piece.

Obviously, in the print world the ONLY variation you can play with is length of article and style of writing, but when it comes to digital the opportunities are endless.

Flow

In the online world you have a plethora of media types to play with to add extra zing to your content strategy. The key to getting the “flow” correct is to use this flatplan technique, with the pages being hours, days, weeks, or whatever other measurement of time is relevant for your plan.

We often refer to the brilliant Smart Insights content matrix as part of this content type planning process. You can see below that it includes all of the key content types and adds insight into which part of the customer purchase and intent cycle they sit.

I’ve created a new resource to help further with this process, based on the same principles. The Content Flow Matrix helps you understand which content types to use based not only on where they may sit within the purchase funnel (upright axis) but also the relative “size” of the content.

By choosing a mix of content types AND a mix of content “sizes,” you end up with the right mix of variation to ensure your content audience remains engaged and that they come back for more.

Pillar Six: Front cover insight

But while variation is a great thing, it’s also very important to make clear what the cornerstones of that strategy are, and to consistently and clearly reinforce and deliver that for your audience.

The way this works in print is to utilize the cover “sells” to deliver consistent messaging.

One of the very best exponent of this is Men’s Health magazine, a media brand that very much understands its readership and where and how it can add value.

Below you can see a randomly selected front cover highlighting what I call the “Editorial Pillars” of the brand — the cornerstones of its strategy.

Every single month, the cover will feature content that offers to help you improve your mind, body, or sexual performance:

Digital content strategy requires the same focus. Part of your overall strategic planning process should include a session to establish what those pillars are for your brand.

Below, you can see how a template front cover may look, complete with spaces for your editorial pillar planning. I have also included a copy of this in the free Brand as Publisher Toolkit download bundle created specifically for this piece to help you build your own strategy.

What might that look like for my agency, Zazzle Media? Here’s a fun example created by our designers to give you an idea of how it might be pulled together:

Getting it right will mean greater engagement, more return visitors, and more sharing of and linking to your content.

Marketing and incentivising purchase

So, with a clear proposition and great content delivered with variation and clear messaging, you’re ready to roll, right?

Almost, but not quite. Often the key difference between a magazine being successful and just being “OK” was the quality of its marketing strategy.

If anything, this is even truer in the digital sphere. Thinking about how you reach your audience is what this blog is all about.

The challenge, digitally, is that while access to your audience is faster and easier, it means that the barriers to entry that protected traditional media for so long are no longer there. And that means competition, and lots of it.

In print, the only truly effective ways of growing market share was to improve distribution (be in more stores), optimize your position on the newsstand (be more visible), or invest in gifting (giving away free stuff on the cover).

These strategies translate nicely to digital in the following ways:

  • Ensure you have a strategy for all relevant channels (social, search, influencer channels) to maximize reach.

  • Optimise all channels to maximize effectiveness (SEO is especially important here)

  • Incentivize. This will look different for all businesses; for example, in ecommerce this may be money-off codes.

One final area of investment for publisher brands is live events. This is, again, a cornerstone of a brilliant brand as publisher play.

For those dealing with specialist markets (and that is exactly what we all do online), it has always been absolutely critical to stay close to the audience. One of the very best ways of doing that is to create, and run, semi-regular live events.

What they look like is completely dependent upon what market you’re in, but if you truly understand your customers, you’ll almost always be able to add some experimental value.

For some, it may not be possible to do this in person. Where this is the case, regular Hangouts and/or webinars can fill the void.

Max Power was always famous for running regular meets throughout the UK for people to bring (and show off) their cars. It was a forum that kept us connected to the loves and hates of the market, and allowed us to establish strong relationships with the key influencers amongst them.

Events can be seen as the icing on the cake to many, but in reality they are one of the most important slices of the marketing cake.

Pillar Seven: The long tail

Another underestimated area of opportunity can be found within your regular content, the pieces you put out every day and that serve to stick together your bigger-bang campaign content.

In magazines, these pieces help create variation of pace as you turn the page. In the digital world, they can do much more.

Designing your long tail strategy in a way that takes advantage of long tail search opportunity is something I have covered as a standalone subject here at Moz previously, and I’d urge you to read the post to get the most out of your idea planning.

The added bonus now is also taking advantage of Google Answer Boxes.

By designing regular content to answer key questions that your audience is asking (I use Answer the Public and a keyword tool like Keyword Studio to help me understand this), you’re not only adding value to regular visitors’ lives — you’re also creating the opportunity to jump to the top of the SERPs.

Claiming those boxes requires real focus on article structure and good use of headlines, as this amazing study by Razvan at Cognitive SEO explains. If you achieve it, in our experience you can expect to see a 15% increase in traffic from that keyword, versus even being first in the normal SERPs.

Outside of the “content-for-long-tail-search” opportunity, regular content also serves to provide interaction opportunity. Using those “regular” slots to run polls, quizzes, more brand-led pieces and so on will enable you to not just provide variation but also improve brand understanding, resonance, and reach.

Pillar Eight: “Big Bang” content

For many, the campaign content end of the spectrum is where most content strategists concentrate. This is a mistake. While Big Bang pieces can undoubtedly provide greater reach and attract more links, they alone do not constitute a strategy.

That said, they can certainly provide value — and like a magazine full of short-form content only, without them, you lose readers quickly.

The “features” in a magazine — those articles that span four+ pages — are the print Big Bang equivalent. They often fit within those brand as publisher pillars we discussed earlier.

For Max Power, these would often take the form of a car road test, road trip, or interview — but in digital, the world is your oyster.

For instance, we’ve recently produced content campaigns as diverse as a vegetable cookbook, to a supermarket shopping challenge, to the Classroom of the Future, to give you a taste of what that means.

Content types that lend themselves to Big Bang campaigns include:

  • Tools

  • Games

  • Data visualizations

  • Guides

  • Surveys and reports

  • Video

The key, once again, is ensuring there’s variation, even in your Big Bang output. So many brands will find a hit with one type and then stick with it, but that’s missing the point.

As with every part of your strategy, variation will always win. That’s how you stand out from the crowd in the long run.

Pillar Nine: Team structure and resources

Creating this variation is not an easy task. It requires a greater focus than ever on available skill sets.

You may think that what’s needed now from a team perspective is much more demanding than it was before, but that view isn’t necessarily correct.

To give you a view on what it took to pull together an issue of Max Power, we employed the following. I also explain, briefly, their role within the whole:

  • Editor – Responsible for the overall positioning and editorial strategy. Takes a longer term view to issue planning and liaises with commercial and publishing teams to maximize sales opportunities and sales. Works closely with all.
  • Publisher – Commercial-focused P&L owner responsible for distribution deals, production costs, and sales (the number of magazines sold AND ad revenue from it).
  • Deputy editor – Day-to-day ownership of the flatplan. Ensures content is delivered on time and to standard. The editor’s right-hand man/woman.
  • Production editor – Responsible for ensuring everything is produced on time. Liaises with the printers to ensure production standards are upheld.
  • Art editor – Leads the design team and is responsible for upholding design rules and the adoption of brand values throughout.
  • Designers – Layout and design all pages, and will artistically direct shoots to ensure that the design vision for individual features is carried through.
  • News/Features/Section editors – Lead a mini-team of specialist writers and are responsible for their output and the quality of their sections.
  • Writers – On-the-ground journalists who are out and about more than they’re in-office working on the individual articles and features.
  • Photographer – More often has a focus on photos, but may also have video skills.
  • Web team – In the early days of the net, this team ran separately to the “main” print team and often reconstituted print content for the web, ran communities, etc.
  • Advertising team – Responsible for selling all advertising space in the magazine (a key way of monetizing the audience).
  • Production team – Produce the adverts that the advertising team sells and supplies them to the designer team.

As you can clearly see, the cost of a great editorial product has always been high — that will never change — but the value it creates will outweigh the cost if you get the strategy right.

The big question, of course, is what should the right digital version of this team look like?

This is something I have spent a great deal of time looking at in my current role; here’s a view on what a small, medium, and large business could base a setup on. Obviously this looks different for everyone, as different markets demand different areas of focus, but this can be a start point for discussion:

Small business

In this scenario, we’re ideally looking for multidiscipline people. In an ideal world, your journalist will be able to write and PR their written work, leaving you with the possibility of also including someone to focus on paid promotion across search, social, and native.

As with all of these example team structures, the MD/CEO of the business should own the brand as publisher plan, bringing it to the very centre of focus for the business. In larger businesses, that may ultimately be taken on by the CMO, but in any business of hundreds of people or less this needs to have priority focus.

In a small team the focus has to start with owned and earned media, hence the balance of people here. With a writer and designer you can create lots of different types of content, while the PR person focuses on building key relationships and leveraging those connected audiences.

Medium enterprise

In a slightly larger organization with more budget to play with, things start to get much more interesting as roles become more specialized.

In this model (and read each specialty as being scalable with multiple people in each of those teams) we can create more variation. Video and data start to creep in, allowing you to not only create a wider range of content, but also understand who your audience is, where they are online, and what they consume right now.

Interestingly, we find that those who have traditionally sat in SEO roles make for very good data analysts in helping to forge a data-driven strategy, while their abilities in ensuring platforms are still “fit for purpose” means they can fulfill a dual and extremely valuable role.

We then also have the ability to split out PR and blogger relations. That way, there’s focus on both the niche and the big traffic media brands within the distribution plan.

At this level, it’s also critical to have some specialist paid media focus to ensure that the content distribution plan includes a cohesive paid media element.

Large brand

For large-scale enterprises, the sky is the limit! We can go much further in bringing in further data specialists and also how the wider CRM play may come in to include specialists dedicated to best using the whole Inbound Marketing Suite.

We also add in multilingual capability, especially important to international brands, as well as other specialties to give more focus to the overall strategy, ensuring it’s scalable. The sky is the limit here.

Help is here!

We’ve covered a great deal of ground in this post on a subject matter that asks wider questions of all brands and businesses. To help you on a more practical level to work through it, we’ve created an all-encompassing Brand as Publisher Toolkit. In it, you’ll find:

  • Flatplan template
  • Magazine cover template
  • Content campaign planner
  • Editorial calendar
  • Persona template
  • A copy of our Content Flow Matrix
  • Content Style Planning Guide

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

How to Create Content That Earns Engagement, Trust, and Loyalty for Your Brand

Posted by ronell-smith

[Estimated read time: 17 minutes]

A couple of years back, I received a call from the CMO of a small but popular and growing startup about taking on the brand as a content strategist. While I was initially lukewarm to the idea, they were adamant about working together, feeling that I “could help them reach their goals.”

Before hanging up the phone, I asked him to email me the main priority for the onsite content:

“Engaging content (e.g., shares, likes, tweets, etc.),” she wrote.

I thought, I can do engaging.

I reasoned I’d stick with how-to information content, in-depth evergreen content, and maybe a few interviews. In the online marketing vertical, these are what I call “can’t miss elements” for brands looking to create onsite engagement.

But not long after I started working with the brand, I saw some problems that should have been red flags from the beginning:

  • The type of content they wanted for the blog didn’t garner traffic
  • The type of content that did garner traffic didn’t garner engagement
  • When I talked to the CMO, her words were equally confusing: “Conversions are up, but we need to see engagement improve to continue the relationship.”

I was confused.

Is there EVER a scenario where increased conversions was a negative?

Shortly thereafter, the relationship dissolved. The culprit wasn’t a lack of engaging content, though.

Engagement, alone, is a poor choice for a goal

This likely sounds familiar to folks reading this post. Maybe someone says, “We have a shiny new website, so now we need to blog.”

The next question is “Who’s going to blog?”

Then, typically, the question after that is “What do we blog about?”

Someone always, and I do mean always, says, “About what we do. You know… stuff that will get folks talking about our brand.”

The next question and answer dooms us: “What’s the goal?”

  • One blog/week
  • To drive people to our website
  • To increase conversions

Inevitably, the main goal for the content itself, though, is engagement.

The biggest problem brands have in the move to content marketing is creating engaging content.

Why do you think that is?

  • Because it’s hard?
  • Because they don’t have writers who can produce it?
  • Because when they do produce it, folks still don’t engage with it?
  • Because they’re marketing to the wrong audience?

Nope!

Creating engaging content is a nice-to-have, first-step goal. But as the client I talked about earlier found out, engagement alone isn’t going to move your brand forward in what is now a sea of content.

Engagement is a goal; it shouldn’t be the goal.

First, engagement simply means people noticed your content and interacted with it in some, typically small, way. That could mean a social share, leaving a comment, sharing a link, etc. And for those of us just starting on the content marketing journey, that’s nothing to sneeze at.

Where the problem comes is when we use engagement as an all-important Key Performance Indicator (KPI) of how your brand’s content is performing.

I think Avinash Kaushik, Google’s digital marketing evangelist, says about all there is to say about engagement with this quote, taken from his blog:

“Even as creating engaging experiences on the web is mandatory, the metric called Engagement is simply an excuse for an unwillingness to sit down and identify why a site exists. An excuse for an unwillingness to identify real metrics that measure if your web presence is productive. An excuse for taking a short cut…”

He goes further, saying the only people who use engagement as a metric are those who are too lazy to discern the real reason for being for their website.

They refuse to ask “Why does it exist?”

So they assign value to something that is all but impossible to measure in a tangible way.

My experience mirrors those comments. Engagement is an easy, feel-good metric used by brands who lack clear purpose for their content marketing.

My core problem with using engagement as a metric of significance is it’s hard to measure, next to impossible to sustain and, worst of all, easy to copy.

In five simple steps, competitors can kill your engagement strategy:

  1. Visit your website and see which content is doing well: See the Facebook, Twitter ands Google Plus number, and that gets them to thinking…
  2. They go to Google, do a site:search and see what your top-performing content is. Then they tell their copywriting staff to take this idea and expound upon it — more details, richer graphics, etc.
  3. Then they use a tool such Open Site Explorer to view your site’s backlinks to see who’s linking to them and what content is getting the most links.
  4. They’ll reach out to those same brands and say, “We see you’re linking to this content. We created a similar post that has even more details.” They’re likely to add the competitor’s link, but they’re just as likely to unlink to your content.
  5. Your stellar content piece is likely to take a tumble in the SERPs and your site will miss out on traffic.

All because you chased the wrong goal.

I’ll add a huge “however” here: If you’re just starting out, OR if all you really truly care about is creating some potentially engaging content, you can do exactly what we outlined regarding the competition. You find a popular brand in your vertical and copy the content they’re creating, only you make it better: better written, better text, and you commit to outreach. I can tell you that of all the companies I’ve worked with and for — from mom and pop cupcake shops to, moving companies, fitness brands, apparel manufacturers and software companies — this is where the content creation process begins and, sadly, sometimes ends. So copy it. Use it. At least until you get better, see better, and know better what the audience wants.

But never hang your hat singularly on engagement.

What comes easily is just as easily taken.

Brand trust is essential for content marketing success

If engagement is a blind date, trust is going steady. It has to be in place before things get too serious.

In the strictest sense, trust is about how prospects and customers view your brand, how they view the people who represent your brand, what you stand for and how you make them feel.

(Image source)

While asking for prospects to trust your brand this much is definitely pushing it, brand trust is an imperative in today’s online marketplace.

When trust is in place, people come to see your brand as not simply a reliable option, but the reliable option; they feel good about association with it; and, most importantly, they seek out those interactions.

To get there, people need to see your brand and brand representative in lots of places, online and offline, to develop familiarity and form a positive association with the brand. (I call this positive ubiquity.)

That’s why making too big of a deal about onsite content is a mistake. It’s important. But, let’s be honest, if there are only three people reading your blog, your impact is going to be very limited. Wouldn’t you agree?

In addition to writing posts and sharing your brand’s content, you should also be sharing valuable content from other non-competing brands; engaging in meaningful online conversations surround your vertical; interviewing influencers in your space; and creating a presence that moves seamlessly between online and offline, social and content, human to human.

The fact of the matter, though, is that people respond best to people. Not words or images or fancy design. And as reluctant as you might be to have public faces for your brand, you need it to make your content marketing efforts work.

People are what lead prospects to build an affinity, not simply an association, with your brand. It’s akin to going from an encounter to being noticed.

Think…

  • Apple and Steve Jobs
  • All State Insurance and the Mayhem man
  • Blendtec and its zany CEO (shown below)

(Image source)

Make this work for your brand.

Why not highlight SMEs inside the company?

Instead of simply forcing everyone to blog, find out what individual team members are good at and have a passion for, then allow them to express their creativity for the brand in their own way.

  • Maybe another team member is passionate about radio. Why not have her do a podcast for the site, but also share it via iTunes, SoundCloud, or wherever else it makes sense to share it?
  • Every office has the resident know-it-all. Why not create a Twitter handle and associated hashtag for this person, and allow them to spend 30 minutes a day online answering questions for the brand?
  • Maybe you find that someone hates writing blogs, but is interested in theatre and would love doing vblogs for the site as well as posting them on YouTube or Wistia.

(Image source)

And while you’re building that brand affinity, people who aren’t even in the market for your product or service will take note, realizing that your brand cares.

You aren’t out for simply earning a dollar. You’re really helping people, even when those people aren’t likely to buy anything from you.

I know what you’re thinking: “Ronell, who has time or resources for that?”

My answer is, “You don’t have to do any of this. Really, you don’t.”

But I’ll add that if you do at least some of this, consistently, you will be more successful than you likely assume, in large part because most of the competition is unwilling to do it.

Whenever I hear people talking about how difficult it is to find success in content marketing, it reminds me of a quote from one of my favorite strength coaches.

One of his clients said, “Squatting hurts my knees.” After witnessing a demonstration of what the client called a squat, the coach said, “Squats don’t hurt your knees. What you’re doing and calling squats hurts your knees.”

Content marketers are a lot like this, right? We throw ideas at the wall, then call what sticks a success.

We’re better than this.

The path to content marketing success leads to loyalty

Typically, when we set out on this content marketing journey, we, as a team, set these arbitrary goals: We need X number of tweets, X number of Likes and shares on Facebook, Google Plus and so on.

A better way to do it was exposed by Buzzfeed.

Yes, that Buzzfeed.

The site might post an inordinate amount of dumb stuff, but has an amazing data science team. That team studied how content is shared across the web and uncovered some interesting findings.

Leading to what we now know as P.O.U.N.D.: the Process of Optimizing and Understanding Network Diffusion.

We tend to think that a Facebook Like leads to a Facebook Share, which leads to more Facebook Likes and Shares. And a tweet leads to more tweets, etc., etc., for the other social networks.

What they found is network diffusion doesn’t happen in a linear fashion.

Basically, people jump between social networks and links and back again. For example, a Facebook Like might lead to a Facebook Share that leads to a Twitter Share that bounces to a website via a link then back to Facebook as a Like or Share.

This petri dish-looking thing below is really is a graphic depiction of network diffusion, where the dark blue areas are Facebook, the light blue areas are Twitter and the white areas are links.

What Buzzfeed found is that they get links as a byproduct of network diffusion. They don’t need to optimize for links or make link building a focus. The lesson for them, as it should be for us, is that the more they optimize for network diffusion, the more links they’re going to see.

This is not just fascinating; it’s instructive.

Instead of concerning ourselves with link building and outreach and hoping we get links, if we simply optimize our efforts at creating and sharing content, links naturally occur.

Previously, the thinking was to create a piece of content, then build links to it.

But now, with what we know about network diffusion, we’re going to focus on publishing all of our content to the right streams and to the right audience. We’re optimizing for which social streams move the fastest for the specific topic.

As a content marketer, this information should excite you, especially if your team is ready to commit to the right, and best, goal, which is content loyalty.

If that’s not your goal, scrap your goal and adopt this one.

Content loyalty means you aren’t having to work so hard for your content. Your content is working for you.

  • Folks are avid fans, actively seeking out each and every piece of content you create.
  • Instead of you having to carry the load with sharing and promotion, these fans are sharing and promoting like crazy.
  • Instead of worrying about what content to create, your fans, followers, prospects and customers are actively involved helping you via comments on the blog, questions and responses on social media, interactions with the help desk, and sundry other touch points whereby they interact with the brand.

“The shortest path to break through the noise and create a sustainable content strategy is to create content loyalty,” says Moz’s Matthew J. Brown, who is chief of product strategy and design.

It’s difficult but doable.

Parse.ly, an audience insight platform for digital publishers, found that 2.6 days is the median pageview peak for any single piece of content. Pageviews basically fall off a cliff shortly thereafter.

If you get 20% of your traffic from social, things are a little bit better: 3.2 days

But by and large your window is two to three days.

But the biggest takeaway from their research, which looked at hundreds of sites and billions of pageviews, showed that the average site sees only 11 percent of its visitors returning at least once in a 30-day period.

You heard right: 11%.

That number might sound low, and it is. But it highlights an opportunity.

If you can get that number up to 20%, you’re doing 2X better than the competition.

So how do you get there?

A content marketing playbook

Vulture.com conducted a study with Chartbeat to find what on-page content attributes led to content loyalty. They wanted to figure out what led readers to return to their site.

They found that if they could get their readers to return to the first page of their site 5 times, the readers would be what they term “loyal visitors” of their site, returning frequently to consume information.

In other words, five days was their core loyalty metric, and the primary starting place for the brand’s content efforts.

They looked at factors ranging from text length to images and the number of ads on the page, and what they found was surprising and illuminating: For them, the key was the amount of text above the fold.

That is, loyal readers expected to consume a certain amount of content above-the-fold. (Click the link above for the details, which are quite interesting.)

Armed with this information, Vulture.com could focus on a targeted attribute that led to their 5X, loyal, readers.

Nothing is stopping you from doing the same.

Making content loyalty work for your brand

Your first step toward content loyalty, is to define your goal post (e.g., visits per an allotted amount of time), then optimize for the attributes that lead to that goal.

For your brand, it might be content length or number of ads or GIFs or videos.

The key is to dial in those attributes that are specific to your site, then continue to optimize for them.

You likely have some inkling of what content types help earn loyalty in your vertical, based on popularity and such. Same thing for content types. We know that for many industries, blogs, videos, infographics, and the like are the most shared and most linked to types of content.

Your brand can do the same, provided you have the heart and the patience to do so.

One of the reasons brands are struggling with content marketing is they aren’t giving it enough time. Create a program, set a plan, and let it run.

It’s not a 90 day thing.

“The sheer majority of brands will continue to crash and burn with their content creation and distribution efforts. Simply put, most brands resist telling a truly differentiated story, and even those that do tell one aren’t consistent or patient enough to build loyal audiences over time,” says Content Marketing Institute founder Joe Pulizzi.

If you’re willing to put in the work, though, you can have success.

The natural starting place is a content audit.

I know many of you cringe upon seeing that word. But you have to start somewhere, and the content audit is the best somewhere.

Besides, before you get started producing content, you need to know what you have and how well it’s performing.

If, like me, you’ve done content audits, you know they can be a time-consuming chore, especially when done from scratch.

Luckily, you don’t have to start from scratch.

Using the template found in Mike King’s deck from Authority Rainmaker, you can get an excellent snapshot of the strongest-performing content on your site. Then you simply aggregate that data to see what’s resonating with your readers, what’s creating that network diffusion for your brand.

For example, you can find the most shares for various types of content, which can help you better discern what types of content you should be creating and sharing more of.

Once you have your content audit in hand, the next step you want to take, before execution on your new content strategy, is to calculate your ROI. This Content Marketing ROI Calculator from Siege Media allows you to plug in the costs associated with creation, including how many links and shares and loyal visitors, which makes it easier to make the case for your boss or your clients. This is a must-have when you’re trying to not only get buy-in but also get the time you need to execute your plan.

If your brand is like many of those I’ve worked with in the past, meaning you don’t have a wide base of content from which to pull a great deal of data from during the audit, I suggest using tool like BuzzSumo, which is a newcomer that has become very popular very fast in content marketing circles.

And for good reason.

It can help you get up and running really fast, and you can learn a great deal about how your content is performing along the way.

BuzzSumo allows you to view the social landscape across myriad topics for the entirety of your competitive landscape.

So, by the time you get started, you can have a complete list of targets and categories to optimize for, even if you don’t have a strong content inventory.

One of the coolest parts about working with Moz — aside from the Roger notepads and pens — is the great people who are always designing and creating tools for us to use, then share with the audience.

For a while now, we’ve been privileged to play with something called One Metric.

Created by our audience and data teams, it allows us to weight social sharing, traffic and links and on-page attention, and reader engagement to create a more organic content score that ensures we’re looking at the entire picture.

Earlier this year, Moz released Moz Content, which is basically One Metric plus 10 and times one million.

With Moz Content, you can crawl your site, then integrate the various bits of information, including content types, your author performance, your social sharing, your links, etc. Even better, you can create, track, and save multiple content audits, making it possible to see how well your content is doing over time, and with ease.

The goal is to make that first step when performing a content audit much easier.

Even better, using the newly created Moz Context API, you’re able to extract the most relevant topics for your site. It can tell you what topics and what keywords are the most relevant for your site and across the web.

This allows you to create a topic inventory for your site.

Let’s say, based on performance, visitors are engaging with these content types and topics most on your site. That way you don’t have to guess about what content to create.

You can then focus on optimizing for creating and sharing the right content in the right places for the right audience, instead of blindly creating content with the hope that it performs optimally.

Maybe my favorite feature, and the one that I can see many brands using most to position themselves favorably against the competition, is the Content Search feature. It allows you to see topics — -your topics — across the web, enabling you to harness information on what’s getting the most shares, what’s gaining social traction, what’s resonating with your audience.

With this view, you’re getting a bird’s-eye view across the web, so you can see what’s working for the competition, what they’re having success with and what, maybe, you should consider trying.

Full disclosure: Since Moz Content is new, I still rely on BuzzSumo for getting a quick, easy, and clean snapshot at the topical level, then use Moz Content to get a deeper look at the content landscape I’m hoping to track, whether for myself or for a client or prospect. And because both platforms offer a level of free service, I’d suggest using them in tandem, especially at first, to get a feel for which has the features better suited for your needs.

Take your content marketing to the next level

Hopefully, you have a better sense of how to be successful, in addition to having a more in-depth understanding of what it takes to attain long-term success in content marketing. The overall goal for this post, however, was to make it clear that, with regard to the content you create, share and promote, loyalty is THE goal, not a goal.

Remember, content is meant to support your marketing efforts; it should not define them. If the content you create can draw readers to your site consistently, your team can then set about ensuring that the various messaging needed to call attention to or sell additional products are in place, even as you further optimize the content to increase views and viewers.

By making content loyalty your goal, you make it palatable that more of your brand’s goals are attainable.

What are your thoughts? Do you think loyalty is the right goal for your content?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moz Local Industry Report: Who’s Winning Wireless Searches?

Posted by Dr-Pete

Summary: We analyzed 5 mobile phone buyer searches on Google across 5,000 cities (25,000 total markets) to find the winners and losers in both organic and local pack results. Best Buy dominated organic results and performed well in local searches. Sprint won the local pack results, but disappeared from organic entirely. Carriers Verizon, T-Mobile, and AT&T all performed well, but none covered more than 30% of local search markets.

The wireless industry in the United States is both massive and competitive. According to an IDC report, over 184 million mobile phones were shipped to US customers in 2014, with an estimated 191 million in 2015. The vast majority of consumers, even in 2015, report browsing products online but purchasing in-store (73%, according to PWC’s annual report). This trend may be even more dramatic in the wireless industry, where experts suggest that upwards of 9 out of 10 of all mobile phone purchases in the US still happen in a brick-and-mortar store.

In a competitive environment where most people research phones online but buy them in-store, ranking well in Google search results, especially local results, is critical. Local results can lead consumers not only to one brand over another, but to specific store locations in their area, surfacing store addresses, phone numbers, and operating hours.

For example, here’s a local 3-pack from a search for “mobile phone store” in the Seattle area:

Local packs in 2016 not only contain rich information, including directions, reviews, location, phone, and store hours, but they appear at or near the top of organic results and occupy a large amount of screen real-estate.

This report takes a Google’s-eye view of the mobile phone market in the United States. We ran thousands of searches to determine who were the big winners in both organic and local Google results, who were the losers, and where big brands had gaps.


Report methodology

For this study, we tracked 5 wireless industry phrases on page 1 of Google.com across the 5,000 largest cities in the contiguous 48 states (according to census data), measuring both organic and local pack results. The five searches used in the final study were:

  • “phone store”
  • “mobile phone store”
  • “cell phone store”
  • “wireless store”
  • “buy cell phone”

We deliberately chose keywords that were likely to return both organic and local pack results. Based on initial analyses, we discarded product-specific keywords, like “buy iPhone 6,” because those didn’t typically return local results. Interestingly, searches containing “smartphone” also generally failed to display local results.

Finally, we threw out “phone shop,” because, even searching US locations on Google.com, that phrase tended to return UK-based results. Data was combined across the five keywords, with organic and local results analyzed separately.


Top 5 organic brands (by markets)

If we treat each of these 25,000 searches (5 keywords X 5,000 cities) as a potential market, we can get a sense of how well any given company is covering the total US marketplace. For this analysis, we’ll treat multiple listings on a single page of search results as one “market.” The question is just whether any given brand is represented in that market (not where or how often).

Here were the top 5 brands, by total markets:

Big-box retailer Best Buy and online retailer Newegg led the organic winners, followed by mobile carriers AT&T, T-Mobile, and Verizon. The Top 10 were rounded out by (in order): Walmart, Wirefly, Cricket Wireless, and Boost Mobile.

Surprisingly, Sprint was nowhere to be found in our organic data, showing just one listing (and that one was on a sub-domain). Keep in mind that this study looked only at page-one results. Used phone resellers, including Gazelle (#11), Glyde (#12), and Swappa (#16) made a strong showing in the top 20.


Top 5 organic brands (by clicks)

The “market” analysis doesn’t account for the varying impact of different ranking positions and the populations of the 5,000 cities in this study. So, we did a second, more complex analysis. If we take a shallow click-through curve (see below), where the #1 position gets the most clicks and then click-through rate (CTR) trails off, and then we multiply each of those CTRs by the city’s population, we can get a proxy for total click volume.

Obviously, not everyone alive is running these searches, and we’re going to cheat and assume clicks total 100% (they don’t, in reality), so instead of looking at total counts, we’ll rely on percentage of total click share. Here were the top 5 by click share:

Adjusting for CTR and population, Best Buy held onto the top spot, and most of the top 5 was the same. The notable exception was AT&T, which fell to #8. Digging deeper into the data, this appears to be a function of CTR. On average, AT&T’s rankings are appearing lower on page 1 than the rest of the top 5. Cricket Wireless moved up from #8 to round out the top 5.


Top 5 local brands (by markets)

Now, let’s look at just the local pack results for those same 25,000 markets. Keep in mind that local packs did not occur in all markets, and there are a maximum of 3 sites in any local pack (compared with up to 10 organic listings). Here were the top 5 local winners:

Sprint, nowhere to be seen in our organic data, led the pack in local results. Other major wireless companies rounded out the top 5. Best Buy maintained a strong position at #6, but organic leader Newegg.com fell completely out of the local results, having no physical storefronts.

Clearly, the biggest disconnect between the organic and local data here was Sprint — taking the #1 spot for local, but disappearing completely from organic rankings. Newegg flipped that around, dominating organic but having no local presence. This was a direct and obvious result of having no physical locations.

Another big difference between organic and local was Apple.com. Apple naturally has a strong presence for product-specific (i.e. iPhone) queries, but ranked #47 in our organic results for general phone-buying searches, appearing in only 95 (of 25,000) markets. Apple stores, however, ranked #8 in local markets.


Top 5 local brands (by clicks)

Like organic, we can apply our click share analysis to local pack rankings. The Top 5 local domains, weighted by CTR and population, looked like this:

Other than some position shuffling, the Top 5 were the same as the simpler local-pack analysis. T-Mobile took the top spot from Sprint when adjusted by CTR and population. It looks as if the major brands were distributed pretty well across a variety of populations and ranking positions.


Top 5 overall winners (by clicks)

What if we combine the organic and local totals, using the click share data across all markets? Here are the winners of the combined data:

Verizon and Best Buy were in close competition for the top spot, with T-Mobile just behind. Best Buy’s #6 spot in our local analysis was easily boosted by their #1 spot in organic, making the big box store a strong overall contender. AT&T squeaked into the top 5, hampered a bit by their #8 position in organic search. Cricket Wireless rounded out the top 5.


Winners, losers, and takeaways

Best Buy dominated our organic winners and took an impressive #2 overall, performing well in local searches. This matches Best Buy’s leading spot in real-world mobile phone sales, an advantage enhanced by representing multiple brands and carriers under one roof. Best Buy’s performance is even more impressive given that they have considerably fewer total locations than most of the major carriers.

Sprint was the biggest winner in local results, given their relatively small retail footprint compared to other major carriers. Publicly-reported location data shows Sprint having half or less of the locations that each of Verizon, T-Mobile, and AT&T operate, which makes their local dominance even more impressive. Sprint’s recent acquisition of as many as 1,700 Radio Shack storefronts could double their retail locations and make them a force to be reckoned with in local search. Sprint does, however, need to address their complete absence from organic results for general mobile keywords.

Mega-carriers Verizon, T-Mobile, and AT&T performed well in overall results, as expected given their marketing budgets and massive retail footprints. Verizon struggled somewhat in local rankings, relative to other carriers, bolstered in the overall standings by their strong organic presence. AT&T had the opposite problem — they had a strong local presence, but trailed a bit in organic once CTR was taken into account. It appears AT&T has room for improvement in their ranking positions for general mobile phone terms.

AT&T can count a second win in their column. As of 2014, they own Cricket Wireless, who was our #4 overall winner and had a top 5 position in both of our click share analyses (organic and local). Cricket’s dominant position is undoubtedly good for revenue, although it can be argued that both their organic and local search share represent a branding challenge for AT&T.

No single major carrier dominated market coverage in local pack results. Of the 25,000 markets we studied, 21,143 displayed local packs. Sprint ranked in local packs in about 1/3 of available markets, AT&T and T-Mobile ranked in just under 30%, and Verizon ranked in roughly 20%. Given their retail footprints and marketing budgets, all of the major carriers have significant room for improvement in their local rankings.

Even as the competitive landscape in the wireless industry shifts, Google’s local search landscape will continue to evolve. Google’s current local 3-packs have only been in full effect since August of 2015, and the search giant is constantly experimenting with new formats and features. No one carrier or reseller dominates the entire picture, and all of them will have to fight hard for organic and local search share in the foreseeable future.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

How & Why to Build a Basic Gantt Chart for Almost Any Project

Posted by noahlemas

[Estimated read time: 13 minutes]

I had planned on writing about losslessness, about accurate reproduction. I’ve always found it strange that at just about the same time that true losslessness became widely available cheaply, we suddenly seemed to care less about fidelity than ever before. So I had wanted to discuss the Internet’s imminent future, almost undoubtedly VR-based and highly resolution-dependent, and how that vision is slightly at odds with its history of relegating virtually everything to simple, low-resolution, compressed formats.

With the path to writing such a post research and time-intensive, deadline-bound, and rife with potential rabbit holes that could very well result in me unintentionally plumbing the depths of the Internet, I began framing it as though it were typical proposed work — which, for me, means organizing a basic Gantt chart. It’s something I do to frame the projects included in client engagements, beginning even during the proposal stage.

Remind me again what a Gantt chart is…

A Gantt chart is a rather simple matrix of a project’s activities and its associated start dates and deadlines. You’ve seen them but perhaps not known they had a name (activities on the left, activity duration on the right):
gantt-chart-example.jpgGiven the rise of agile project management within the technology and software industries in recent years, the humble Gantt chart is often forgotten about, mainly because a Gantt chart rarely meets the highly adaptive needs of more complicated projects (like software product development). But the same simplicity that has doomed it in complex spaces is also what makes it so easy to create and share in relation to the organization of simpler projects.

A Gantt chart is an assurance that we have a plan

“By failing to prepare, you are preparing to fail.”
– Benjamin Franklin

Clients want only two things: the first is a plan, the second results. In our industry, the results (or sometimes lack thereof) get the focus, with the plan usually an implicit conceptual agreement from the outset. If we don’t have a tangible plan from the outset, though, results will be largely arbitrary.

During the pitch/proposal stage, the reported results are usually case studies from our past work. Supplementing such case studies with a customized Gantt chart can illustrate to the prospective client that we’ve put more than cursory thought into the work we’re proposing and planning, and the results that we’re hoping to achieve.

It took me trial and error in both conventional client services and business development roles to learn that our agency work is often incomplete without a Gantt chart. I now find myself using them increasingly. Here, for example, is a very generic example of a simple Gantt chart framing a very basic SEO site audit:
tomsplanner (6).png

Aren’t Gantt charts only needed in project management & sales?

“The sales department is not the whole company but the whole company better be the sales department.”
– Philip Kotler

Treating sales as somebody else’s duty is a common mistake that we make in search. If we are client-facing in any capacity, though, we should be considering things, both in scope and out (often we can’t help but think of the out-of-scope anyway), that could provide clients the best possible results. That is to say that since finding and presenting opportunities to clients is an important aspect of growing both client results and agency business, then we all really are in sales.

In fact, we are all working not just in sales, but also in project management. Realizing that and capitalizing on it wherever possible is an additional means of “getting closer to the customer.” Embracing the humble Gantt chart helps us to better organize projects by providing the needed framework in a standardized format that translates across roles, companies, or even industries. Gantt charts help us speak the “language” of project management, organization, sales, and business in general.

Gantt charts are part of the common language of business

“It seems to me you should use their language, the language they use every day, the language in which they think.”
– David Ogilvy

As a general rule of thumb, the bigger or more sophisticated the client, the higher the chance that a Gantt chart will be an important part of planning and winning the business, and the greater the chances that our point of contact frames work like a typical project manager.

Successfully navigating the proposal process is almost always a product of communicating in the common language. Gantt charts, then, are not only another means of speaking the common language but, as Vince Lombardi once famously said, of acting “like you’ve been there before.”

As with any other industry or interest, speaking the common language can be the only way to ensure that a wide variety of people within an organization can understand exactly what it is that we’re proposing.

A Gantt chart can also meet the expectations of legal & procurement

“Any sufficiently advanced bureaucracy is indistinguishable from molasses.”
– Unknown

In business development, the unfortunate reality is that a “verbal yes” (especially with a bigger client) is nothing but permission to proceed to the legal and/or procurement departments, where many a business development director has been maddeningly frustrated, and where important initiatives, unfortunately, can go to die a slow and painful death.

I stumbled into emphasizing Gantt charts entirely by accident. In researching a promising prospective client, I found a page on their site that outlined a case study from an entirely unrelated field. In one of the page’s images was a Gantt chart with redacted identifying details. On a lark, I included a rudimentary timeline that I thought represented something close to a Gantt chart. That process made me better understand the work that I was proposing, and I’ve been using Gantt charts since.

Not only did I win the business that time by implementing a Gantt chart, I have also won other accounts simply by knowing the audience, in so doing acting as though every last member is a project manager. A Gantt chart certainly isn’t a magic key to legal and procurement, but it’s a relatively small, subtle addition that can have a disproportionately strong impact.

Internal client teams expect, want, or need Gantt charts

“Before anything else, preparation is the key to success.”
– Alexander Graham Bell

Assume for a moment that we did “get ink,” that we won a contract with a sophisticated client without a project plan or Gantt chart. In such a case, the internal team(s) will probably put together their own project plan and/or Gantt chart as a baseline reference point and as part of beginning to allocate resources.

When we hand this brand new account over to our client services team, would we prefer that they receive the Gantt chart that we carefully constructed and agreed on during the proposal process? Or would we prefer that the work be defined and framed by the people who had to hire our agency to consult on the work in the first place? Which is our most realistic path to being able to deliver the framed work and meet the goals of the campaign(s)?

Gantt charts help agency-side teams, too

“Of all the things I’ve done, the most vital is coordinating the talents of those who work for us and pointing them toward a certain goal.”
– Walt Disney

The handover and kickoff can be phenomenally easy and well organized when we prepare an easily relatable and understandable Gantt chart to every member of our team (and our client’s). The handover is then as simple as sharing file permissions with the teams of our agency and our new client.

The typical handover to your client services should be smooth and easy, accompanied by a well-outlined plan. Often, though, handovers to client services can be a cluster of questions to which nobody really knows the exact answer(s). A basic Gantt chart goes a long way toward an orderly, sensible, smooth handover, something that only instills further confidence in the new client’s team. The Gantt chart serves as a great means of bridging the gap between what was promised by sales and what will be delivered by client services.

Competitors use Gantt charts, too

“Even if you are on the right track, you will get run over if you just sit there.”
– Will Rogers

Experienced as we are, we all know that clients and prospects respond almost viscerally to reports on competition, especially where their competition is clearly beating them.

On that note, you know who uses project plans? Some of your competitors. If all else in a proposal is equal (and you’d be shocked at how often that happens), the planning can be the tie-breaker, both because it implies sophistication and because, as noted above, it is much more likely to be converted into a contract/SOW that breezes through legal and procurement.

How to make a Gantt chart for that article about fidelity vs. connectivity

Okay, now it’s time to dig in and and to actually put together a top-level outline of the article’s components, which include in this case preparation, research, writing, and editing. An outline is the best place to start building our Gantt chart.

In this case, we’ve created a 23-step outline (details below) for writing that article on “Fidelity vs. Connectivity.” Now, let’s make a Gantt chart of it…

Make a Gantt chart easily from a Trello board

To make a basic Gantt chart using Trello, frame a Trello board. I’ve created one called “Interesting Article Idea,” with a list; in this case, “Fidelity Vs. Connectivity” :
interesting-article-idea.jpgNext, fill out the related cards (which in this case consist of the 23 outline steps noted above) below the Fidelity vs. Connectivity list to include specific activities:

It’s a card list of how the article progresses and in what order. It’s still not ready to be a Gantt chart, but it’s close. In order to build our Gantt chart, define start dates and due dates for each card related to its associated activity, starting with what we anticipate to be the first, in this case “Research the history of ‘high fidelity’”:
start-trello-image.jpgWhen you hover over that card, you’ll see a small pencil icon. Click that edit/pencil icon to open an expanded menu of options. Then click “Change Due Date,” from which the following calendar menu will appear:
trello-start-image-3.jpgClick on the appropriate “due date” from the calendar (I’ve chosen a March 11 due date for this card, as we can see) and save changes, at which point our edited card will look like this (minus the giant red arrow, of course):

trello-start-image-4.jpgAt that point, the card has a due date but no start date. In order to add a start date, go back to the card list and click on the March 11 due date (where the red arrow above is pointing), at which point we will have an this expanded menu:
screen-cap-arrow.jpgAdding the start date here can be a bit elusive only because there is no “button” to do so. Instead, click on “Edit the description” (red arrow above) in order to open the following window:

In order to establish the start date, add it to the description window, using the format below (in this example a start date of March 7, 2016):
screen-cap-arrow-2.jpg

Repeat this process for all remaining 22 cards and you’ll end up with a card list like this:

trello-start-image-7.jpg

What you’ll need to make a basic Gantt chart

I prefer to use Ganttify, integrating a Trello board, largely out of habit. Ganttify also provides compatibility with Basecamp or even, yes it’s true, Google Calendar, so there are certainly other options if you’re not a Trello user. A rather impressive Gantt chart can also be built in Excel (for our spreadsheet-obsessed colleagues). Point is, there is no shortage of options for making free Gantt charts.

Also worth noting: the Gantt chart is NOT a complete project plan, but instead merely a part of one. The Gantt chart organizes the “what” and “when” aspects of a project plan but largely doesn’t touch on the “why” or “who” aspects. A Gantt chart, then, can exist without a project plan, but a project plan usually cannot exist without Gantt charts. For the purposes of this post, we’re concerned only with the Gantt chart… and a very basic one at that; we’re stepping into the project management world as relative novices. By design, our sample Gantt chart here will be as simple as possible.

At this point, with your Trello board complete, you are ready to head over to Ganttify:

Gantt_charts_for_Basecamp__Google_Calendar_and_Trello.jpg

We’re working from Trello here, obviously. Click the Trello button and you’ll be taken to the following screen:
trello-start-image-6.jpgAllow Ganttify access to your Trello board by clicking the “Allow” button; you’ll be taken swiftly to the Trello/Ganttify dashboard:

number-2.jpg

And there it is waiting for you… automatically created from the existing Trello board you made earlier. Click on “Interesting Article Idea” and you’ll be served this pop-out window:
Interesting-Article-Idea.jpg

You did it; that’s a Gantt chart! It needs a little refining, of course, but you’ve created a usable Gantt chart. Perhaps the best part about the Gantt chart you’ve just created is the fact that you can simply adjust any of the “activities” on the timeline of the Gantt chart and the associated changes will be automatically reflected in the original Trello board. Let’s have a look at how this works:
trello-start-image10.jpg

Drag to increase the width of the first “activity” (red arrow above) and you’ll will see this change directly on the Gantt chart:
trello-start-image-11.jpg

That change on the Gantt change then becomes part of the parameters of the original Trello board (requiring no changes to the underlying Trello board; Ganttify and Trello are essentially working together):
trello-start-image-12.jpgChanging all or part of your Gantt chart, then, changes the underlying Trello board (and vice-versa). Using the same process, you can easily change the activities back to the original dates. This means that changes are then automatically shared with collaborators (assuming we’ve shared our Trello board with other team members).

Exporting the Gantt chart to a format of your choosing means you can insert it into any document in the appropriate file type. To export, click the print icon (indicated by the red arrow below):
Interesting-Article-Idea-2.jpg

From the pop-out window above, click the print icon in the upper left corner. That will result in the following option window:

3.jpg

I’ve added the red arrow here to remind you that the cleanest possible outcome is a result of condensing the timeline to show only the dates relevant to the project (especially important when planning longer, more complicated projects).

Export to your preferred format by clicking “print.” The resulting JPG for our for our simplified example project looks like this:trello-gantt-image-8.jpgBy this point, you’ll have a Trello board built out, a working version of a Gantt chart, and the knowledge/ability to edit in one place, with those edits reflected across platforms and immediately available to collaborators. You’ll be ready to insert your newly created file wherever you need it. It really is that simple! Of course, time and practice will provide for more detailed and complex, in-depth Gantt charts, but this is a great place to start.

We’ve started here with a very simplified Gantt chart but, as you begin to use them, you can add layers of depth and make them increasingly advanced. As you’ve seen, building basic Gantt charts is simpler (and perhaps more useful) than it at first might have seemed.

To summarize the process:

  1. Outline the project.
  2. Frame the associated Trello board.
  3. Define the start and end dates of each activity.
  4. Allow Ganttify access to the Trello board.
  5. Export from Ganttify to your preferred file format.
  6. Insert the newly created file into a proposal, business case, report, etc.

Regardless of your role, or whether you are agency-side or client-side, organizing work and communicating timelines via Gantt charts provides a necessary baseline for just about any project. When you build Gantt charts from shared resources like Trello, Basecamp, or Google Calendar, you also encourage efficient collaboration by ensuring that everyone on your internal teams, and those of your clients, start from “the same page.” Framing your work in Gantt charts improves your ability to organize, communicate, and collaborate, all of which increases efficiency and allows you to, as we say at Distilled, “work smarter, not harder.”

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

How to Create 10x Content – Whiteboard Friday

Posted by randfish

[Estimated read time: 8 minutes]

Have you ever actually tried to create 10x content? It’s not easy, is it? Knowing how and where to start can often be the biggest obstacle you’ll face. In today’s Whiteboard Friday, Rand talks about how good, unique content is going to die, and how you can develop your own 10x content to help it along.

http://ift.tt/1RSw8gM

http://ift.tt/1GaxkYO

How to Create 10x Content Whiteboard

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about how to create 10x content.

Now, for those of you who might need a refresher or who haven’t seen previous Whiteboard Fridays where we’ve talked about 10x content, this is the idea that, because of content saturation, content overload, the idea that there’s just so much in our streams and standing out is so hard, we can’t just say, “Hey, I want to be as good as the top 10 people in the search results for this particular keyword term or phrase.” We have to say, “How can I create something 10 times better than what any of these folks are currently doing?” That’s how we stand out.

Some criteria for 10x content.

I actually have a page, a Google doc that I keep adding to that has a list of 60-plus different pieces of 10x content. I’ll link to that.

But basically, the criteria for 10 times better than what anyone else is doing is the following.

  • It has to have great UI and UX on any device.
  • That content is generally a combination of high quality, trustworthy, it’s useful, interesting, and remarkable. It doesn’t have to be all of those but some combination of them.
  • It’s got to be considerably different in scope and in detail from other works that are serving the same visitor or user intent.
  • It’s got to create an emotional response. I want to feel awe. I want to feel surprise. I want to feel joy, anticipation, or admiration for that piece of content in order for it to be considered 10x.
  • It has to solve a problem or answer a question by providing that comprehensive, accurate, exceptional information or resources.
  • It’s got to deliver content in a unique, remarkable, typically unexpectedly pleasurable style or medium.

If you hit all of these things, you probably have yourself a piece of 10x content. It’s just very hard to do. That’s what we’re talking about today. What’s a process by which we can get to checking off all these boxes?

Step 1 – Gain deep insight.

So let’s start here. First off, when you have an issue, let’s say you’ve got a piece of content that you know you want to create, a topic you know you’re going to address that topic. We can talk about how to get to that topic in a future Whiteboard Friday, and we’ve had some in the past certainly around keyword research and choosing topics and that sort of thing. But if I know the topic, I need to first gain a deep, deep insight into the core of why people are interested in this subject.

So for example, let’s do something simple, something we’re all familiar with.

“I wonder what the most highly-rated new movies are out there.” Essentially this is, “Well, okay, how do we get into this person’s brain and try and answer the core of their question?” They’re essentially asking, “Okay, how do I figure out . . . help me decide what to watch.”

That could have a bunch of angles to it. It could be about user ratings, or it could be maybe about awards. Maybe it’s about popularity. What are the most popular movies out there? It could be meta ratings. Maybe this person wants to see an aggregated list of all the data out there. It could be editorial or critic ratings. There’s a bunch of angles there.

Step 2 – We have to get unique.

We know that uniqueness, being exceptional, not the same as everyone else but different from everyone else out there, is really important.

So as we brainstorm different ways that we might address the core of this user’s problem, we might say, “All right, movie ratings, could we do a round-up?”

Well, that already exists at places like Metacritic. They sort of aggregate everything and then put it all together and tell us what critics versus audiences think across many, many different websites. So that’s already been done.

Awards versus popularity, again, it’s already been done in a number of places that do comparisons of here’s the ones that had the highest box office versus here’s the ones that won certain types of awards. Well, okay, so that’s not particularly unique.

What about critics versus audiences? Again, this is done basically on every different website. Everyone shows me user ratings versus critic ratings.

What about by availability? Well, there’s actually a bunch of sites that do this now where they show you this is on Netflix, this is on Hulu, this is on Amazon, this you can watch on Comcast or on demand, this you can see on YouTube. All right, so that’s not unique either.

What about which ratings can I trust? Hang on a tick. That might not exist yet. That’s a great, unique insight into this problem, because one of the challenges that I have when I want to say, “What should I decide to watch,” is who should I trust and who should I believe. Can I go to Fandango or Amazon or Metacritic or Netflix? Whose ratings are actually trustworthy?

Well, now we’ve got something unique, and now we’ve got that core insight, that unique angle on it.

Step 3 – Uncover powerful methods to provide an answer.

Now we want to uncover a powerful, hard-to-replicate, high-quality method to provide an answer to that question.

In this case, that could be, “Well, you know what? We can do a statistical analysis.” We get a sample set big enough, enough films, maybe 150 movies or so from the last year. We take a look at the ratings that each service provides, and we see if we can find patterns, patterns like: Who’s high and low? Do some have different genre preferences? Which one is trustworthy? Does one correlate with awards and critics? Which ones are outliers? All of these are actually trying to get to the “which one can I trust” question.

I think we can answer that if we do this statistical analysis. It’s a pain in the butt.

We have to go to all these sites. We have to collect all the data. We have to put it into a statistical model. We then have to run our model. We have to make sure that we have a big enough sample set. We’ve got to see what our correlations are. We have to check for outliers and distributions and all this kind of stuff. But once we do that and once we show our methodology, now all we have to do is…

Step 4 – Find a unique, powerful, exceptional way to present this content.

In fact, FiveThirtyEight.com did exactly this.

They took this statistical analysis. They looked at all of these different sites, Fandango and IMDB users versus critics versus Metacritic versus Rotten Tomatoes and a number of other sites. Then they had this one graph that shows essentially the star rating averages across I think it was 146 different films, which was the sample set that they determined was accurate enough.

Now they’ve created this piece of 10x content, and they’ve answered this unique take on the question, “Which rating service can I trust?” The answer is, “Don’t trust Fandango,” basically. But you can see more in there. Metacritic is pretty good. A couple of the other ones are decent.

Step 5 – Expect that you’re going to do this 5 to 10 times before you have one hit.

The only way to get good at this, the only way to get good is experimentation and practice. You do this over and over again, and you start to develop a sixth sense for how you can uncover that unique element, how you can present it in a unique fashion, and how you can make it sing on the Web.

All right, everyone, I look forward to hearing your thoughts on 10x content. If you have any examples you’d like to share with us, please feel free to do so in the comments. No problem linking out. That’s just fine. We will see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

How Google’s AMP Will Influence Your Online Marketing

Posted by Web_Perfectionist

[Estimated read time: 9 minutes]

What is Google AMP?

The Google AMP Project is a way of fast-tracking content to mobile devices. It improves upon the traditional model of serving mobile content because it relies on a specific form of HTML, called AMP HTML, to strip down the presentation of content. Here’s an example of what an AMP page looks like when rendered on an iPhone 6.

amp-example-iphone6.jpg

The net effect is that the mobile user will see articles with comparatively basic text and images, but that content will load up to 10 times faster (or more!) over traditionally formatted content.

Why is Google AMP important for SEO?

As Google often preaches to the industry, page speed and mobile-readiness are high-quality ranking distinctions that determine the placement of a site’s content link in the search engine results pages (SERPs). The faster a site is (among other ranking signals), and the more it caters to mobile devices, the more likely it is to be seen and clicked on by Google search users.

Since 2013, Google has been evolving from being the company that provides links to other sites in search results to the company that provides answers to questions in search results.

For example, the “featured snippets” aspect of Google, shown below, has been a method of providing quick answers in search results to simple questions such as “Who won the 1969 World Series?”

who won the 1969 world series Google Search.png

But featured snippets don’t work well for more complex questions, like “What are the main issues in the 2016 presidential election?” Those types of questions lend themselves more to in-depth articles. Unfortunately, when simple answers are not plausible, one must load another page that may be slow to load on mobile devices. As a result, Google has been developing ways to make the links you click on in search results load more quickly. And now, with Google’s AMP Project, they have been making those appear more prominently in SERPs.

How does Google AMP work?

There are three parts to Google AMP:

  1. AMP HTML
  2. AMP JS
  3. AMP Cache

AMP HTML has a strictly defined set of pre-processing tags. Those are mainly limited to text formatting and image embedding tags such as amp-ad, amp-embed, amp-img, amp-pixel, and amp-video.

AMP JS is a severely limited Javascript file. It loads all external resources in an asynchronous (in the background) way. This keeps “render blocking” from interfering with how quickly what the user came to see renders on the screen. Everything extraneous to the actual words and images in the article loads last. AMP JS also grabs and pre-renders the content by predicting which DNS resources and connections will be needed, then by downloading and pre-sizing images. This is all done to alleviate work for the mobile device to economize data use.

AMP Cache, or the AMP Content Delivery Network (AMP CDN), is Google’s system of servers doing the heavy lifting of grabbing your most recent content and pre-positioning it around the globe. This ensures that a page requested from, say, Italy doesn’t need to be sent over the wire from Mountain View, California each time it’s requested. Instead, Google places a pre-rendered, optimized copy of that AMP page on a server close to or in Italy. The CDN is refreshed each time an article is updated or added.

The positive impacts of AMP on SEO & online marketing

Faster-loading articles improve the publisher/reader relationship. Speed is the most obvious benefit to publishers using AMP for improved SEO. That speed translates into more page views and fewer frustrated readers, which also translates to more ad views, sharing, and engagement with content.

1.) AMP-enabled articles will rank higher in SERPs.

AMP content will have the advantage of being shown above the fold, at the top of Google searches, unless Google changes how and whether it displays all AMP results in this way. An example of how AMP pages displayed in search results is shown below.

amp-search-example-iphone6.jpg

Currently, AMP articles appear in a swipeable carousel. For now, there is not a paid placement option, but it may appear in the future. AMP-enabled articles do have an icon in the SERPs indicating that they are built on AMP.

2.) Paid search impressions will likely increase.

After viewing an AMP-based piece of content, the most common thing users do is click back to the SERP to see what else there might be. This will positively affect the number of paid search impressions over time.

3.) Google AMP is for every publisher.

Facebook limits participation in its Instant Articles feature to just a select set of publishers. With Google AMP, anyone with a little know-how or willingness to learn can format his or her content to be accessed quickly by a potentially enormous number of readers.

4.) AMP is open source.

This means that contributions to its evolution are not limited to the world of Google’s best and brightest developers. Anyone who has an idea for making it better can contribute to the specification. AMP’s feature set will more readily adapt to a changing publishing world.

5.) Analytics are coming for AMP.

According to Google, several analytics providers — including comScore, Adobe Analytics, Parse.ly, and Chartbeat — are gearing up their services to tell publishers how well their AMP content is doing. In fact, the AMP specification provides instructions for supporting current AMP analytics vendors, as well as how to support your own custom analytics solution for AMP.

6.) Content gets to more readers.

Even though AMP mainly benefits Google in that it helps them compete with Facebook’s Instant Articles, that improved reach benefits publishers because their content can now be more widely read when users click on them in Google’s SERPs — not just in Facebook’s walled garden.

7.) More features and formatting options are coming.

Even though AMP deals in a limited set of tags for formatting pages, all’s not lost. There are still plenty of extended components and even some experimental components to be released as they become available.

The negative impacts of AMP on SEO & online marketing

1.) There are no forms in AMP content.

That means that if a publisher’s goal is to generate leads by inviting a reader to subscribe or submit his contact information, it’s going to have to wait until AMP provides an upgrade to the specification that allows publishers to have forms in their AMP-optimized content.

2.) AMP doesn’t solve the problem of page speed SEO for non-publisher sites.

It really only covers “news”-type articles and blog posts and is not intended for speeding up general e-commerce or brand sites. An e-commerce site that doesn’t focus on articles or blog posts as its main content will probably find the design constraints of AMP much too restrictive and will want to stick to traditional HTML.

3.) The number of paid search result item impressions could go down.

If the search term is broad or general (i.e. “news,” “fashion,” or “food”), AMP articles will probably appear more frequently than paid search results items. Only time and analytics will correct for assumptions here.

4.) There are no external style sheets or Javascript.

Because of a lack of external stylesheets and external Javascript, the design and user experience (UX) of pages is lackluster. Publishers and non-publishers alike will have to decide if it’s more important to their brand to have the design complement the content to attract return visitors (in which case they might opt out of AMP for now), or if their content stands on its own and their visitors only care about rapidly loading pages (in which case they’ll want to start implementing it now). Use of experimental components as a hedge against dull pages carries the risk that the component will have bugs or will be rejected by the next release of the AMP specification.

5.) Domain Authority may suffer.

From their Learn SEO page: “Domain Authority is a score (on a 100-point scale) developed by Moz that predicts how well a website will rank on search engines.” One of the factors included in the calculation is the number of linking root domains. An indirect negative effect would be that a publisher’s site would earn fewer links. That’s because other sites linking to AMP content will not be linking to the publisher’s domain name, but to google.com. For example, here’s a screenshot of an article loaded on an iPhone 6 as accessed from an AMP carousel search.

Note that the URL, http://ift.tt/1M9TGRG, begins with “www.google.com/amp/” and then tacks on the article’s originating domain. When viewing the article, visitors will still be on Google.com, not on the publisher’s website.

6.) The way publishers serve ads inline with content will necessarily change.

This could be a good thing, in that it will force publishers to rethink their ads so that they no longer annoy the 16% of customers who block their ads anyway. It could be a bad thing for publishers who rely on high-bandwidth, over-designed ads to capture attention, though. They’ll have to either opt out of AMP or find another advertising strategy. Of course, if a publisher is part of the Google AMP ads partnership of Outbrain, AOL, OpenX, DoubleClick, and AdSense, the publisher’s own burden of improving its ads is greatly reduced. More ad partners are being brought into the fold as they come into compliance with the AMP spec for their ads.

7.) Budgeting for content development will need to increase.

If you don’t have a CMS that already supports AMP, you’ll need to budget for developing in AMP or build into your custom or extensible CMS as an additional feature.

8.) Publishers can’t get away with poorly-constructed HTML pages with AMP.

This is actually both a positive and a negative aspect of AMP. On the positive side, every page has to be free of errors before Google will even pick it up and put it in the AMP caches. This means that users will have a better experience downloading the content on a variety of devices. However, on the negative side, publishers will need to budget time (and developer hours) to further debug every page. Fortunately, Google has provided a validator with AMP.

Conclusion

Page speed is a ranking factor in Google’s algorithm. The fact that Google has come out with its own way of constructing and displaying content faster and more concisely speaks to its desire to make page speed an even more important indicator of a page’s value in SERPs.

If a site deals primarily in long-form, news-type content (as opposed to marketing or selling its products), then it’s a good candidate for an AMP overhaul. If publishers only add AMP to get ahead of the emerging trend towards mobile-optimized content, they’ll be doing themselves and their SEO ranking a favor.

By now, you may be wondering what you can do to boost your page speed, given its increasing importance. We’ve got an awesome free performance report you can use to get actionable intel on how to optimize your site for speed and performance. Feel free to check it out if you’re interested in learning steps you can take to improve your website’s performance.

Thoughts or questions about Google’s Accelerated Mobile Pages? Let us know in the comments!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!