Pop-Ups, Overlays, Modals, Interstitials, and How They Interact with SEO – Whiteboard Friday

Posted by randfish

Have you thought about what your pop-ups might be doing to your SEO? There are plenty of considerations, from their timing and how they affect your engagement rates, all the way to Google’s official guidelines on the matter. In this episode of Whiteboard Friday, Rand goes over all the reasons why you ought to carefully consider how your overlays and modals work and whether the gains are worth the sacrifice.

http://ift.tt/2oRRPa7

http://ift.tt/1SsY8tZ

Pop-ups, modals, overlays, interstitials, and how they work with SEO

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about pop-ups, overlays, modals, interstitials, and all things like them. They have specific kinds of interactions with SEO. In addition to Google having some guidelines around them, they also can change how people interact with your website, and that can adversely or positively affect you accomplishing your goals, SEO and otherwise.

Types

So let’s walk through what these elements, these design and UX elements do, how they work, and best practices for how we should be thinking about them and how they might interfere with our SEO efforts.

Pop-ups

So, first up, let’s talk specifically about what each element is. A pop-up now, okay, there are a few kinds. There are pop-ups that happen in new windows. New window pop-ups are, basically, new window, no good. Google hates those. They are fundamentally against them. Many browsers will stop them automatically. Chrome does. Firefox does. In fact, users despise these as well. There are still some spammy and sketchy sites out there that use them, but, generally speaking, bad news.

Overlays

When we’re talking about a pop-up that happens in the same browser window, essentially it’s just a visual element, that’s often also referred to as an overlay. So, for the purposes of this Whiteboard Friday, we’ll call that an overlay. An overlay is basically like this, where you have the page’s content and there’s some smaller element, a piece, a box, a window, a visual of some kind that comes up and that essentially says, maybe it says, “Sign up for my email newsletter,” and then there’s a place to enter your email, or, “Get my book now,” and you click that and get the book. Those types of overlays are pretty common on the web, and they do not create quite the same problems that pop-ups do, at least from Google’s perspective. However, we’ll talk about those later, there are some issues around them, especially with mobile.

Modals

Modals tend to be windows of interaction, tend to be more elements of use. So lightboxes for images is a very popular modal. A modal is something where you are doing work inside that new box rather than in the content that’s underneath it. So a sign-in form that overlays, that pops up over the rest of the content, but that doesn’t allow you to engage with this content underneath it, that would be considered a modal. Generally, most of the time, these aren’t a problem, unless they are something like spam, or advertising, or something that’s taking you out of the user experience.

Interstitials

Then finally, interstitials are essentially, and many of these can also be called interstitial experiences, but a classic interstitial is something like what Forbes.com does. When you visit Forbes, an article for the first time, you get this, “Welcome. Our sponsor of the day is Brawndo. Brawndo, it has what plants need.” Then you can continue after a certain number of seconds. These really piss people off, myself included. I really hate the interstitial experience. I understand that it’s an advertising thing. But, yeah, Google hates them too. Not quite enough to kick Forbes out of their SERPs entirely yet, but, fingers crossed, it will happen sometime soon. They have certainly removed plenty of other folks who have gone with invasive or overly heavy interstitials over the years and made those pretty tough.

What are the factors that matter for SEO?

A) Timing

Well, it turns out timing is a big one. So when the element appears matters. Basically, if the element shows up initially upon page load, they will consider it differently than if it shows up after a few minutes. So, for example, if you have a “Sign Up Now” overlay that pops up the second you visit the page, that’s going to be treated differently than something that happens when you’re 80% or you’ve just finished scrolling through an entire blog post. That will get treated very differently. Or it may have no effect actually on how Google treats the SEO, and then it really comes down to how users do.

Then how long does it last as well. So interstitials, especially those advertising interstitials, there are some issues governing that with people like Forbes. There are also some issues around an overlay that can’t be closed and how long a window can pop up, especially if it shows advertising and those types of things. Generally speaking, obviously, shorter is better, but you can get into trouble even with very short ones.

B) Interaction

Can that element easily be closed, and does it interfere with the content or readability? So Google’s new mobile guidelines, I think as of just a few months ago, now state that if an overlay or a modal or something interferes with a visitor’s ability to read the actual content on the page, Google may penalize those or remove their mobile-friendly tags and remove any mobile-friendly benefit. That’s obviously quite concerning for SEO.

C) Content

So there’s an exception or an exclusion to a lot of Google’s rules around this, which is if you have an element that is essentially asking for the user’s age, or asking for some form of legal consent, or giving a warning about cookies, which is very popular in the EU, of course, and the UK because of the legal requirements around saying, “Hey, this website uses cookies,” and you have to agree to it, those kinds of things, that actually gets around Google’s issues. So Google will not give you a hard time if you have an overlay interstitial or modal that says, “Are you of legal drinking age in your country? Enter your birth date to continue.” They will not necessarily penalize those types of things.

Advertising, on the other hand, advertising could get you into more trouble, as we have discussed. If it’s a call to action for the website itself, again, that could go either way. If it’s part of the user experience, generally you are just fine there. Meaning something like a modal where you get to a website and then you say, “Hey, I want to leave a comment,” and so there’s a modal that makes you log in, that type of a modal. Or you click on an image and it shows you a larger version of that image in a modal, again, no problem. That’s part of the user experience.

D) Conditions

Conditions matter as well. So if it is triggered from SERP visits versus not, meaning that if you have an exclusionary protocol in your interstitial, your overlay, your modal that says, “Hey, if someone’s visiting from Google, don’t show this to them,” or “If someone’s visiting from Bing, someone’s visiting from DuckDuckGo, don’t show this to them,” that can change how the search engines perceive it as well.

It’s also the case that this can change if you only show to cookied or logged in or logged out types of users. Now, logged out types of users means that everyone from a search engine could or will get it. But for logged in users, for example, you can imagine that if you visit a page on a social media site and there’s a modal that includes or an overlay that includes some notification around activity that you’ve already been performing on the site, now that becomes more a part of the user experience. That’s not necessarily going to harm you.

Where it can hurt is the other way around, where you get visitors from search engines, they are logged out, and you require them to log in before seeing the content. Quora had a big issue with this for a long time, and they seem to have mostly resolved that through a variety of measures, and they’re fairly sophisticated about it. But you can see that Facebook still struggles with this, because a lot of their content, they demand that you log in before you can ever view or access it. That does keep some of their results out of Google, or certainly ranking lower.

E) Engagement impact

I think this is what Google’s ultimately trying to measure and what they’re trying to essentially say, “Hey, this is why we have these issues around this,” which is if you are hurting the click-through rate or you’re hurting pogo-sticking, meaning that more people are clicking onto your website from Google and then immediately clicking the Back button when one of these things appears, that is a sign to Google that you have provided a poor user experience, that people are not willing to jump through whatever hoop you’ve created for them to get access your content, and that suggests they don’t want to get there. So this is sort of the ultimate thing that you should be measuring. Some of these can still hurt you even if these are okay, but this is the big one.

Best practices

So some best practices around using all these types of elements on your website. I would strongly urge you to avoid elements that are significantly harming UX. If you’re willing to take a small sacrifice in user experience in exchange for a great deal of value because you capture people’s email addresses or you get more engagement of other different kinds, okay. But this would be something I’d watch.

There are three or four metrics that I’d urge you to check out to compare whether this is doing the right thing. Those are:

  • Bounce rate
  • Browse rate
  • Return visitor rates, meaning the percentage of people who come back to your site again and again, and
  • Time on site after the element appears

So those four will help tell you whether you are truly interfering badly with user experience.

On mobile, ensure that your crucial content is not covered up, that the reading experience, the browsing experience isn’t covered up by one of these elements. Please, whatever you do, make those elements easy and obvious to dismiss. This is part of Google’s guidelines around it, but it’s also a best practice, and it will certainly help your user experience metrics.

Only choose to keep one of these elements if you are finding that the sacrifice… and there’s almost always a sacrifice cost, like you will hurt bounce rate or browse rate or return visitor rate or time on site. You will hurt it. The question is, is it a slight enough hurt in exchange for enough gain, and that’s that trade-off that you need to decide whether it’s worth it. I think if you are hurting visitor interaction by a few seconds on average per visit, but you are getting 5% of your visitors to give you an email address, that’s probably worth it. If it’s more like 30 seconds and 1%, maybe not as good.

Consider removing the elements from triggering if the visit comes from search engines. So if you’re finding that this works fine and great, but you’re having issues around search guidelines, you could consider potentially just removing the element from any visit that comes directly from a search engine and instead placing that in the content itself or letting it happen on a second page load, assuming that your browse rate is decently high. That’s a fine way to go as well.

If you are trying to get the most effective value out of these types of elements, it tends to be the case that the less common and less well used the visual element is, the more interaction and engagement it’s going to get. But the other side of that coin is that it can create a more frustrating experience. So if people are not familiar with the overlay or modal or interstitial visual layout design that you’ve chosen, they may engage more with it. They might not dismiss it out of hand, because they’re not used to it yet, but they can also get more frustrated by it. So, again, return to looking at those metrics.

With that in mind, hopefully you will effectively, and not too harmfully to your SEO, be able to use these pop-ups, overlays, interstitials, modals, and all other forms of elements that interfere with user experience.

And we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

There’s No Such Thing as a Site Migration

Posted by jonoalderson

Websites, like the businesses who operate them, are often deceptively complicated machines.

They’re fragile systems, and changing or replacing any one of the parts can easily affect (or even break) the whole setup — often in ways not immediately obvious to stakeholders or developers.

Even seemingly simple sites are often powered by complex technology, like content management systems, databases, and templating engines. There’s much more going on behind the scenes — technically and organizationally — than you can easily observe by crawling a site or viewing the source code.

When you change a website and remove or add elements, it’s not uncommon to introduce new errors, flaws, or faults.

That’s why I get extremely nervous whenever I hear a client or business announce that they’re intending to undergo a “site migration.”

Chances are, and experience suggests, that something’s going to go wrong.

http://platform.twitter.com/widgets.js

Migrations vary wildly in scope

As an SEO consultant and practitioner, I’ve been involved in more “site migrations” than I can remember or count — for charities, startups, international e-commerce sites, and even global household brands. Every one has been uniquely challenging and stressful.

In each case, the businesses involved have underestimated (and in some cases, increased) the complexity, the risk, and the details involved in successfully executing their “migration.”

As a result, many of these projects negatively impacted performance and potential in ways that could have been easily avoided.

This isn’t a case of the scope of the “migration” being too big, but rather, a misalignment of understanding, objectives, methods, and priorities, resulting in stakeholders working on entirely different scopes.

The migrations I’ve experienced have varied from simple domain transfers to complete overhauls of server infrastructure, content management frameworks, templates, and pages — sometimes even scaling up to include the consolidation (or fragmentation) of multiple websites and brands.

In the minds of each organization, however, these have all been “migration” projects despite their significantly varying (and poorly defined) scopes. In each case, the definition and understanding of the word “migration” has varied wildly.

We suck at definitions

As an industry, we’re used to struggling with labels. We’re still not sure if we’re SEOs, inbound marketers, digital marketers, or just… marketers. The problem is that, when we speak to each other (and those outside of our industry), these words can carry different meaning and expectations.

Even amongst ourselves, a conversation between two digital marketers, analysts, or SEOs about their fields of expertise is likely to reveal that they have surprisingly different definitions of their roles, responsibilities, and remits. To them, words like “content” or “platform” might mean different things.

In the same way, “site migrations” vary wildly, in form, function, and execution — and when we discuss them, we’re not necessarily talking about the same thing. If we don’t clarify our meanings and have shared definitions, we risk misunderstandings, errors, or even offense.

Ambiguity creates risk

Poorly managed migrations can have a number of consequences beyond just drops in rankings, traffic, and performance. There are secondary impacts, too. They can also inadvertently:

  • Provide a poor user experience (e.g., old URLs now 404, or error states are confusing to users, or a user reaches a page different from what they expected).
  • Break or omit tracking and/or analytics implementations, resulting in loss of business intelligence.
  • Limit the size, shape, or scalability of a site, resulting in static, stagnant, or inflexible templates and content (e.g., omitting the ability to add or edit pages, content, and/or sections in a CMS), and a site which struggles to compete as a result.
  • Miss opportunities to benefit from what SEOs do best: blending an understanding of consumer demand and behavior, the market and competitors, and the brand in question to create more effective strategies, functionality and content.
  • Create conflict between stakeholders, when we need to “hustle” at the last minute to retrofit our requirements into an already complex project (“I know it’s about to go live, but PLEASE can we add analytics conversion tracking?”) — often at the cost of our reputation.
  • Waste future resource, where mistakes require that future resource is spent recouping equity resulting from faults or omissions in the process, rather than building on and enhancing performance.

I should point out that there’s nothing wrong with hustle in this case; that, in fact, begging, borrowing, and stealing can often be a viable solution in these kinds of scenarios. There’s been more than one occasion when, late at night before a site migration, I’ve averted disaster by literally begging developers to include template review processes, to implement redirects, or to stall deployments.

But this isn’t a sensible or sustainable or reliable way of working.

Mistakes will inevitably be made. Resources, favors, and patience are finite. Too much reliance on “hustle” from individuals (or multiple individuals) may in fact further widen the gap in understanding and scope, and positions the hustler as a single point of failure.

More importantly, hustle may only fix the symptoms, not the cause of these issues. That means that we remain stuck in a role as the disruptive outsiders who constantly squeeze in extra unscoped requirements at the eleventh hour.

Where things go wrong

If we’re to begin to address some of these challenges, we need to understand when, where, and why migration projects go wrong.

The root cause of all less-than-perfect migrations can be traced to at least one of the following scenarios:

  • The migration project occurs without consultation.
  • Consultation is sought too late in the process, and/or after the migration.
  • There is insufficient planned resource/time/budget to add requirements (or processes)/make recommended changes to the brief.
  • The scope is changed mid-project, without consultation, or in a way which de-prioritizes requirements.
  • Requirements and/or recommended changes are axed at the eleventh hour (due to resource/time/budget limitations, or educational/political conflicts).

There’s a common theme in each of these cases. We’re not involved early enough in the process, or our opinions and priorities don’t carry sufficient weight to impact timelines and resources.

Chances are, these mistakes are rarely the product of spite or of intentional omission; rather, they’re born of gaps in the education and experience of the stakeholders and decision-makers involved.

We can address this, to a degree, by elevating ourselves to senior stakeholders in these kinds of projects, and by being consulted much earlier in the timeline.

Let’s be more specific

I think that it’s our responsibility to help the organizations we work for to avoid these mistakes. One of the easiest opportunities to do that is to make sure that we’re talking about the same thing, as early in the process as possible.

Otherwise, migrations will continue to go wrong, and we will continue to spend far too much of our collective time fixing broken links, recommending changes or improvements to templates, and holding together bruised-and-broken websites — all at the expense of doing meaningful, impactful work.

Perhaps we can begin to answer to some of these challenges by creating better definitions and helping to clarify exactly what’s involved in a “site migration” process.

Unfortunately, I suspect that we’re stuck with the word “migration,” at least for now. It’s a term which is already widely used, which people think is a correct and appropriate definition. It’s unrealistic to try to change everybody else’s language when we’re already too late to the conversation.

Our next best opportunity to reduce ambiguity and risk is to codify the types of migration. This gives us a chance to prompt further exploration and better definitions.

For example, if we can say “This sounds like it’s actually a domain migration paired with a template migration,” we can steer the conversation a little and rely on a much better shared frame of reference.

If we can raise a challenge that, e.g., the “translation project” a different part of the business is working on is actually a whole bunch of interwoven migration types, then we can raise our concerns earlier and pursue more appropriate resource, budget, and authority (e.g., “This project actually consists of a series of migrations involving templates, content, and domains. Therefore, it’s imperative that we also consider X and Y as part of the project scope.”).

By persisting in labelling this way, stakeholders may gradually come to understand that, e.g., changing the design typically also involves changing the templates, and so the SEO folks should really be involved earlier in the process. By challenging the language, we can challenge the thinking.

Let’s codify migration types

I’ve identified at least seven distinct types of migration. Next time you encounter a “migration” project, you can investigate the proposed changes, map them back to these types, and flag any gaps in understanding, expectations, and resource.

You could argue that some of these aren’t strictly “migrations” in a technical sense (i.e., changing something isn’t the same as moving it), but grouping them this way is intentional.

Remember, our goal here isn’t to neatly categorize all of the requirements for any possible type of migration. There are plenty of resources, guides, and lists which already try do that.

Instead, we’re trying to provide neat, universal labels which help us (the SEO folks) and them (the business stakeholders) to have shared definitions and to remove unknown unknowns.

They’re a set of shared definitions which we can use to trigger early warning signals, and to help us better manage stakeholder expectations.

Feel free to suggest your own, to grow, shrink, combine, or bin any of these to fit your own experience and requirements!

1. Hosting migrations

A broad bundling of infrastructure, hardware, and server considerations (while these are each broad categories in their own right, it makes sense to bundle them together in this context).

If your migration project contains any of the following changes, you’re talking about a hosting migration, and you’ll need to explore the SEO implications (and development resource requirements) to make sure that changes to the underlying platform don’t impact front-end performance or visibility.

  • You’re changing hosting provider.
  • You’re changing, adding, or removing server locations.
  • You’re altering the specifications of your physical (or virtual) servers (e.g., RAM, CPU, storage, hardware types, etc).
  • You’re changing your server technology stack (e.g., moving from Apache to Nginx).*
  • You’re implementing or removing load balancing, mirroring, or extra server environments.
  • You’re implementing or altering caching systems (database, static page caches, varnish, object, memcached, etc).
  • You’re altering the physical or server security protocols and features.**
  • You’re changing, adding or removing CDNs.***

*Might overlap into a software migration if the changes affect the configuration or behavior of any front-end components (e.g., the CMS).

**Might overlap into other migrations, depending on how this manifests (e.g., template, software, domain).

***Might overlap into a domain migration if the CDN is presented as/on a distinct hostname (e.g., AWS), rather than invisibly (e.g., Cloudflare).

2. Software migrations

Unless your website is comprised of purely static HTML files, chances are that it’s running some kind of software to serve the right pages, behaviors, and content to users.

If your migration project contains any of the following changes, you’re talking about a software migration, and you’ll need to understand (and input into) how things like managing error codes, site functionality, and back-end behavior work.

  • You’re changing CMS.
  • You’re adding or removing plugins/modules/add-ons in your CMS.
  • You’re upgrading or downgrading the CMS, or plugins/modules/addons (by a significant degree/major release) .
  • You’re changing the language used to render the website (e.g., adopting Angular2 or NodeJS).
  • You’re developing new functionality on the website (forms, processes, widgets, tools).
  • You’re merging platforms; e.g., a blog which operated on a separate domain and system is being integrated into a single CMS.*

*Might overlap into a domain migration if you’re absorbing software which was previously located/accessed on a different domain.

3. Domain migrations

Domain migrations can be pleasantly straightforward if executed in isolation, but this is rarely the case. Changes to domains are often paired with (or the result of) other structural and functional changes.

If your migration project alters the URL(s) by which users are able to reach your website, contains any of the following changes, then you’re talking about a domain migration, and you need to consider how redirects, protocols (e.g., HTTP/S), hostnames (e.g., www/non-www), and branding are impacted.

  • You’re changing the main domain of your website.
  • You’re buying/adding new domains to your ecosystem.
  • You’re adding or removing subdomains (e.g., removing domain sharding following a migration to HTTP2).
  • You’re moving a website, or part of a website, between domains (e.g., moving a blog on a subdomain into a subfolder, or vice-versa).
  • You’re intentionally allowing an active domain to expire.
  • You’re purchasing an expired/dropped domain.

4. Template migrations

Chances are that your website uses a number of HTML templates, which control the structure, layout, and peripheral content of your pages. The logic which controls how your content looks, feels, and behaves (as well as the behavior of hidden/meta elements like descriptions or canonical URLs) tends to live here.

If your migration project alters elements like your internal navigation (e.g., the header or footer), elements in your <head>, or otherwise changes the page structure around your content in the ways I’ve outlined, then you’re talking about a template migration. You’ll need to consider how users and search engines perceive and engage with your pages, how context, relevance, and authority flow through internal linking structures, and how well-structured your HTML (and JS/CSS) code is.

  • You’re making changes to internal navigation.
  • You’re changing the layout and structure of important pages/templates (e.g., homepage, product pages).
  • You’re adding or removing template components (e.g., sidebars, interstitials).
  • You’re changing elements in your <head> code, like title, canonical, or hreflang tags.
  • You’re adding or removing specific templates (e.g., a template which shows all the blog posts by a specific author).
  • You’re changing the URL pattern used by one or more templates.
  • You’re making changes to how device-specific rendering works*

*Might involve domain, software, and/or hosting migrations, depending on implementation mechanics.

5. Content migrations

Your content is everything which attracts, engages with, and convinces users that you’re the best brand to answer their questions and meet their needs. That includes the words you use to describe your products and services, the things you talk about on your blog, and every image and video you produce or use.

If your migration project significantly changes the tone (including language, demographic targeting, etc), format, or quantity/quality of your content in the ways I’ve outlined, then you’re talking about a content migration. You’ll need to consider the needs of your market and audience, and how the words and media on your website answer to that — and how well it does so in comparison with your competitors.

  • You significantly increase or reduce the number of pages on your website.
  • You significantly change the tone, targeting, or focus of your content.
  • You begin to produce content on/about a new topic.
  • You translate and/or internationalize your content.*
  • You change the categorization, tagging, or other classification system on your blog or product content.**
  • You use tools like canonical tags, meta robots indexation directives, or robots.txt files to control how search engines (and other bots) access and attribute value to a content piece (individually or at scale).

*Might involve domain, software and/or hosting, and template migrations, depending on implementation mechanics.

**May overlap into a template migration if the layout and/or URL structure changes as a result.

6. Design migrations

The look and feel of your website doesn’t necessarily directly impact your performance (though user signals like engagement and trust certainly do). However, simple changes to design components can often have unintended knock-on effects and consequences.

If your migration project contains any of the following changes, you’re talking about a design migration, and you’ll need to clarify whether changes are purely cosmetic or whether they go deeper and impact other areas.

  • You’re changing the look and feel of key pages (like your homepage).*
  • You’re adding or removing interaction layers, e.g. conditionally hiding content based on device or state.*
  • You’re making design/creative changes which change the HTML (as opposed to just images or CSS files) of specific elements.*
  • You’re changing key messaging, like logos and brand slogans.
  • You’re altering the look and feel to react to changing strategies or monetization models (e.g., introducing space for ads in a sidebar, or removing ads in favor of using interstitial popups/states).
  • You’re changing images and media.**

*All template migrations.

**Don’t forget to 301 redirect these, unless you’re replacing like-for-like filenames (which isn’t always best practice if you wish to invalidate local or remote caches).

7. Strategy migrations

A change in organizational or marketing strategy might not directly impact the website, but a widening gap between a brand’s audience, objectives, and platform can have a significant impact on performance.

If your market or audience (or your understanding of it) changes significantly, or if your mission, your reputation, or the way in which you describe your products/services/purpose changes, then you’re talking about a strategy migration. You’ll need to consider how you structure your website, how you target your audiences, how you write content, and how you campaign (all of which might trigger a set of new migration projects!).

  • You change the company mission statement.
  • You change the website’s key objectives, goals, or metrics.
  • You enter a new marketplace (or leave one).
  • Your channel focus (and/or your audience’s) changes significantly.
  • A competitor disrupts the market and/or takes a significant amount of your market share.
  • Responsibility for the website/its performance/SEO/digital changes.
  • You appoint a new agency or team responsible for the website’s performance.
  • Senior/C-level stakeholders leave or join.
  • Changes in legal frameworks (e.g. privacy compliance or new/changing content restrictions in prescriptive sectors) constrain your publishing/content capabilities.

Let’s get in earlier

Armed with better definitions, we can begin to force a more considered conversation around what a “migration” project actually involves. We can use a shared language and ensure that stakeholders understand the risks and opportunities of the changes they intend to make.

Unfortunately, however, we don’t always hear about proposed changes until they’ve already been decided and signed off.

People don’t know that they need to tell us that they’re changing domain, templates, hosting, etc. So it’s often too late when — or if — we finally get involved. Decisions have already been made before they trickle down into our awareness.

That’s still a problem.

By the time you’re aware of a project, it’s usually too late to impact it.

While our new-and-improved definitions are a great starting place to catch risks as you encounter them, avoiding those risks altogether requires us to develop a much better understanding of how, where, and when migrations are planned, managed, and start to go wrong.

Let’s identify trigger points

I’ve identified four common scenarios which lead to organizations deciding to undergo a migration project.

If you can keep your ears to the ground and spot these types of events unfolding, you have an opportunity to give yourself permission to insert yourself into the conversation, and to interrogate to find out exactly which types of migrations might be looming.

It’s worth finding ways to get added to deployment lists and notifications, internal project management tools, and other systems so that you can look for early warning signs (without creating unnecessary overhead and comms processes).

1. Mergers, acquisitions, and closures

When brands are bought, sold, or merged, this almost universally triggers changes to their websites. These requirements are often dictated from on-high, and there’s limited (or no) opportunity to impact the brief.

Migration strategies in these situations are rarely comfortable, and almost always defensive by nature (focusing on minimizing impact/cost rather than capitalizing upon opportunity).

Typically, these kinds of scenarios manifest in a small number of ways:

  • The “parent” brand absorbs the website of the purchased brand into their own website; either by “bolting it on” to their existing architecture, moving it to a subdomain/folder, or by distributing salvageable content throughout their existing site and killing the old one (often triggering most, if not every type of migration).
  • The purchased brand website remains where it is, but undergoes a design migration and possibly template migrations to align it with the parent brand.
  • A brand website is retired and redirected (a domain migration).

2. Rebrands

All sorts of pressures and opportunities lead to rebranding activity. Pressures to remain relevant, to reposition within marketplaces, or change how the brand represents itself can trigger migration requirements — though these activities are often led by brand and creative teams who don’t necessarily understand the implications.

Often, the outcome of branding processes and initiatives creates new a or alternate understanding of markets and consumers, and/or creates new guidelines/collateral/creative which must be reflected on the website(s). Typically, this can result in:

  • Changes to core/target audiences, and the content or language/phrasing used to communicate with them (strategy and content migrations -—more if this involves, for example, opening up to international audiences).
  • New collateral, replacing or adding to existing media, content, and messaging (content and design migrations).
  • Changes to website structure and domain names (template and domain migrations) to align to new branding requirements.

3. C-level vision

It’s not uncommon for senior stakeholders to decide that the strategy to save a struggling business, to grow into new markets, or to make their mark on an organization is to launch a brand-new, shiny website.

These kinds of decisions often involve a scorched-earth approach, tearing down the work of their predecessors or of previously under-performing strategies. And the more senior the decision-maker, the less likely they’ll understand the implications of their decisions.

In these kinds of scenarios, your best opportunity to avert disaster is to watch for warning signs and to make yourself heard before it’s too late. In particular, you can watch out for:

  • Senior stakeholders with marketing, IT, or C-level responsibilities joining, leaving, or being replaced (in particular if in relation to poor performance).
  • Boards of directors, investors, or similar pressuring web/digital teams for unrealistic performance goals (based on current performance/constraints).
  • Gradual reduction in budget and resource for day-to-day management and improvements to the website (as a likely prelude to a big strategy migration).
  • New agencies being brought on board to optimize website performance, who’re hindered by the current framework/constraints.
  • The adoption of new martech and marketing automation software.*

*Integrations of solutions like SalesForce, Marketo, and similar sometimes rely on utilizing proxied subdomains, embedded forms/content, and other mechanics which will need careful consideration as part of a template migration.

4. Technical or financial necessity

The current website is in such a poor, restrictive, or cost-ineffective condition that it makes it impossible to adopt new-and-required improvements (such as compliance with new standards, an integration of new martech stacks, changes following a brand purchase/merger, etc).

Generally, like the kinds of C-level “new website” initiatives I’ve outlined above, these result in scorched earth solutions.

Particularly frustrating, these are the kinds of migration projects which you yourself may well argue and fight for, for years on end, only to then find that they’ve been scoped (and maybe even begun or completed) without your input or awareness.

Here are some danger signs to watch out for which might mean that your migration project is imminent (or, at least, definitely required):

  • Licensing costs for parts or the whole platform become cost-prohibitive (e.g., enterprise CMS platforms, user seats, developer training, etc).
  • The software or hardware skill set required to maintain the site becomes rarer or more expensive (e.g., outdated technologies).
  • Minor-but-urgent technical changes take more than six months to implement.
  • New technical implementations/integrations are agreed upon in principle, budgeted for, but not implemented.
  • The technical backlog of tasks grows faster than it shrinks as it fills with breakages and fixes rather than new features, initiatives, and improvements.
  • The website ecosystem doesn’t support the organization’s ways of working (e.g., the organization adopts agile methodologies, but the website only supports waterfall-style codebase releases).
  • Key technology which underpins the site is being deprecated, and there’s no easy upgrade path.*

*Will likely trigger hosting or software migrations.

Let’s not count on this

While this kind of labelling undoubtedly goes some way to helping us spot and better manage migrations, it’s far from a perfect or complete system.

In fact, I suspect it may be far too ambitious, and unrealistic in its aspiration. Accessing conversations early enough — and being listened to and empowered in those conversations — relies on the goodwill and openness of companies who aren’t always completely bought into or enamored with SEO.

This will only work in an organization which is open to this kind of thinking and internal challenging — and chances are, they’re not the kinds of organizations who are routinely breaking their websites. The very people who need our help and this kind of system are fundamentally unsuited to receive it.

I suspect, then, it might be impossible in many cases to make the kinds of changes required to shift behaviors and catch these problems earlier. In most organizations, at least.

Avoiding disasters resulting from ambiguous migration projects relies heavily on broad education. Everything else aside, people tend to change companies faster than you can build deep enough tribal knowledge.

That doesn’t mean that the structure isn’t still valuable, however. The types of changes and triggers I’ve outlined can still be used as alarm bells and direction for your own use.

Let’s get real

If you can’t effectively educate stakeholders on the complexities and impact of them making changes to their website, there are more “lightweight” solutions.

At the very least, you can turn these kinds of items (and expand with your own, and in more detail) into simple lists which can be printed off, laminated, and stuck to a wall. At the very least, perhaps you’ll remind somebody to pick up the phone to the SEO team when they recognize an issue.

In a more pragmatic world, stakeholders don’t necessarily have to understand the nuance or the detail if they at least understand that they’re meant to ask for help when they’re changing domain, for example, or adding new templates to their website.

Whilst this doesn’t solve the underlying problems, it does provide a mechanism through which the damage can be systematically avoided or limited. You can identify problems earlier and be part of the conversation.

If it’s still too late and things do go wrong, you’ll have something you can point to and say “I told you so,” or, more constructively perhaps, “Here’s the resource you need to avoid this happening next time.”

And in your moment of self-righteous vindication, having successfully made it through this post and now armed to save your company from a botched migration project, you can migrate over to the bar. Good work, you.


Thanks to…

This turned into a monster of a post, and its scope meant that it almost never made it to print. Thanks to a few folks in particular for helping me to shape, form, and ship it. In particular:

  • Hannah Thorpe, for help in exploring and structuring the initial concept.
  • Greg Mitchell, for a heavy dose of pragmatism in the conclusion.
  • Gerry White, for some insightful additions and the removal of dozens of typos.
  • Sam Simpson for putting up with me spending hours rambling and ranting at her about failed site migrations.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The State of Links: Yesterday’s Ranking Factor?

Posted by Tom.Capper

Back in September last year, I was lucky enough to see Rand speak at MozCon. His talk was about link building and the main types of strategy that he saw as still being relevant and effective today. During his introduction, he said something that really got me thinking, about how the whole purpose of links and PageRank had been to approximate traffic.

Source

Essentially, back in the late ’90s, links were a much bigger part of how we experienced the web — think of hubs like Excite, AOL, and Yahoo. Google’s big innovation was to realize that, because people navigated the web by clicking on links, they could approximate the relative popularity of pages by looking at those links.

So many links, such little time.

Rand pointed out that, given all the information at their disposal in the present day — as an Internet Service Provider, a search engine, a browser, an operating system, and so on — Google could now far more accurately model whether a link drives traffic, so you shouldn’t aim to build links that don’t drive traffic. This is a pretty big step forward from the link-building tactics of old, but it occurred to me that it it probably doesn’t go far enough.

If Google has enough data to figure out which links are genuinely driving traffic, why bother with links at all? The whole point was to figure out which sites and pages were popular, and they can now answer that question directly. (It’s worth noting that there’s a dichotomy between “popular” and “trustworthy” that I don’t want to get too stuck into, but which isn’t too big a deal here given that both can be inferred from either link-based data sources, or from non-link-based data sources — for example, SERP click-through rate might correlate well with “trustworthy,” while “search volume” might correlate well with “popular”).

However, there’s plenty of evidence out there suggesting that Google is in fact still making significant use of links as a ranking factor, so I decided to set out to challenge the data on both sides of that argument. The end result of that research is this post.

The horse’s mouth

One reasonably authoritative source on matters relating to Google is Google themselves. Google has been fairly unequivocal, even in recent times, that links are still a big deal. For example:

  • March 2016: Google Senior Search Quality Strategist Andrey Lipattsev confirms that content and links are the first and second greatest ranking factors. (The full quote is: “Yes; I can tell you what they [the number 1 and 2 ranking factors] are. It’s content, and links pointing to your site.”)
  • April 2014: Matt Cutts confirms that Google has tested search quality without links, and found it to be inferior.
  • October 2016: Gary Illyes implies that text links continue to be valuable while playing down the concept of Domain Authority.

Then, of course, there’s their continued focus on unnatural backlinks and so on — none of which would be necessary in a world where links are not a ranking factor.

However, I’d argue that this doesn’t indicate the end of our discussion before it’s even begun. Firstly, Google has a great track record of giving out dodgy SEO advice. Consider HTTPS migrations pre-2016. Will Critchlow talked at SearchLove San Diego about how Google’s algorithms are at a level of complexity and opaqueness where they’re no longer even trying to understand them themselves — and of course there are numerous stories of unintentional behaviors from machine learning algorithms out in the wild.

Third-party correlation studies

It’s not difficult to put together your own data and show a correlation between link-based metrics and rankings. Take, for example:

  • Moz’s most recent study in 2015, showing strong relationships between link-based factors and rankings across the board.
  • This more recent study by Stone Temple Consulting.

However, these studies fall into significant issues with correlation vs. causation.

There are three main mechanisms which could explain the relationships that they show:

  1. Getting more links causes sites to rank higher (yay!)
  2. Ranking higher causes sites to get more links
  3. Some third factor, such as brand awareness, is related to both links and rankings, causing them to be correlated with each other despite the absence of a direct causal relationship

I’ve yet to see any correlation study that addresses these very serious shortcomings, or even particularly acknowledges them. Indeed, I’m not sure that it would even be possible to do so given the available data, but this does show that as an industry we need to apply some critical thinking to the advice that we’re consuming.

However, earlier this year I did write up some research of my own here on the Moz Blog, demonstrating that brand awareness could in fact be a more useful factor than links for predicting rankings.

Source

The problem with this study was that it showed a relationship that was concrete (i.e. extremely statistically significant), but that was surprisingly lacking in explanatory power. Indeed, I discussed in that post how I’d ended up with a correlation that was far lower than Moz’s for Domain Authority.

Fortunately, Malcolm Slade recently discussed some of his very similar research at BrightonSEO, in which he finds similar broad correlations to myself between brand factors and rankings, but far, far stronger correlations for certain types of query, and especially big, high-volume, highly competitive head terms.

So what can we conclude overall from these third-party studies? Two main things:

  1. We should take with a large pinch of salt any study that does not address the possibilities of reverse causation, or a jointly-causing third factor.
  2. Links can add very little explanatory power to a rankings prediction model based on branded search volume, at least at a domain level.

The real world: Why do rankings change?

At the end of the day, we’re interested in whether links are a ranking factor because we’re interested in whether we should be trying to use them to improve the rankings of our sites, or our clients’ sites.

Fluctuation

The first example I want to look at here is this graph, showing UK rankings for the keyword “flowers” from May to December last year:

The fact is that our traditional understanding of ranking changes — which breaks down into links, on-site, and algorithm changes — cannot explain this degree of rapid fluctuation. If you don’t believe me, the above data is available publicly through platforms like SEMRush and Searchmetrics, so try to dig into it yourself and see if there’s any external explanation.

This level and frequency of fluctuation is increasingly common for hotly contested terms, and it shows a tendency by Google to continuously iterate and optimize — just as marketers do when they’re optimizing a paid search advert, or a landing page, or an email campaign.

What is Google optimizing for?

Source

The above slide is from Larry Kim’s presentation at SearchLove San Diego, and it shows how the highest SERP positions are gaining click-through rate over time, despite all the changes in Google Search (such as increased non-organic results) that ought to drive the opposite.

Larry’s suggestion is that this is a symptom of Google’s procedural optimization — not of the algorithm, but by the algorithm and of results. This certainly fits in with everything we’ve seen.

Successful link building

However, at the other end of the scale, we get examples like this:

Picture1.png

The above graph (courtesy of STAT) shows rankings for the commercial keywords for Fleximize.com during a Distilled creative campaign. This is a particularly interesting example for two reasons:

  • Fleximize started off as a domain with relatively little equity, meaning that changes were measurable, and that there were fairly easy gains to be made
  • Nothing happened with the first two pieces (1, 2), even though they scored high-quality coverage and were seemingly very comparable to the third (3).

It seems that links did eventually move the needle here, and massively so, but the mechanisms at work are highly opaque.

The above two examples — “Flowers” and Fleximize — are just two real-world examples of ranking changes. I’ve picked one that seems obviously link-driven but a little strange, and one that shows how volatile things are for more competitive terms. I’m sure there are countless massive folders out there full of case studies that show links moving rankings — but the point is that it can happen, yet it isn’t always as simple as it seems.

How do we explain all of this?

A lot of the evidence I’ve gone through above is contradictory. Links are correlated with rankings, and Google says they’re important, and sometimes they clearly move the needle, but on the other hand brand awareness seems to explain away most of their statistical usefulness, and Google’s operating with more subtle methods in the data-rich top end.

My favored explanation right now to explain how this fit together is this:

  • There are two tiers — probably fuzzily separated.
  • At the top end, user signals — and factors that Google’s algorithms associate with user signals — are everything. For competitive queries with lots of search volume, links don’t tell Google anything it couldn’t figure out anyway, and links don’t help with the final refinement of fine-grained ordering.
  • However, links may still be a big part of how you qualify for that competition in the top end.

This is very much a work in progress, however, and I’d love to see other people’s thoughts, and especially their fresh research. Let me know what you think in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Half of Page-1 Google Results Are Now HTTPS

Posted by Dr-Pete

Just over 9 months ago, I wrote that 30% of page-1 Google results in our 10,000-keyword tracking set were secure (HTTPS). As of earlier this week, that number topped 50%:

While there haven’t been any big jumps recently – suggesting this change is due to steady adoption of HTTPS and not a major algorithm update – the end result of a year of small changes is dramatic. More and more Google results are secure.

MozCast is, of course, just one data set, so I asked the folks at Rank Ranger, who operate a similar (but entirely different) tracking system, if they thought I was crazy…

Could we both be crazy? Absolutely. However, we operate completely independent systems with no shared data, so I think the consistency in these numbers suggests that we’re not wildly off.

What about the future?

Projecting the fairly stable trend line forward, the data suggests that HTTPS could hit about 65% of page-1 results by the end of 2017. The trend line is, of course, an educated guess at best, and many events could change the adoption rate of HTTPS pages.

I’ve speculated previously that, as the adoption rate increased, Google would have more freedom to bump up the algorithmic (i.e. ranking) boost for HTTPS pages. I asked Gary Illyes if such a plan was in the works, and he said “no”:

As with any Google statement, some of you will take this as gospel truth and some will take it as devilish lies. While he isn’t promising that Google will never boost the ranking benefits of HTTPS, I believe Gary on this one. I think Google is happy with the current adoption rate and wary of the collateral damage that an aggressive HTTPS ranking boost (or penalty) could cause. It makes sense that they would bide their time..

Who hasn’t converted?

One of the reasons Google may be proceeding with caution on another HTTPS boost (or penalty) is that not all of the big players have made the switch. Here are the Top 20 subdomains in the MozCast dataset, along with the percentage of ranking URLs that use HTTPS:

(1) en.wikipedia.org — 100.0%
(2) www. amazon.com — 99.9%
(3) www. facebook.com — 100.0%
(4) www. yelp.com — 99.7%
(5) www. youtube.com — 99.6%
(6) www. pinterest.com — 100.0%
(7) www. walmart.com — 100.0%
(8) www. tripadvisor.com — 99.7%
(9) www. webmd.com — 0.2%
(10) allrecipes.com — 0.0%
(11) www. target.com — 0.0%
(12) www. foodnetwork.com — 0.0%
(13) www. ebay.com — 0.0%
(14) play.google.com — 100.0%
(15) www. bestbuy.com — 0.0%
(16) www. mayoclinic.org — 0.0%
(17) www. homedepot.com — 0.0%
(18) www. indeed.com — 0.0%
(19) www. zillow.com — 100.0%
(20) shop.nordstrom.com – 0.0%

Of the Top 20, exactly half have switched to HTTPS, although most of the Top 10 have converted. Not surprisingly, switching is, with only minor exceptions, nearly all-or-none. Most sites naturally opt for a site-wide switch, at least after initial testing.

What should you do?

Even if Google doesn’t turn up the reward or penalty for HTTPS, other changes are in play, such as Chrome warning visitors about non-secure pages when those pages collect sensitive data. As the adoption rate increases, you can expect pressure to switch to increase.

For new sites, I’d recommend jumping in as soon as possible. Security certificates are inexpensive these days (some are free), and the risks are low. For existing sites, it’s a lot tougher. Any site-wide change carries risks, and there have certainly been a few horror stories this past year. At minimum, make sure to secure pages that collect sensitive information or process transactions, and keep your eyes open for more changes.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Large Site SEO Basics: Faceted Navigation

Posted by sergeystefoglo

If you work on an enterprise site — particularly in e-commerce or listings (such as a job board site) — you probably use some sort of faceted navigation structure. Why wouldn’t you? It helps users filter down to their desired set of results fairly painlessly.

While helpful to users, it’s no secret that faceted navigation can be a nightmare for SEO. At Distilled, it’s not uncommon for us to get a client that has tens of millions of URLs that are live and indexable when they shouldn’t be. More often than not, this is due to their faceted nav setup.

There are a number of great posts out there that discuss what faceted navigation is and why it can be a problem for search engines, so I won’t go into much detail on this. A great place to start is this post from 2011.

What I want to focus on instead is narrowing this problem down to a simple question, and then provide the possible solutions to that question. The question we need to answer is, “What options do we have to decide what Google crawls/indexes, and what are their pros/cons?”

Brief overview of faceted navigation

As a quick refresher, we can define faceted navigation as any way to filter and/or sort results on a webpage by specific attributes that aren’t necessarily related. For example, the color, processor type, and screen resolution of a laptop. Here is an example:

Because every possible combination of facets is typically (at least one) unique URL, faceted navigation can create a few problems for SEO:

  1. It creates a lot of duplicate content, which is bad for various reasons.
  2. It eats up valuable crawl budget and can send Google incorrect signals.
  3. It dilutes link equity and passes equity to pages that we don’t even want indexed.

But first… some quick examples

It’s worth taking a few minutes and looking at some examples of faceted navigation that are probably hurting SEO. These are simple examples that illustrate how faceted navigation can (and usually does) become an issue.

Macy’s

First up, we have Macy’s. I’ve done a simple site:search for the domain and added “black dresses” as a keyword to see what would appear. At the time of writing this post, Macy’s has 1,991 products that fit under “black dresses” — so why are over 12,000 pages indexed for this keyword? The answer could have something to do with how their faceted navigation is set up. As SEOs, we can remedy this.

Home Depot

Let’s take Home Depot as another example. Again, doing a simple site:search we find 8,930 pages on left-hand/inswing front exterior doors. Is there a reason to have that many pages in the index targeting similar products? Probably not. The good news is this can be fixed with the proper combinations of tags (which we’ll explore below).

I’ll leave the examples at that. You can go on most large-scale e-commerce websites and find issues with their navigation. The points is, many large websites that use faceted navigation could be doing better for SEO purposes.

Faceted navigation solutions

When deciding a faceted navigation solution, you will have to decide what you want in the index, what can go, and then how to make that happen. Let’s take a look at what the options are.

“Noindex, follow”

Probably the first solution that comes to mind would be using noindex tags. A noindex tag is used for the sole purpose of letting bots know to not include a specific page in the index. So, if we just wanted to remove pages from the index, this solution would make a lot of sense.

The issue here is that while you can reduce the amount of duplicate content that’s in the index, you will still be wasting crawl budget on pages. Also, these pages are receiving link equity, which is a waste (since it doesn’t benefit any indexed page).

Example: If we wanted to include our page for “black dresses” in the index, but we didn’t want to have “black dresses under $100” in the index, adding a noindex tag to the latter would exclude it. However, bots would still be coming to the page (which wastes crawl budget), and the page(s) would still be receiving link equity (which would be a waste).

Canonicalization

Many sites approach this issue by using canonical tags. With a canonical tag, you can let Google know that in a collection of similar pages, you have a preferred version that should get credit. Since canonical tags were designed as a solution to duplicate content, it would seem that this is a reasonable solution. Additionally, link equity will be consolidated to the canonical page (the one you deem most important).

However, Google will still be wasting crawl budget on pages.

Example: /black-dresses?under-100/ would have the canonical URL set to /black-dresses/. In this instance, Google would give the canonical page the authority and link equity. Additionally, Google wouldn’t see the “under $100” page as a duplicate of the canonical version.

Disallow via robots.txt

Disallowing sections of the site (such as certain parameters) could be a great solution. It’s quick, easy, and is customizable. But, it does come with some downsides. Namely, link equity will be trapped and unable to move anywhere on your website (even if it’s coming from an external source). Another downside here is even if you tell Google not to visit a certain page (or section) on your site, Google can still index it.

Example: We could disallow *?under-100* in our robots.txt file. This would tell Google to not visit any page with that parameter. However, if there were any “follow” links pointing to any URL with that parameter in it, Google could still index it.

“Nofollow” internal links to undesirable facets

An option for solving the crawl budget issue is to “nofollow” all internal links to facets that aren’t important for bots to crawl. Unfortunately, “nofollow” tags don’t solve the issue entirely. Duplicate content can still be indexed, and link equity will still get trapped.

Example: If we didn’t want Google to visit any page that had two or more facets indexed, adding a “nofollow” tag to all internal links pointing to those pages would help us get there.

Avoiding the issue altogether

Obviously, if we could avoid this issue altogether, we should just do that. If you are currently in the process of building or rebuilding your navigation or website, I would highly recommend considering building your faceted navigation in a way that limits the URL being changed (this is commonly done with JavaScript). The reason is simple: it provides the ease of browsing and filtering products, while potentially only generating a single URL. However, this can go too far in the opposite direction — you will need to manually ensure that you have indexable landing pages for key facet combinations (e.g. black dresses).

Here’s a table outlining what I wrote above in a more digestible way.

Options:

Solves duplicate content?

Solves crawl budget?

Recycles link equity?

Passes equity from external links?

Allows internal link equity flow?

Other notes

“Noindex, follow”

Yes

No

No

Yes

Yes

Canonicalization

Yes

No

Yes

Yes

Yes

Can only be used on pages that are similar.

Robots.txt

Yes

Yes

No

No

No

Technically, pages that are blocked in robots.txt can still be indexed.

Nofollow internal links to undesirable facets

No

Yes

No

Yes

No

JavaScript setup

Yes

Yes

Yes

Yes

Yes

Requires more work to set up in most cases.

But what’s the ideal setup?

First off, it’s important to understand there is no “one-size-fits-all solution.” In order to get to your ideal setup, you will most likely need to use a combination of the above options. I’m going to highlight an example fix below that should work for most sites, but it’s important to understand that your solution might vary based on how your site is built, how your URLs are structured, etc.

Fortunately, we can break down how we get to an ideal solution by asking ourselves one question. “Do we care more about our crawl budget, or our link equity?” By answering this question, we’re able to get closer to an ideal solution.

Consider this: You have a website that has a faceted navigation that allows the indexation and discovery of every single facet and facet combination. You aren’t concerned about link equity, but clearly Google is spending valuable time crawling millions of pages that don’t need to be crawled. What we care about in this scenario is crawl budget.

In this specific scenario, I would recommend the following solution.

  1. Category, subcategory, and sub-subcategory pages should remain discoverable and indexable. (e.g. /clothing/, /clothing/womens/, /clothing/womens/dresses/)
  2. For each category page, only allow versions with 1 facet selected to be indexed.
    1. On pages that have one or more facets selected, all facet links become “nofollow” links (e.g. /clothing/womens/dresses?color=black/)
    2. On pages that have two or more facets selected, a “noindex” tag is added as well (e.g. /clothing/womens/dresses?color=black?brand=express?/)
  3. Determine which facets could have an SEO benefit (for example, “color” and “brand”) and whitelist them. Essentially, throw them back in the index for SEO purposes.
  4. Ensure your canonical tags and rel=prev/next tags are setup appropriately.

This solution will (in time) start to solve our issues with unnecessary pages being in the index due to the navigation of the site. Also, notice how in this scenario we used a combination of the possible solutions. We used “nofollow,” “noindex, nofollow,” and proper canonicalization to achieve a more desirable result.

Other things to consider

There are many more variables to consider on this topic — I want to address two that I believe are the most important.

Breadcrumbs (and markup) helps a lot

If you don’t have breadcrumbs on each category/subcategory page on your website, you’re doing yourself a disservice. Please go implement them! Furthermore, if you have breadcrumbs on your website but aren’t marking them up with microdata, you’re missing out on a huge win.

The reason why is simple: You have a complicated site navigation, and bots that visit your site might not be reading the hierarchy correctly. By adding accurate breadcrumbs (and marking them up), we’re effectively telling Google, “Hey, I know this navigation is confusing, but please consider crawling our site in this manner.”

Enforcing a URL order for facet combinations

In extreme situations, you can come across a site that has a unique URL for every facet combination. For example, if you are on a laptop page and choose “red” and “SSD” (in that order) from the filters, the URL could be /laptops?color=red?SSD/. Now imagine if you chose the filters in the opposite order (first “SSD” then “red”) and the URL that’s generated is /laptops?SSD?color=red/.

This is really bad because it exponentially increases the amount of URLs you have. Avoid this by enforcing a specific order for URLs!

Conclusions

My hope is that you feel more equipped (and have some ideas) on how to tackle controlling your faceted navigation in a way that benefits your search presence.

To summarize, here are the main takeaways:

  1. Faceted navigation can be great for users, but is usually setup in a way that negatively impacts SEO.
  2. There are many reasons why faceted navigation can negatively impact SEO, but the top three are:
    1. Duplicate content
    2. Crawl budget being wasted
    3. Link equity not being used as effectively as it should be
  3. Boiled down further, the question we want to answer to begin approaching a solution is, “What are the ways we can control what Google crawls and indexes?”
  4. When it comes to a solution, there is no “one-size-fits-all” solution. There are numerous fixes (and combinations) that can be used. Most commonly:
    1. Noindex, follow
    2. Canonicalization
    3. Robots.txt
    4. Nofollow internal links to undesirable facets
    5. Avoiding the problem with an AJAX/JavaScript solution
  5. When trying to think of an ideal solution, the most important question you can ask yourself is, “What’s more important to our website: link equity, or crawl budget?” This can help focus your possible solutions.

I would love to hear any example setups. What have you found that’s worked well? Anything you’ve tried that has impacted your site negatively? Let’s discuss in the comments or feel free to shoot me a tweet.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[Case Study] How We Ranked #1 for a High-Volume Keyword in Under 3 Months

Posted by DmitryDragilev

This blog post was co-written with Brad Zomick, the former Director of Content Marketing at Pipedrive, where this case study took place.

It’s tough out there for SEOs and content marketers. With the sheer amount of quality content being produced, it has become nearly impossible to stand out in most industries.

Recently we were running content marketing for Pipedrive, a sales CRM. We created a content strategy that used educational sales content to educate and build trust with our target audience.

This was a great idea, in theory — we’d educate readers, establish trust, and turn some of our readers into customers.

The problem is that there are already countless others producing similar sales-focused content. We weren’t just competing against other startups for readers; we also had to contend with established companies, sales trainers, strategists, bloggers and large business sites.

The good news is that ranking a strategic keyword is still very much possible. It’s certainly not easy, but with the right process, anyone can rank for their target keyword.

Below, we’re going to show you the process we used to rank on page one for a high-volume keyword.

If you’re not sure about reading ahead, here is a quick summary:

We were able to rank #1 for a high-volume keyword: “sales management” (9,900 search volume). We outranked established sites including SalesManagement.org, Apptus, InsightSquared, Docurated, and even US News, Wikipedia, and the Bureau of Labor Statistics. We managed this through good old-fashioned content creation + outreach + guest posting, aka the “Skyscraper Technique.”

Here are the eight steps we took to reach our goal (click on a step to jump straight to that section):

  1. Select the right topic
  2. Create bad-ass content for our own blog
  3. Optimize on-page SEO & engagement metrics
  4. Build internal links
  5. Find people who would link to this content
  6. Ask people to link to our content
  7. Write guest posts on leading blogs
  8. Fine-tuning content with TF * IDF

Before we start, understand that this is a labor-intensive process. Winning a top SERP spot required the focus of a 3-person team for the better part of 3 months.

If you’re willing to invest a similar amount of time and effort, read on!


Step 1: Finding a good topic

We wanted three things from our target keyword:

1. Significant keyword volume

If you’re going to spend months ranking for a single keyword, you need to pick something big enough to justify the effort.

In our case, we settled on a keyword with 9,900 searches each month as per the Keyword Planner (1k–10k range after the last update).

That same keyword registered a search volume of 1.7–2.9k in Moz Keyword Explorer, so take AdWords’ estimates with a grain of salt.

One way to settle on a target volume is to see it in terms of your conversion rate and buyer’s journey:

  • Buyer’s journey: Search volume decreases as customers move further along the buyer’s journey. Fewer searches are okay if you’re targeting Decision-stage keywords.
  • Conversion rate: The stronger your conversion rate for each stage of the buyer’s journey, the more you can get away with by targeting a low search volume keyword.

Also consider the actual traffic from the keyword, not just search volume.

For instance, we knew from Moz’s research that the first result gets about 30% of all clicks.

For a keyword with 9,900 search volume, this would translate into over 3,000 visitors/month for a top position.

If we could convert even 5% of these into leads, we’d net over 1,800 leads each year, which makes it worth our time.

2. Pick a winnable topic

Some SERPs are incredibly competitive. For instance, if you’re trying to rank for “content marketing,” you’ll find that the first page is dominated by CMI (DA 84):

You might be able to fight out a first-page rank, but it’s really not worth the effort in 99% of cases.

So our second requirement was to see if we could actually rank for our shortlisted keywords.

This can be done in one of two ways:

Informal method

The old-fashioned way to gauge keyword difficulty is to simply eyeball SERPs for your selected keywords.

If you see a lot of older articles, web 1.0 pages, unrecognizable brands, and generic content sites, the keyword should be solid.

On the other hand, if the first page is dominated by big niche brands with in-depth articles, you’ll have a hard time ranking well.

I also recommend using the MozBar to check metrics on the fly. If you see a ton of high DA/PA pages, move on to another keyword.

In our case, the top results mostly comprised of generic content sites or newish domains.

Moz Keyword Explorer

Moz’s Keyword Explorer gives you a more quantifiable way to gauge keyword difficulty. You’ll get actual difficulty vs. potential scores.

Aim for a competitiveness score under 50 and opportunity/potential scores above 50. If you get scores beyond this threshold, keep looking.

Of course, if you have an established domain, you can target more difficult keywords.

Following this step, we had a shortlist of four keywords:

  1. sales techniques (8100)
  2. sales process (8100)
  3. sales management (9900)
  4. sales forecast (4400)

We could have honestly picked anything from this list, but for added impact, we decided to add another filter.

3. Strategic relevance

If you’re going to turn visitors into leads, it’s important to focus on keywords that are strategically relevant to your conversion goals.

In our case, we chose “sales management” as the target keyword.

We did this because Pipedrive is a sales management tool, so the keyword describes us perfectly.

Additionally, a small business owner searching for “sales management” has likely moved from Awareness to Consideration and thus, is one step closer to buying.

In contrast, “sales techniques” and “sales forecast” are keywords a sales person would search for, not a sales leader or small business owner (decision-makers).


Step 2: Writing a bad-ass piece of content

Content might not be king anymore, but it is still the foundation of good SEO. We wanted to get this part absolutely right.

Here’s the process we followed to create our content:

1. Extremely thorough research

We had a simple goal from the start: create something substantially better than anything in the top SERPs.

To get there, we started by reviewing every article ranking for “sales management,” noting what we liked and what we didn’t.

For instance, we liked how InsightSquared started the article with a substantive quote. We didn’t like how Apptus went overboard with headers.

We also looked for anomalies. One thing that caught our attention was that two of the top 10 results were dedicated to the keyword “sales manager.”

We took note of this and made sure to talk about “sales managers” in our article.

We also looked at related searches at the bottom of the page:

We also scoured more than 50 sales-related books for chapters about sales management.

Finally, we also talked to some real salespeople. This step helped us add expert insight that outsourced article writers just don’t have.

At the end, we had a superior outline of what we were going to write.

2. Content creation

You don’t need to be a subject matter expert to create an excellent piece of content.

What you do need is good writing skills… and the discipline to actually finish an article.

Adopt a journalistic style where you report insight from experts. This gives you a better end-product since you’re curating insight and writing it far better than subject matter experts.

Unfortunately, there is no magic bullet to speed up the writing part — you’ll just have to grind it out. Set aside a few days at least to write anything substantive.

There are a few things we learned through the content creation experience:

  1. Don’t multi-task. Go all-in on writing and don’t stop until it’s done.
  2. Work alone. Writing is a solitary endeavor. Work in a place where you won’t be bothered by coworkers.
  3. Listen to ambient music. Search “homework edit” on YouTube for some ambient tracks, or use a site like Noisli.com

Take tip #1 as non-negotiable. We tried to juggle a couple of projects and finishing the article ended up taking two weeks. Learn from our mistake — focus on writing alone!

Before you hit publish, make sure to get some editorial feedback from someone on your team, or if possible, a professional editor.

We also added a note at the end of the article where we solicit feedback for future revisions.

If you can’t get access to editors, at the very least put your article through Grammarly.

3. Add lots of visuals and make content more readable

Getting visuals in B2B content can be surprisingly challenging. This is mostly due to the fact that there are a lot of abstract, hard-to-visualize concepts in B2B writing.

This is why we found a lot of blog posts like this with meaningless stock images:

To avoid this, we decided to use four custom images spread throughout the article.

We wanted to use visuals to:

  • Illustrate abstract concepts and ideas
  • Break up the content into more readable chunks.
  • Emphasize key takeaways in a readily digestible format

We could have done even more — prolific content creators like Neil Patel often use images every 200–300 words.

Aside from imagery, there are a few other ways to break up and highlight text to make your content more readable.

  • Section headers
  • Bullets and numbered lists
  • Small paragraphs
  • Highlighted text
  • Blockquotes
  • Use simple words

We used most of these tactics, especially blockquotes to create sub-sections.

Given our audience — sales leaders and managers — we didn’t have to bother with dumbing down our writing. But if you’re worried that your writing is too complex, try using an app like Hemingway to edit your draft.


Step 3: Optimize on-page SEO and engagement metrics

Here’s what we did to optimize on-page SEO:

1. Fix title

We wanted traffic from people searching for keywords related to “sales management,” such as:

  • “Sales management definition” (currently #2)
  • “Sales management process” (currently #1)
  • “Sales management strategies” (currently #4)
  • “Sales management resources” (currently #3)

To make sure we tapped all these keywords, we changed our main H1 header tag to include the words definition, process, strategies, and resources.

These are called “modifiers” in SEO terms.

Google is now smart enough to know that a single article can cover multiple related keywords. Adding such modifiers helped us increase our potential traffic.

2. Fix section headers

Next, we used the right headers for each section:

Instead of writing “sales management definition,” we used an actual question a reader might ask.

Here’s why:

  • It makes the article easier to read
  • It’s a natural question, which makes it more likely to rank for voice searches and Google’s “answers”

We also peppered related keywords in headers throughout the article. Note how we used the keyword at the beginning of the header, not at the end:

We didn’t want to go overboard with the keywords. Our goal was to give readers something they’d actually want to read.

This is why our <h2> tag headers did not have any obvious keywords:

This helps the article read naturally while still using our target keywords.

3. Improve content engagement

Notice the colon and the line break at the very start of the article:

This is a “bucket brigade”: an old copywriting trick to grab a reader’s attention.

We used it at the beginning of the article to stop readers from hitting the back button and going back to Google (i.e. increase our dwell time).

We also added outgoing and internal links to the article.

4. Fix URL

According to research, shorter URLs tend to rank better than longer ones.

We didn’t pay a lot of attention to the URL length when we first started blogging.

Here’s one of our blog post URLs from 2013:

Not very nice, right?

For this post, we used a simple, keyword-rich URL:

Ideally, we wouldn’t have the /2016/05/ bit, but by now, it’s too late to change.

5. Improve keyword density

One common piece of on-page SEO advice is to add your keywords to the first 100 words of your content.

If you search for “sales management” on our site, this is what you’ll see:

If you’re Googlebot, you’d have no confusion what this article was about: sales management.

We also wanted to use related keywords in the article without it sounding over-optimized. Gaetano DiNardi, our SEO manager at the time, came up with a great solution to fix this:

We created a “resources” or “glossary” section to hit a number of related keywords while still being useful. Here’s an example:

It’s important to make these keyword mentions as organic as possible.

As a result of this on-page keyword optimization, traffic increased sharply.

We over-optimized keyword density in the beginning, which likely hurt rankings. Once we spotted this, we changed things around and saw an immediate improvement (more on this below).


Step 4: Build internal links to article

Building internal links to your new content can be surprisingly effective when promoting content.

As Moz has already written before:

“Internal links are most useful for establishing site architecture and spreading link juice.”

Essentially, these links:

  • Help Googlebot discover your content
  • Tell Google that a particular page is “important” on your site since a lot of pages point to it

Our approach to internal linking was highly strategic. We picked two kinds of pages:

1. Pages that had high traffic and PA. You can find these in Google Analytics under Behavior –> Site Content.

2. Pages where the keyword already existed unlinked. You can use this query to find such pages:

Site:[yoursite.com] “your keyword”

In our case, searching for “sales management” showed us a number of mentions:

After making a list of these pages, we dove into our CMS and added internal links by hand.

These new links from established posts showed Google that we thought of this page as “important.”


Step 5: Finding link targets

This is where things become more fun. In this step, we used our detective SEO skills to find targets for our outreach campaign.

There are multiple ways to approach this process, but the easiest — and the one we followed — is to simply find sites that had linked to our top competitors.

We used Open Site Explorer to crawl the top ten results for backlinks.

By digging beyond the first page, we managed to build up a list of hundreds of prospects, which we exported to Excel.

This was still a very “raw” list. To maximize our outreach efficiency, we filtered out the following from our list:

  • Sites with DA under 30.
  • Sites on free blog hosts like Blogspot.com, WordPress.com, etc.

This gave us a highly targeted list of hundreds of prospects.

Here’s how we organized our Excel file:

Finding email addresses

Next step: find email addresses.

This has become much easier than it used to be thanks to a bunch of new tools. We used EmailHunter (Hunter.io) but you can also use VoilaNorbert, Email Finder, etc.

EmailHunter works by finding the pattern people use for emails on a domain name, like this:

To use this tool, you will need either the author’s name or the editor/webmaster’s name.

In some cases, the author of the article is clearly displayed.

In case you can’t find the author’s name (happens in case of guest posts), you’ll want to find the site’s editor or content manager.

LinkedIn is very helpful here.

Try a query like this:

site:linkedin.com “Editor/Blog Editor” at “[SiteName]”.

Once you have a name, plug the domain name into Hunter.io to get an email address guess of important contacts.


Step 6: Outreach like crazy

After all the data retrieval, prioritization, deduping, and clean up, we were left with hundreds of contacts to reach out to.

To make things easier, we segmented our list into two categories:

  • Category 1: Low-quality, generic sites with poor domain authority. You can send email templates to them without any problems.
  • Category 2: Up-and-coming bloggers/authoritative sites we wanted to build relationships with. To these sites, we sent personalized emails by hand.

With the first category of sites, our goal was volume instead of accuracy.

For the second category, our objective was to get a response. It didn’t matter whether we got a backlink or not — we wanted to start a conversation which could yield a link or, better, a relationship.

You can use a number of tools to make outreach easier. Here are a few of these tools:

  1. JustReachOut
  2. MixMax
  3. LeadIQ
  4. Toutapp
  5. Prospectify

We loved using a sales tool called MixMax. Its ability to mail merge outreach templates and track open rates works wonderfully well for SEO outreach.

If you’re looking for templates, here’s one email we sent out:

Let’s break it down:

  1. Curiosity-evoking headline: Small caps in the subject line makes the email look authentic. The “something missing” part evokes curiosity.
  2. Name drop familiar brands: Name dropping your relationship to familiar brands is another good way to show your legitimacy. It’s also a good idea to include a link to their article to jog their memory.
  3. What’s missing: The meat of the email. Make sure that you’re specific here.
  4. The “why”: Your prospects need a “because” to link to you. Give actual details as to what makes it great — in-depth research, new data, or maybe a quote or two from Rand Fishkin.
  5. Never demand a link: Asking for feedback first is a good way to show that you want a genuine conversation, not just a link.

This is just one example. We tested 3 different emails initially and used the best one for the rest of the campaign. Our response rate for the whole campaign was 42%.


Step 7: Be prepared to guest post

Does guest blogging still work?

If you’re doing it for traffic and authority, I say: go ahead. You are likely putting your best work out there on industry-leading blogs. Neither your readers nor Google will mind that.

In our case, guest blogging was already a part of our long-term content marketing strategy. The only thing we changed was adding links to our sales management post within guest posts.

Your guest post links should have contextual reference, i.e. the post topic and link content should match. Otherwise, Google might discount the link, even if it is dofollow.

Keep this in mind when you start a guest blogging campaign. Getting links isn’t enough; you need contextually relevant links.

Here are some of the guest posts we published:

  • 7 Keys to Scaling a Startup Globally [INC]
  • An Introduction to Activity-Based Selling [LinkedIn]
  • 7 Tips for MBAs Entering Sales Management Careers [TopMBA]

We weren’t exclusively promoting our sales management post in any of these guest posts. The sales management post just fit naturally into the context, so we linked to it.

If you’re guest blogging in 2017, this is the approach you need to adopt.


Step 8: Fine-tuning content with TF * IDF

After the article went live, we realized that we had heavily over-optimized it for the term “sales management.” It occurred 48 times throughout the article, too much for a 2,500 word piece.

Moreover, we hadn’t always used the term naturally in the article.

To solve this problem, we turned to TF-IDF.

Recognizing TF-IDF as a ranking factor

TF-IDF (Term Frequency-Inverse Document Frequency) is a way to figure out how important a word is in a document based on how frequently it appears in it.

This is a pretty standard statistical process in information retrieval. It is also one of the oldest ranking factors in Google’s algorithms.

Hypothesis: We hypothesized that dropping the number of “sales management” occurrences from 48 to 20 and replacing it with terms that have high lexical relevance would improve rankings.

Were we right?

See for yourself:

Our organic pageviews increased from nearly 0 to over 5,000 in just over 8 months.

Note that no new links or link acquisition initiatives were actively in-progress during the time of this mini-experiment.

Experiment timeline:

  • July 18th – Over-optimized keyword recognized.
  • July 25th – Content team finished updating body copy, H2s with relevant topics/synonyms.
  • July 26th – Updated internal anchor text to include relevant terms.
  • July 27th – Flushed cache & re-submitted to Search Console.
  • August 4th – Improved from #4 to #2 for “Sales Management”
  • August 17 – Improved from #2 to #1 for “Sales Management”

The results were fast. We were able to normalize our content and see results within weeks.

We’ll show you our exact process below.

Normalization process — How did we do it?

The normalization process focused on identifying over-optimized terms, replacing them with related words and submitting the new page to search engines.

Here’s how we did it:

1. Identifying over-optimized term(s)

We started off using Moz’s on-page optimization tool to scan our page.

According to Moz, we shouldn’t have used the target term — “sales management” — more than 15 times. This means we had to drop 33 occurrences.

2. Finding synonymous terms with high lexical relevance

Next, we had to replace our 28+ mentions with synonyms that wouldn’t feel out of place.

We used Moz’s Keyword Explorer to get some ideas.

3. Removed “sales management” from H2 headings

Initially, we had the keyword in both H1 and H2 headings, which was just overkill.

We removed it from H2 headings and used lexically similar variants instead for better flow.

4. Diluted “sales management” from body copy

We used our list of lexically relevant words to bring down the number of “sales management” occurrences to under 20. This was perfect for 2,500+ word article.

5. Diversify internal anchors

While we were changing our body copy, we realized that we also needed more anchor text diversity for our internal links.

Our anchors cloud was mostly “sales management” links:

We diversified this list by adding links to related terms like “sales manager,” “sales process,” etc.

6. Social amplification

We ramped up our activity on LinkedIn and Facebook to get the ball rolling on social shares.

The end result of this experimentation was an over 100% increase in traffic between August ‘16 to January ‘17.

The lesson?

Don’t just build backlinks — optimize your on-page content as well!


Conclusion

There’s a lot to learn from this case study. Some findings were surprising for us as well, particularly the impact of keyword density normalization.

While there are a lot of tricks and tactics detailed here, you’ll find that the fundamentals are essentially the same as what Rand and team have been preaching here for years. Create good content, reach out to link prospects, and use strategic guest posts to get your page to rank.

This might sound like a lot of work, but the results are worth it. Big industry players like Salesforce and Oracle actually advertise on AdWords for this term. While they have to pay for every single click, Pipedrive gets its clicks for free.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The Best Types of Content for Local Businesses: Building Geo-Topical Authority

Posted by MiriamEllis

bestcontentlocalbusiness.jpg

Q: What kind of content should a local business develop?

A: The kind that converts!

Okay, you could have hit on that answer yourself, but as this post aims to demonstrate:

  1. There are almost as many user paths to conversion as there are customers in your city, and
  2. Your long-term goal is to become the authority in your industry and geography that consumers and search engines turn to.

Google’s widely publicized concept of micro-moments has been questioned by some local SEOs for its possible oversimplification of consumer behavior. Nevertheless, I think it serves as a good, basic model for understanding how a variety of human needs (I want to do, know, buy something, or go somewhere) leads people onto the web. When a local business manages to become a visible solution to any of these needs, the rewards can include:

  • Online traffic
  • In-store traffic
  • Transactions
  • Reviews/testimonials
  • Clicks-for-directions
  • Clicks-to-call
  • Clicks-to-website
  • Social sharing
  • Offline word-of-mouth
  • Good user metrics like time-on-page, low bounce rate, etc.

Takeaway: Consumers have a variety of needs and can bestow a variety of rewards that directly or indirectly impact local business reputation, rankings and revenue when these needs are well-met.

No surprise: it will take a variety of types of content publication to enjoy the full rewards it can bring.

Proviso: There will be nuances to the best types of content for each local business based on geo-industry and average consumer. Understandably, a cupcake bakery has a more inviting topic for photographic content than does a septic services company, but the latter shouldn’t rule out the power of an image of tree roots breaking into a septic line as a scary and effective way to convert property owners into customers. Point being, you’ll be applying your own flavor to becoming a geo-topical authority as you undertake the following content development work:

Foundational local business content development

These are the basics almost every local business will need to publish.

Customer service policy

Every single staff member who interacts with your public must be given a copy of your complete customer service policy. Why? A 2016 survey by the review software company GetFiveStars demonstrated that 57% of consumer complaints revolve around customer service and employee behavior. To protect your local business’ reputation and revenue, the first content you create should be internal and should instruct all forward-facing employees in approved basic store policies, dress, cleanliness, language, company culture, and allowable behaviors. Be thorough! Yes, you may wear a t-shirt. No, you may not text your friends while waiting on tables.

Customer rights guarantee

On your website, publish a customer-focused version of your policy. The Vermont Country Store calls this a Customer Bill of Rights which clearly outlines the quality of service consumers should expect to experience, the guarantees that protect them, and the way the business expects to be treated, as well.

NAP

Don’t overlook the three most important pieces of content you need to publish on your website: your company name, address, and phone number. Make sure they are in crawlable HTML (not couched in an image or a problematic format like Flash). Put your NAP at the top of your Contact Us page and in the site-wide masthead or footer so that humans and bots can immediately and clearly identify these key features of your business. Be sure your NAP is consistent across all pages for your site (not Green Tree Consulting on one page and Green Tree Marketing on another, or wrong digits in a phone number or street address on some pages). And, ideally, mark up your NAP with Schema to further assist search engine comprehension of your data.

Reviews/testimonials page

On your website, your reviews/testimonials page can profoundly impact consumer trust, comprising a combination of unique customer sentiment you’ve gathered via a form/software (or even from handwritten customer notes) and featured reviews from third-party review platforms (Google, Yelp). Why make this effort? As many as 92% of consumers now read online reviews and Google specifically cites testimonials as a vehicle for boosting your website’s trustworthiness and reputation.

Reviews/testimonials policy

Either on your Reviews/Testimonials page or on a second page of your website, clearly outline your terms of service for reviewers. Just like Yelp, you need to protect the quality of the sentiment-oriented content you publish and should let consumers know what you permit/forbid. Here’s a real-world example of a local business review TOS page I really like, at Barbara Oliver Jewelry.

Homepage

Apart from serving up some of the most fundamental content about your business to search engines, your homepage should serve two local consumer groups: those in a rush and those in research mode.

Pro tip: Don’t think of your homepage as static. Change up your content regularly there and track how this impacts traffic/conversions.

Contact Us page

On this incredibly vital website page, your content should include:

  • Complete NAP
  • All supported contact methods (forms, email, fax, live chat, after-hours hotline, etc.),
  • Thorough driving directions from all entry points, including pointers for what to look for on the street (big blue sign, next to red church, across the street from swim center, etc.)
  • A map
  • Exterior images of your business
  • Attributes like parking availability and wheelchair accessibility
  • Hours of operation
  • Social media links
  • Payment forms accepted (cash only, BitCoin, etc.)
  • Mention of proximity to major nearby points of interest (national parks, monuments, etc.)
  • Brief summary of services with a nod to attributes (“Stop by the Starlight tonight for late-night food that satisfies!”)
  • A fresh call-to-action (like visiting the business for a Memorial Day sale)

Store locator pages

For a multi-location businesses (like a restaurant chain), you’ll be creating content for a set of landing pages to represent each of your physical locations, accessed via a top-level menu if you have a few locations, or via a store locator widget if you have many. These should feature the same types of content a Contact Us page would for a single-location business, and can also include:

  • Reviews/testimonials for that location
  • Location-specific special offers
  • Social media links specific to that location
  • Proofs of that location’s local community involvement
  • Highlights of staff at that location
  • Education about availability of in-store beacons or apps for that location
  • Interior photos specific to that location
  • A key call-to-action

For help formatting all of this great content sensibly, please read Overcoming Your Fear of Local Landing Pages.

City landing pages

Similar to the multi-location business, the service area business (like a plumber) can also develop a set of customer-centric landing pages. These pages will represent each of the major towns or cities the business serves, and while they won’t contain a street address if the company lacks a physical location in a given area, they can contain almost everything else a Contact Us page or Store Locator page would, plus:

  • Documentation of projects completed in that city (text, photos, video)
  • Expert advice specific to consumers in that city, based on characteristics like local laws, weather, terrain, events, or customs
  • Showcasing of services provided to recognized brands in that city (“we wash windows at the Marriott Hotel,” etc.)
  • Reviews/testimonials from customers in that city
  • Proofs of community involvement in that city (events, sponsorships, etc.)
  • A key call-to-action

Product/service descriptions

Regardless of business model, all local businesses should devote a unique page of content to each major product or service they offer. These pages can include:

  • A thorough text description
  • Images
  • Answers to documented FAQs
  • Price/time quotes
  • Technical specs
  • Reviews of the service or product
  • Videos
  • Guarantees
  • Differentiation from competitors (awards won, lowest price, environmental standards, lifetime support, etc.)

For inspiration, I recommend looking at SolarCity’s page on solar roofing. Beautiful and informative.

Images

For many industries, image content truly sells. Are you “wowed” looking at the first image you see of this B&B in Albuquerque, the view from this restaurant in San Diego, or the scope of this international architectural firm’s projects? But even if your industry doesn’t automatically lend itself to wow-factor visuals, cleaning dirty carpets can be presented with high class and even so-called “boring” industries can take a visual approach to data that yields interesting and share-worthy/link-worthy graphics.

While you’re snapping photos, don’t neglect uploading them to your Google My Business listings and other major citations. Google data suggests that listing images influence click-through rates!

FAQ

The content of your FAQ page serves multiple purposes. Obviously, it should answer the questions your local business has documented as being asked by your real customers, but it can also be a keyword-rich page if you have taken the time to reflect the documented natural language of your consumers. If you’re just starting out and aren’t sure what types of questions your customers will ask, try AnswerThePublic and Q&A crowdsourcing sites to brainstorm common queries.

Be sure your FAQ page contains a vehicle for consumers to ask a question so that you can continuously document their inquiries, determine new topics to cover on the FAQ page, and even find inspiration for additional content development on your website or blog for highly popular questions.

About page

For the local customer in research mode, your About page can seal the deal if you have a story to tell that proves you are in the best possible alignment with their specific needs and desires. Yes, the About Us page can tell the story of your business or your team, but it can also tell the story of why your consumers choose you.

Take a look at this About page for a natural foods store in California and break it down into elements:

  • Reason for founding company
  • Difference-makers (95% organic groceries, building powered by 100% renewable energy)
  • Targeted consumer alignment (support local alternative to major brand, business inspired by major figure in environmental movement)
  • Awards and recognition from government officials and organizations
  • Special offer (5-cent rebate if you bring your own bag)
  • Timeline of business history
  • Video of the business story
  • Proofs of community involvement (organic school lunch program)
  • Links to more information

If the ideal consumer for this company is an eco-conscious shopper who wants to support a local business that will, in turn, support the city in which they live, this About page is extremely persuasive. Your local business can take cues from this real-world example, determining what motivates and moves your consumer base and then demonstrating how your values and practices align.

Calls to action

CTAs are critical local business content, and any website page which lacks one represents a wasted opportunity. Entrepreneur states that the 3 effective principles of calls to action are visibility, clear/compelling messaging, and careful choice of supporting elements. For a local business, calls to action on various pages of your website might direct consumers to:

  • Come into your location
  • Call
  • Fill out a form
  • Ask a question/make a comment or complaint
  • Livechat with a rep
  • Sign up for emails/texts or access to offers
  • Follow you on social media
  • Attend an in-store event/local event
  • Leave a review
  • Fill out a survey/participate in a poll

Ideally, CTAs should assist users in doing what they want to do in alignment with the actions the business hopes the consumer will take. Audit your website and implement a targeted CTA on any page currently lacking one. Need inspiration? This Hubspot article showcases mainly virtual companies, but the magic of some of the examples should get your brain humming.

Local business listings

Some of the most vital content being published about your business won’t exist on your website — it will reside on your local business listings on the major local business data platforms. Think Google My Business, Facebook, Acxiom, Infogroup, Factual, YP, Apple Maps, and Yelp. While each platform differs in the types of data they accept from you for publication, the majority of local business listings support the following content:

  • NAP
  • Website address
  • Business categories
  • Business description
  • Hours of operation
  • Images
  • Marker on a map
  • Additional phone numbers/fax numbers
  • Links to social, video, and other forms of media
  • Attributes (payments accepted, parking, wheelchair accessibility, kid-friendly, etc.)
  • Reviews/owner responses

The most important components of your business are all contained within a thorough local business listing. These listings will commonly appear in the search engine results when users look up your brand, and they may also appear for your most important keyword searches, profoundly impacting how consumers discover and choose your business.

Your objective is to ensure that your data is accurate and complete on the major platforms and you can quickly assess this via a free tool like Moz Check Listing. By ensuring that the content of your listings is error-free, thorough, and consistent across the web, you are protecting the rankings, reputation, and revenue of your local business. This is a very big deal!

Third-party review profiles

While major local business listing platforms (Google My Business, Facebook, Yelp) are simultaneously review platforms, you may need to seek inclusion on review sites that are specific to your industry or geography. For example, doctors may want to manage a review profile on HealthGrades and ZocDoc, while lawyers may want to be sure they are included on Avvo.

Whether your consumers are reviewing you on general or specialized platforms, know that the content they are creating may be more persuasive than anything your local business can publish on its own. According to one respected survey, 84% of consumers trust online reviews as much as they trust personal recommendations and 90% of consumers read less than 10 reviews to form a distinct impression of your business.

How can local businesses manage this content which so deeply impacts their reputation, rankings, and revenue? The answer is twofold:

  1. First, refer back to the beginning of this article to the item I cited as the first document you must create for your business: your customer service policy. You can most powerfully influence the reviews you receive via the excellence of your staff education and training.
  2. Master catching verbal and social complaints before they turn into permanent negative reviews by making your business complaint-friendly. And then move onto the next section of this article.

Owner responses

Even with the most consumer-centric customer service policies and the most detailed staff training, you will not be able to fully manage all aspects of a customer’s experience with your business. A product may break, a project be delayed, or a customer may have a challenging personality. Because these realities are bound to surface in reviews, you must take advantage of the best opportunity you have to manage sentiment after it has become a written review: the owner response.

You are not a silent bystander, sitting wordless on the sidelines while the public discusses your business. The owner response function provided by many review sites gives you a voice. This form of local business content, when properly utilized, can:

  • Save you money by winning back a dissatisfied existing customer instead of having to invest a great deal more in winning an entirely new one;
  • Inspire an unhappy customer to update a negative review with improved sentiment, including a higher star rating; and
  • Prove to all other potential customers who encounter your response that you will take excellent care of them.

You’ll want to respond to both positive and negative reviews. They are free Internet real estate on highly visible websites and an ideal platform for showcasing the professionalism, transparency, accountability, empathy, and excellence of your company. For more on this topic, please read Mastering the Owner Response to the Quintet of Google My Business Reviews.

Once you have developed and are managing all of the above content, your local business has created a strong foundation on the web. Depending on the competitiveness of your geo-industry, the above work will have won you a certain amount of local and organic visibility. Need better or broader rankings and more customers? It’s time to grow with:

Structural local business content development

These are options for creating a bigger structure for your local business on the web, expanding the terms you rank for and creating multiple paths for consumer discovery. We’ll use Google’s 4 micro-moment terms as a general guide + real-world examples for inspiration.

I want to do

  1. A homeowner wants to get her house in Colorado Springs ready to sell. In her search for tips, she encounters this Ultimate Home Seller’s To-Do Checklist & Infographic. Having been helped by the graphic, she may turn to the realty firm that created it for professional assistance.
  2. A dad wants to save money by making homemade veggie chips for his children. He’s impressed with the variety of applicable root vegetables featured in this 52-second video tutorial from Whole Foods. And now he’s also been shown where he can buy that selection of produce.
  3. A youth in California wants to become a mountain climber. He discovers this website page describing guided hikes up nearby Mount Whitney, but it isn’t the text that really gets him — it’s the image gallery. He can share those exciting photos with his grandmother on Facebook to persuade her to chaperone him on an adventure together.

I want to know

  1. A tech worker anywhere in America wants to know how to deal with digital eye strain and she encounters this video from Kaiser Permanente, which gives tips and also recommends getting an eye exam every 1–2 years. The worker now knows where she could go locally for such an exam and other health care needs.
  2. A homeowner in the SF Bay Area wants to know how to make his place more energy efficient to save on his bills. He finds this solar company’s video on YouTube with a ton of easy tips. They’ve just made a very good brand impression on the homeowner, and this company serves locally. Should he decide at some point to go the whole nine yards and install solar panels, this brand’s name is now connected in his mind with that service.
  3. A gardener wants to know how to install a drip irrigation system in her yard and she encounters this major hardware store brand’s video tutorial. There’s a branch of this store in town, and now she knows where she can find all of the components that will go into this project.

I want to go

  1. While it’s true that most I-want-to-go searches will likely lead to local pack results, additional website content like this special gluten-free menu an independently owned pizza place in Houston has taken the time to publish should seal the deal for anyone in the area who wants to go out for pizza while adhering to their dietary requirements.
  2. A busy Silicon Valley professional is searching Google because they want to go to a “quiet resort in California.” The lodgings, which have been lucky enough to be included on this best-of list from TripAdvisor, didn’t have to create this content — their guests have done it for them by mentioning phrases like “quiet place” and “quiet location” repeatedly in their reviews. The business just has to provide the experience, and, perhaps promote this preferred language in their own marketing. Winning inclusion on major platforms’ best-of lists for key attributes of your business can be very persuasive for consumers who want to go somewhere specific.
  3. An ornithologist is going to speak at a conference in Medford, OR. As he always does when he goes on a trip, he looks for a bird list for the area and encounters this list of local bird walks published by a Medford nature store. He’s delighted to discover that one of the walks corresponds with his travel dates, and he’s also just found a place to do a little shopping during his stay.

I want to buy

  1. Two cousins in Atlanta want to buy their uncle dinner for his birthday, but they’re on a budget. One sees this 600+ location restaurant chain’s tweet about how dumb it is to pay for chips and salsa. Check this out @cousin, he tweets, and they agree their wallets can stretch for the birthday dinner.
  2. An off-road vehicle enthusiast in Lake Geneva, WI wants to buy insurance for his ride, but who offers this kind of coverage? A local insurance agent posts his video on this topic on his Facebook page. Connection!
  3. A family in Hoboken, NJ wants to buy a very special cake for an anniversary party. A daughter finds these mouth-watering photos on Pinterest while a son finds others on Instagram, and all roads lead to the enterprising Carlo’s Bakery.

In sum, great local business content can encompass:

  • Website/blog content
  • Image content including infographics and photos
  • Social content
  • Video content
  • Inclusion in best-of type lists on prominent publications

Some of these content forms (like professional video or photography creation) represent a significant financial investment that may be most appropriate for businesses in highly competitive markets. The creation of tools and apps can also be smart (but potentially costly) undertakings. Others (like the creation of a tweet or a Facebook post) can be almost free, requiring only an investment of time that can be made by local businesses at all levels of commerce.

Becoming a geo-topical authority

Your keyword and consumer research are going to inform the particular content that would best serve the needs of your specific customers. Rand Fishkin recently highlighted here on the Moz Blog that in order to stop doing SEO like it’s 2012, you must aim to become an entity that Google associates with a particular topic.

For local business owners, the path would look something like when anyone in my area searches for any topic that relates to our company, we want to appear in:

  • local pack rankings with our Google My Business listing
  • major local data platforms with our other listings
  • major review sites with our profiles and owner responses
  • organic results with our website’s pages and posts
  • social platforms our customers use with our contributions
  • video results with our videos
  • image search results with our images
  • content of important third-party websites that are relevant either to our industry or to our geography

Basically, every time Google or a consumer reaches for an answer to a need that relates to your topic and city, you should be there offering up the very best content you can produce. Over time, over years of publication of content that consistently applies to a given theme, you will be taking the right steps to become an authority in Google’s eyes, and a household brand in the lives of your consumers.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Launching a New Website: Your SEO Checklist – Whiteboard Friday

Posted by randfish

Hovering your finger over the big red “launch” button for your new website? Hold off for just a second (or 660 of them, rather). There may be SEO considerations you haven’t accounted for yet, from a keyword-to-URL content map to sweeping for crawl errors to setting up proper tracking. In today’s Whiteboard Friday, Rand covers five big boxes you need to check off before finally setting that site live.

http://ift.tt/2pfQ6fx

http://ift.tt/1SsY8tZ

SEO checklist when launching a new website

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to talk about launching a new website and the SEO process that you’ve got to go through. Now, it’s not actually that long and cumbersome. But there are a few things that I put into broad categories, where if you do these as you’re launching a new site or before you launch that new site, your chances of having success with SEO long term and especially in those first few months is going to go way up.

1. Keyword to URL map for your content

So let’s get started with number one here. What I’m suggesting that you do is, as you look across the site that you’ve built, go and do some keyword research. There are a lot of Whiteboard Fridays and blog posts that we’ve written here at Moz about great ways to do keyword research. But do that keyword research and create a list that essentially maps all of the keywords you are initially targeting to all of the URLs, the pages that you have on your new website.

So it should look something like this. It’s got the URL, so RandsAnimals.com, targeting the keyword “amazing animals,” and here’s the page title and here’s the meta description. Then, I’ve got http://ift.tt/2nLC0oW, which is my page about lemurs, and that’s targeting “lemurs” and “lemur habits.” There’s the title.

You want to go through these and make sure that if you have an important keyword that you have not yet targeted, you do so, and likewise, that if you’ve got a URL, a page on your website that you have not yet intentionally targeted a keyword with, you make sure to do that as well. This can be a great way to go through a small site in the early stages and make sure that you’ve got some terms and phrases that you’re actually targeting. This will be also helpful when you do your rank tracking and your on-page optimization later on.

2. Accessibility, crawl, and UX

So what I want you to do here is to ask yourself:

I. “Are the pages and the content on my website accessible to search engines?”

There are some great ways to check these. You can use something like Screaming Frog or Google Search Console. You could use Moz Pro, or OnPage.org, to basically run a scan of your site and make sure that crawlers can get to all the pages, that you don’t have duplicate content, that you don’t have thin content or pages that are perceived to have no content at all, you don’t have broken links, you don’t have broken pages, all that kind of good stuff.

II. “Is the content accessible to all audiences, devices, and browsers?”

Next, we’re going to ask not about search engines and their crawlers, but about the audience, the human beings and whether your content is accessible to all the audiences, devices, and browsers that it could be. So this could mean things like screen readers for blind users, mobile devices, desktop devices, laptops, browsers of all different kinds. You’re going to want to use a tool like a browser checker to make sure that Chrome, Firefox, and… What’s Internet Explorer called now? Oh, man. They changed it. Microsoft Edge. Make sure that it works in all of them.

I like that I think that there’s a peanut gallery who’s going to yell it out. Like you’re watching this at lunch and you’re thinking, “Rand, if I yell it to you now, it won’t be recorded.” I know. I know.

III. “Do those pages load fast from everywhere?”

So I could use a tool like Google Speed Test. I can also do some proxy checking to make sure that from all sorts of regions, especially if I’m doing international targeting or if I know that I’m going to be targeting rural regions that my pages load fast from everywhere.

IV. “Is the design, UI, visuals, and experience enjoyable and easy for all users?”

You can do that with some in-house usability testing. You could do it informally with friends and family and existing customers if you have them. Or you could use something like Five Second Test or UsabilityHub to run some more formal testing online. Sometimes this can reveal things in your navigation or your content that’s just stopping people from having the experience that you want — that’s very easy to fix.

3. Setup of important services and tracking

So there’s a bunch of stuff that you just need to set up around a website. Those include:

  • Web analytics – Google Analytics is free and very, very popular. But you could also use something like Piwik, or if you’re bigger, Omniture. You’re going to want to do a crawl. OnPage or Moz Pro, or some of these other ones will check to make sure that your analytics are actually loaded on all of your pages.
  • Uptime tracking – If you haven’t checked them out, Pingdom has some very cheap plans for very early-stage sites. Then, if you get bigger, they can get more expensive and more sophisticated.
  • Retargeting and remarketing – Even if you don’t want to pay now and you’re not going to use any of the services, go ahead and put the retargeting pixels from at least Facebook and Google onto your website, on all of your pages, so that those audiences are accessible to you later on in the future.
  • Set up some brand alerts – The cheapest option is Google Alerts, which is free, but it’s not very good at all. If you’re using Moz Pro, there’s Fresh Web Explorer alerts, which is great. Mention.net is also good, Talkwalker, Trackur. There’s a number of options there that are paid and a little bit better.
  • Google Search Console – If you haven’t set that up already, you’re going to want to do that, as well as Bing Webmaster Tools. Both of those can reveal some errors to you. So if you have accessibility issues, that’s a good free way to go.
  • Moz/Ahrefs/SEMRush/Searchmetrics/Raven/etc. – If you are doing SEO, chances are good that you’re going to want to set up some type of an SEO tool to track your rankings and do a regular crawl, show you competitive opportunities and missteps, potentially show you link-building opportunities, all that kind of stuff. I would urge you to check out one of probably these five. There are a few other ones. But these five are pretty popular — Moz, Ahrefs, SEMRush, Searchmetrics, or Raven. Those are some of the best known ones certainly out there.
  • Social and web profiles – Again, important to set those up before you launch your new site, so that no one goes and jumps on the name of your Facebook page, or your Pinterest page, or your Instagram profile page, or your YouTube page, or your SlideShare page. I know you might be saying, “But Rand, I don’t use SlideShare.” No, not today. But you might in the future, and trust me, you’re going to want to claim Rand’s Animals on YouTube and SlideShare. You’re going to want to claim whatever your website’s name is. I’ll go claim this one later. But you’ve got to set all those up, because you don’t want someone else taking them later. I would urge you to go down the full list of all the social media sites out there, all the web profiles out there, just to make sure that you’ve got your brand secured.

4. Schema, rich snippets, OpenGraph, etc

Optimization in general, more broadly. So this is where I’m essentially going through these URLs and I’m making sure, “Hey, okay. I know I’ve targeted these keywords and I already did sort of my page title meta description. But let me check if there are other opportunities.”

Are there content opportunities or image search opportunities? Do I have rich snippet opportunities? Like maybe, this is probably not the case, but I could have user review stars for my Rand’s Animals website. I don’t know if people particularly love this lemur GIF versus that lemur GIF. But those can be set up on your site, and you can see the description of how to do that on Google and Bing. They both have resources for that. The same is true for Twitter and Facebook, who offer cards so that you show up correctly in there. If you’re using OpenGraph, I believe that also will correctly work on LinkedIn and other services like that. So those are great options.

5. Launch amplification & link outreach plan

So one of the things that we know about SEO is that you need links and engagement and those types of signals in order to rank well. You’re going to want to have a successful launch day and launch week and even a launch month. That means, asking the question in advance:

I. “Who will help amplify your launch and why? Why are they going to do this?”

If you can identify, “These people, I know they personally want to help out,” or, “They are friends and family. I have business relationships with them. They’re customers of mine. They’re journalists who promised to cover this. They are bloggers who care a lot about this subject and need stuff to write about.” Whatever it is, if you can identify those people, create a list, and start doing that direct outreach, that is certainly something that you should do. I would plan in advance for that, and I would warn folks of when you were going to do that launch. That way, when launch day rolls around, you have some big, exciting news to announce. Two weeks after you launch to say, “Hey, I launched a new website a couple weeks ago,” you’re no longer news. You’re no longer quite as special, and therefore your chances of coverage go down pretty precipitously after the first few days.

II. “What existing relationships, profiles, and sites should I update to create buzz (and accuracy)?”

I would also ask what existing relationships and websites and profiles do you already have that you can and should update to create buzz and actually to create accuracy. So this would be things like everything from your email signature to all your social profiles that we’ve talked about, both the ones you’ve claimed and the ones that you personally have. You should go and update your LinkedIn. You should go and update your Twitter page. You should go and update Facebook. All of those kinds of things, you may want to go and update. About.me if you have a profile there, or if you’re a designer, maybe your Dribbble profile, whatever you’ve got.

*Then, you should also be thinking about, “Do I have content that I’ve contributed across the web over the years, on all sorts of other websites, where if I went and said, ‘Hey, I’ve got a new site. Could you point to that new site, instead of my old one, or to my new site that I’ve just launched, instead of my old employer who I’ve left?'” you can do that as well, and it’s certainly a good idea.

III. “What press coverage, social coverage, or influencer outreach can I do?”

The last thing I would ask about are people who are maybe more distant from you, but press coverage, social coverage, or influencer outreach, similar to the, “Who will help you amplify and why?” You should be able to make a list of those folks, those outlets, find some email addresses, send a pitch if you’ve got one, and start to build those relationships.

Launch day is a great reason to do outreach. When you’re launching something new is the right time to do that, and that can help you get some amplification as well.

All right. Hopefully, when you launch your new site, you’re going to follow this checklist, you’re going to dig into these details, and you’re going to come away with a much more successful SEO experience.

If you’ve launched a website and you see things that are missing from this list, you see other recommendations that you’ve got, please, by all means, leave them in the comments. We’d love to chat about them.

We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The Wonderful World of SEO Meta Tags [Refreshed for 2017]

Posted by katemorris

Meta tags represent the beginning of most SEO training, for better or for worse. I contemplated exactly how to introduce this topic because we always hear about the bad side of meta tags — namely, the keywords meta tag. One of the first things dissected in any site review is the misuse of meta tags, mainly because they’re at the top of every page in the header and are therefore the first thing seen. But we don’t want to get too negative; meta tags are some of the best tools in a search marketer’s repertoire.

There are meta tags beyond just description and keywords, though those two are picked on the most. I’ve broken down the most-used (in my experience) by the good, the bad, and the indifferent. You’ll notice that the list gets longer as we get to the bad ones. I didn’t get to cover all of the meta tags possible to add, but there’s a comprehensive meta tag resource you should check out if you’re interested in everything that’s out there.

My main piece of advice: stick to the core minimum. Don’t add meta tags you don’t need — they just take up code space. The less code you have, the better. Think of your page code as a set of step-by-step directions to get somewhere, but for a browser. Extraneous meta tags are the annoying “Go straight for 200 feet” line items in driving directions that simply tell you to stay on the same road you’re already on!


The good meta tags

These are the meta tags that should be on every page, no matter what. Notice that this is a small list; these are the only ones that are required, so if you can work with just these, please do.

  • Meta content type – This tag is necessary to declare your character set for the page and should be present on every page. Leaving this out could impact how your page renders in the browser. A few options are listed below, but your web designer should know what’s best for your site.
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1">
  • Title – While the title tag doesn’t start with “meta,” it is in the header and contains information that’s very important to SEO. You should always have a unique title tag on every page that describes the page. Check out this post for more information on title tags.
  • Meta description – The infamous meta description tag is used for one major purpose: to describe the page to searchers as they read through the SERPs. This tag doesn’t influence ranking, but it’s very important regardless. It’s the ad copy that will determine if users click on your result. Keep it within 160 characters, and write it to catch the user’s attention. Sell the page — get them to click on the result. Here’s a great article on meta descriptions that goes into more detail.
  • Viewport – In this mobile world, you should be specifying the viewport. If you don’t, you run the risk of having a poor mobile experience — the Google PageSpeed Insights Tool will tell you more about it. The standard tag is:
<meta name=viewport content="width=device-width, initial-scale=1">


The indifferent meta tags

Different sites will need to use these in specific circumstances, but if you can go without, please do.

  • Social meta tagsI’m leaving these out. OpenGraph and Twitter data are important to sharing, but are not required per se.
  • Robots One huge misconception is that you have to have a robots meta tag. Let’s make this clear: In terms of indexing and link following, if you don’t specify a meta robots tag, they read that as index,follow. It’s only if you want to change one of those two commands that you need to add meta robots. Therefore, if you want to noindex but follow the links on the page, you would add the following tag with only the noindex, as the follow is implied. Only change what you want to be different from the norm.
<meta name="robots" content="noindex" />
  • Specific bots (Googlebot) – These tags are used to give a specific bot instructions like noodp (forcing them not to use your DMOZ listing information, RIP) and noydir (same, but instead the Yahoo Directory listing information). Generally the search engines are really good at this kind of thing on their own, but if you think you need it, feel free. There have been some cases I’ve seen where it’s necessary, but if you must, consider using the overall robots tag listed above.
  • Language – The only reason to use this tag is if you’re moving internationally and need to declare the main language used on the page. Check out this meta languages resource for a full list of languages you can declare.
  • Geo – The last I heard, these meta tags are supported by Bing but not Google (you can target to country inside Search Console). There are three kinds: placename, position (latitude and longitude), and region.
<META NAME="geo.position" CONTENT="latitude; longitude">
<META NAME="geo.placename" CONTENT="Place Name">
<META NAME="geo.region" CONTENT="Country Subdivision Code">
  • Keywords – Yes, I put this on the “indifferent” list. While no good SEO is going to recommend spending any time on this tag, there’s some very small possibility it could help you somewhere. Please leave it out if you’re building a site, but if it’s automated, there’s no reason to remove it.
  • Refresh – This is the poor man’s redirect and should not be used, if at all possible. You should always use a server-side 301 redirect. I know that sometimes things need to happen now, but Google is NOT a fan.
  • Site verification – Your site is verified with Google and Bing, right? Who has the verification meta tags on their homepage? These are sometimes necessary because you can’t get the other forms of site verification loaded, but if at all possible try to verify another way. Google allows you to verify by DNS, external file, or by linking your Google Analytics account. Bing still only allows by XML file or meta tag, so go with the file if you can.

The bad meta tags

Nothing bad will happen to your site if you use these — let me just make that clear. They’re a waste of space though; even Google says so (and that was 12 years ago now!). If you’re ready and willing, it might be time for some spring cleaning of your <head> area.

  • Author/web author – This tag is used to name the author of the page. It’s just not necessary on the page.
  • Revisit after – This meta tag is a command to the robots to return to a page after a specific period of time. It’s not followed by any major search engine.
  • Rating – This tag is used to denote the maturity rating of content. I wrote a post about how to tag a page with adult images using a very confusing system that has since been updated (see the post’s comments). It seems as if the best way to note bad images is to place them on a separate directory from other images on your site and alert Google.
  • Expiration/date – “Expiration” is used to note when the page expires, and “date” is the date the page was made. Are any of your pages going to expire? Just remove them if they are (but please don’t keep updating content, even contests — make it an annual contest instead!). And for “date,” make an XML sitemap and keep it up to date. It’s much more useful.
  • Copyright – That Google article debates this with me a bit, but look at the footer of your site. I would guess it says “Copyright 20xx” in some form. Why say it twice?
  • Abstract – This tag is sometimes used to place an abstract of the content and used mainly by educational pursuits.
  • Distribution – The “distribution” value is supposedly used to control who can access the document, typically set to “global.” It’s inherently implied that if the page is open (not password-protected, like on an intranet) that it’s meant for the world. Go with it, and leave the tag off the page.
  • Generator – This is used to note what program created the page. Like “author,” it’s useless.
  • Cache control – This tag is set in hopes of controlling when and how often a page is cached in the browser. It’s best to do this in the HTTP header.
  • Resource type – This is used to name the type of resource the page is, like “document.” Save yourself time, as the DTD declaration does it for you.

There are so many meta tags out there, I’d love to hear about any you think need to be added or even removed! Shout out in the comments with suggestions or questions.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The MozCon Local 2017 Video Bundle Is Here!

Posted by Danielle_Launders

At MozCon Local we came, we learned, and now we share! We invited 16 speakers to dive into all aspects of local marketing and SEO in 13 keynote-style presentations and one Q&A panel with local search experts. Throughout the day we dove into such topics as link building, citation sources, reviews, industry trends, and more.

Ready to level up your local marketing skills? Feel free to jump ahead:

Let’s go! I’m ready for my bundle

For those that attended this year’s event, you may want to double-check your inbox: There’s a special email waiting for you with steps on how to access your videos. If you’re having any trouble or if spam filters ate your email, feel free to reach out to the team at mozcon@moz.com.

MozCon Local 2017 was our biggest and best yet. We put a lot of heart into the program and are so excited to share all of the actionable tips and next-level knowledge with you. Harness the knowledge of industry leaders from the office or from the comfort of your sofa.

http://ift.tt/2nEl43T

http://ift.tt/1SsY8tZ


The results are in…

Here’s what our attendees thought of MozCon Local 2017:

We asked our attendees for their thoughts on the sessions and 80% of surveyed attendees found the content in the presentations to be advanced enough for them, while 72% of respondents found 80% or more of the sessions to be interesting and relevant to their field.


The bundle

Included in the bundle is access to all of this year’s presentations, which include both the videos of the speakers and their slide decks.

For $99, the MozCon Local 2017 Video Bundle will give you instant access to:

  • All 14 videos — that’s over 6 hours of content from MozCon Local 2017!
  • Stream or download the videos to your computer, tablet, or phone. The videos are iOS, Windows, and Android compatible.
  • Downloadable slide decks for all presentations.

Buy the MozCon Local 2017 Video Bundle


Want a sneak peek?

It’s important to know what you are getting, which is why we are sharing one of this year’s highly rated sessions at MozCon Local for free. GetFiveStars’ Mike Blumenthal digs into factors that determine relevance of non link-based signals and develops a model for how Google might use them to determine rank. Even if you feel that the whole bundle is not for you, you won’t want to miss this informative session:

http://ift.tt/2ptnD5o

A huge thanks to the team members that worked hard to finish these videos. It takes a village and we appreciate all the efforts of designing, editing, and coding. We wish you all happy learning and hope to see you at MozCon 2017 in July!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!