Google’s Walled Garden: Are We Being Pushed Out of Our Own Digital Backyards?

Posted by Dr-Pete

Early search engines were built on an unspoken transaction — a pact between search engines and website owners — you give us your data, and we’ll send you traffic. While Google changed the game of how search engines rank content, they honored the same pact in the beginning. Publishers, who owned their own content and traditionally were fueled by subscription revenue, operated differently. Over time, they built walls around their gardens to keep visitors in and, hopefully, keep them paying.

Over the past six years, Google has crossed this divide, building walls around their content and no longer linking out to the sources that content was originally built on. Is this the inevitable evolution of search, or has Google forgotten their pact with the people’s whose backyards their garden was built on?

I don’t think there’s an easy answer to this question, but the evolution itself is undeniable. I’m going to take you through an exhaustive (yes, you may need a sandwich) journey of the ways that Google is building in-search experiences, from answer boxes to custom portals, and rerouting paths back to their own garden.


I. The Knowledge Graph

In May of 2012, Google launched the Knowledge Graph. This was Google’s first large-scale attempt at providing direct answers in search results, using structured data from trusted sources. One incarnation of the Knowledge Graph is Knowledge Panels, which return rich information about known entities. Here’s part of one for actor Chiwetel Ejiofor (note: this image is truncated)…

The Knowledge Graph marked two very important shifts. First, Google created deep in-search experiences. As Knowledge Panels have evolved, searchers have access to rich information and answers without ever going to an external site. Second, Google started to aggressively link back to their own resources. It’s easy to overlook those faded blue links, but here’s the full Knowledge Panel with every link back to a Google property marked…

Including links to Google Images, that’s 33 different links back to Google. These two changes — self-contained in-search experiences and aggressive internal linking — represent a radical shift in the nature of search engines, and that shift has continued and expanded over the past six years.

More recently, Google added a sharing icon (on the right, directly below the top images). This provides a custom link that allows people to directly share rich Google search results as content on Facebook, Twitter, Google+, and by email. Google no longer views these pages as a path to a destination. Search results are the destination.

The Knowledge Graph also spawned Knowledge Cards, more broadly known as “answer boxes.” Take any fact in the panel above and pose it as a question, and you’re likely to get a Knowledge Card. For example, “How old is Chiwetel Ejiofor?” returns the following…

For many searchers, this will be the end of their journey. Google has answered their question and created a self-contained experience. Note that this example also contains links to additional Google searches.

In 2015, Google launched Medical Knowledge Panels. These gradually evolved into fully customized content experiences created with partners in the medical field. Here’s one for “cardiac arrest” (truncated)…

Note the fully customized design (these images were created specifically for these panels), as well as the multi-tabbed experience. It is now possible to have a complete, customized content experience without ever leaving Google.


II. Live Results

In some specialized cases, Google uses private data partnerships to create customized answer boxes. Google calls these “Live Results.” You’ve probably seen them many times now on weather, sports and stock market searches. Here’s one for “Seattle weather”…

For the casual information seeker, these are self-contained information experiences with most or all of what we care about. Live Results are somewhat unique in that, unlike the general knowledge in the Knowledge Graph, each partnership represents a disruption to an industry.

These partnerships have branched out over time into even more specialized results. Consider, for example, “Snoqualmie ski conditions”…

Sports results are incredibly disruptive, and Google has expanded and enriched these results quite a bit over the past couple of years. Here’s one for “Super Bowl 2018″…

Note that clicking any portion of this Live Result leads to a customized portal on Google that can no longer be called a “search result” in any traditional sense (more on portals later). Special sporting events, such as the 2018 Winter Olympics, have even more rich features. Here are some custom carousels for “Olympic snowboarding results”…

Note that these are multi-column carousels that ultimately lead to dozens of smaller cards. All of these cards click to more Google search results. This design choice may look strange on desktop and marks another trend — Google’s shift to mobile-first design. Here’s the same set of results on a Google Pixel phone…

Here, the horizontal scrolling feels more intuitive, and the carousel is the full-width of the screen, instead of feeling like a free-floating design element. These features are not only rich experiences on mobile screens, but dominate mobile results much more than they do two-column desktop results.


III. Carousels

Speaking of carousels, Google has been experimenting with a variety of horizontal result formats, and many of them are built around driving traffic back to Google searches and properties. One of the older styles of carousels is the list format, which runs across the top of desktop searches (above other results). Here’s one for “Seattle Sounders roster”…

Each player links to a new search result with that player in a Knowledge Panel. This carousel expands to the width of the screen (which is unusual, since Google’s core desktop design is fixed-width). On my 1920×1080 screen, you can see 14 players, each linking to a new Google search, and the option to scroll for more…

This type of list carousel covers a wide range of topics, from “cat breeds” to “types of cheese.” Here’s an interesting one for “best movies of 1984.” The image is truncated, but the full result includes drop-downs to select movie genres and other years…

Once again, each result links to a new search with a Knowledge Panel dedicated to that movie. Another style of carousel is the multi-row horizontal scroller, like this one for “songs by Nirvana”…

In this case, not only does each entry click to a new search result, but many of them have prominent featured videos at the top of the left column (more on that later). My screen shows at least partial information for 24 songs, all representing in-Google links above the traditional search results…

A search for “laptops” (a very competitive, commercial term, unlike the informational searches above) has a number of interesting features. At the bottom of the search is this “Refine by brand” carousel…

Clicking on one of these results leads to a new search with the brand name prepended (e.g. “Apple laptops”). The same search shows this “Best of” carousel…

The smaller “Mentioned in:” links go to articles from the listed publishers. The main, product links go to a Google search result with a product panel. Here’s what I see when I click on “Dell XPS 13 9350” (image is truncated)…

This entity live in the right-hand column and looks like a Knowledge Panel, but is commercial in nature (notice the “Sponsored” label in the upper right). Here, Google is driving searchers directly into a paid/advertising channel.


IV. Answers & Questions

As Google realized that the Knowledge Graph would never scale at the pace of the wider web, they started to extract answers directly from their index (i.e. all of the content in the world, or at least most of it). This led to what they call “Featured Snippets”, a special kind of answer box. Here’s one for “Can hamsters eat cheese?” (yes, I have a lot of cheese-related questions)…

Featured Snippets are an interesting hybrid. On the one hand, they’re an in-search experience (in this case, my basic question has been answered before I’ve even left Google). On the other hand, they do link out to the source site and are a form of organic search result.

Featured Snippets also power answers on Google Assistant and Google Home. If I ask Google Home the same question about hamsters, I hear the following:

On the website TheHamsterHouse.com, they say “Yes, hamsters can eat cheese! Cheese should not be a significant part of your hamster’s diet and you should not feed cheese to your hamster too often. However, feeding cheese to your hamster as a treat, perhaps once per week in small quantities, should be fine.”

You’ll see the answer is identical to the Featured Snippet shown above. Note the attribution (which I’ve bolded) — a voice search can’t link back to the source, posing unique challenges. Google does attempt to provide attribution on Google Home, but as they use answers extracted from the web more broadly, we may see the way original sources are credited change depending on the use case and device.

This broader answer engine powers another type of result, called “Related Questions” or the “People Also Ask” box. Here’s one on that same search…

These questions are at least partially machine-generated, which is why the grammar can read a little oddly — that’s a fascinating topic for another time. If you click on “What can hamsters eat list?” you get what looks a lot like a Featured Snippet (and links to an outside source)…

Notice two other things that are going on here. First, Google has included a link to search results for the question you clicked on (see the purple arrow). Second, the list has expanded. The two questions at the end are new. Let’s click “What do hamsters like to do for fun?” (because how can I resist?)…

This opens up a second answer, a second link to a new Google search, and two more answers. You can continue this to your heart’s content. What’s especially interesting is that this isn’t just some static list that expands as you click on it. The new questions are generated based on your interactions, as Google tries to understand your intent and shape your journey around it.

My colleague, Britney Muller, has done some excellent research on the subject and has taken to calling these infinite PAAs. They’re probably not quite infinite — eventually, the sun will explode and consume the Earth. Until then, they do represent a massively recursive in-Google experience.


V. Videos & Movies

One particularly interesting type of Featured Snippet is the Featured Video result. Search for “umbrella” and you should see a panel like this in the top-left column (truncated):

This is a unique hybrid — it has Knowledge Panel features (that link back to Google results), but it also has an organic-style link and large video thumbnail. While it appears organic, all of the Featured Videos we’ve seen in the wild have come from YouTube (Vevo is a YouTube partner), which essentially means this is an in-Google experience. These Featured Videos consume a lot of screen real-estate and appear even on commercial terms, like Rihanna’s “umbrella” (shown here) or Kendrick Lamar’s “swimming pools”.

Movie searches yield a rich array of features, from Live Results for local showtimes to rich Knowledge Panels. Last year, Google completely redesigned their mobile experience for movie results, creating a deep in-search experience. Here’s a mobile panel for “Black Panther”…

Notice the tabs below the title. You can navigate within this panel to a wealth of information, including cast members and photos. Clicking on any cast member goes to a new search about that actor/actress.

Although the search results eventually continue below this panel, the experience is rich, self-contained, and incredibly disruptive to high-ranking powerhouses in this space, including IMDB. You can even view trailers from the panel…

On my phone, Google displayed 10 videos (at roughly two per screen), and nine of those were links to YouTube. Given YouTube’s dominance, it’s difficult to say if Google is purposely favoring their own properties, but the end result is the same — even seemingly “external” clicks are often still Google-owned clicks.


VI. Local Results

A similar evolution has been happening in local results. Take the local 3-pack — here’s one on a search for “Seattle movie theaters”…

Originally, the individual business links went directly to each of those business’s websites. As of the past year or two, these instead go to local panels on Google Maps, like this one…

On mobile, these local panels stand out even more, with prominent photos, tabbed navigation and easy access to click-to-call and directions.

In certain industries, local packs have additional options to run a search within a search. Here’s a pack for Chicago taco restaurants, where you can filter results (from the broader set of Google Maps results) by rating, price, or hours…

Once again, we have a fully embedded search experience. I don’t usually vouch for any of the businesses in my screenshots, but I just had the pork belly al pastor at Broken English Taco Pub and it was amazing (this is my personal opinion and in no way reflects the taco preferences of Moz, its employees, or its lawyers).

The hospitality industry has been similarly affected. Search for an individual hotel, like “Kimpton Alexis Seattle” (one of my usual haunts when visiting the home office), and you’ll get a local panel like the one below. Pardon the long image, but I wanted you to have the full effect…

This is an incredible blend of local business result, informational panel, and commercial result, allowing you direct access to booking information. It’s not just organic local results that have changed, though. Recently, Google started offering ads in local packs, primarily on mobile results. Here’s one for “tax attorneys”…

Unlike traditional AdWords ads, these results don’t go directly to the advertiser’s website. Instead, like standard pack results, they go to a Google local panel. Here’s what the mobile version looks like…

In addition, Google has launched specialized ads for local service providers, such as plumbers and electricians. These appear carousel-style on desktop, such as this one for “plumbers in Seattle”…

Unlike AdWords advertisers, local service providers buy into a specialized program and these local service ads click to a fully customized Google sub-site, which brings us to the next topic — portals.


VII. Custom Portals

Some Google experiences have become so customized that they operate as stand-alone portals. If you click on a local service ad, you get a Google-owned portal that allows you to view the provider, check to see if they can handle your particular problem in your zip code, and (if not) view other, relevant providers…

You’ve completely left the search result at this point, and can continue your experience fully within this Google property. These local service ads have now expanded to more than 30 US cities.

In 2016, Google launched their own travel guides. Run a search like “things to do in Seattle” and you’ll see a carousel-style result like this one…

Click on “Seattle travel guide” and you’ll be taken to a customized travel portal for the city of Seattle. The screen below is a desktop result — note the increasing similarity to rich mobile experiences.

Once again, you’ve been taken to a complete Google experience outside of search results.

Last year, Google jumped into the job-hunting game, launching a 3-pack of job listings covering all major players in this space, like this one for “marketing jobs in Seattle”…

Click on any job listing, and you’ll be taken to a separate Google jobs portal. Let’s try Facebook…

From here, you can view other listings, refine your search, and even save jobs and set up alerts. Once again, you’ve jumped from a specialized Google result to a completely Google-controlled experience.

Like hotels, Google has dabbled in flight data and search for years. If I search for “flights to Seattle,” Google will automatically note my current location and offer me a search interface and a few choices…

Click on one of these choices and you’re taken to a completely redesigned Google Flights portal…

Once again, you can continue your journey completely within this Google-owned portal, never returning back to your original search. This is a trend we can expect to continue for the foreseeable future.


VIII. Hard Questions

If I’ve bludgeoned you with examples, then I apologize, but I want to make it perfectly clear that this is not a case of one or two isolated incidents. Google is systematically driving more clicks from search to new searches, in-search experiences, and other Google owned properties. This leads to a few hard questions…

Why is Google doing this?

Right about now, you’re rushing to the comments section to type “For the money!” along with a bunch of other words that may include variations of my name, “sheeple,” and “dumb-ass.” Yes, Google is a for-profit company that is motivated in part by making money. Moz is a for-profit company that is motivated in part by making money. Stating the obvious isn’t insight.

In some cases, the revenue motivation is clear. Suggesting the best laptops to searchers and linking those to shopping opportunities drives direct dollars. In traditional walled gardens, publishers are trying to produce more page-views, driving more ad impressions. Is Google driving us to more searches, in-search experiences, and portals to drive more ad clicks?

The answer isn’t entirely clear. Knowledge Graph links, for example, usually go to informational searches with few or no ads. Rich experiences like Medical Knowledge Panels and movie results on mobile have no ads at all. Some portals have direct revenues (local service providers have to pay for inclusion), but others, like travel guides, have no apparent revenue model (at least for now).

Google is competing directly with Facebook for hours in our day — while Google has massive traffic and ad revenue, people on average spend much more time on Facebook. Could Google be trying to drive up their time-on-site metrics? Possibly, but it’s unclear what this accomplishes beyond being a vanity metric to make investors feel good.

Looking to the long game, keeping us on Google and within Google properties does open up the opportunity for additional advertising and new revenue streams. Maybe Google simply realizes that letting us go so easily off to other destinations is leaving future money on the table.

Is this good for users?

I think the most objective answer I can give is — it depends. As a daily search user, I’ve found many of these developments useful, especially on mobile. If I can get an answer at a glance or in an in-search entity, such as a Live Result for weather or sports, or the phone number and address of a local restaurant, it saves me time and the trouble of being familiar with the user interface of thousands of different websites. On the other hand, if I feel that I’m being run in circles through search after search or am being given fewer and fewer choices, that can feel manipulative and frustrating.

Is this fair to marketers?

Let’s be brutally honest — it doesn’t matter. Google has no obligation to us as marketers. Sites don’t deserve to rank and get traffic simply because we’ve spent time and effort or think we know all the tricks. I believe our relationship with Google can be symbiotic, but that’s a delicate balance and always in flux.

In some cases, I do think we have to take a deep breath and think about what’s good for our customers. As a marketer, local packs linking directly to in-Google properties is alarming — we measure our success based on traffic. However, these local panels are well-designed, consistent, and have easy access to vital information like business addresses, phone numbers, and hours. If these properties drive phone calls and foot traffic, should we discount their value simply because it’s harder to measure?

Is this fair to businesses?

This is a more interesting question. I believe that, like other search engines before it, Google made an unwritten pact with website owners — in exchange for our information and the privilege to monetize that information, Google would send us traffic. This is not altruism on Google’s part. The vast majority of Google’s $95B in 2017 advertising revenue came from search advertising, and that advertising would have no audience without organic search results. Those results come from the collective content of the web.

As Google replaces that content and sends more clicks back to themselves, I do believe that the fundamental pact that Google’s success was built on is gradually being broken. Google’s garden was built on our collective property, and it does feel like we’re slowly being herded out of our own backyards.

We also have to consider the deeper question of content ownership. If Google chooses to pursue private data partnerships — such as with Live Results or the original Knowledge Graph — then they own that data, or at least are leasing it fairly. It may seem unfair that they’re displacing us, but they have the right to do so.

Much of the Knowledge Graph is built on human-curated sources such as Wikidata (i.e. Wikipedia). While Google undoubtedly has an ironclad agreement with Wikipedia, what about the people who originally contributed and edited that content? Would they have done so knowing their content could ultimately displace other content creators (including possibly their own websites) in Google results? Are those contributors willing participants in this experiment? The question of ownership isn’t as easy as it seems.

If Google extracts the data we provide as part of the pact, such as with Featured Snippets and People Also Ask results, and begins to wall off those portions of the garden, then we have every right to protest. Even the concept of a partnership isn’t always black-and-white. Some job listing providers I’ve spoken with privately felt pressured to enter Google’s new jobs portal (out of fear of cutting off the paths to their own gardens), but they weren’t happy to see the new walls built.

Google is also trying to survive. Search has to evolve, and it has to answer questions and fit a rapidly changing world of device formats, from desktop to mobile to voice. I think the time has come, though, for Google to stop and think about the pact that built their nearly hundred-billion-dollar ad empire.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

MozCon 2018: Making the Case for the Conference (& All the Snacks!)

Posted by Danielle_Launders

You’ve got that conference looming on the horizon. You want to go — you’ve spent the past few years desperately following hashtags on Twitter, memorizing catchy quotes, zooming in on grainy snapshots of a deck, and furiously downloading anything and everything you can scour from Slideshare.

But there’s a problem: conferences cost money, and your boss won’t even approve a Keurig in the communal kitchen, much less a ticket to a three-day-long learning sesh complete with its own travel and lodging expenses.

What’s an education-hungry digital marketer to do?

How do you convince your boss to send you to the conference of your dreams?

First of all, you gather evidence to make your case.

There are a plethora of excellent reasons why attending conferences is good for your career (and your bottom line). In digital marketing, we exist in the ever-changing tech space, hurtling toward the future at breakneck speed and often missing the details of the scenery along the way.

A good SEO conference will keep you both on the edge of your seat and on the cutting-edge of what’s new and noteworthy in our industry, highlighting some of the most important and impactful things your work depends on.

A good SEO conference will flip a switch for you, will trigger that lightbulb moment that empowers you and levels you up as both a marketer and a critical thinker.

If that doesn’t paint a beautiful enough picture to convince the folks that hold the credit card, though, there are also some great statistics and resources available:

Specifically, we’re talking about MozCon

Yes, that MozCon!

Let’s just take a moment to address the elephant in the room here: you all know why we wrote this post. We want to see your smiling face in the audience at MozCon this July (the 9th–11th, if you were wondering). There are a few specific benefits worth mentioning:

  • Speakers and content: Our speakers bring their A-game each year. We work with them to bring the best content and latest trends to the stage to help set you up for a year of success.
  • Videos to share with your team: About a month or so after the conference, we’ll send you a link to professionally edited videos of every presentation at the conference. Your colleagues won’t get to partake in the morning Top Pot doughnuts or Starbucks coffee, but they will get a chance to learn everything you did, for free.
  • Great food onsite: We understand that conference food isn’t typically worth mentioning, but at MozCon you can expect snacks from local Seattle vendors – in the past this includes Trophy cupcakes, KuKuRuZa popcorn, Starbucks’ Seattle Reserve cold brew, and did we mention bacon at breakfast? Let’s not forget the bacon.
  • Swag: Expect to go home with a one-of-a-kind Roger Mozbot, a super-soft t-shirt from American Apparel, and swag worth keeping. We’ve given away Roger Legos, Moleskine notebooks, phone chargers, and have even had vending machines with additional swag in case you didn’t get enough.
  • Networking: You work hard taking notes, learning new insights, and digesting all of that knowledge — that’s why we think you deserve a little fun in the evenings to chat with fellow attendees. Each night after the conference, we’ll offer a different networking event that adds to the value you’ll get from your day of education.
  • A supportive network after the fact: Our MozCon Facebook group is incredibly active, and it’s grown to have a life of its own — marketers ask one another SEO questions, post jobs, look for and offer advice and empathy, and more. It’s a great place to find TAGFEE support and camaraderie long after the conference itself has ended.
  • Discounts for subscribers and groups: Moz Pro subscribers get a whopping $500 off their ticket cost (even if you’re on a free 30-day trial!) and there are discounts for groups as well, so make sure to take advantage of savings where you can!
  • Ticket cost: At MozCon our goal is to break even, which means we invest all of your ticket price back into you. Check out the full breakdown below:

Can you tell we’re serious about the snacks?

You can check out videos from years past to get a taste for the caliber of our speakers. We’ll also be putting out a call for community speaker pitches in April, so if you’ve been thinking about breaking into the speaking circuit, it could be an amazing opportunity — keep an eye on the blog for your chance to submit a pitch.

If you’ve ever seriously considered attending an SEO conference like MozCon, now’s the time to do it. You’ll save actual hundreds of dollars by grabbing subscriber or group pricing while you can (think of all the Keurigs you could get for that communal kitchen!), and you’ll be bound for an unforgettable experience that lives and grows with you beyond just the three days you spend in Seattle.

Grab your ticket to MozCon!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

How (and Whether) to Invest in and Structure Online Communities – Whiteboard Friday

Posted by randfish

Building an online community sounds like an attractive idea on paper. A group of enthusiastic, engaged users working on their own to boost your brand? What’s the hitch?

Well, building a thriving online community takes a great deal of effort, often with little return for a very long time. And there are other considerations: do you build your own platform, participate in an existing community, or a little of both? What are the benefits from a brand, SEO, and content marketing perspective? In this edition of Whiteboard Friday, Rand answers all your questions about building yourself an online community, including whether it’s an investment worth your time.

https://fast.wistia.net/embed/iframe/0gzvy51tmw?seo=false&videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

How and whether to invest in and structure online communities

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week, we’re chatting about how and whether to invest in and structure online communities.

I want to say a thank you to @DaveCraige on Twitter. Dave, thank you very much for the question, an excellent one. I think this is something that a lot of content marketers, web marketers, community builders think about is, “Should I be making an investment in building my own community? Should I leverage someone’s existing community? How can I do that and benefit from an SEO perspective and a content marketing and a brand awareness perspective?” So we’ll try and tackle some of those questions today on Whiteboard Friday.

Strategy first!

First off, before you go and invest anywhere or build anything, I urge you to think strategy first, meaning your business has goals. You have things that you want to accomplish. Maybe those are revenue growth or conversions. Maybe they have to do with entering a new sphere of influence or pursuing a new topic. Maybe you’re trying to launch a new product. Maybe you’re trying to pivot the business or disrupt yourself, change with technology.

Whatever those business goals are, they should lead you to marketing goals, the things that marketing can help to accomplish in those business goals. From that should fall out a bunch of tactics and initiatives. It’s only down here, in your marketing goals and tactical initiatives, that if online communities match up with those and serve your broader business goals, that you should actually make the investment. If not or if they fall below the line of, “Well, we can do three things that we think this year and do them well and this is thing number 4 or number 5 or number 10,” it doesn’t make the cut.

Online communities fit here if…

1. A passionate group of investment-worthy size exists in your topic.

So if, for example, you are targeting a new niche. I think Dave himself is in cryptocurrency. There’s definitely a passionate group of people in that sphere, and it is probably of investment-worthy size. More recently, that investment has been a little rocky, but certainly a large size group, and if you are targeting that group, a community could be worthwhile. So we have passion. We have a group. They are of sizable investment.

2. You/your brand/your platform can provide unique value via a community that’s superior to what’s available elsewhere.

Second, you or your brand or your platform can provide not just value but unique value, unique value different from what other people are offering via a community superior to what’s available elsewhere. Dave might himself say, “Well, there’s a bunch of communities around crypto, but I believe that I can create X, which will be unique in ways Y and Z and will be preferable for these types of participants in this way.” Maybe that’s because it enables sharing in certain ways. Maybe it enables transparency of certain kinds. Maybe it’s because it has no vested interest or ties to particular currencies or particular companies, whatever the case may be.

3. You’re willing to invest for years with little return to build something of great long-term value.

And last but not least, you’re willing to invest potentially for years, literally years without return or with very little return to build something of great long-term value. I think this is the toughest one. But communities are most similar in attribute to content marketing, where you’re going to put in a ton of upfront effort and a lot of ongoing effort before you’re going to see that return. Most of the time, most communities fail because the people behind them were not willing to make the investments to build them up, or they made other types of mistakes. We’ll talk about that in a second.

Two options: Build your own platform, or participate in an existing community

You have two options here. First, you can build your own platform. Second, you can participate in an existing community. My advice on this is never do number one without first spending a bunch of time in number two.

So if you are unfamiliar with the community platforms that already exist in interior decorating or in furniture design or in cryptocurrency or in machining tools or in men’s fashion, first participate in the communities that already exist in the space you’re targeting so that you are very familiar with the players, the platforms, the options, and opportunities. Otherwise, you will never know whether it’s an investment-worthy size, a passionate group. You’ll never know how or whether you can provide unique value. It’s just going to be too tough to get those things down. So always invest in the existing communities before you do the other one.

1. Build your own platform

Potential structures

Let’s talk quickly about building your own platform, and then we’ll talk about investing in others. If you’re deciding that what matches your goals best and your strategy best is to build your own platform, there are numerous opportunities. You can do it sort of halfway, where you build on someone else’s existing platform, for example creating your own subreddit or your own Facebook or LinkedIn group, which essentially uses another community’s platform, but you’re the owner and administrator of that community.

Or you can actually build your own forum or discussion board, your own blog and comments section, your own Q&A board, your own content leaderboard, like Hacker News or like Dharmesh and I did with Inbound.org, where we essentially built a Reddit or Hacker News-like clone for marketers.

Those investments are going to be much more severe than a Facebook group or a Twitter hashtag, a Twitter chat or a LinkedIn group, or those kinds of things, but you want to compare the benefits and drawbacks. In each, there are some of each.

Benefits & drawbacks

So forums and discussions, those are going to have user-generated content, which is a beautiful thing because it scales non-linearly with your investment. So if you build up a community of people who are on an ongoing basis creating topics and answering those topics and talking about those things in either a Q&A board or a forum discussion or a content leaderboard, what’s great is you get that benefit, that SEO benefit of having a bunch of longtail, hopefully high-quality content and discussion you’re going to need to do.

Mostly, what you’re going to worry about is drawbacks like the graveyard effect, where the community appears empty and so no one participates and maybe it drags down Google’s perception of your site because you have a bunch of low quality or thin pages, or people leave a bunch of spam in there or they become communities filled with hate groups, and the internet can devolve very quickly, as you can see from a lot of online communities.

Whatever you’re doing, blog and comments, you get SEO benefits, you get thought leadership benefits, but it requires regular content investments. You don’t get the UGC benefit quite like you would with a forum or a discussion. With Facebook groups or LinkedIn groups, Twitter hashtags, it’s easy to build, but there’s no SEO benefit, usually very little to none.

With a Q&A board, you do get UGC and SEO. You still have those same moderation and graveyard risks.

With content leaderboards, they’re very difficult to maintain, Inbound.org being a good example, where Dharmesh and I figured, “Hey, we can get this thing up and rolling,” and then it turns out no, we need to hire people and maintain it and put in a bunch of effort and energy. But it can become a bookmarkable destination, which means you get repeat traffic over and over.

Whatever you’re choosing, make sure you list out these benefits and then align these with the strategy, the marketing goal, the tactics and initiatives that flow from those. That’s going to help determine how you should structure, whether you should structure your own community.

2. Participate in existing communities

Size/reach

The other option is participating in existing ones, places like Quora, subreddits, Twitter, LinkedIn groups, existing forums. Same thing, you’re going to take these. Well, we can participate on an existing forum, and we can see that the size and reach is on average about nine responses per thread, about three threads per day, three new threads per day.

Benefits & drawbacks

The benefit is that it can build up our thought leadership and our recognition among this group of influential people in our field. The drawback is it builds our competitor’s content, because this forum is going to be ranking for all those things. They own the platform. It’s not our owned platform. Then we align that with our goals and initiatives.

Four bits of advice

1. If you build, build for SEO + owned channels. Don’t create on someone else’s platform.

So I’m not going to dive through all of these, but I do want to end on some bits of advice. So I mentioned already make sure you invest in other people’s communities before you build your own. I would also add to that if you’re going to build something, if you’re going to build your own, I would generally rule these things out — LinkedIn groups, Facebook groups, Twitter hashtag groups. Why? Because those other platforms control them, and then they can change them at any time and your reach can change on those platforms. I would urge you to build for SEO and for an owned media channel.

2. Start with a platform that doesn’t lose credibility when empty (e.g. blog > forum).

Second, I’d start with a platform that doesn’t lose credibility when it’s empty. That is to say if you know you want to build a forum or a content leaderboard or a Q&A board, make it something where you know that you and your existing team can do all the work to create a non-graveyard-like environment initially. That could mean limiting it to only a few sections in a forum, or all the Q&A goes in one place as opposed to there are many subforums that have zero threads and zero comments and zero replies, or every single thing that’s posted, we know that at least two of us internally will respond to them, that type of stuff.

3. Don’t use a subdomain or separate domain.

Third, if you can, avoid using a subdomain and definitely don’t use a separate domain. Subdomains inherit some of the ranking ability and benefits of the primary domain they’re on. Separate domains tend to inherit almost none.

4. Before you build, gather a willing, excited crew via an email group who will be your first active members.

Last, but not least, before you build, gather a willing, excited group of people, your crew, hopefully via an email group — this has served me extremely well — who will be those first active members.

So if you’re building something in the crypto space, as maybe Dave is considering, I might say to him, hey, find those 10 or 15 or 20 people who are in your world, who you talk to online already, create an email group, all be chatting with each other and contributing. Then start your Q&A board, or then start your blog and your comments section, or then start your forum, what have you. If you can seed it with that initial passionate group, you will get over a lot of the big hurdles around building or rolling your own community system.

All right, everyone. Hope you’ve enjoyed this edition of Whiteboard Friday, and we’ll see you again next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

How to Deal with Fake Negative Reviews on Google

Posted by JoyHawkins

Fake reviews are a growing problem for those of us that own small businesses. In the online world, it’s extremely easy to create a new account and leave either a positive or negative review for any business — regardless of whether you’ve ever tried to hire them.

Google has tons of policies for users that leave reviews. But in my experience they’re terrible at automatically catching violations of these policies. At my agency, my team spends time each month carefully monitoring reviews for our clients and their competitors. The good news is that if you’re diligent at tracking them and can make a good enough case for why the reviews are against the guidelines, you can get them removed by contacting Google on Twitter, Facebook, or reporting via the forum.

Recently, my company got hit with three negative reviews, all left in the span of 5 minutes:

Two of the three reviews were ratings without reviews. These are the hardest to get rid of because Google will normally tell you that they don’t violate the guidelines — since there’s no text on them. I instantly knew they weren’t customers because I’m really selective about who I work with and keep my client base small intentionally. I would know if someone that was paying me was unhappy.

The challenge with negative reviews on Google

The challenge is that Google doesn’t know who your customers are, and they won’t accept “this wasn’t a customer” as an acceptable reason to remove a review, since they allow people to use anonymous usernames. In most cases, it’s extremely difficult to prove the identity of someone online.

The other challenge is that a person doesn’t have to be a customer to be eligible to leave a review. They have to have a “customer experience,” which could be anything from trying to call you and getting your voicemail to dropping by your office and just browsing around.

How to respond

When you work hard to build a good, ethical business, it’s always infuriating when a random person has the power to destroy what took you years to build. I’d be lying if I said I wasn’t the least bit upset when these reviews came in. Thankfully, I was able to follow the advice I’ve given many people in the last decade, which is to calm down and think about what your future prospects will see when they come across review and the way you respond to it.

Solution: Share your dilemma

I decided to post on Twitter and Facebook about my lovely three negative reviews, and the response I got was overwhelming. People had really great and amusing things to say about my dilemma.

https://platform.twitter.com/widgets.js

Whoever was behind these three reviews was seeking to harm my business. The irony is that they actually helped me, because I ended up getting three new positive reviews as a result of sharing my experience with people that I knew would rally behind me.

For most businesses, your evangelists might not be on Twitter, but you could post about it on your personal Facebook profile. Any friends that have used your service or patronized your business would likely respond in the same manner. It’s important to note that I never asked anyone to review me when posting this — it was simply the natural response from people that were a fan of my company and what we stand for. If you’re a great company, you’ll have these types of customers and they should be the people you want to share this experience with!

But what about getting the negative reviews removed?

In this case, I was able to get the three reviews removed. However, there have also been several cases where I’ve seen Google refuse to remove them for others. My plan B was to post a response to the reviews offering these “customers” a 100% refund. After all, 100% of zero is still zero — I had nothing to lose. This would also ensure that future prospects see that I’m willing to address people that have a negative experience, since even the best businesses in the world aren’t perfect. As much as I love my 5-star rating average, studies have shown that 4.2–4.5 is actually the ideal average star rating for purchase probability.

Have you had an experience with fake negative reviews on Google? If so, I’d love to hear about it, so please leave a comment.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The Google Ranking Factor You Can Influence in an Afternoon [Case Study]

Posted by sanfran

What does Google consider “quality content”? And how do you capitalize on a seemingly subjective characteristic to improve your standing in search?

We’ve been trying to figure this out since the Hummingbird algorithm was dropped in our laps in 2013, prioritizing “context” over “keyword usage/frequency.” This meant that Google’s algorithm intended to understand the meaning behind the words on the page, rather than the page’s keywords and metadata alone.

This new sea change meant the algorithm was going to read in between the lines in order to deliver content that matched the true intent of someone searching for a keyword.

Write longer content? Not so fast!

Watching us SEOs respond to Google updates is hilarious. We’re like a floor full of day traders getting news on the latest cryptocurrency.

One of the most prominent theories that made the rounds was that longer content was the key to organic ranking. I’m sure you’ve read plenty of articles on this. We at Brafton, a content marketing agency, latched onto that one for a while as well. We even experienced some mixed success.

However, what we didn’t realize was that when we experienced success, it was because we accidentally stumbled on the true ranking factor.

Longer content alone was not the intent behind Hummingbird.

Content depth

Let’s take a hypothetical scenario.

If you were to search the keyword “search optimization techniques,” you would see a SERP that looks similar to the following:

Nothing too surprising about these results.

However, if you were to go through each of these 10 results and take note of the major topics they discussed, theoretically you would have a list of all the topics being discussed by all of the top ranking sites.

Example:

Position 1 topics discussed: A, C, D, E, F

Position 2 topics discussed: A, B, F

Position 3 topics discussed: C, D, F

Position 4 topics discussed: A, E, F

Once you finished this exercise, you would have a comprehensive list of every topic discussed (A–F), and you would start to see patterns of priority emerge.

In the example above, note “topic F” is discussed in all four pieces of content. One would consider this a cornerstone topic that should be prioritized.

If you were then to write a piece of content that covered each of the topics discussed by every competitor on page one, and emphasized the cornerstone topics appropriately, in theory, you would have the most comprehensive piece of content on that particular topic.

By producing the most comprehensive piece of content available, you would have the highest quality result that will best satisfy the searcher’s intent. More than that, you would have essentially created the ultimate resource center for everything a person would want to know about that topic.

How to identify topics to discuss in a piece of content

At this point, we’re only theoretical. The theory makes logical sense, but does it actually work? And how do we go about scientifically gathering information on topics to discuss in a piece of content?

Finding topics to cover:

  • Manually: As discussed previously, you can do it manually. This process is tedious and labor-intensive, but it can be done on a small scale.
  • Using SEMrush: SEMrush features an SEO content template that will provide guidance on topic selection for a given keyword.
  • Using MarketMuse: MarketMuse was originally built for the very purpose of content depth, with an algorithm that mimics Hummingbird. MM takes a largely unscientific process and makes it scientific. For the purpose of this case study, we used MarketMuse.

The process

https://fast.wistia.net/assets/external/E-v1.js

Watch the process in action

https://fast.wistia.net/assets/external/E-v1.js

1. Identify content worth optimizing

We went through a massive list of keywords our blog ranked for. We filtered that list down to keywords that were not ranking number one in SERPs but had strong intent. You can also do this with core landing pages.

Here’s an example: We were ranking in the third position for the keyword “financial content marketing.” While this is a low-volume keyword, we were enthusiastic to own it due to the high commercial intent it comes with.

2. Evaluate your existing piece

Take a subjective look at your piece of content that is ranking for the keyword. Does it SEEM like a comprehensive piece? Could it benefit from updated examples? Could it benefit from better/updated inline embedded media? With a cursory look at our existing content, it was clear that the examples we used were old, as was the branding.

3. Identify topics

As mentioned earlier, you can do this in a few different ways. We used MarketMuse to identify the topics we were doing a good job of covering as well as our topic gaps, topics that competitors were discussing, but we were not. The results were as follows:

Topics we did a good job of covering:

  • Content marketing impact on branding
  • Impact of using case studies
  • Importance of infographics
  • Business implications of a content marketing program
  • Creating articles for your audience

Topics we did a poor job of covering:

  • Marketing to millennials
  • How to market to existing clients
  • Crafting a content marketing strategy
  • Identifying and tracking goals

4. Rewrite the piece

Considering how out-of-date our examples were, and the number of topics we had neglected to discuss, we determined a full rewrite of the piece was warranted. Our writer, Mike O’Neill, was given the topic guidance, ensuring he had a firm understanding of everything that needed to be discussed in order to create a comprehensive article.

5. Update the content

To maintain our link equity, we kept the same URL and simply updated the old content with the new. Then we updated the publish date. The new article looks like this, with updated content depth, modern branding, and inline visuals.

6. Fetch as Google

Rather than wait for Google to reindex the content, I wanted to see the results immediately (and it is indeed immediate).

7. Check your results

Open an incognito window and see your updated position.

Promising results:

We have run more than a dozen experiments and have seen positive results across the board. As demonstrated in the video, these results are usually realized within 60 seconds of reindexing the updated content.

Keyword target

Old Ranking

New ranking

“Financial content marketing”

3

1

“What is a subdomain”

16

6

“Best company newsletters”

32

4

“Staffing marketing”

7

3

“Content marketing agency”

16

1

“Google local business cards”

16

5

“Company blog”

7

4

“SEO marketing tools”

9

3

Of those tests, here’s another example of this process in action for the keyword, “best company newsletters.”

Before:

After

Assumptions:

From these results, we can assume that content depth and breadth of topic coverage matters — a lot. Google’s algorithm seems to have an understanding of the competitive topic landscape for a keyword. In our hypothetical example from before, it would appear the algorithm knows that topics A–F exist for a given keyword and uses that collection of topics as a benchmark for content depth across competitors.

We can also assume Google’s algorithm either a.) responds immediately to updated information, or b.) has a cached snapshot of the competitive content depth landscape for any given keyword. Either of these scenarios is very likely because of the speed at which updated content is re-ranked.


In conclusion, don’t arbitrarily write long content and call it “high quality.” Choose a keyword you want to rank for and create a comprehensive piece of content that fully supports that keyword. There is no guarantee you’ll be granted a top position — domain strength factors play a huge role in rankings — but you’ll certainly improve your odds, as we have seen.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The Biggest Mistake Digital Marketers Ever Made: Claiming to Measure Everything

Posted by willcritchlow

Digital marketing is measurable.

It’s probably the single most common claim everyone hears about digital, and I can’t count the number of times I’ve seen conference speakers talk about it (heck, I’ve even done it myself).

I mean, look at those offline dinosaurs, the argument goes. They all know that half their spend is wasted — they just don’t know which half.

Maybe the joke’s on us digital marketers though, who garnered only 41% of global ad spend even in 2017 after years of strong growth.

Unfortunately, while we were geeking out about attribution models and cross-device tracking, we were accidentally triggering a common human cognitive bias that kept us anchored on small amounts, leaving buckets of money on the table and fundamentally reducing our impact and access to the C-suite.

And what’s worse is that we have convinced ourselves that it’s a critical part of what makes digital marketing great. The simplest way to see this is to realize that, for most of us, I very much doubt that if you removed all our measurement ability we’d reduce our digital marketing investment to nothing.

In truth, of course, we’re nowhere close to measuring all the benefits of most of the things we do. We certainly track the last clicks, and we’re not bad at tracking any clicks on the path to conversion on the same device, but we generally suck at capturing:

  • Anything that happens on a different device
  • Brand awareness impacts that lead to much later improvements in conversion rate, average order value, or lifetime value
  • Benefits of visibility or impressions that aren’t clicked
  • Brand affinity generally

The cognitive bias that leads us astray

All of this means that the returns we report on tend to be just the most direct returns. This should be fine — it’s just a floor on the true value (“this activity has generated at least this much value for the brand”) — but the “anchoring” cognitive bias means that it messes with our minds and our clients’ minds. Anchoring is the process whereby we fixate on the first number we hear and subsequently estimate unknowns closer to the anchoring number than we should. Famous experiments have shown that even showing people a totally random number can drag their subsequent estimates up or down.

So even if the true value of our activity was 10x the measured value, we’d be stuck on estimating the true value as very close to the single concrete, exact number we heard along the way.

This tends to result in the measured value being seen as a ceiling on the true value. Other biases like the availability heuristic (which results in us overstating the likelihood of things that are easy to remember) tend to mean that we tend to want to factor in obvious ways that the direct value measurement could be overstating things, and leave to one side all the unmeasured extra value.

The mistake became a really big one because fortunately/unfortunately, the measured return in digital has often been enough to justify at least a reasonable level of the activity. If it hadn’t been (think the vanishingly small number of people who see a billboard and immediately buy a car within the next week when they weren’t otherwise going to do so) we’d have been forced to talk more about the other benefits. But we weren’t. So we lazily talked about the measured value, and about the measurability as a benefit and a differentiator.

The threats of relying on exact measurement

Not only do we leave a whole load of credit (read: cash) on the table, but it also leads to threats to measurability being seen as existential threats to digital marketing activity as a whole. We know that there are growing threats to measuring accurately, including regulatory, technological, and user-behavior shifts:

Now, imagine that the combination of these trends meant that you lost 100% of your analytics and data. Would it mean that your leads stopped? Would you immediately turn your website off? Stop marketing?

I suggest that the answer to all of that is “no.” There’s a ton of value to digital marketing beyond the ability to track specific interactions.

We’re obviously not going to see our measurable insights disappear to zero, but for all the reasons I outlined above, it’s worth thinking about all the ways that our activities add value, how that value manifests, and some ways of proving it exists even if you can’t measure it.

How should we talk about value?

There are two pieces to the brand value puzzle:

  1. Figuring out the value of increasing brand awareness or affinity
  2. Understanding how our digital activities are changing said awareness or affinity

There’s obviously a lot of research into brand valuations generally, and while it’s outside the scope of this piece to think about total brand value, it’s worth noting that some methodologies place as much as 75% of the enterprise value of even some large companies in the value of their brands:

Image source

My colleague Tom Capper has written about a variety of ways to measure changes in brand awareness, which attacks a good chunk of the second challenge. But challenge #1 remains: how do we figure out what it’s worth to carry out some marketing activity that changes brand awareness or affinity?

In a recent post, I discussed different ways of building marketing models and one of the methodologies I described might be useful for this – namely so-called “top-down” modelling which I defined as being about percentages and trends (as opposed to raw numbers and units of production).

The top-down approach

I’ve come up with two possible ways of modelling brand value in a transactional sense:

1. The Sherlock approach

When you have eliminated the impossible, whatever remains, however improbable, must be the truth.”
Sherlock Holmes

The outline would be to take the total new revenue acquired in a period. Subtract from this any elements that can be attributed to specific acquisition channels; whatever remains must be brand. If this is in any way stable or predictable over multiple periods, you can use it as a baseline value from which to apply the methodologies outlined above for measuring changes in brand awareness and affinity.

2. Aggressive attribution

If you run normal first-touch attribution reports, the limitations of measurement (clearing cookies, multiple devices etc) mean that you will show first-touch revenue that seems somewhat implausible (e.g. email; email surely can’t be a first-touch source — how did they get on your email list in the first place?):

Click for a larger version

In this screenshot we see that although first-touch dramatically reduces the influence of direct, for instance, it still accounts for more than 15% of new revenue.

The aggressive attribution model takes total revenue and splits it between the acquisition channels (unbranded search, paid social, referral). A first pass on this would simply split it in the relative proportion to the size of each of those channels, effectively normalizing them, though you could build more sophisticated models.

Note that there is no way of perfectly identifying branded/unbranded organic search since (not provided) and so you’ll have to use a proxy like homepage search vs. non-homepage search.

But fundamentally, the argument here would be that any revenue coming from a “first touch” of:

  • Branded search
  • Direct
  • Organic social
  • Email

…was actually acquired previously via one of the acquisition channels and so we attempt to attribute it to those channels.

Even this under-represents brand value

Both of those methodologies are pretty aggressive — but they might still under-represent brand value. Here are two additional mechanics where brand drives organic search volume in ways I haven’t figured out how to measure yet:

Trusting Amazon to rank

I like reading on the Kindle. If I hear of a book I’d like to read, I’ll often Google the name of the book on its own and trust that Amazon will rank first or second so I can get to the Kindle page to buy it. This is effectively a branded search for Amazon (and if it doesn’t rank, I’ll likely follow up with a [book name amazon] search or head on over to Amazon to search there directly).

But because all I’ve appeared to do is search [book name] on Google and then click through to Amazon, there is nothing to differentiate this from an unbranded search.

Spotting brands you trust in the SERPs

I imagine we all have anecdotal experience of doing this: you do a search and you spot a website you know and trust (or where you have an account) ranking somewhere other than #1 and click on it regardless of position.

One time that I can specifically recall noticing this tendency growing in myself was when I started doing tons more baby-related searches after my first child was born. Up until that point, I had effectively zero brand affinity with anyone in the space, but I quickly grew to rate the content put out by babycentre (babycenter in the US) and I found myself often clicking on their result in position 3 or 4 even when I hadn’t set out to look for them, e.g. in results like this one:

It was fascinating to me to observe this behavior in myself because I had no real interaction with babycentre outside of search, and yet, by consistently ranking well across tons of long-tail queries and providing consistently good content and user experience I came to know and trust them and click on them even when they were outranked. I find this to be a great example because it is entirely self-contained within organic search. They built a brand effect through organic search and reaped the reward in increased organic search.

I have essentially no ideas on how to measure either of these effects. If you have any bright ideas, do let me know in the comments.

Budgets will come under pressure

My belief is that total digital budgets will continue to grow (especially as TV continues to fragment), but I also believe that individual budgets are going to come under scrutiny and pressure making this kind of thinking increasingly important.

We know that there is going to be pressure on referral traffic from Facebook following the recent news feed announcements, but there is also pressure on trust in Google:

While I believe that the opportunity is large and still growing (see, for example, this slide showing Google growing as a referrer of traffic even as CTR has declined in some areas), it’s clear that the narrative is going to lead to more challenging conversations and budgets under increased scrutiny.

Can you justify your SEO investment?

What do you say when your CMO asks what you’re getting for your SEO investment?

What do you say when she asks whether the organic search opportunity is tapped out?

I’ll probably explore the answers to both these questions more in another post, but suffice it to say that I do a lot of thinking about these kinds of questions.

The first is why we have built our split-testing platform to make organic SEO investments measurable, quantifiable and accountable.

The second is why I think it’s super important to remember the big picture while the media is running around with hair on fire. Media companies saw Facebook overtake Google as a traffic channel (and then are likely seeing that reverse right now), but most of the web has Google as the largest and growing source of traffic and value.

The reality (from clickstream data) is that it’s really easy to forget how long the long-tail is and how sparse search features and ads are on the extreme long-tail:

  1. Only 3–4% of all searches result in a click on an ad, for example. Google’s incredible (and still growing) business is based on a small subset of commercial searches
  2. Google’s share of all outbound referral traffic across the web is growing (and Facebook’s is shrinking as they increasingly wall off their garden)

The opportunity is for smart brands to capitalize on a growing opportunity while their competitors sink time and money into a social space that is increasingly all about Facebook, and increasingly pay-to-play.

What do you think? Are you having these hard conversations with leadership? How are you measuring your digital brand’s value?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Using the Cross Domain Rel=Canonical to Maximize the SEO Value of Cross-Posted Content – Whiteboard Friday

Posted by randfish

Same content, different domains? There’s a tag for that. Using rel=canonical to tell Google that similar or identical content exists on multiple domains has a number of clever applications. You can cross-post content across several domains that you own, you can benefit from others republishing your own content, rent or purchase content on other sites, and safely use third-party distribution networks like Medium to spread the word. Rand covers all the canonical bases in this not-to-be-missed edition of Whiteboard Friday.

https://fast.wistia.net/embed/iframe/qsmvv6edgb?seo=false&videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

Using the Cross Domain Rel=Canonical to Maximize the SEO Value of X-Posted Content

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about the cross-domain rel=canonical tag. So we’ve talked about rel=canonical a little bit and how it can be used to take care of duplicate content issues, point Google to the right pages from potentially other pages that share similar or exactly the same content. But cross-domain rel=canonical is a unique and uniquely powerful tool that is designed to basically say, “You know what, Google? There is the same content on multiple different domains.”

So in this simplistic example, MyFriendSite.com/green-turtles contains this content that I said, “Sure, it’s totally fine for you, my friend, to republish, but I know I don’t want SEO issues. I know I don’t want duplicate content. I know I don’t want a problem where my friend’s site ends up outranking me, because maybe they have better links or other ranking signals, and I know that I would like any ranking credit, any link or authority signals that they accrue to actually come to my website.

There’s a way that you can do this. Google introduced it back in 2009. It is the cross-domain rel=canonical. So essentially, in the header tag of the page, I can add this link, rel=canonical href — it’s a link tag, so there’s an href — to the place where I want the link or the canonical, in this case, to point to and then close the tag. Google will transfer over, this is an estimate, but roughly in the SEO world, we think it’s pretty similar to what you get in a 301 redirect. So something above 90% of the link authority and ranking signals will transfer from FriendSite.com to MySite.com.

So my green turtles page is going to be the one that Google will be more likely to rank. As this one accrues any links or other ranking signals, that authority, those links should transfer over to my page. That’s an ideal situation for a bunch of different things. I’ll talk about those in a sec.

Multiple domains and pages can point to any URL

Multiple domains and pages are totally cool to point to any URL. I can do this for FriendSite.com. I can also do this for TurtleDudes.com and LeatherbackFriends.net and SeaTees.com and NatureIsLit.com. All of them can contain this cross-domain rel=canonical pointing back to the site or the page that I want it to go to. This is a great way to potentially license content out there, give people republishing permissions without losing any of the SEO value.

A few things need to match:

I. The page content really does need to match

That includes things like text, images, if you’ve embedded videos, whatever you’ve got on there.

II. The headline

Ideally, should match. It’s a little less crucial than the page content, but probably you want that headline to match.

III. Links (in content)

Those should also match. This is a good way to make sure. You check one, two, three. This is a good way to make sure that Google will count that rel=canonical correctly.

Things that don’t need to match:

I. The URL

No, it’s fine if the URLs are different. In this case, I’ve got NatureIsLit.com/turtles/p?id=679. That’s okay. It doesn’t need to be green-turtles. I can have a different URL structure on my site than they’ve got on theirs. Google is just fine with that.

II. The title of the piece

Many times the cross-domain rel=canonical is used with different page titles. So if, for example, CTs.com wants to publish the piece with a different title, that’s okay. I still generally recommend that the headlines stay the same, but okay to have different titles.

III. The navigation

IV. Site branding

So all the things around the content. If I’ve got my page here and I have like nav elements over here, nav elements down here, maybe a footer down here, a nice little logo up in the top left, that’s fine if those are totally different from the ones that are on these other pages cross-domain canonically. That stuff does not need to match. We’re really talking about the content inside the page that Google looks for.

Ways to use this protocol

Some great ways to use the cross-domain rel=canonical.

1. If you run multiple domains and want to cross-post content, choose which one should get the SEO benefits and rankings.

If you run multiple domains, for whatever reason, let’s say you’ve got a set of domains and you would like the benefit of being able to publish a single piece of content, for whatever reason, across multiples of these domains that you own, but you know you don’t want to deal with a duplicate content issue and you know you’d prefer for one of these domains to be the one receiving the ranking signals, cross-domain rel=canonical is your friend. You can tell Google that Site A and Site C should not get credit for this content, but Site B should get all the credit.

The issue here is don’t try and do this across multiple domains. So don’t say, “Oh, Site A, why don’t you rel=canonical to B, and Site C, why don’t you rel=canonical to D, and I’ll try and get two things ranked in the top.” Don’t do that. Make sure all of them point to one. That is the best way to make sure that Google respects the cross-domain rel=canonical properly.

2. If a publication wants to re-post your content on their domain, ask for it instead of (or in addition to) a link back.

Second, let’s say a publication reaches out to you. They’re like, “Wow. Hey, we really like this piece.” My wife, Geraldine, wrote a piece about Mario Batali’s sexual harassment apology letter and the cinnamon rolls recipe that he strangely included in this apology. She baked those and then wrote about it. It went quite viral, got a lot of shares from a ton of powerful and well-networked people and then a bunch of publications. The Guardian reached out. An Australian newspaper reached out, and they said, “Hey, we would like to republish your piece.” Geraldine talked to her agent, and they set up a price or whatever.

One of the ways that you can do this and benefit from it, not just from getting a link from The Guardian or some other newspaper, but is to say, “Hey, I will be happy to be included here. You don’t even have to give me, necessarily, if you don’t want to, author credit or link credit, but I do want that sweet, sweet rel=canonical.” This is a great way to maximize the SEO benefit of being posted on someone else’s site, because you’re not just receiving a single link. You’re receiving credit from all the links that that piece might generate.

Oops, I did that backwards. You want it to come from their site to your site. This is how you know Whiteboard Friday is done in one take.

3. Purchase/rent content from other sites without forcing them to remove the content from their domain.

Next, let’s say I am in the opposite situation. I’m the publisher. I see a piece of content that I love and I want to get that piece. So I might say, “Wow, that piece of content is terrific. It didn’t do as well as I thought it would do. I bet if we put it on our site and broadcast it with our audience, it would do incredibly well. Let’s reach out to the author of the piece and see if we can purchase or rent for a time period, say two years, for the next two years we want to put the cross-domain rel=canonical on your site and point it back to us and we want to host that content. After two years, you can have it back. You can own it again.”

Without forcing them to remove the content from their site, so saying you, publisher, you author can keep it on your site. We don’t mind. We’d just like this tag applied, and we’d like to able to have republishing permissions on our website. Now you can get the SEO benefits of that piece of content, and they can, in exchange, get some money. So your site sending them some dollars, their site sending you the rel=canonical and the ranking authority and the link equity and all those beautiful things.

4. Use Medium as a content distribution network without the drawback of duplicate content.

Number four, Medium. Medium is a great place to publish content. It has a wide network, people who really care about consuming content. Medium is a great distribution network with one challenge. If you post on Medium, people worry that they can’t post the same thing on their own site because you’ll be competing with Medium.com. It’s a very powerful domain. It tends to rank really well. So duplicate content is an issue, and potentially losing the rankings and the traffic that you would get from search and losing that to Medium is no fun.

But Medium has a beautiful thing. The cross-domain rel=canonical is built in to their import tool. So if you go to Medium.com/p/import and you are logged in to your Medium account, you can enter in their URL field the content that you’ve published on your own site. Medium will republish it on your account, and they will include the cross-domain rel=canonical back to you. Now, you can start thinking of Medium as essentially a distribution network without the penalties or problems of duplicate content issues. Really, really awesome tool. Really awesome that Medium is offering this. I hope it sticks around.

All right, everyone. I think you’re going to have some excellent additional ideas for the cross-domain rel=canonical and how you have used it. We would love you to share those in the comments below, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reading Between the Lines: A 3-Step Guide to Reviewing Web Page Content

Posted by Jackie.Francis

In SEO, reviewing content is an unavoidable yet extremely important task. As the driving factor that brings people to a page, best practice dictates that we do what we can to ensure that the work we’ve invested hours and resources into creating remains impactful and relevant over time. This requires occasionally going back and re-evaluating our content to identify areas that can be improved.

That being said, if you’ve ever done a content review, you know how surprisingly challenging this is. A large variety of formats and topics alongside the challenge of defining “good” content makes it hard to pick out the core elements that matter. Without these universal focus areas, you may end up neglecting an element (e.g. tone of voice) in one instance but paying special attention to that same element in another.

Luckily there are certain characteristics — like good spelling, appealing layouts, and relevant keywords — that are universally associated with what we would consider “good” content. In this three-step guide, I’ll show you how to use these characteristics (or elements, as I like to call them) to define your target audience, measure the performance of your content using a scorecard, and assess your changes for quality assurance as part of a review process that can be applied to nearly all types of content across any industry.


Step 1: Know your audience

Arguably the most important step mentioned in this post, knowing your target reader will identify the details that should make up the foundation of your content. This includes insight into the reader’s intent, the ideal look and feel of the page, and the goals your content’s message should be trying to achieve.

To get to this point, however, you first need to answer these two questions:

  1. What does my target audience look like?
  2. Why are they reading my content?

What does my target audience look like?

The first question relies on general demographic information such as age, gender, education, and job title. This gives a face to the ideal audience member(s) and the kind of information that would best suit them. For example, if targeting stay-at-home mothers between the ages of 35 and 40 with two or more kids under the age of 5, we can guess that she has a busy daily schedule, travels frequently for errands, and constantly needs to stay vigilant over her younger children. So, a piece that is personable, quick, easy to read on-the-go, and includes inline imagery to reduce eye fatigue would be better received than something that is lengthy and requires a high level of focus.

Why are they reading my content?

Once you have a face to your reader, the second question must be answered to understand what that reader wants from your content and if your current product is effectively meeting those needs. For example, senior-level executives of mid- to large-sized companies may be reading to become better informed before making an important decision, to become more knowledgeable in their field, or to use the information they learn to teach others. Other questions you may want to consider asking:

  • Are they reading for leisure or work?
  • Would they want to share this with their friends on social media?
  • Where will they most likely be reading this? On the train? At home? Waiting in line at the store?
  • Are they comfortable with long blocks of text, or would inline images be best?
  • Do they prefer bite-sized information or are they comfortable with lengthy reports?

You can find the answers to these questions and collect valuable demographic and psychographic information by using a combination of internal resources, like sales scripts and surveys, and third-party audience insight tools such as Google Analytics and Facebook Audience Insights. With these results you should now have a comprehensive picture of your audience and can start identifying the parts of your content that can be improved.


Step 2: Tear apart your existing content

Now that you understand who your audience is, it’s time to get to the real work: assessing your existing content. This stage requires breaking everything apart to identify the components you should keep, change, or discard. However, this task can be extremely challenging because the performance of most components — such as tone of voice, design, and continuity — can’t simply be bucketed into binary categories like “good” or “bad.” Rather, they fall into a spectrum where the most reasonable level of improvement falls somewhere in the middle. You’ll see what I mean by this statement later on, but one of the most effective ways to evaluate and measure the degree of optimization needed for these components is to use a scorecard. Created by my colleague, Ben Estes, this straightforward, reusable, and easy to apply tool can help you objectively review the performance of your content.

Make a copy of the Content Review Grading Rubric

Note: The card sampled here, and the one I personally use for similar projects, is a slightly altered version of the original.

As you can see, the card is divided into two categories: Writing and Design. Listed under each category are elements that are universally needed to create a good content and should be examined. Each point is assigned a grading scale ranging from 1–5, with 1 being the worst score and 5 being best.

To use, start by choosing a part of your page to look at first. Order doesn’t matter, so whether you choose to first check “spelling and grammar” or “continuity” is up to you. Next, assign it a score on a separate Excel sheet (or mark it directly on the rubric) based on its current performance. For example, if the copy has no spelling errors but some minor grammar issues, you would rank “spelling and grammar” as a four (4).

Finally, repeat this process until all elements are graded. Remember to stay impartial to give an honest assessment.

Once you’re done, look at each grade and see where it falls on the scale. Ideally each element should have a score of 4 or greater, although a grade of 5 should only be given out sparingly. Tying back to my spectrum comment from earlier, a 5 is exclusively reserved for top-level work and should be something to strive for but will typically take more effort to achieve than it is worth. A grade of 4 is often the highest and most reasonable goal to attempt for, in most instances.

A grade of 3 or below indicates an opportunity for improvement and that significant changes need to be made.

If working with multiple pieces of content at once, the grading system can also be used to help prioritize your workload. Just collect the average writing or design score and sort them in ascending/descending order. Pages with a lower average indicate poorer performance and should be prioritized over pages whose averages are higher.

Whether you choose to use this scorecard or make your own, what you review, the span of the grading scale, and the criteria for each grade should be adjusted to fit your specific needs and result in a tool that will help you honestly assess your content across multiple applications.

Don’t forget the keywords

With most areas of your content covered by the scorecard, the last element to check before moving to the editing stage is your keywords.

Before I get slack for this, I’m aware that the general rule of creating content is to do your keyword research first. But I’ve found that when it comes to reviews, evaluating keywords last feels more natural and makes the process a lot smoother. When first running through a page, you’re much more likely to notice spelling and design flaws before you pick up whether a keyword is used correctly — why not make note of those details first?

Depending on the outcomes stemming from the re-evaluation of your target audience and content performance review, you will notice one of two things about your currently targeted keywords:

  1. They have not been impacted by the outcomes of the prior analyses and do not need to be altered
  2. They no longer align with the goals of the page or needs of the audience and should be changed

In the first example, the keywords you originally target are still best suited for your content’s message and no additional research is needed. So, your only remaining task is to determine whether or not your keywords are effectively used throughout the page. This means assessing things like title tag, image alt attributes, URL, and copy.

In an attempt to stay on track, I won’t go into further detail on how to optimize keywords but if you want a little more insight, this post by Ken Lyons is a great resource.

If, however, your target keywords are no longer relevant to the goals of your content, before moving to the editing stage you’ll need to re-do your keyword research to identify the terms you should rank for. For insight into keyword research this chapter in Moz’s Beginner’s Guide to SEO is another invaluable resource.


Step 3: Evaluate your evaluation

At this point your initial review is complete and you should be ready to edit.

That’s right. Your initial review.

The interesting thing about assessing content is that it never really ends. As you make edits you’ll tend to deviate more and more from your initial strategy. And while not always a bad thing, you must continuously monitor these changes to ensure that you are on the right track to create a highly valued piece of content.

The best approach would be to reassess all your material when:

  • 50% of the edits are complete
  • 85% of the edits are complete
  • You have finished editing

At the 50% and 85% marks, keep the assessment quick and simple. Look through your revisions and ask the following questions:

  • Am I still addressing the needs of my target audience?
  • Are my target keywords properly integrated?
  • Am I using the right language and tone of voice?
  • Does it look like the information is structured correctly (hierarchically)?

If your answer is “Yes” to all four questions, then you’ve effectively made your changes and should proceed. For any question you answer “No,” go back and make the necessary corrections. The areas targeted here become more difficult to fix the closer you are to completion and ensuring they’re correct throughout this stage will save a lot of time and stress in the long run.

When you’ve finished and think you’re ready to publish, run one last comprehensive review to check the performance status of all related components. This means confirming you’ve properly addressed the needs of your audience, optimized your keywords, and improved the elements highlighted in the scorecard.


Moving forward

No two pieces of content are the same, but that does not mean there aren’t some important commonalities either. Being able to identify these similarities and understand the role they play across all formats and topics will lead the way to creating your own review process for evaluating subjective material.

So, when you find yourself gearing up for your next project, give these steps a try and always keep the following in mind:

  1. Your audience is what makes or breaks you, so keep them happy
  2. Consistent quality is key! Ensure all components of your content are performing at their best
  3. Keep your keywords optimized and be prepared to do additional research if necessary
  4. Unplanned changes will happen. Just remember to remain observant as to keep yourself on track

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The 2018 Local SEO Forecast: 9 Predictions According to Mozzers

Posted by MiriamEllis

It’s February, and we’ve all dipped our toes into the shallow end of the 2018 pool. Today, let’s dive into the deeper waters of the year ahead, with local search marketing predictions from Moz’s Local SEO Subject Matter Expert, our Marketing Scientist, and our SEO & Content Architect. Miriam Ellis, Dr. Peter J. Myers, and Britney Muller weigh in on what your brand should prepare for in the coming months in local.


WOMM, core SEO knowledge, and advice for brands both large and small

Miriam Ellis, Moz Associate & Local SEO SME

LSAs will highlight the value of Google-independence

Word-of-mouth marketing (WOMM) and loyalty initiatives will become increasingly critical to service area business whose results are disrupted by Google’s Local Service Ads. SABs aren’t going to love having to “rent back” their customers from Google, so Google-independent lead channels will have enhanced value. That being said, the first small case study I’ve seen indicates that LSAs may be a winner over traditional Adwords in terms of cost and conversions.

Content will be the omni-channel answer

Content will grow in value, as it is the answer to everything coming our way: voice search, Google Posts, Google Questions & Answers, owner responses, and every stage of the sales funnel. Because of this, agencies which have formerly thought of themselves as strictly local SEO consultants will need to master the fundamentals of organic keyword research and link building, as well as structured data, to offer expert-level advice in the omni-channel environment. Increasingly, clients will need to become “the answer” to queries… and that answer will predominantly reside in content dev.

Retail may downsize but must remain physical

Retail is being turned on its head, with Amazon becoming the “everything store” and the triumphant return of old-school home delivery. Large brands failing to see profits in this new environment will increasingly downsize to the showroom scenario, significantly cutting costs, while also possibly growing sales as personally assisted consumers are dissuaded from store-and-cart abandonment, and upsold on tie-ins. Whether this will be an ultimate solution for shaky brands, I can’t say, but it matters to the local SEO industry because showrooms are, at least, physical locations and therefore eligible for all of the goodies of our traditional campaigns.

SMBs will hold the quality high card

For smaller local brands, emphasis on quality will be the most critical factor. Go for the customers who care about specific attributes (e.g. being truly local, made in the USA, handcrafted, luxury, green, superior value, etc.). Evaluating and perfecting every point of contact with the customer (from how phone calls are assisted, to how online local business data is managed, to who asks for and responds to reviews) matters tremendously. This past year, I’ve watched a taxi driver launch a delivery business on the side, grow to the point where he quit driving a cab, hire additional drivers, and rack up a profusion of 5-star, unbelievably positive reviews, all because his style of customer service is memorably awesome. Small local brands will have the nimbleness and hometown know-how to succeed when quality is what is being sold.


In-pack ads, in-SERP features, and direct-to-website traffic

Dr. Peter J. Meyers, Marketing Scientist at Moz

In-pack ads to increase

Google will get more aggressive about direct local advertising, and in-pack ads will expand. In 2018, I expect local pack ads will not only appear on more queries but will make the leap to desktop SERPs and possibly Google Home.

In-SERP features to grow

Targeted, local SERP features will also expand. Local Service Ads rolled out to more services and cities in 2017, and Google isn’t going to stop there. They’ve shown a clear willingness to create specialized content for both organic and local. For example, 2017 saw Google launch a custom travel portal and jobs portal on the “organic” side, and this trend is accelerating.

Direct-to-website traffic to decline

The push to keep local search traffic in Google properties (i.e. Maps) will continue. Over the past couple of years, we’ve seen local packs go from results that link directly to websites, to having a separate “Website” link to local sites being buried 1–2 layers deep. In some cases, local sites are being almost completely supplanted by local Knowledge Panels, some of which (hotels being a good example) have incredibly rich feature sets. Google wants to deliver local data directly on Google, and direct traffic to local sites from search will continue to decline.


Real-world data and the importance of Google

Britney Muller, SEO & Content Architect at Moz

Relevance drawn from the real world

Real-world data! Google will leverage device and credit card data to get more accurate information on things like foot traffic, current gas prices, repeat customers, length of visits, gender-neutral bathrooms, type of customers, etc. As the most accurate source of business information to date, why wouldn’t they?

Google as one-stop shop

SERPs and Maps (assisted by local business listings) will continue to grow as a one-stop-shop for local business information. Small business websites will still be important, but are more likely to serve as a data source as opposed to the only place to get their business information, in addition to more in-depth data like the above.


Google as friend or foe? Looking at these expert predictions, that’s a question local businesses of all sizes will need to continue to ask in 2018. Perhaps the best answer is “neither.” Google represents opportunity for brands that know how to play the game well. Companies that put the consumer first are likely to stand strong, no matter how the nuances of digital marketing shift, and education will remain the key to mastery in the year ahead.

What do you think? Any hunches about the year ahead? Let us know in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

New Research: 35% of Competitive Local Keywords Have Local Pack Ads

Posted by Dr-Pete

Over the past year, you may have spotted a new kind of Google ad on a local search. It looks something like this one (on a search for “oil change” from my Pixel phone in the Chicago suburbs):

These ads seem to appear primarily on mobile results, with some limited testing on desktop results. We’ve heard rumors about local pack ads as far back as 2016, but very few details. How prevalent are these ads, and how seriously should you be taking them?

11,000 SERPs: Quick summary

For this study, we decided to look at 110 keywords (in 11 categories) across 100 major US cities. We purposely focused on competitive keywords in large cities, assuming, based on our observations as searchers, that the prevalence rate for these ads was still pretty low. The 11 categories were as follows:

  • Apparel
  • Automotive
  • Consumer Goods
  • Finance
  • Fitness
  • Hospitality
  • Insurance
  • Legal
  • Medical
  • Services (Home)
  • Services (Other)

We purposely selected terms that were likely to have local pack results and looked for the presence of local packs and local pack ads. We collected these searches as a mobile user with a Samsung Galaxy 7 (a middle-ground choice between iOS and a “pure” Google phone).

Why 11 categories? Confession time – it was originally 10, and then I had the good sense to ask Darren Shaw about the list and realized I had completely left out insurance keywords. Thanks, Darren.

Finding #1: I was very wrong

I’ll be honest – I expected, from casual observations and the lack of chatter in the search community, that we’d see fewer than 5% of local packs with ads, and maybe even numbers in the 1% range.

Across our data set, roughly 35% of SERPs with local packs had ads.

Across industry categories, the prevalence of pack ads ranged wildly, from 10% to 64%:

For the 110 individual keyword phrases in our study, the presence of local ads ranged from 0% to 96%. Here are the keywords with >=90% local pack ad prevalence:

  • “car insurance” (90%)
  • “auto glass shop” (91%)
  • “bankruptcy lawyer” (91%)
  • “storage” (92%)
  • “oil change” (95%)
  • “mattress sale” (95%)
  • “personal injury attorney” (96%)

There was no discernible correlation between the presence of pack ads and city size. Since our study was limited to the top 100 US cities by population, though, this may simply be due to a restricted data range.

Finding #2: One is the magic number

Every local pack with ads in our study had one and only one ad. This ad appeared in addition to regular pack listings. In our data set, 99.7% of local packs had three regular/organic listings, and the rest had two listings (which can happen with or without ads).

Finding #3: Pack ads land on Google

Despite their appearance, local packs ads are more like regular local pack results than AdWords ads, in that they’re linked directly to a local panel (a rich Google result). On my Pixel phone, the Jiffy Lube ad at the beginning of this post links to this result:

This is not an anomaly: 100% of the 3,768 local pack ads in our study linked back to Google. This follows a long trend of local pack results linking back to Google entities, including the gradual disappearance of the “Website” link in the local pack.

Conclusion: It’s time to get serious

If you’re in a competitive local vertical, it’s time to take local pack ads seriously. Your visitors are probably seeing them more often than you realize. Currently, local pack ads are an extension of AdWords, and require you to set up location extensions.

It’s also more important than ever to get your Google My Business listing in order and make sure that all of your information is up to date. It may be frustrating to lose the direct click to your website, but a strong local business panel can drive phone calls, foot traffic, and provide valuable information to potential customers.

Like every Google change, we ultimately have to put aside whether we like or dislike it and make the tough choices. With more than one-third of local packs across the competitive keywords in our data set showing ads, it’s time to get your head out of the sand and get serious.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!