Tag Archive | "Guide"

The Guide to Building Linked Unstructured Citations for Local SEO

Posted by MiriamEllis

This article was written jointly in partnership with Kameron Jenkins. You can enjoy her previous articles here.


When you’ve accomplished step one in your local search marketing, how do you take step two?

You already know that any local business you market has to have the table stakes of accurate structured citations on major platforms like Facebook, Yelp, Infogroup, Acxiom, and YP.

But what can local SEO practitioners do once they’ve got these formal listings created and a system in place for managing them? Our customers often come to us once they’ve gotten well underway with Moz Local and ask, “What’s next? What can I do to move the needle?” This blog post will give you the actionable strategy and a complete step-by-step tutorial to answer this important question.

A quick refresher on citations

Listings on formal directories are called “structured citations.” When other types of platforms (like online news, blogs, best-of lists, etc.) reference a local business’ complete or partial contact information, that’s called an “unstructured citation.” And the best unstructured citations of all include links, of course!

For example, the San Francisco branch of a natural foods grocery store gets a linked unstructured citation from a major medical center in their city via a blog post about stocking a pantry with the right ingredients for healthier eating. Google and consumers encounter this reference and understand that trust and authority are being conveyed and earned.

The more often websites that are relevant to your location or industry link to you within their own content, the better your chances of ranking well in Google’s organic and local search engine results.

Why linked unstructured citations are growing in importance right now

Link building is as old as organic SEO. Structured citation building is as old as local SEO. Both practices have long sought to influence Google rankings. But a close read of the local search marketing community these days points up an increasing emphasis on the value of unstructured citations. In fact, local links were one of the top three takeaways from the 2018 Local Search Ranking Factors survey. Why is this?

  1. Google has become the dominant force in local consumer experiences, keeping as many actions as possible within their own interface instead of sending searchers to company websites. Because links influence rank within that interface, most local businesses enterprises will need to move beyond traditional structured citations to impress Google with mentions on a diverse variety of relevant websites. While structured citations are rightly referred to as “table stakes” for all local businesses, it’s the unstructured ones that can be competitive difference-makers in tough markets.
  2. Meanwhile, Google is increasingly monetizing local search results. A prime example of this is their Local Service Ads (LSA) program which acts as lead gen between Google and service area businesses like plumbing and housekeeping companies. Savvy local brands (including brick-and-mortar models) will see the way the wind is blowing with this and work to form non-Google-dependent sources of traffic and lead generation. A good linked unstructured citation on a highly relevant publication can drive business without having to pay Google a dime.

Your goal with linked unstructured citations is to build your community footprint and your authority simultaneously. All you need is the right tools for the research phase!

Fishing for opportunities with Link Intersect

For the sake of this tutorial, let’s choose at random a small B&B in Albuquerque — Bottger.com — as our hypothetical client. Let’s say that the innkeeper wants to know how the big Tribal resort casinos are earning publicity and links, in the hopes of finding opportunities for a smaller hospitality business, too. *Note that these aren’t absolutely direct competitors, but they share a city and an overall industry.

We’re going to use Moz’s Link Intersect tool to do this research for Bottger Mansion. This tool could help Bottger uncover all kinds of links and unstructured linked citation opportunities, depending on how it’s used. For example, the tool could surface:

  • Links that direct or near-direct competitors have, but that Bottger doesn’t
  • Locally relevant links from domains/pages about Bottger’s locale
  • Industry-relevant links from domains/pages about the hospitality industry

Step 1: Find the “big fish”

A client may already know who the “big fish” in their community are, or you can cast a net by identifying popular local events and seeing which businesses sponsor them. Sponsorships can be pricey, depending on the event, so if a local company sponsors a big event, it’s an indication that they’re a larger enterprise with the budget to pursue a wide array of creative PR ideas. Larger enterprises can serve as models for small business emulation, at scale.

In our case study, we know that Bottger is located in Albuquerque, so we decided to locate sponsors of the famous Albuquerque International Balloon Fiesta. Right away, we spotted two lavish Albuquerque resort-casinos — Isleta and Sandia. These are the “big fish” we want our smaller client to look to for inspiration.

Step 2: Input domains in Link Intersect

We’re going to compare Bottger’s domain to Isleta and Sandia’s domains. In Moz Pro, navigate to “Link Explorer” and then select “Link Intersect” from the left navigation. Input your domain in the top and the domains you want to mine link ideas from in the fields beneath, as depicted below.

Open Link Explorer in a new tab

Next to Bottger’s domain, we’ve selected “root domain” as that will show us all competitor links who haven’t linked to us at all. We’re also going to select “root domain” on the resort domains, so we can see all of their backlinks, rather than just links to particular pages on their sites.

Moz’s Link Intersect tool will let you compare your site with up to 5 competitors. It’s totally up to you how many sites you want to evaluate at once. If you’re just getting started with link building, you may want to start with just one domain, as this should yield plenty of link opportunities to start with. If you’ve already been doing some link building, you have more time to dedicate to link building, or you’d just generally rather have more options to work with, go ahead and put in multiple domains to compare.

Step 3: Find link opportunities

Once you’ve input your domain and your competitor(s) domains, click “Find Opportunities.” That will yield a list of sites that link to your competitors, but do not link to you.

In this example, we’re comparing our client’s domain against two other domains: A (Isleta) and B (Sandia). In the “Sites that intersect” column, you will see whether Site A has the link, Site B has it, or if they both have it.

Step 4: The link selection process

Now that we have a list of link ideas from Isleta and Sandia’s backlink profiles, it’s time to decide which ones might yield good opportunities for our B&B. That’s right — just because something is in a competitor’s link profile doesn’t necessarily mean you want it!

View the referring pages

The first step is to drill down and get more detail about links the big resorts have. Select the arrow to expand this section and view the exact page the link is coming from.

In this example, both Sandia and Isleta have links from the root domain marriott.com. By using the “expand” feature, we can see the exact pages those links are located on.

Identify follow or no-follow

You can use the MozBar Chrome plugin to view whether your competitor’s link is no-followed or followed. Since only followed links pass authority, you may want to prioritize those, but no-followed links can also have value in the form of generating traffic to your site and could get picked up by others who do eventually link to your site with a follow link.

Select the MozBar icon from your browser and click the pencil icon. If you want to see Followed links, select “Followed” and the MozBar will highlight these links on the page in green. To find No-Followed links, click “No-Followed” and MozBar will highlight these links on the page in pink.

Common types of links you’ll see in the profiles of local business websites

If this is your first foray into link building for local businesses, you may be unfamiliar with the types of sites you’ll see in Link Intersect. While no two link profiles are exactly the same, many local businesses use similar methods for building links, so there are some common categories to be aware of. Knowing these will help you decipher the results Link Intersect will show you.

Types of links and what you can do with them:

Press releases

Press release sites like PRweb.com and PRnewsire.com are fairly common among local businesses that want to spread the word about their initiatives. Whether someone at the business won an award or they started a new community outreach program, local businesses often pay companies like PRweb.com to distribute this news on their platform and to their partners. These are no-followed links (don’t pass link authority aka “SEO value”) but they can offer valuable traffic and could even get picked up by sites that do link with a follow link.

If your competitor is utilizing press releases, you may want to consider distributing your newsworthy information this way!

Structured citations / directories

One of the primary types of domains you’ll see in a local business’ backlink profile is directories — structured citation websites like yellowpages.com that list a business’ name, address, and phone number (NAP) with a link back to the business’ website. Because they’re self-created and not editorially given, like Press Releases, they are often no-followed. However, having consistent and accurate citations across major directory websites is a key foundational step in local search performance.

If you see these types of sites in Link Intersect, it may indicate your need for a listings management solution like Moz Local that can ensure your NAP is accurate and available across major directories. Typically, you’ll want to have these table stakes before focusing on unstructured linked citations.

News coverage

Another favorite among local businesses is local media coverage (or just media coverage in general — it’s not always local). HARO (Help a Reporter Out) is a popular service for connecting journalists to subject matter experts who may be valuable sources for their articles. The journalists will typically link your quote back to your website. Aside from services like HARO, local businesses would do well to make media contacts, such as forming relationships with local news correspondents. As news surfaces, they’ll start reaching out to you for comment!

If you see news coverage in your competitor’s backlink profile, you can get ideas of what types of publications want content and information that you can provide.

Local / industry coverage

Blogs, hobby sites, DIY sites, and other platforms can feature content that depicts city life or interest in a topic. For example, a chef might author a popular blog covering their dining experiences in San Francisco. For a local restaurant, being cited by this publication could be valuable.

If you see popular local or industry sites in your competitor’s backlink profile, it’s a good signal of opportunity for your business to build a relationship with the authors in hopes of gaining links.

Trade organizations

Most local businesses are affiliated with some type of governing/regulating body, trade organization, award organization, etc. Many of these organizations have websites themselves, and they often list the businesses they’re affiliated with.

If your competitor is involved with an organization, that means your business is likely suited to be involved as well! Use these links to get ideas of which organizations to join.

Community organizations

Community organizations are a great local validator for search engines, and many local businesses have taken notice. You’ll likely find these types of organizations’ websites in your competitor’s backlink profile, such as Chamber of Commerce websites or the local YMCA.

As a local business, your competitors are in the same locale as you, so take note of these community organizations and consider joining them. You’ll not only get the benefit of better community involvement, but you can get a link out of it too!

Sponsorships / event participation

Local businesses can sponsor, donate to, host or participate in community events, teams, and other cherished local resources, which can lead to both online and offline publicity.

Local businesses can earn great links from online press surrounding these groups and happening. If an event/team page highlights you, but doesn’t actually link to benefactors/participants, don’t be shy about politely requesting a link.

Scholarships / .edu sites

A popular strategy used by many local businesses and non-local businesses alike is scholarship link building. Businesses figured out that if they offered a scholarship, they could get a link back to their site on education websites, such as .edu domains. Everyone seemed to catch on — so much so that many schools stopped featuring these scholarships on their site. It’s also important to note that .edu domains don’t inherently have more value than domains on any other TLD.

If your business wants to offer a scholarship, that is a great thing! We encourage you to pursue this for the benefit it could offer students, rather than primarily for the purpose of gaining links. Scholarship link building has become very saturated, and could be a strategy with diminishing returns, so don’t put all your eggs in this basket, and do it first and foremost for students instead of links.

Other businesses

Businesses may sometimes partner with each other for mutually beneficial link opportunities. Co-marketing opportunities that are a byproduct of genuine relationships can present valuable link opportunities, but link exchanges are against Google’s quality guidelines.

Stay away from “you link to me, I’ll link to you” opportunities as Google can see it as an attempt to manipulate your site’s ranking in search, but don’t be afraid to pursue genuine connections with other businesses that can turn into linking opportunities.

Spam

Just because your competitor has that link doesn’t mean you want it too! In Link Intersect, pay attention to the domain’s Spam Score and DA. A high spam score and/or low DA can indicate that the link wouldn’t be valuable for your site, and may even harm it.

Also watch out for links generated from comments. If your competitor has links in their backlink profile coming from comments, you can safely ignore these as they do not present real opportunities for earning links that will move the needle in the right direction.

Now that you’re familiar with popular types of local backlinks and what you can do with them, let’s actually dig into Isleta and Sandia’s backlinks to see which might be good prospects for us.

Step 5: Imitation is the sincerest form of flattery

Both the Albuquerque Marriott and Hilton Garden Inn link to Isleta and Sandia on their “Local Things to Do” pages. This could be a great prospect for Bottger! In many cases, “things to do” pages will include lists of local restaurants, historic sites, attractions, shops, and more. Note how their addresses are included on the following pages, making them powerful linked unstructured citations. Bottger hosts fancy tea parties in a lovely setting, which could be a fun thing for tourists to do.

Isleta and Sandia also have links from a wedding website. If Bottger uses their property as a wedding venue, offers special wedding or engagement packages, or something similar, this could be a great prospect as well.

Link Intersect also yielded links to various travel guide websites. There are plenty of links on websites like these to local attractions. In the following example, you can see an Albuquerque travel guide that’s broken up by category, “hotels” being one of them:

Isleta and Sandia also have been featured in the Albuquerque Journal. In this example, a local reporter covered news that Isleta was opening expanded bingo and poker rooms. This seems to be a journalist who covers local businesses, so she could be a great connection to make!

Many other links in Isleta and Sandia’s backlink profiles came from sources like events websites, since these resorts are large enough to serve as the venue for major events like concerts and MMA matches. Although Bottger isn’t large enough to host an event of that magnitude, it could spark good ideas for link building opportunities in the future. Maybe Bottger could host a small community tea tasting event featuring locally sourced herbal teas and get in touch with a local reporter to promote it. Even competitor links that you can’t directly pursue can spark your creativity for related link building opportunities.

And let’s not forget how we found out about Isleta and Sandia in the first place: the Albuquerque International Balloon Fiesta! Event sponsors are featured on an “official sponsors” page with links to their websites. This is a classic, locally relevant opportunity for any Albuquerque business.

Step 6: Compile your link prospects in Link Tracking Lists

If you’re thinking, “This sounds great, but it also sounds like a lot of work. How am I ever going to keep track of all this?” — we’ve got you covered!

Moz Pro’s “Link Tracking Lists” was built for just this purpose.

In Link Intersect, you’ll see little check boxes next to all your competitors’ links. When you find one you want to target, check the box. When you’re done going through all the links and have checked the boxes next to the domains you want to pursue, click “Add to Link Tracking List” at the top right.

Since we’ve never done link building for Bottger before, we’re going to select “Create New List” from the dropdown, and label it something descriptive.

Make sure to put your client’s domain in the “target URL” field. For Step 3, since we’ve just selected the links we want to track from Link Intersect, those will already be populated in this field, so no further action is needed other than to click “Save.”

We’ll come back to Link Tracking Lists when we talk about outreach, but for now, all you need to know is that you can add the desirable competitor links (in our case, links from Isleta and Sandia) to Link Tracking lists straight from Link Intersect, making it easy to manage your data.

Step 7: Find out how to connect with your link prospects

Now it’s time to connect the dots: how do you go from knowing about your competitor’s links to getting those types of links for yourself?

There are three main ways you can get unstructured linked citations to your local business’ website, and those categories are what’s going to dictate the strategy you need to take to secure that opportunity for yourself.

  1. Self-created: Self-created links are like voting for yourself, so sites that accept these types of submissions, like Yelp.com, will NoFollow the link to your business’ website. Visitors are still referred to your website through that link, but the link doesn’t pass authority from Yelp.com to your domain. You should only get authority from a website if they link to you on their own (what Google calls “editorially placed” links). Neither NoFollow nor Follow links are inherently good or bad on their own. They are just intended for different purposes, and it’s the misuse of followed links that can get you in trouble with Google. We’ll talk more about that in a later section titled “Avoiding the bad fish: Risks of ignoring Google’s link scheme guidelines”
  2. Prompted by outreach: In many cases, people won’t know about your content until you tell them. These links are editorially placed by the site owner (not self-created), but the site owner was only made aware of your content because you reached out to them.
  3. Organically earned: Sometimes, you get links even without asking for them. If you have a popular piece of content on your site that receives lots of traffic, for example, people may link to that on their own because they find it valuable.

Since this tutorial is about proactively pursuing link opportunities, we’re going to focus on unstructured linked citations types one and two.

Articles

If your competitor has been featured in an article from say a local journalist or blogger, then your outreach will be focused on making a connection with that writer or publication for future link opportunities, rather than getting the exact link your competitor has. That’s because the article has already been written, so it’s unlikely that the writer will go back and edit their story just to add your link.

The one exception to this rule would be if the article links to your competitor, but your competitor’s link is now broken. In this scenario, you could reach out to the writer and say something like, “Hey! I notice in your article [article title] you link to [competitor’s link], but that link doesn’t seem to be working. I have similar content on my website [your URL]. If you find it valuable, please feel free to use it as a replacement for that broken link!”

Sometimes the contact information of the writer will be right next to the article, itself. For example:

If there’s no email address or contact form in the writer’s bio, you can usually find a link to one of their social media accounts, like Twitter, and you can connect with them there via a public or direct message. If you live in a small, tight-knit community, you may even be able to meet with the author in person.

Press releases

If you notice your competitors are issuing a lot of press releases and you want to try that out for yourself, you’ll likely need to sign up for an account, as these are a primarily self-serve platform. Most quality press release sites charge per release, and the price can differ depending on length.

Citations / directories

You’ll either want to sign up for a citation service like Moz Local that distributes your data to these types of listings programmatically, or if you do it manually, you’ll want to find the link to create your listings. Please note that your business may already be on the directory even if you haven’t set up a profile. Before creating a new listing, search for your business name and its variants, your phone number, and current and former addresses to see if there are existing listings you can claim and update.

Business websites

Most businesses will make it easy to contact them. If you’re trying to contact another business for the purpose of proposing teaming up for a co-marketing opportunity, look in their footer (the very bottom of the website). If there’s no contact information there, search for a “Contact Us” or “About” page. You may not find an email address, but you may be able to find a contact form or phone number. Below is an example from Albuquerque Little Theater, where they have contact information on the right and advertising information in the top navigation for businesses that are interested in taking out ads in their printed show programs. Not an unstructured linked citation, but a great way to get your business known to the community!

Organizations

Most organizations will make it easy for those who want to join, unless they are more exclusive or invitation-only. In the event that you do wish to get involved in an invitation-only organization that has no public-facing contact information, try viewing a member list and seeing if there’s anyone you know. Or maybe you know someone who can introduce you to one of the members. Genuine connections are key for this type of organization.

Step 8: Writing a good outreach email (for unstructured linked citations requiring outreach)

Outreach emails are necessary when the link opportunity you’re pursuing isn’t a link you could create yourself, or if the link source is one where you can’t make face-to-face contact with decision-makers. One of the most important questions you should be asking yourself for these opportunities is, “Why would this website link to me?”

Here’s how Bottger might go about sending an outreach email:

Greeting that matches the nature of the outreach target

“Hey Jill!” might be fine when outreaching to the author of a blog, while “Hello Ms. Smith” might be better for more professional outreach.

Introduction

Give a brief summary of who you are, what you do, and your interest in contacting them. For example: “I work with Bottger Mansion, a historic Bed & Breakfast in Old Town Albuquerque. I found your page about Albuquerque activities — you’ve really captured a lot of what Albuquerque has to offer!”

The ask, and the value add

This is where you’ll actually ask for the link. It’s a good idea here to add value. Don’t just ask for something; offer to give something back!

To continue the same example: “As long-time residents of Old Town, we’d love to provide you with a comprehensive list of activities in the city’s historic district! We feel an Old Town Activities list would be a great addition to your page. Bottger Mansion regularly hosts high tea, for example, which we’d love to let more people know about with a spot on your list!”

Close

Wish them well, thank them for their time, and sign off. Make sure that it’s easy for them to find information about you by including your full name, title, organization, and website/social links in your email signature.

Don’t be afraid to get on the phone, either! Hearing your voice can add a human element to the outreach attempt and offer a better conversion rate than a more impersonal email (we all get so many of those a day that ones from people we don’t know are easy to ignore).

And remember that local businesses have a particular advantage in accruing unstructured linked citations. Lively participation in the life of your community can continuously introduce you to decision-makers at popular local publications, paving the way towards neighborly outreach on your part. Learn to see the opportunities and think of ways your business can add value to the content that is being written about your town or city.

Step 9: Tracking your wins

Next-to-last, we’re going to jump back to Link Tracking Lists for a second, because that’s going to come in extremely handy here. Remember when we created the list with Sandia and Isleta’s links that we were interested in pursuing? Those will now show up when we go to Moz Pro > Link Explorer > Link Tracking Lists.

Every time Bottger successfully secures a link that they’ve added to their Link Tracking List, the red X in “Links to target URL?” column will turn blue, indicating that the site links to Bottger’s root domain. If we were pursuing links to individual pages, and a link prospect linked to our target page, the red X would turn green.

Another handy feature is the “Notes” dropdown. This allows you to keep track of your outreach attempts, which can be one of the trickiest parts about link building!

Avoiding the bad fish: Some words of caution before you get started

Before starting this process for yourself, familiarize yourself with these four risks so that your fishing trip doesn’t result in a basket of bad catches that could waste your resources or get your website penalized.

1. Risks of a “copy only” strategy

Link Intersect can be amazingly helpful for discovering new, relevant link opportunities for your local business, but link builders beware. If all you ever do is copy your competitors, the most you’ll ever achieve is becoming the second-best version of them. Use this method to keep tabs on strategies your competition is using, and even use it to spark your own creativity, but avoid copying everything your competitors do, and nothing else. Why be the second-best version of your competition when you can be the best version of yourself?

2. Risks of a “blindly follow” strategy

Comparing your site’s backlink profile with your direct competitors’ backlink profiles will return a list of links that they have and you don’t, but don’t use Link Intersect results as an exact checklist of links to pursue. Your competitors might have bad backlinks in their profile. For example, avoid pursuing opportunities from domains with a high Spam Score or low Domain or Page Authority (DA/PA). Learn more about how to evaluate sites by their Spam Score or DA/PA.

They might also have great backlinks that aren’t the right opportunity for your business, and those should be avoided too! Do you remember Isleta and Sandia’s links for events like MMA matches? If Bottger were to blindly take those resorts’ link profiles as directives, they might think they have to host a fight at their B&B, too!

Take what you find with a grain of salt. Evaluate every link opportunity on its own merit, rather than deeming it a good opportunity simply because your competitor has it.

3. Risks of an “apples to oranges” strategy

Choose the domains and pages you want to compare yourself against wisely. As a small local B&B, Bottger wouldn’t want to compare their backlink profile to that of Wikipedia or The New York Times, for example. Those sites are popular, but not relevant in any way to the types of unstructured linked citations Bottger would want to pursue, such as links that are locally relevant or industry-relevant.

In other words, just because a site is popular doesn’t mean it will yield relevant unstructured linked citation opportunities for you. Here in this tutorial, we’ve outlined one potential use-case for Link Intersect: finding unstructured linked citations your local business competitors have. However, this is not the only use for Link Intersect. Instead of comparing your site against competitors or near-competitors, you could compare it against:

If you know what types of links you’re trying to find, choosing sites to evaluate against your own should be a lot easier, and yield more relevant opportunities.

4. Risks of ignoring Google’s “link schemes” guidelines

If you’ve never embarked on link building before, we encourage you to read through Google’s quality guidelines for webmasters, specifically its section on “Link schemes.” If you were to distill those link guidelines down into a single principle, it would be: don’t create links for the purpose of manipulating your site’s ranking in Google search. That’s right. Google doesn’t want anyone embarking on any marketing initiatives solely for the purpose of improving their ranking. Google wants links to be the natural byproduct of the quality work you’re doing for your audience. Google can penalize sites that participate in activities such as:

  • Buying links that pass PageRank (“followed” links)
  • Excessive “you link to me and I’ll link to you” exchanges
  • Self-created followed links that weren’t editorially placed by the site owner

This underscores that the activities that are just good business, like being involved in the local community, are also the ones that can produce the links that Google likes. Sites owners might need a little nudge, which is why we’ve included a section on outreach, but that doesn’t mean the links are unnatural. Unstructured linked citations should be a byproduct of the good work local businesses are doing in their communities.

In conclusion

At Moz, we’re strong believers in authenticity, and there is no better pond for building meaningful marketing relationships than the local one. Focusing on unstructured linked citations can be viewed as a prompt to grow your community relationships — with journalists, bloggers, event hosts, business associations, and customers. It’s a chance for a real-world fishing trip that can reel in a basket of publicity for your local brand beyond what money can buy. Your genuine desire to serve and build community will stand you in good stead for the long haul.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

The Advanced Guide to Keyword Clustering

Posted by tomcasano

If your goal is to grow your organic traffic, you have to think about SEO in terms of “product/market fit.”

Keyword research is the “market” (what users are actually searching for) and content is the “product” (what users are consuming). The “fit” is optimization.

To grow your organic traffic, you need your content to mirror the reality of what users are actually searching for. Your content planning and creation, keyword mapping, and optimization should all align with the market. This is one of the best ways to grow your organic traffic.

Why bother with keyword grouping?

One web page can rank for multiple keywords. So why aren’t we hyper-focused on planning and optimizing content that targets dozens of similar and related keywords?

Why target only one keyword with one piece of content when you can target 20?

The impact of keyword clustering to acquire more organic traffic is not only underrated, it is largely ignored. In this guide, I’ll share with you our proprietary process we’ve pioneered for keyword grouping so you can not only do it yourself, but you can maximize the number of keywords your amazing content can rank for.

Here’s a real-world example of a handful of the top keywords that this piece of content is ranking for. The full list is over 1,000 keywords.

17 different keywords one page is ranking for

Why should you care?

It’d be foolish to focus on only one keyword, as you’d lose out on 90%+ of the opportunity.

Here’s one of my favorite examples of all of the keywords that one piece of content could potentially target:

List of ~100 keywords one page ranks for

Let’s dive in!

Part 1: Keyword collection

Before we start grouping keywords into clusters, we first need our dataset of keywords from which to group from.

In essence, our job in this initial phase is to find every possible keyword. In the process of doing so, we’ll also be inadvertently getting many irrelevant keywords (thank you, Keyword Planner). However, it’s better to have many relevant and long-tail keywords (and the ability to filter out the irrelevant ones) than to only have a limited pool of keywords to target.

For any client project, I typically say that we’ll collect anywhere from 1,000 to 6,000 keywords. But truth be told, we’ve sometimes found 10,000+ keywords, and sometimes (in the instance of a local, niche client), we’ve found less than 1,000.

I recommend collecting keywords from about 8–12 different sources. These sources are:

  1. Your competitors
  2. Third-party data tools (Moz, Ahrefs, SEMrush, AnswerThePublic, etc.)
  3. Your existing data in Google Search Console/Google Analytics
  4. Brainstorming your own ideas and checking against them
  5. Mashing up keyword combinations
  6. Autocomplete suggestions and “Searches related to” from Google

There’s no shortage of sources for keyword collection, and more keyword research tools exist now than ever did before. Our goal here is to be so extensive that we never have to go back and “find more keywords” in the future — unless, of course, there’s a new topic we are targeting.

The prequel to this guide will expand upon keyword collection in depth. For now, let’s assume that you’ve spent a few hours collecting a long list of keywords, you have removed the duplicates, and you have semi-reliable search volume data.

Part 2: Term analysis

Now that you have an unmanageable list of 1,000+ keywords, let’s turn it into something useful.

We begin with term analysis. What the heck does that mean?

We break each keyword apart into its component terms that comprise the keyword, so we can see which terms are the most frequently occurring.

For example, the keyword: “best natural protein powder” is comprised of 4 terms: “best,” “natural,” “protein,” and “powder.” Once we break apart all of the keywords into their component parts, we can more readily analyze and understand which terms (as subcomponents of the keywords) are recurring the most in our keyword dataset.

Here’s a sampling of 3 keywords:

  • best natural protein powder
  • most powerful natural anti inflammatory
  • how to make natural deodorant

Take a closer look, and you’ll notice that the term “natural” occurs in all three of these keywords. If this term is occurring very frequently throughout our long list of keywords, it’ll be highly important when we start grouping our keywords.

You will need a word frequency counter to give you this insight. The ultimate free tool for this is Write Words’ Word Frequency Counter. It’s magical.

Paste in your list of keywords, click submit, and you’ll get something like this:

List of keywords and how frequently they occur

Copy and paste your list of recurring terms into a spreadsheet. You can obviously remove prepositions and terms like “is,” “for,” and “to.”

You don’t always get the most value by just looking at individual terms. Sometimes a two-word or three-word phrase gives you insights you wouldn’t have otherwise. In this example, you see the terms “milk” and “almond” appearing, but it turns out that this is actually part of the phrase “almond milk.”

To gather these insights, use the Phrase Frequency Counter from WriteWords and repeat the process for phrases that have two, three, four, five, and six terms in them. Paste all of this data into your spreadsheet too.

A two-word phrase that occurs more frequently than a one-word phrase is an indicator of its significance. To account for this, I use the COUNTA function in Google Sheets to show me the number of terms in a phrase:

=COUNTA(SPLIT(B2," "))

Now we can look at our keyword data with a second dimension: not only the number of times a term or phrase occurs, but also how many words are in that phrase.

Finally, to give more weighting to phrases that recur less frequently but have more terms in them, I put an exponent on the number of terms with a basic formula:

=(C4^2)*A4

In other words, take the number of terms and raise it to a power, and then multiply that by the frequency of its occurrence. All this does is give more weighting to the fact that a two-word phrase that occurs less frequently is still more important than a one-word phrase that might occur more frequently.

As I never know just the right power to raise it to, I test several and keep re-sorting the sheet to try to find the most important terms and phrases in the sheet.

Spreadsheet of keywords and their weighted importance

When you look at this now, you can already see patterns start to emerge and you’re already beginning to understand your searchers better.

In this example dataset, we are going from a list of 10k+ keywords to an analysis of terms and phrases to understand what people are really asking. For example, “what is the best” and “where can i buy” are phrases we can absolutely understand searchers using.

I mark off the important terms or phrases. I try to keep this number to under 50 and to a maximum of around 75; otherwise, grouping will get hairy in Part 5.

Part 3: Hot words

What are hot words?

Hot words are the terms or phrases from that last section that we have deemed to be the most important. We’ve explained hot words in greater depth here.

Why are hot words important?

We explain:

This exercise provides us with a handful of the most relevant and important terms and phrases for traffic and relevancy, which can then be used to create the best content strategies — content that will rank highly and, in turn, help us reap traffic rewards for your site.

When developing your hot words list, we identify the highest frequency and most relevant terms from a large range of keywords used by several of your highest-performing competitors to generate their traffic, and these become “hot words.”

When working with a client (or doing this for yourself), there are generally 3 questions we want answered for each hot word:

  1. Which of these terms are the most important for your business? (0–10)
  2. Which of these terms are negative keywords (we want to ignore or avoid)?
  3. Any other feedback about qualified or high-intent keywords?

We narrow down the list, removing any negative keywords or keywords that are not really important for the website.

Once we have our final list of hot words, we organize them into broad topic groups like this:

Organized spreadsheet of hot words by topic

The different colors have no meaning, but just help to keep it visually organized for when we group them.

One important thing to note is that word stems play an important part here.

For example, consider that all of these words below have the same underlying relevance and meaning:

  • blog
  • blogs
  • blogger
  • bloggers
  • blogging

Therefore, when we’re grouping keywords, to consider “blog” and “blogging” and “bloggers” as part of the same cluster, we’ll need to use the word stem of “blog” for all of them. Word stems are our best friend when grouping. Synonyms can be organized in a similar way, which are basically two different ways of saying the same thing (and the same user intent) such as “build” and “create” or “search” and “look for.”

Part 4: Preparation for keyword grouping

Now we’re going to get ourselves set up for our Herculean task of clustering.

To start, copy your list of hot words and transpose them horizontally across a row.

Screenshot of menu in spreadsheet

List your keywords in the first column.

Screenshot of keyword spreadsheet

Now, the real magic begins.

After much research and noodling around, I discovered the function in Google Sheets that tells us whether a stem or term is in a keyword or not. It uses RegEx:

=IF(RegExMatch(A5,"health"),"YES","NO")

This simply tells us whether this word stem or word is in that keyword or not. You have to individually set the term for each column to get your “YES” or “NO” answer. I then drag this formula down to all of the rows to get all of the YES/NO answers. Google Sheets often takes a minute or so to process all of this data.

Next, we have to “hard code” these formulas so we can remove the NOs and be left with only a YES if that terms exists in that keyword.

Copy all of the data and “Paste values only.”

Screenshot of spreadsheet menu

Now, use “Find and replace” to remove all of the NOs.

Screenshot of Find and Replace popup

What you’re left with is nothing short of a work of art. You now have the most powerful way to group your keywords. Let the grouping begin!

Screenshot of keyword spreadsheet

Part 5: Keyword grouping

At this point, you’re now set up for keyword clustering success.

This part is half art, half science. No wait, I take that back. To do this part right, you need:

  • A deep understanding of who you’re targeting, why they’re important to the business, user intent, and relevance
  • Good judgment to make tradeoffs when breaking keywords apart into groups
  • Good intuition

This is one of the hardest parts for me to train anyone to do. It comes with experience.

At the top of the sheet, I use the COUNTA function to show me how many times this word step has been found in our keyword set:

=COUNTA(C3:C10000)

This is important because as a general rule, it’s best to start with the most niche topics that have the least overlap with other topics. If you start too broadly, your keywords will overlap with other keyword groups and you’ll have a hard time segmenting them into meaningful groups. Start with the most narrow and specific groups first.

To begin, you want to sort the sheet by word stem.

The word stems that occur only a handful of times won’t have a large amount of overlap. So I start by sorting the sheet by that column, and copying and pasting those keywords into their own new tab.

Now you have your first keyword group!

Here’s a first group example: the “matcha” group. This can be its own project in its own right: for instance, if a website was all about matcha tea and there were other tangentially related keywords.

Screenshot of list of matcha-related keywords

As we continue breaking apart one keyword group and then another, by the end we’re left with many different keyword groups. If the groups you’ve arrived at are too broad, you can subdivide them even more into narrower keyword subgroups for more focused content pieces. You can follow the same process for this broad keyword group, and make it a microcosm of the same process of dividing the keywords into smaller groups based on word stems.

We can create an overview of the groups to see the volume and topical opportunities from a high level.

Screenshot of spreadsheet with keyword group overview

We want to not only consider search volume, but ideally also intent, competitiveness, and so forth.

Voilà!

You’ve successfully taken a list of thousands of keywords and grouped them into relevant keyword groups.

Wait, why did we do all of this hard work again?

Now you can finally attain that “product/market fit” we talked about. It’s magical.

You can take each keyword group and create a piece of optimized content around it, targeting dozens of keywords, exponentially raising your potential to acquire more organic traffic. Boo yah!

All done. Now what?

Now the real fun begins. You can start planning out new content that you never knew you needed to create. Alternatively, you can map your keyword groups (and subgroups) to existing pages on your website and add in keywords and optimizations to the header tags, body text, and so forth for all those long-tail keywords you had ignored.

Keyword grouping is underrated, overlooked, and ignored at large. It creates a massive new opportunity to optimize for terms where none existed. Sometimes it’s just adding one phrase or a few sentences targeting a long-tail keyword here and there that will bring in that incremental search traffic for your site. Do this dozens of times and you will keep getting incremental increases in your organic traffic.

What do you think?

Leave a comment below and let me know your take on keyword clustering.

Need a hand? Just give me a shout, I’m happy to help.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 5: Technical Optimization

Posted by BritneyMuller

After a short break, we’re back to share our working draft of Chapter 5 of the Beginner’s Guide to SEO with you! This one was a whopper, and we’re really looking forward to your input. Giving beginner SEOs a solid grasp of just what technical optimization for SEO is and why it matters — without overwhelming them or scaring them off the subject — is a tall order indeed. We’d love to hear what you think: did we miss anything you think is important for beginners to know? Leave us your feedback in the comments!

And in case you’re curious, check back on our outline, Chapter One, Chapter Two, Chapter Three, and Chapter Four to see what we’ve covered so far.


Chapter 5: Technical Optimization

Basic technical knowledge will help you optimize your site for search engines and establish credibility with developers.

Now that you’ve crafted valuable content on the foundation of solid keyword research, it’s important to make sure it’s not only readable by humans, but by search engines too!

You don’t need to have a deep technical understanding of these concepts, but it is important to grasp what these technical assets do so that you can speak intelligently about them with developers. Speaking your developers’ language is important because you will likely need them to carry out some of your optimizations. They’re unlikely to prioritize your asks if they can’t understand your request or see its importance. When you establish credibility and trust with your devs, you can begin to tear away the red tape that often blocks crucial work from getting done.

Pro tip: SEOs need cross-team support to be effective

It’s vital to have a healthy relationship with your developers so that you can successfully tackle SEO challenges from both sides. Don’t wait until a technical issue causes negative SEO ramifications to involve a developer. Instead, join forces for the planning stage with the goal of avoiding the issues altogether. If you don’t, it can cost you in time and money later.

Beyond cross-team support, understanding technical optimization for SEO is essential if you want to ensure that your web pages are structured for both humans and crawlers. To that end, we’ve divided this chapter into three sections:

  1. How websites work
  2. How search engines understand websites
  3. How users interact with websites

Since the technical structure of a site can have a massive impact on its performance, it’s crucial for everyone to understand these principles. It might also be a good idea to share this part of the guide with your programmers, content writers, and designers so that all parties involved in a site’s construction are on the same page.

1. How websites work

If search engine optimization is the process of optimizing a website for search, SEOs need at least a basic understanding of the thing they’re optimizing!

Below, we outline the website’s journey from domain name purchase all the way to its fully rendered state in a browser. An important component of the website’s journey is the critical rendering path, which is the process of a browser turning a website’s code into a viewable page.

Knowing this about websites is important for SEOs to understand for a few reasons:

  • The steps in this webpage assembly process can affect page load times, and speed is not only important for keeping users on your site, but it’s also one of Google’s ranking factors.
  • Google renders certain resources, like JavaScript, on a “second pass.” Google will look at the page without JavaScript first, then a few days to a few weeks later, it will render JavaScript, meaning SEO-critical elements that are added to the page using JavaScript might not get indexed.

Imagine that the website loading process is your commute to work. You get ready at home, gather your things to bring to the office, and then take the fastest route from your home to your work. It would be silly to put on just one of your shoes, take a longer route to work, drop your things off at the office, then immediately return home to get your other shoe, right? That’s sort of what inefficient websites do. This chapter will teach you how to diagnose where your website might be inefficient, what you can do to streamline, and the positive ramifications on your rankings and user experience that can result from that streamlining.

Before a website can be accessed, it needs to be set up!

  1. Domain name is purchased. Domain names like moz.com are purchased from a domain name registrar such as GoDaddy or HostGator. These registrars are just organizations that manage the reservations of domain names.
  2. Domain name is linked to IP address. The Internet doesn’t understand names like “moz.com” as website addresses without the help of domain name servers (DNS). The Internet uses a series of numbers called an Internet protocol (IP) address (ex: 127.0.0.1), but we want to use names like moz.com because they’re easier for humans to remember. We need to use a DNS to link those human-readable names with machine-readable numbers.

How a website gets from server to browser

  1. User requests domain. Now that the name is linked to an IP address via DNS, people can request a website by typing the domain name directly into their browser or by clicking on a link to the website.
  2. Browser makes requests. That request for a web page prompts the browser to make a DNS lookup request to convert the domain name to its IP address. The browser then makes a request to the server for the code your web page is constructed with, such as HTML, CSS, and JavaScript.
  3. Server sends resources. Once the server receives the request for the website, it sends the website files to be assembled in the searcher’s browser.
  4. Browser assembles the web page. The browser has now received the resources from the server, but it still needs to put it all together and render the web page so that the user can see it in their browser. As the browser parses and organizes all the web page’s resources, it’s creating a Document Object Model (DOM). The DOM is what you can see when you right click + “inspect element” on a web page in your Chrome browser (learn how to inspect elements in other browsers).
  5. Browser makes final requests. The browser will only show a web page after all the page’s necessary code is downloaded, parsed, and executed, so at this point, if the browser needs any additional code in order to show your website, it will make an additional request from your server.
  6. Website appears in browser. Whew! After all that, your website has now been transformed (rendered) from code to what you see in your browser.

Pro tip: Talk to your developers about async!

Something you can bring up with your developers is shortening the critical rendering path by setting scripts to “async” when they’re not needed to render content above the fold, which can make your web pages load faster. Async tells the DOM that it can continue to be assembled while the browser is fetching the scripts needed to display your web page. If the DOM has to pause assembly every time the browser fetches a script (called “render-blocking scripts”), it can substantially slow down your page load.

It would be like going out to eat with your friends and having to pause the conversation every time one of you went up to the counter to order, only resuming once they got back. With async, you and your friends can continue to chat even when one of you is ordering. You might also want to bring up other optimizations that devs can implement to shorten the critical rendering path, such as removing unnecessary scripts entirely, like old tracking scripts.

Now that you know how a website appears in a browser, we’re going to focus on what a website is made of — in other words, the code (programming languages) used to construct those web pages.

The three most common are:

  • HTML – What a website says (titles, body content, etc.)
  • CSS – How a website looks (color, fonts, etc.)
  • JavaScript – How it behaves (interactive, dynamic, etc.)

HTML: What a website says

HTML stands for hypertext markup language, and it serves as the backbone of a website. Elements like headings, paragraphs, lists, and content are all defined in the HTML.

Here’s an example of a webpage, and what its corresponding HTML looks like:

HTML is important for SEOs to know because it’s what lives “under the hood” of any page they create or work on. While your CMS likely doesn’t require you to write your pages in HTML (ex: selecting “hyperlink” will allow you to create a link without you having to type in “a href=”), it is what you’re modifying every time you do something to a web page such as adding content, changing the anchor text of internal links, and so on. Google crawls these HTML elements to determine how relevant your document is to a particular query. In other words, what’s in your HTML plays a huge role in how your web page ranks in Google organic search!

CSS: How a website looks

CSS stands for cascading style sheets, and this is what causes your web pages to take on certain fonts, colors, and layouts. HTML was created to describe content, rather than to style it, so when CSS entered the scene, it was a game-changer. With CSS, web pages could be “beautified” without requiring manual coding of styles into the HTML of every page — a cumbersome process, especially for large sites.

It wasn’t until 2014 that Google’s indexing system began to render web pages more like an actual browser, as opposed to a text-only browser. A black-hat SEO practice that tried to capitalize on Google’s older indexing system was hiding text and links via CSS for the purpose of manipulating search engine rankings. This “hidden text and links” practice is a violation of Google’s quality guidelines.

Components of CSS that SEOs, in particular, should care about:

  • Since style directives can live in external stylesheet files (CSS files) instead of your page’s HTML, it makes your page less code-heavy, reducing file transfer size and making load times faster.
  • Browsers still have to download resources like your CSS file, so compressing them can make your web pages load faster, and page speed is a ranking factor.
  • Having your pages be more content-heavy than code-heavy can lead to better indexing of your site’s content.
  • Using CSS to hide links and content can get your website manually penalized and removed from Google’s index.

JavaScript: How a website behaves

In the earlier days of the Internet, web pages were built with HTML. When CSS came along, webpage content had the ability to take on some style. When the programming language JavaScript entered the scene, websites could now not only have structure and style, but they could be dynamic.

JavaScript has opened up a lot of opportunities for non-static web page creation. When someone attempts to access a page that is enhanced with this programming language, that user’s browser will execute the JavaScript against the static HTML that the server returned, resulting in a web page that comes to life with some sort of interactivity.

You’ve definitely seen JavaScript in action — you just may not have known it! That’s because JavaScript can do almost anything to a page. It could create a pop up, for example, or it could request third-party resources like ads to display on your page.

JavaScript can pose some problems for SEO, though, since search engines don’t view JavaScript the same way human visitors do. That’s because of client-side versus server-side rendering. Most JavaScript is executed in a client’s browser. With server-side rendering, on the other hand, the files are executed at the server and the server sends them to the browser in their fully rendered state.

SEO-critical page elements such as text, links, and tags that are loaded on the client’s side with JavaScript, rather than represented in your HTML, are invisible from your page’s code until they are rendered. This means that search engine crawlers won’t see what’s in your JavaScript — at least not initially.

Google says that, as long as you’re not blocking Googlebot from crawling your JavaScript files, they’re generally able to render and understand your web pages just like a browser can, which means that Googlebot should see the same things as a user viewing a site in their browser. However, due to this “second wave of indexing” for client-side JavaScript, Google can miss certain elements that are only available once JavaScript is executed.

There are also some other things that could go wrong during Googlebot’s process of rendering your web pages, which can prevent Google from understanding what’s contained in your JavaScript:

  • You’ve blocked Googlebot from JavaScript resources (ex: with robots.txt, like we learned about in Chapter 2)
  • Your server can’t handle all the requests to crawl your content
  • The JavaScript is too complex or outdated for Googlebot to understand
  • JavaScript doesn’t “lazy load” content into the page until after the crawler has finished with the page and moved on.

Needless to say, while JavaScript does open a lot of possibilities for web page creation, it can also have some serious ramifications for your SEO if you’re not careful. Thankfully, there is a way to check whether Google sees the same thing as your visitors. To see a page how Googlebot views your page, use Google Search Console’s “Fetch and Render” tool. From your site’s Google Search Console dashboard, select “Crawl” from the left navigation, then “Fetch as Google.”

From this page, enter the URL you want to check (or leave blank if you want to check your homepage) and click the “Fetch and Render” button. You also have the option to test either the desktop or mobile version.

In return, you’ll get a side-by-side view of how Googlebot saw your page versus how a visitor to your website would have seen the page. Below, Google will also show you a list of any resources they may not have been able to get for the URL you entered.

Understanding the way websites work lays a great foundation for what we’ll talk about next, which is technical optimizations to help Google understand the pages on your website better.

2. How search engines understand websites

Search engines have gotten incredibly sophisticated, but they can’t (yet) find and interpret web pages quite like a human can. The following sections outline ways you can better deliver content to search engines.

Help search engines understand your content by structuring it with Schema

Imagine being a search engine crawler scanning down a 10,000-word article about how to bake a cake. How do you identify the author, recipe, ingredients, or steps required to bake a cake? This is where schema (Schema.org) markup comes in. It allows you to spoon-feed search engines more specific classifications for what type of information is on your page.

Schema is a way to label or organize your content so that search engines have a better understanding of what certain elements on your web pages are. This code provides structure to your data, which is why schema is often referred to as “structured data.” The process of structuring your data is often referred to as “markup” because you are marking up your content with organizational code.

JSON-LD is Google’s preferred schema markup (announced in May ‘16), which Bing also supports. To view a full list of the thousands of available schema markups, visit Schema.org or view the Google Developers Introduction to Structured Data for additional information on how to implement structured data. After you implement the structured data that best suits your web pages, you can test your markup with Google’s Structured Data Testing Tool.

In addition to helping bots like Google understand what a particular piece of content is about, schema markup can also enable special features to accompany your pages in the SERPs. These special features are referred to as “rich snippets,” and you’ve probably seen them in action. They’re things like:

  • Top Stories carousel
  • Review stars
  • Sitelinks search boxes
  • Recipes

Remember, using structured data can help enable a rich snippet to be present, but does not guarantee it. Other types of rich snippets will likely be added in the future as the use of schema markup increases.

Some last words of advice for schema success:

  • You can use multiple types of schema markup on a page. However, if you mark up one element, like a product for example, and there are other products listed on the page, you must also mark up those products.
  • Don’t mark up content that is not visible to visitors and follow Google’s Quality Guidelines. For example, if you add review structured markup to a page, make sure those reviews are actually visible on that page.
  • If you have duplicate pages, Google asks that you mark up each duplicate page with your structured markup, not just the canonical version.
  • Provide original and updated (if applicable) content on your structured data pages.
  • Structured markup should be an accurate reflection of your page.
  • Try to use the most specific type of schema markup for your content.
  • Marked-up reviews should not be written by the business. They should be genuine unpaid business reviews from actual customers.

Tell search engines about your preferred pages with canonicalization

When Google crawls the same content on different web pages, it sometimes doesn’t know which page to index in search results. This is why the tag was invented: to help search engines better index the preferred version of content and not all its duplicates.

The rel=”canonical” tag allows you to tell search engines where the original, master version of a piece of content is located. You’re essentially saying, “Hey search engine! Don’t index this; index this source page instead.” So, if you want to republish a piece of content, whether exactly or slightly modified, but don’t want to risk creating duplicate content, the canonical tag is here to save the day.

Proper canonicalization ensures that every unique piece of content on your website has only one URL. To prevent search engines from indexing multiple versions of a single page, Google recommends having a self-referencing canonical tag on every page on your site. Without a canonical tag telling Google which version of your web page is the preferred one, http://www.example.com could get indexed separately from http://example.com, creating duplicates.

“Avoid duplicate content” is an Internet truism, and for good reason! Google wants to reward sites with unique, valuable content — not content that’s taken from other sources and repeated across multiple pages. Because engines want to provide the best searcher experience, they will rarely show multiple versions of the same content, opting instead to show only the canonicalized version, or if a canonical tag does not exist, whichever version they deem most likely to be the original.

Pro tip: Distinguishing between content filtering & content penalties
There is no such thing as a duplicate content penalty. However, you should try to keep duplicate content from causing indexing issues by using the rel=”canonical” tag when possible. When duplicates of a page exist, Google will choose a canonical and filter the others out of search results. That doesn’t mean you’ve been penalized. It just means that Google only wants to show one version of your content.

It’s also very common for websites to have multiple duplicate pages due to sort and filter options. For example, on an e-commerce site, you might have what’s called a faceted navigation that allows visitors to narrow down products to find exactly what they’re looking for, such as a “sort by” feature that reorders results on the product category page from lowest to highest price. This could create a URL that looks something like this: example.com/mens-shirts?sort=price_ascending. Add in more sort/filter options like color, size, material, brand, etc. and just think about all the variations of your main product category page this would create!

To learn more about different types of duplicate content, this post by Dr. Pete helps distill the different nuances.

3. How users interact with websites

In Chapter 1, we said that despite SEO standing for search engine optimization, SEO is as much about people as it is about search engines themselves. That’s because search engines exist to serve searchers. This goal helps explain why Google’s algorithm rewards websites that provide the best possible experiences for searchers, and why some websites, despite having qualities like robust backlink profiles, might not perform well in search.

When we understand what makes their web browsing experience optimal, we can create those experiences for maximum search performance.

Ensuring a positive experience for your mobile visitors

Being that well over half of all web traffic today comes from mobile, it’s safe to say that your website should be accessible and easy to navigate for mobile visitors. In April 2015, Google rolled out an update to its algorithm that would promote mobile-friendly pages over non-mobile-friendly pages. So how can you ensure that your website is mobile friendly? Although there are three main ways to configure your website for mobile, Google recommends responsive web design.

Responsive design

Responsive websites are designed to fit the screen of whatever type of device your visitors are using. You can use CSS to make the web page “respond” to the device size. This is ideal because it prevents visitors from having to double-tap or pinch-and-zoom in order to view the content on your pages. Not sure if your web pages are mobile friendly? You can use Google’s mobile-friendly test to check!

AMP

AMP stands for Accelerated Mobile Pages, and it is used to deliver content to mobile visitors at speeds much greater than with non-AMP delivery. AMP is able to deliver content so fast because it delivers content from its cache servers (not the original site) and uses a special AMP version of HTML and JavaScript. Learn more about AMP.

Mobile-first indexing

As of 2018, Google started switching websites over to mobile-first indexing. That change sparked some confusion between mobile-friendliness and mobile-first, so it’s helpful to disambiguate. With mobile-first indexing, Google crawls and indexes the mobile version of your web pages. Making your website compatible to mobile screens is good for users and your performance in search, but mobile-first indexing happens independently of mobile-friendliness.

This has raised some concerns for websites that lack parity between mobile and desktop versions, such as showing different content, navigation, links, etc. on their mobile view. A mobile site with different links, for example, will alter the way in which Googlebot (mobile) crawls your site and sends link equity to your other pages.

Breaking up long content for easier digestion

When sites have very long pages, they have the option of breaking them up into multiple parts of a whole. This is called pagination and it’s similar to pages in a book. In order to avoid giving the visitor too much all at once, you can break up your single page into multiple parts. This can be great for visitors, especially on e-commerce sites where there are a lot of product results in a category, but there are some steps you should take to help Google understand the relationship between your paginated pages. It’s called rel=”next” and rel=”prev.”

You can read more about pagination in Google’s official documentation, but the main takeaways are that:

  • The first page in a sequence should only have rel=”next” markup
  • The last page in a sequence should only have rel=”prev” markup
  • Pages that have both a preceding and following page should have both rel=”next” and rel=”prev”
  • Since each page in the sequence is unique, don’t canonicalize them to the first page in the sequence. Only use a canonical tag to point to a “view all” version of your content, if you have one.
  • When Google sees a paginated sequence, it will typically consolidate the pages’ linking properties and send searchers to the first page

Pro tip: rel=”next/prev” should still have anchor text and live within an <a> link
This helps Google ensure that they pick up the rel=”next/prev”.

Improving page speed to mitigate visitor frustration

Google wants to serve content that loads lightning-fast for searchers. We’ve come to expect fast-loading results, and when we don’t get them, we’ll quickly bounce back to the SERP in search of a better, faster page. This is why page speed is a crucial aspect of on-site SEO. We can improve the speed of our web pages by taking advantage of tools like the ones we’ve mentioned below. Click on the links to learn more about each.

Images are one of the main culprits of slow pages!

As discussed in Chapter 4, images are one of the number-one reasons for slow-loading web pages! In addition to image compression, optimizing image alt text, choosing the right image format, and submitting image sitemaps, there are other technical ways to optimize the speed and way in which images are shown to your users. Some primary ways to improve image delivery are as follows:

SRCSET: How to deliver the best image size for each device

The SRCSET attribute allows you to have multiple versions of your image and then specify which version should be used in different situations. This piece of code is added to the <img> tag (where your image is located in the HTML) to provide unique images for specific-sized devices.

This is like the concept of responsive design that we discussed earlier, except for images!

This doesn’t just speed up your image load time, it’s also a unique way to enhance your on-page user experience by providing different and optimal images to different device types.

Pro tip: There are more than just three image size versions!
It’s a common misconception that you just need a desktop, tablet, and mobile-sized version of your image. There are a huge variety of screen sizes and resolutions. Learn more about SRCSET.

Show visitors image loading is in progress with lazy loading

Lazy loading occurs when you go to a webpage and, instead of seeing a blank white space for where an image will be, a blurry lightweight version of the image or a colored box in its place appears while the surrounding text loads. After a few seconds, the image clearly loads in full resolution. The popular blogging platform Medium does this really well.

The low resolution version is initially loaded, and then the full high resolution version. This also helps to optimize your critical rendering path! So while all of your other page resources are being downloaded, you’re showing a low-resolution teaser image that helps tell users that things are happening/being loaded. For more information on how you should lazy load your images, check out Google’s Lazy Loading Guidance.

Improve speed by condensing and bundling your files

Page speed audits will often make recommendations such as “minify resource,” but what does that actually mean? Minification condenses a code file by removing things like line breaks and spaces, as well as abbreviating code variable names wherever possible.

“Bundling” is another common term you’ll hear in reference to improving page speed. The process of bundling combines a bunch of the same coding language files into one single file. For example, a bunch of JavaScript files could be put into one larger file to reduce the amount of JavaScript files for a browser.

By both minifying and bundling the files needed to construct your web page, you’ll speed up your website and reduce the number of your HTTP (file) requests.

Improving the experience for international audiences

Websites that target audiences from multiple countries should familiarize themselves with international SEO best practices in order to serve up the most relevant experiences. Without these optimizations, international visitors might have difficulty finding the version of your site that caters to them.

There are two main ways a website can be internationalized:

  • Language
    Sites that target speakers of multiple languages are considered multilingual websites. These sites should add something called an hreflang tag to show Google that your page has copy for another language. Learn more about hreflang.
  • Country
    Sites that target audiences in multiple countries are called multi-regional websites and they should choose a URL structure that makes it easy to target their domain or pages to specific countries. This can include the use of a country code top level domain (ccTLD) such as “.ca” for Canada, or a generic top-level domain (gTLD) with a country-specific subfolder such as “example.com/ca” for Canada. Learn more about locale-specific URLs.

You’ve researched, you’ve written, and you’ve optimized your website for search engines and user experience. The next piece of the SEO puzzle is a big one: establishing authority so that your pages will rank highly in search results.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Follow the Local SEO Leaders: A Guide to Our Industry’s Best Publications

Posted by MiriamEllis

Change is the only constant in local SEO. As your local brand or local search marketing agency grows, you’ll be onboarding new hires. Whether they’re novices or adepts, they’ll need to keep up with continuous industry developments in order to make agile contributions to team strategy. Particularly if local SEO is new to someone, it saves training time if you can fast-track them on who to follow for the best news and analysis. This guide serves as a blueprint for that very purpose.

And even if you’re an old hand in the local SEM industry, you may find some sources here you’ve been overlooking that could add richness and depth to your ongoing education.

Two quick notes on what and how I’ve chosen:

  1. As the author of both of Moz’s newsletters (the Moz Top 10 and the Moz Local Top 7), I read an inordinate amount of SEO and local SEO content, but I could have missed your work. The list that follows represents my own, personal slate of the resources that have taught me the most. If you publish great local SEO information but you’re not on this list, my apologies, and if you write something truly awesome in future, you’re welcome to tweet at me. I’m always on the lookout for fresh and enlightening voices. My personal criteria for the publications I trust is that they are typically groundbreaking, thoughtful, investigative, and respectful of readers and subjects.
  2. Following the leaders is a useful practice, but not a stopping point. Even experts aren’t infallible. Rather than take industry advice at face value, do your own testing. Some of the most interesting local SEO discussions I’ve ever participated in have stemmed from people questioning standard best practices. So, while it’s smart to absorb the wisdom of experts, it’s even smarter to do your own experiments.

The best of local SEO news

Who reports fastest on Google updates, Knowledge Panel tweaks, and industry business?

Sterling Sky’s Timeline of Local SEO Changes is the industry’s premiere log of developments that impact local businesses and is continuously updated by Joy Hawkins + team.

Search Engine Roundtable has a proven track record of being among the first to report news that affects both local and digital businesses, thanks to the ongoing dedication of Barry Schwartz.

Street Fight is the best place on the web to read about mergers, acquisitions, the release of new technology, and other major happenings on the business side of local. I’m categorizing Street Fight under news, but they also offer good commentary, particularly the joint contributions of David Mihm and Mike Blumenthal.

LocalU’s Last Week in Local video and podcast series highlights Mike Blumenthal and Mary Bowling’s top picks of industry coverage most worthy of your attention. Comes with the bonus of expert commentary as they share their list.

TechCrunch also keeps a finger on the pulse of technology and business dealings that point to the future of local.

Search Engine Land’s local category is consistently swift in getting the word out about breaking industry news, with the help of multiple authors.

Adweek is a good source for reportage on retail and brand news, but there’s a limit to the number of articles you can read without a subscription. I often find them covering quirky stories that are absent from other publications I read.

The SEMPost’s local tab is another good place to check for local developments, chiefly covered by Jennifer Slegg.

Search Engine Journal’s local column also gets my vote for speedy delivery of breaking local stories.

Google’s main blog and the ThinkWithGoogle blog are musts to keep tabs on the search engine’s own developments, bearing in mind, of course, that these publications can be highly promotional of their products and worldview.

The best of local search marketing analysis

Who can you trust most to analyze the present and predict the future?

LocalU’s Deep Dive video series features what I consider to be the our industry’s most consistently insightful analysis of a variety of local marketing topics, discussed by learned faculty and guests.

The Moz Blog’s local category hosts a slate of gifted bloggers and professional editorial standards that result in truly in-depth treatment of local topics, presented with care and attention. As a veteran contributor to this publication, I can attest to how Moz inspires authors to aim high, and one of the nicest things that happened to our team in 2018 was being voted the #2 local SEO blog by BrightLocal’s survey respondents.

The Local Search Association’s Insider blog is one I turn to again and again, particularly for their excellent studies and quotable statistics.

Mike Blumenthal’s blog has earned a place of honor over many years as a key destination for breaking local developments and one-of-a-kind analysis. When Blumenthal talks, local people listen. One of the things I’ve prized for well over a decade in Mike’s writing is his ability to see things from a small business perspective, as opposed to simply standing in awe of big business and technology.

BrightLocal’s surveys and studies are some of the industry’s most cited and I look eagerly forward to their annual publication.

Whitespark’s blog doesn’t publish as frequently as I wish it did, but their posts by Darren Shaw and crew are always on extremely relevant topics and of high quality.

Sterling Sky’s blog is a relative newcomer, but the expertise Joy Hawkins and Colan Nielsen bring to their agency’s publication is making it a go-to resource for advice on some of the toughest aspects of local SEO.

Local Visibility System’s blog continues to please, with the thoughtful voice of Phil Rozek exploring themes you likely encounter in your day-to-day work as a local SEO.

The Local Search Forum is, hands down, the best free forum on the web to take your local mysteries and musings to. Founded by Linda Buquet, the ethos of the platform is approachable, friendly, and often fun, and high-level local SEOs frequently weigh in on hot topics.

Pro tip: In addition to the above tried-and-true resources, I frequently scan the online versions of city newspapers across the country for interesting local stories that add perspective to my vision of the challenges and successes of local businesses. Sometimes, too, publications like The Atlantic, Forbes, or Business Insider will publish pieces of a high journalistic quality with relevance to our industry. Check them out!

The best for specific local marketing disciplines

Here, I’ll break this down by subject or industry for easy scanning:

Reviews

  • GatherUp (formerly GetFiveStars) can’t be beat for insight into online reputation management, with Aaron Weiche and team delivering amazing case studies and memorable statistics. I literally have a document of quotes from their work that I refer to on a regular basis in my own writing.
  • Grade.us is my other ORM favorite for bright and lively coverage from authors like Garrett Sussman and Andrew McDermott.

Email marketing

  • Tidings’ vault contains a tiny but growing treasure trove of email marketing wisdom from David Mihm, whose former glory days spent in the trenches of local SEO make him especially attuned to our industry.

SABs

  • Tom Waddington’s blog is the must-read publication for service area businesses whose livelihoods are being impacted by Google’s Local Service Ads program in an increasing number of categories and cities.

Automotive marketing

  • DealerOn’s blog is the real deal when it comes to automotive local SEO, with Greg Gifford teaching memorable lessons in an enjoyable way.

Legal marketing

  • JurisDigital brings the the educated voices of Casey Meraz and team to the highly-specialized field of attorney marketing.

Hospitality marketing

Independent businesses

Link building

  • Nifty Marketing’s blog has earned my trust for its nifty local link building ideas and case studies.
  • ZipSprout belongs here, too, because of their focus on local sponsorships, which are a favorite local link building methodology. Check them out for blog posts and podcasts.

Schema + other markup

  • Touchpoint Digital Marketing doesn’t publish much on their own website, but look anywhere you can for David Deering’s writings on markup. LocalU and Moz are good places to search for his expertise.

Patents

  • SEO by the Sea has proffered years of matchless analysis of Google patents that frequently impact local businesses or point to future possible developments.

Best local search industry newsletters

Get the latest news and tips delivered right to your inbox by signing up for these fine free newsletters:

Follow the local SEO leaders on Twitter

What an easy way to track what industry adepts are thinking and sharing, up-to-the-minute! Following this list of professionals (alphabetized by first name) will fill up your social calendar with juicy local tidbits. Keep in mind that many of these folks either own or work for agencies or publishers you can follow, too.

Aaron Weiche
Adam Dorfman
Andrew Shotland
Ben Fisher
Bernadette Coleman
Bill Slawski
Brian Barwig
Carrie Hill
Casey Meraz
Cindy Krum
Colan Nielsen
DJ Baxter
Dan Leibson
Dana DiTomaso
Dani Owens
Darren Shaw
Dave DiGreggorio
David Mihm
Don Campbell
Garrett Sussman
Glenn Gabe
Greg Gifford
Greg Sterling
Jennifer Slegg
Joel Headley
Joy Hawkins
Mary Bowling
Mike Blumenthal
Mike Ramsey
Miriam Ellis
Phil Rozek
Sherry Bonelli
Thibault Adda
Tim Capper
Tom Waddington

Share what you learn

How about your voice? How do you get it heard in the local SEO industry? The answer is simple: share what you learn with others. Each of the people and publications on my list has earned a place there because, at one time or another, they have taught me something they learned from their own work. Some tips:

  • Our industry has become a sizeable niche, but there is always room for new, interesting voices
  • Experiment and publish — consistent publication of your findings is the best way I know of to become a trusted source of information
  • Don’t be afraid of making mistakes, so long as you are willing to own them
  • Socialize — attend events, amplify the work of colleagues you admire, reach out in real ways to others to share your common work interest while also respecting busy schedules

Local SEO is a little bit like jazz, in which we’re all riffing off the same chord progressions created by Google, Facebook, Yelp, other major platforms, and the needs of clients. Mike Blumenthal plays a note about a jeweler whose WOMM is driving the majority of her customers. You take that note and turn it around for someone in the auto industry, yielding an unexpected insight. Someone else takes your insight and creates a print handout to bolster a loyalty program.

Everyone ends up learning in this virtuous, democratic cycle, so go ahead — start sharing! A zest for contribution is a step towards leadership and your observations could be music to the industry’s ears.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 4: On-Page Optimization

Posted by BritneyMuller

Chapter Four of the Beginner’s Guide to SEO rewrite is chock full of on-page SEO learnings. After all the great feedback you’ve provided thus far on our outline, Chapter One, Chapter Two, and Chapter Three, we’re eager to hear how you feel about Chapter Four. What really works for you? What do you think is missing? Read on, and let us know your thoughts in the comments!


Chapter 4: On-Page Optimization

Use your research to craft your message.

Now that you know how your target market is searching, it’s time to dive into on-page optimization, the practice of crafting web pages that answer searcher’s questions. On-page SEO is multifaceted, and extends beyond content into other things like schema and meta tags, which we’ll discuss more at length in the next chapter on technical optimization. For now, put on your wordsmithing hats — it’s time to create your content!

Creating your content

Applying your keyword research

In the last chapter, we learned methods for discovering how your target audience is searching for your content. Now, it’s time to put that research into practice. Here is a simple outline to follow for applying your keyword research:

  1. Survey your keywords and group those with similar topics and intent. Those groups will be your pages, rather than creating individual pages for every keyword variation.
  2. If you haven’t done so already, evaluate the SERP for each keyword or group of keywords to determine what type and format your content should be. Some characteristics of ranking pages to take note of:
    1. Are they image or video heavy?
    2. Is the content long-form or short and concise?
    3. Is the content formatted in lists, bullets, or paragraphs?
  3. Ask yourself, “What unique value could I offer to make my page better than the pages that are currently ranking for my keyword?”

On-page optimization allows you to turn your research into content your audience will love. Just make sure to avoid falling into the trap of low-value tactics that could hurt more than help!

Low-value tactics to avoid

Your web content should exist to answer searchers’ questions, to guide them through your site, and to help them understand your site’s purpose. Content should not be created for the purpose of ranking highly in search alone. Ranking is a means to an end, the end being to help searchers. If we put the cart before the horse, we risk falling into the trap of low-value content tactics.

Some of these tactics were introduced in Chapter 2, but by way of review, let’s take a deeper dive into some low-value tactics you should avoid when crafting search engine optimized content.

Thin content

While it’s common for a website to have unique pages on different topics, an older content strategy was to create a page for every single iteration of your keywords in order to rank on page 1 for those highly specific queries.

For example, if you were selling bridal dresses, you might have created individual pages for bridal gowns, bridal dresses, wedding gowns, and wedding dresses, even if each page was essentially saying the same thing. A similar tactic for local businesses was to create multiple pages of content for each city or region from which they wanted clients. These “geo pages” often had the same or very similar content, with the location name being the only unique factor.

Tactics like these clearly weren’t helpful for users, so why did publishers do it? Google wasn’t always as good as it is today at understanding the relationships between words and phrases (or semantics). So, if you wanted to rank on page 1 for “bridal gowns” but you only had a page on “wedding dresses,” that may not have cut it.

This practice created tons of thin, low-quality content across the web, which Google addressed specifically with its 2011 update known as Panda. This algorithm update penalized low-quality pages, which resulted in more quality pages taking the top spots of the SERPs. Google continues to iterate on this process of demoting low-quality content and promoting high-quality content today.

Google is clear that you should have a comprehensive page on a topic instead of multiple, weaker pages for each variation of a keyword.

Depiction of distinct pages for each keyword variation versus one page covering multiple variations

Duplicate content

Like it sounds, “duplicate content” refers to content that is shared between domains or between multiple pages of a single domain. “Scraped” content goes a step further, and entails the blatant and unauthorized use of content from other sites. This can include taking content and republishing as-is, or modifying it slightly before republishing, without adding any original content or value.

There are plenty of legitimate reasons for internal or cross-domain duplicate content, so Google encourages the use of a rel=canonical tag to point to the original version of the web content. While you don’t need to know about this tag just yet, the main thing to note for now is that your content should be unique in word and in value.

Depiction of how duplicate content looks between pages.

Cloaking

A basic tenet of search engine guidelines is to show the same content to the engine’s crawlers that you’d show to a human visitor. This means that you should never hide text in the HTML code of your website that a normal visitor can’t see.

When this guideline is broken, search engines call it “cloaking” and take action to prevent these pages from ranking in search results. Cloaking can be accomplished in any number of ways and for a variety of reasons, both positive and negative. Below is an example of an instance where Spotify showed different content to users than to Google.

Spotify shows a login page to Google.Spotify shows a National Philharmonic Orchestra landing page to logged in visitors.

In some cases, Google may let practices that are technically cloaking pass because they contribute to a positive user experience. For more on the subject of cloaking and the levels of risk associated with various tactics, see our article on White Hat Cloaking.

Keyword stuffing

If you’ve ever been told, “You need to include {critical keyword} on this page X times,” you’ve seen the confusion over keyword usage in action. Many people mistakenly think that if you just include a keyword within your page’s content X times, you will automatically rank for it. The truth is, although Google looks for mentions of keywords and related concepts on your site’s pages, the page itself has to add value outside of pure keyword usage. If a page is going to be valuable to users, it won’t sound like it was written by a robot, so incorporate your keywords and phrases naturally in a way that is understandable to your readers.

Below is an example of a keyword-stuffed page of content that also uses another old method: bolding all your targeted keywords. Oy.

Screenshot of a site that bolds keywords in a paragraph.

Auto-generated content

Arguably one of the most offensive forms of low quality content is the kind that is auto-generated, or created programmatically with the intent of manipulating search rankings and not helping users. You may recognize some auto-generated content by how little it makes sense when read — they are technically words, but strung together by a program rather than a human being.

Gibberish text on a webpage

It is worth noting that advancements in machine learning have contributed to more sophisticated auto-generated content that will only get better over time. This is likely why in Google’s quality guidelines on automatically generated content, Google specifically calls out the brand of auto-generated content that attempts to manipulate search rankings, rather than any-and-all auto-generated content.

What to do instead: 10x it!

There is no “secret sauce” to ranking in search results. Google ranks pages highly because it has determined they are the best answers to the searcher’s questions. In today’s search engine, it’s not enough that your page isn’t duplicate, spamming, or broken. Your page has to provide value to searchers and be better than any other page Google is currently serving as the answer to a particular query. Here’s a simple formula for content creation:

  • Search the keyword(s) you want your page to rank for
  • Identify which pages are ranking highly for those keywords
  • Determine what qualities those pages possess
  • Create content that’s better than that

We like to call this 10x content. If you create a page on a keyword that is 10x better than the pages being shown in search results (for that keyword), Google will reward you for it, and better yet, you’ll naturally get people linking to it! Creating 10x content is hard work, but will pay dividends in organic traffic.

Just remember, there’s no magic number when it comes to words on a page. What we should be aiming for is whatever sufficiently satisfies user intent. Some queries can be answered thoroughly and accurately in 300 words while others might require 1,000 words!

Pro tip: Don’t reinvent the wheel!
If you already have content on your website, save yourself time by evaluating which of those pages are already bringing in good amounts of organic traffic and converting well. Refurbish that content on different platforms to help get more visibility to your site. On the other side of the coin, evaluate what existing content isn’t performing as well and adjust it, rather than starting from square one with all new content.

NAP: A note for local businesses

If you’re a business that makes in-person contact with your customers, be sure to include your business name, address, and phone number (NAP) prominently, accurately, and consistently throughout your site’s content. This information is often displayed in the footer or header of a local business website, as well as on any “contact us” pages. You’ll also want to mark up this information using local business schema. Schema and structured data are discussed more at length in the “Code” section of this chapter.

If you are a multi-location business, it’s best to build unique, optimized pages for each location. For example, a business that has locations in Seattle, Tacoma, and Bellevue should consider having a page for each:

example.com/seattle
example.com/tacoma
example.com/bellevue

Each page should be uniquely optimized for that location, so the Seattle page would have unique content discussing the Seattle location, list the Seattle NAP, and even testimonials specifically from Seattle customers. If there are dozens, hundreds, or even thousands of locations, a store locator widget could be employed to help you scale.

Hope you still have some energy left after handling the difficult-yet-rewarding task of putting together a page that is 10x better than your competitors’ pages, because there are just a few more things needed before your page is complete! In the next sections, we’ll talk about the other on-page optimizations your pages need, as well as naming and organizing your content.

Beyond content: Other optimizations your pages need

Can I just bump up the font size to create paragraph headings?

How can I control what title and description show up for my page in search results?

After reading this section, you’ll understand other important on-page elements that help search engines understand the 10x content you just created, so let’s dive in!

Header tags

Header tags are an HTML element used to designate headings on your page. The main header tag, called an H1, is typically reserved for the title of the page. It looks like this:

 <h1>Page Title</h1>

There are also sub-headings that go from H2 (<h2>) to H6 (<h6>) tags, although using all of these on a page is not required. The hierarchy of header tags goes from H1 to H6 in descending order of importance.

Each page should have a unique H1 that describes the main topic of the page, this is often automatically created from the title of a page. As the main descriptive title of the page, the H1 should contain that page’s primary keyword or phrase. You should avoid using header tags to mark up non-heading elements, such as navigational buttons and phone numbers. Use header tags to introduce what the following content will discuss.

Take this page about touring Copenhagen, for example:

<h1>Copenhagen Travel Guide</h1>
<h2>Copenhagen by the Seasons</h2>
<h3>Visiting in Winter</h3>
<h3>Visiting in Spring</h3>

The main topic of the page is introduced in the main <h1> heading, and each additional heading is used to introduce a new sub-topic. In this example, the <h2> is more specific than the <h1>, and the <h3> tags are more specific than the <h2>. This is just an example of a structure you could use.

Although what you choose to put in your header tags can be used by search engines to evaluate and rank your page, it’s important to avoid inflating their importance. Header tags are one among many on-page SEO factors, and typically would not move the needle like quality backlinks and content would, so focus on your site visitors when crafting your headings.

Internal links

In Chapter 2, we discussed the importance of having a crawlable website. Part of a website’s crawlability lies in its internal linking structure. When you link to other pages on your website, you ensure that search engine crawlers can find all your site’s pages, you pass link equity (ranking power) to other pages on your site, and you help visitors navigate your site.

The importance of internal linking is well established, but there can be confusion over how this looks in practice.

Link accessibility

Links that require a click (like a navigation drop-down to view) are often hidden from search engine crawlers, so if the only links to internal pages on your website are through these types of links, you may have trouble getting those pages indexed. Opt instead for links that are directly accessible on the page.

Anchor text

Anchor text is the text with which you link to pages. Below, you can see an example of what a hyperlink without anchor text and a hyperlink with anchor text would look like in the HTML.

<a href="http://www.domain.com/"></a>
<a href="http://www.domain.com/" title="Keyword Text">Keyword Text</a>

On live view, that would look like this:

http://www.example.com/

Keyword Text

The anchor text sends signals to search engines regarding the content of the destination page. For example, if I link to a page on my site using the anchor text “learn SEO,” that’s a good indicator to search engines that the targeted page is one at which people can learn about SEO. Be careful not to overdo it, though. Too many internal links using the same, keyword-stuffed anchor text can appear to search engines that you’re trying to manipulate a page’s ranking. It’s best to make anchor text natural rather than formulaic.

Link volume

In Google’s General Webmaster Guidelines, they say to “limit the number of links on a page to a reasonable number (a few thousand at most).” This is part of Google’s technical guidelines, rather than the quality guideline section, so having too many internal links isn’t something that on its own is going to get you penalized, but it does affect how Google finds and evaluates your pages.

The more links on a page, the less equity each link can pass to its destination page. A page only has so much equity to go around.

Depiction of how link equity works between pages

So it’s safe to say that you should only link when you mean it! You can learn more about link equity from our SEO Learning Center.

Aside from passing authority between pages, a link is also a way to help users navigate to other pages on your site. This is a case where doing what’s best for search engines is also doing what’s best for searchers. Too many links not only dilute the authority of each link, but they can also be unhelpful and overwhelming. Consider how a searcher might feel landing on a page that looks like this:

Welcome to our gardening website! We have many articles on gardening, how to garden, and helpful tips on herbs, fruits, vegetables, perennials, and annuals. Learn more about gardening from our gardening blog.

Whew! Not only is that a lot of links to process, but it also reads pretty unnaturally and doesn’t contain much substance (which could be considered “thin content” by Google). Focus on quality and helping your users navigate your site, and you likely won’t have to worry about too many links.

Redirection

Removing and renaming pages is a common practice, but in the event that you do move a page, make sure to update the links to that old URL! At the very least, you should make sure to redirect the URL to its new location, but if possible, update all internal links to that URL at the source so that users and crawlers don’t have to pass through redirects to arrive at the destination page. If you choose to redirect only, be careful to avoid redirect chains that are too long (Google says, “Avoid chaining redirects… keep the number of redirects in the chain low, ideally no more than 3 and fewer than 5.”)

Example of a redirect chain:

(original location of content) example.com/location1 >> example.com/location2 >> (current location of content) example.com/location3

Better:

example.com/location1 >> example.com/location3

Image optimization

Images are the biggest culprits of slow web pages! The best way to solve for this is to compress your images. While there is no one-size-fits-all when it comes to image compression, testing various options like “save for web,” image sizing, and compression tools like Optimizilla, ImageOptim for Mac (or Windows alternatives), as well as evaluating what works best is the way to go.

Another way to help optimize your images (and improve your page speed) is by choosing the right image format.

How to choose which image format to use:

Flowchart for how to choose image formatsSource: Google’s image optimization guide

Choosing image formats:

  • If your image requires animation, use a GIF.
  • If you don’t need to preserve high image resolution, use JPEG (and test out different compression settings).
  • If you do need to preserve high image resolution, use PNG.
    • If your image has a lot of colors, use PNG-24.
    • If your image doesn’t have a lot of colors, use PNG-8.

There are different ways to keep visitors on a semi-slow loading page by using images that produce a colored box or a very blurry/low resolution version while rendering to help visitors feel as if things are loading faster. We will discuss these options in more detail in Chapter 5.

Pro tip: Don’t forget about thumbnails!
Thumbnails (especially for E-Commerce sites) can be a huge page speed slow down. Optimize thumbnails properly to avoid slow pages and to help retain more qualified visitors.

Alt text

Alt text (alternative text) within images is a principle of web accessibility, and is used to describe images to the visually impaired via screen readers. It’s important to have alt text descriptions so that any visually impaired person can understand what the pictures on your website depict.

Search engine bots also crawl alt text to better understand your images, which gives you the added benefit of providing better image context to search engines. Just ensure that your alt descriptions reads naturally for people, and avoid stuffing keywords for search engines.

Bad:

<img src="grumpycat.gif" alt="grumpy cat, cat is grumpy, grumpy cat gif">

Good:

<img src="grumpycat.gif" alt="A black cat looking very grumpy at a big spotted dog">

Submit an image sitemap

To ensure that Google can crawl and index your images, submit an image sitemap in your Google Search Console account. This helps Google discover images they may have otherwise missed.

Formatting for readability & featured snippets

Your page could contain the best content ever written on a subject, but if it’s formatted improperly, your audience might never read it! While we can never guarantee that visitors will read our content, there are some principles that can promote readability, including:

  • Text size and color – Avoid fonts that are too tiny. Google recommends 16+px font to minimize the need for “pinching and zooming” on mobile. The text color in relation to the page’s background color should also promote readability. Additional information on text can be found in the website accessibility guidelines. (Google’s web accessibility fundamentals).
  • Headings – Breaking up your content with helpful headings can help readers navigate the page. This is especially useful on long pages where a reader might be looking only for information from a particular section.
  • Bullet points – Great for lists, bullet points can help readers skim and more quickly find the information they need.
  • Paragraph breaks – Avoiding walls of text can help prevent page abandonment and encourage site visitors to read more of your page.
  • Supporting media – When appropriate, include images, videos, and widgets that would complement your content.
  • Bold and italics for emphasis – Putting words in bold or italics can add emphasis, so they should be the exception, not the rule. Appropriate use of these formatting options can call out important points you want to communicate.

Formatting can also affect your page’s ability to show up in featured snippets, those “position 0” results that appear above the rest of organic results.

Screenshot of a featured snippet

There is no special code that you can add to your page to show up here, nor can you pay for this placement, but taking note of the query intent can help you better structure your content for featured snippets. For example, if you’re trying to rank for “cake vs. pie,” it might make sense to include a table in your content, with the benefits of cake in one column and the benefits of pie in the other. Or if you’re trying to rank for “best restaurants to try in Portland,” that could indicate Google wants a list, so formatting your content in bullets could help.

Title tags

A page’s title tag is a descriptive, HTML element that specifies the title of a particular web page. They are nested within the head tag of each page and look like this:

<head>
  <title>Example Title</title>
</head>

Each page on your website should have a unique, descriptive title tag. What you input into your title tag field will show up here in search results, although in some cases Google may adjust how your title tag appears in search results.

Screenshot with the page title highlighted in the SERPs

It can also show up in web browsers…

Screenshot of a page title in a browser window

Or when you share the link to your page on certain external websites…

Screenshot of a page title shared on an external website

Your title tag has a big role to play in people’s first impression of your website, and it’s an incredibly effective tool for drawing searchers to your page over any other result on the SERP. The more compelling your title tag, combined with high rankings in search results, the more visitors you’ll attract to your website. This underscores that SEO is not only about search engines, but rather the entire user experience.

What makes an effective title tag?

  • Keyword usage: Having your target keyword in the title can help both users and search engines understand what your page is about. Also, the closer to the front of the title tag your keywords are, the more likely a user will be to read them (and hopefully click) and the more helpful they can be for ranking.
  • Length: On average, search engines display the first 50–60 characters (~512 pixels) of a title tag in search results. If your title tag exceeds the characters allowed on that SERP, an ellipsis “…” will appear where the title was cut off. While sticking to 50–60 characters is safe, never sacrifice quality for strict character counts. If you can’t get your title tag down to 60 characters without harming its readability, go longer (within reason).
  • Branding: At Moz, we love to end our title tags with a brand name mention because it promotes brand awareness and creates a higher click-through rate among people who are familiar with Moz. Sometimes it makes sense to place your brand at the beginning of the title tag, such as on your homepage, but be mindful of what you are trying to rank for and place those words closer toward the beginning of your title tag.

Meta descriptions

Like title tags, meta descriptions are HTML elements that describe the contents of the page that they’re on. They are also nested in the head tag, and look like this:

<head>
  <meta name=”description” content=”Description of page here.”/>
</head>

What you input into the description field will show up here in search results:

In many cases though, Google will choose different snippets of text to display in search results, dependent upon the searcher’s query.

For example if you search “find backlinks,” Google will provide this meta description as it deems it more relevant to the specific search:

The meta description pulls each step from the page content and lists it out.

While the actual meta description is:

How to find backlinks? Step 1: Navigate to Link Explorer, a tool used to research the backlink profile of a website. It will show you the quality of backlinks using metrics like Domain Authority, Page Authority, and Spam Score. You can do a good amount of backlink research with the free version or pay to receive unlimited backlink data.

This often helps to improve your meta descriptions for unique searches. However, don’t let this deter you from writing a default page meta description — they’re still extremely valuable.

What makes an effective meta description?

The qualities that make an effective title tag also apply to effective meta descriptions. Although Google says that meta descriptions are not a ranking factor, like title tags, they are incredibly important for click-through rate.

  • Relevance: Meta descriptions should be highly relevant to the content of your page, so it should summarize your key concept in some form. You should give the searcher enough information to know they’ve found a page relevant enough to answer their question, without giving away so much information that it eliminates the need to click through to your web page.
  • Length: Search engines tend to truncate meta descriptions to around 300 characters. It’s best to write meta descriptions between 150–300 characters in length. On some SERPs, you’ll notice that Google gives much more real estate to the descriptions of some pages. This usually happens for web pages ranking right below a featured snippet.

URL structure: Naming and organizing your pages

URL stands for Uniform Resource Locator. URLs are the locations or addresses for individual pieces of content on the web. Like title tags and meta descriptions, search engines display URLs on the SERPs, so URL naming and format can impact click-through rates. Not only do searchers use them to make decisions about which web pages to click on, but URLs are also used by search engines in evaluating and ranking pages.

Clear page naming

Search engines require unique URLs for each page on your website so they can display your pages in search results, but clear URL structure and naming is also helpful for people who are trying to understand what a specific URL is about. For example, which URL is clearer?

example.com/desserts/chocolate-pie

OR

example.com/asdf/453?=recipe-23432-1123

Searchers are more likely to click on URLs that reinforce and clarify what information is contained on that page, and less likely to click on URLs that confuse them.

Page organization

If you discuss multiple topics on your website, you should also make sure to avoid nesting pages under irrelevant folders. For example:

example.com/commercial-litigation/alimony

It would have been better for this fictional multi-practice law firm website to nest alimony under “/family-law/” than to host it under the irrelevant “/commercial-litigation/” section of the website.

The folders in which you locate your content can also send signals about the type, not just the topic, of your content. For example, dated URLs can indicate time-sensitive content. While appropriate for news-based websites, dated URLs for evergreen content can actually turn searchers away because the information seems outdated. For example:

example.com/2015/april/what-is-seo/

vs.

example.com/what-is-seo/

Since the topic “What is SEO?” isn’t confined to a specific date, it’s best to host on a non-dated URL structure or else risk your information appearing stale.

As you can see, what you name your pages, and in what folders you choose to organize your pages, is an important way to clarify the topic of your page to users and search engines.

URL length

While it is not necessary to have a completely flat URL structure, many click-through rate studies indicate that, when given the choice between a URL and a shorter URL, searchers often prefer shorter URLs. Like title tags and meta descriptions that are too long, too-long URLs will also be cut off with an ellipsis. Just remember, having a descriptive URL is just as important, so don’t cut down on URL length if it means sacrificing the URL’s descriptiveness.

example.com/services/plumbing/plumbing-repair/toilets/leaks/

vs.

example.com/plumbing-repair/toilets/

Minimizing length, both by including fewer words in your page names and removing unnecessary subfolders, makes your URLs easier to copy and paste, as well as more clickable.

Keywords in URL

If your page is targeting a specific term or phrase, make sure to include it in the URL. However, don’t go overboard by trying to stuff in multiple keywords for purely SEO purposes. It’s also important to watch out for repeat keywords in different subfolders. For example, you may have naturally incorporated a keyword into a page name, but if located within other folders that are also optimized with that keyword, the URL could begin to appear keyword-stuffed.

Example:

example.com/seattle-dentist/dental-services/dental-crowns/

Keyword overuse in URLs can appear spammy and manipulative. If you aren’t sure whether your keyword usage is too aggressive, just read your URL through the eyes of a searcher and ask, “Does this look natural? Would I click on this?”

Static URLs

The best URLs are those that can easily be read by humans, so you should avoid the overuse of parameters, numbers, and symbols. Using technologies like mod_rewrite for Apache and ISAPI_rewrite for Microsoft, you can easily transform dynamic URLs like this:

http://moz.com/blog?id=123

into a more readable static version like this:

https://moz.com/google-algorithm-change

Hyphens for word separation

Not all web applications accurately interpret separators like underscores (_), plus signs (+), or spaces (%20). Search engines also do not understand how to separate words in URLs when they run together without a separator (example.com/optimizefeaturedsnippets/). Instead, use the hyphen character (-) to separate words in a URL.

Geographic Modifiers in URLs

Some local business owners omit geographic terms that describe their physical location or service area because they believe that search engines can figure this out on their own. On the contrary, it’s vital that local business websites’ content, URLs, and other on-page assets make specific mention of city names, neighborhood names, and other regional descriptors. Let both consumers and search engines know exactly where you are and where you serve, rather than relying on your physical location alone.

Protocols: HTTP vs. HTTPS

A protocol is that “http” or “https” preceding your domain name. Google recommends that all websites have a secure protocol (the “s” in “https” stands for “secure”). To ensure that your URLs are using the https:// protocol instead of http://, you must obtain an SSL (Secure Sockets Layer) certificate. SSL certificates are used to encrypt data. They ensure that any data passed between the web server and browser of the searcher remains private. As of July 2018, Google Chrome displays “not secure” for all HTTP sites, which could cause these sites to appear untrustworthy to visitors and result in them leaving the site.


If you’ve made it this far, congratulations on surpassing the halfway point of the Beginner’s Guide to SEO! So far, we’ve learned how search engines crawl, index, and rank content, how to find keyword opportunities to target, and now, you know the on-page optimization strategies that can help your pages get found. Next, buckle up, because we’ll be diving into the exciting world of technical SEO!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Articles

Posted in Latest NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 3: Keyword Research

Posted by BritneyMuller

Welcome to the draft of Chapter Three of the new and improved Beginner’s Guide to SEO! So far you’ve been generous and energizing with your feedback for our outline, Chapter One, and Chapter Two. We’re asking for a little more of your time as we debut the our third chapter on keyword research. Please let us know what you think in the comments!


Chapter 3: Keyword Research

Understand what your audience wants to find.

Now that you’ve learned how to show up in search results, let’s determine which strategic keywords to target in your website’s content, and how to craft that content to satisfy both users and search engines.

The power of keyword research lies in better understanding your target market and how they are searching for your content, services, or products.

Keyword research provides you with specific search data that can help you answer questions like:

  • What are people searching for?
  • How many people are searching for it?
  • In what format do they want that information?

In this chapter, you’ll get tools and strategies for uncovering that information, as well as learn tactics that’ll help you avoid keyword research foibles and build strong content. Once you uncover how your target audience is searching for your content, you begin to uncover a whole new world of strategic SEO!

What terms are people searching for?

You may know what you do, but how do people search for the product, service, or information you provide? Answering this question is a crucial first step in the keyword research process.

Discovering keywords

You likely have a few keywords in mind that you would like to rank for. These will be things like your products, services, or other topics your website addresses, and they are great seed keywords for your research, so start there! You can enter those keywords into a keyword research tool to discover average monthly search volume and similar keywords. We’ll get into search volume in greater depth in the next section, but during the discovery phase, it can help you determine which variations of your keywords are most popular amongst searchers.

Once you enter in your seed keywords into a keyword research tool, you will begin to discover other keywords, common questions, and topics for your content that you might have otherwise missed.

Let’s use the example of a florist that specializes in weddings.

Typing “wedding” and “florist” into a keyword research tool, you may discover highly relevant, highly searched for related terms such as:

  • Wedding bouquets
  • Bridal flowers
  • Wedding flower shop

In the process of discovering relevant keywords for your content, you will likely notice that the search volume of those keywords varies greatly. While you definitely want to target terms that your audience is searching for, in some cases, it may be more advantageous to target terms with lower search volume because they’re far less competitive.

Since both high- and low-competition keywords can be advantageous for your website, learning more about search volume can help you prioritize keywords and pick the ones that will give your website the biggest strategic advantage.

Pro tip: Diversify!

It’s important to note that entire websites don’t rank for keywords, pages do. With big brands, we often see the homepage ranking for many keywords, but for most websites, this isn’t usually the case. Many websites receive more organic traffic to pages other than the homepage, which is why it’s so important to diversify your website’s pages by optimizing each for uniquely valuable keywords.

How often are those terms searched?

Uncovering search volume

The higher the search volume for a given keyword or keyword phrase, the more work is typically required to achieve higher rankings. This is often referred to as keyword difficulty and occasionally incorporates SERP features; for example, if many SERP features (like featured snippets, knowledge graph, carousels, etc) are clogging up a keyword’s result page, difficulty will increase. Big brands often take up the top 10 results for high-volume keywords, so if you’re just starting out on the web and going after the same keywords, the uphill battle for ranking can take years of effort.

Typically, the higher the search volume, the greater the competition and effort required to achieve organic ranking success. Go too low, though, and you risk not drawing any searchers to your site. In many cases, it may be most advantageous to target highly specific, lower competition search terms. In SEO, we call those long-tail keywords.

Understanding the long tail

It would be great to rank #1 for the keyword “shoes”… or would it?

It’s wonderful to deal with keywords that have 50,000 searches a month, or even 5,000 searches a month, but in reality, these popular search terms only make up a fraction of all searches performed on the web. In fact, keywords with very high search volumes may even indicate ambiguous intent, which, if you target these terms, it could put you at risk for drawing visitors to your site whose goals don’t match the content your page provides.

Does the searcher want to know the nutritional value of pizza? Order a pizza? Find a restaurant to take their family? Google doesn’t know, so they offer these features to help you refine. Targeting “pizza” means that you’re likely casting too wide a net.

The remaining 75% lie in the “chunky middle” and “long tail” of search.

Don’t underestimate these less popular keywords. Long tail keywords with lower search volume often convert better, because searchers are more specific and intentional in their searches. For example, a person searching for “shoes” is probably just browsing. Whereas, someone searching for “best price red womens size 7 running shoe,” practically has their wallet out!

Pro tip: Questions are SEO gold!

Discovering what questions people are asking in your space, and adding those questions and their answers to an FAQ page, can yield incredible organic traffic for your website.

Getting strategic with search volume

Now that you’ve discovered relevant search terms for your site and their corresponding search volumes, you can get even more strategic by looking at your competitors and figuring out how searches might differ by season or location.

Keywords by competitor

You’ll likely compile a lot of keywords. How do you know which to tackle first? It could be a good idea to prioritize high-volume keywords that your competitors are not currently ranking for. On the flip side, you could also see which keywords from your list your competitors are already ranking for and prioritize those. The former is great when you want to take advantage of your competitors’ missed opportunities, while the latter is an aggressive strategy that sets you up to compete for keywords your competitors are already performing well for.

Keywords by season

Knowing about seasonal trends can be advantageous in setting a content strategy. For example, if you know that “christmas box” starts to spike in October through December in the United Kingdom, you can prepare content months in advance and give it a big push around those months.

Keywords by region

You can more strategically target a specific location by narrowing down your keyword research to specific towns, counties, or states in the Google Keyword Planner, or evaluate “interest by subregion” in Google Trends. Geo-specific research can help make your content more relevant to your target audience. For example, you might find out that in Texas, the preferred term for a large truck is “big rig,” while in New York, “tractor trailer” is the preferred terminology.

Which format best suits the searcher’s intent?

In Chapter 2, we learned about SERP features. That background is going to help us understand how searchers want to consume information for a particular keyword. The format in which Google chooses to display search results depends on intent, and every query has a unique one. While there are thousands of of possible search types, there are five major categories to be aware of:

1. Informational queries: The searcher needs information, such as the name of a band or the height of the Empire State Building.

2. Navigational queries: The searcher wants to go to a particular place on the Internet, such as Facebook or the homepage of the NFL.

3. Transactional queries: The searcher wants to do something, such as buy a plane ticket or listen to a song.

4. Commercial investigation: The searcher wants to compare products and find the best one for their specific needs.

5. Local queries: The searcher wants to find something locally, such as a nearby coffee shop, doctor, or music venue.

An important step in the keyword research process is surveying the SERP landscape for the keyword you want to target in order to get a better gauge of searcher intent. If you want to know what type of content your target audience wants, look to the SERPs!

Google has closely evaluated the behavior of trillions of searches in an attempt to provide the most desired content for each specific keyword search.

Take the search “dresses,” for example:

By the shopping carousel, you can infer that Google has determined many people who search for “dresses” want to shop for dresses online.

There is also a Local Pack feature for this keyword, indicating Google’s desire to help searchers who may be looking for local dress retailers.

If the query is ambiguous, Google will also sometimes include the “refine by” feature to help searchers specify what they’re looking for further. By doing so, the search engine can provide results that better help the searcher accomplish their task.

Google has a wide array of result types it can serve up depending on the query, so if you’re going to target a keyword, look to the SERP to understand what type of content you need to create.

Tools for determining the value of a keyword

How much value would a keyword add to your website? These tools can help you answer that question, so they’d make great additions to your keyword research arsenal:

  • Moz Keyword Explorer – Our own Moz Keyword Explorer tool extracts accurate search volume data, keyword difficulty, and keyword opportunity metrics by using live clickstream data. To learn more about how we’re producing our keyword data, check out Announcing Keyword Explorer.
  • Google Keyword Planner – Google’s AdWords Keyword Planner has historically been the most common starting point for SEO keyword research. However, Keyword Planner does restrict search volume data by lumping keywords together into large search volume range buckets. To learn more, check out Google Keyword Planner’s Dirty Secrets.
  • Google Trends – Google’s keyword trend tool is great for finding seasonal keyword fluctuations. For example, “funny halloween costume ideas” will peak in the weeks before Halloween.
  • AnswerThePublic – This free tool populates commonly searched for questions around a specific keyword. Bonus! You can use this tool in tandem with another free tool, Keywords Everywhere, to prioritize ATP’s suggestions by search volume.
  • SpyFu Keyword Research Tool – Provides some really neat competitive keyword data.

Download our free keyword research template!

Keyword research can yield a ton of data. Stay organized by downloading our free keyword research template. You can customize the template to fit your unique needs (ex: remove the “Seasonal Trends” column), sort keywords by volume, and categorize by Priority Score. Happy keyword researching!

Now that you know how to uncover what your target audience is searching for and how often, it’s time to move onto the next step: crafting pages in a way that users will love and search engines can understand.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

More Articles

Posted in Latest NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 2: Crawling, Indexing, and Ranking

Posted by BritneyMuller

It’s been a few months since our last share of our work-in-progress rewrite of the Beginner’s Guide to SEO, but after a brief hiatus, we’re back to share our draft of Chapter Two with you! This wouldn’t have been possible without the help of Kameron Jenkins, who has thoughtfully contributed her great talent for wordsmithing throughout this piece.

This is your resource, the guide that likely kicked off your interest in and knowledge of SEO, and we want to do right by you. You left amazingly helpful commentary on our outline and draft of Chapter One, and we’d be honored if you would take the time to let us know what you think of Chapter Two in the comments below.


Chapter 2: How Search Engines Work – Crawling, Indexing, and Ranking

First, show up.

As we mentioned in Chapter 1, search engines are answer machines. They exist to discover, understand, and organize the internet’s content in order to offer the most relevant results to the questions searchers are asking.

In order to show up in search results, your content needs to first be visible to search engines. It’s arguably the most important piece of the SEO puzzle: If your site can’t be found, there’s no way you’ll ever show up in the SERPs (Search Engine Results Page).

How do search engines work?

Search engines have three primary functions:

  1. Crawl: Scour the Internet for content, looking over the code/content for each URL they find.
  2. Index: Store and organize the content found during the crawling process. Once a page is in the index, it’s in the running to be displayed as a result to relevant queries.
  3. Rank: Provide the pieces of content that will best answer a searcher’s query. Order the search results by the most helpful to a particular query.

What is search engine crawling?

Crawling, is the discovery process in which search engines send out a team of robots (known as crawlers or spiders) to find new and updated content. Content can vary — it could be a webpage, an image, a video, a PDF, etc. — but regardless of the format, content is discovered by links.

The bot starts out by fetching a few web pages, and then follows the links on those webpages to find new URLs. By hopping along this path of links, crawlers are able to find new content and add it to their index — a massive database of discovered URLs — to later be retrieved when a searcher is seeking information that the content on that URL is a good match for.

What is a search engine index?

Search engines process and store information they find in an index, a huge database of all the content they’ve discovered and deem good enough to serve up to searchers.

Search engine ranking

When someone performs a search, search engines scour their index for highly relevant content and then orders that content in the hopes of solving the searcher’s query. This ordering of search results by relevance is known as ranking. In general, you can assume that the higher a website is ranked, the more relevant the search engine believes that site is to the query.

It’s possible to block search engine crawlers from part or all of your site, or instruct search engines to avoid storing certain pages in their index. While there can be reasons for doing this, if you want your content found by searchers, you have to first make sure it’s accessible to crawlers and is indexable. Otherwise, it’s as good as invisible.

By the end of this chapter, you’ll have the context you need to work with the search engine, rather than against it!

Note: In SEO, not all search engines are equal

Many beginners wonder about the relative importance of particular search engines. Most people know that Google has the largest market share, but how important it is to optimize for Bing, Yahoo, and others? The truth is that despite the existence of more than 30 major web search engines, the SEO community really only pays attention to Google. Why? The short answer is that Google is where the vast majority of people search the web. If we include Google Images, Google Maps, and YouTube (a Google property), more than 90% of web searches happen on Google — that’s nearly 20 times Bing and Yahoo combined.

Crawling: Can search engines find your site?

As you’ve just learned, making sure your site gets crawled and indexed is a prerequisite for showing up in the SERPs. First things first: You can check to see how many and which pages of your website have been indexed by Google using “site:yourdomain.com“, an advanced search operator.

Head to Google and type “site:yourdomain.com” into the search bar. This will return results Google has in its index for the site specified:

Screen Shot 2017-08-03 at 5.19.15 PM.png

The number of results Google displays (see “About __ results” above) isn’t exact, but it does give you a solid idea of which pages are indexed on your site and how they are currently showing up in search results.

For more accurate results, monitor and use the Index Coverage report in Google Search Console. You can sign up for a free Google Search Console account if you don’t currently have one. With this tool, you can submit sitemaps for your site and monitor how many submitted pages have actually been added to Google’s index, among other things.

If you’re not showing up anywhere in the search results, there are a few possible reasons why:

  • Your site is brand new and hasn’t been crawled yet.
  • Your site isn’t linked to from any external websites.
  • Your site’s navigation makes it hard for a robot to crawl it effectively.
  • Your site contains some basic code called crawler directives that is blocking search engines.
  • Your site has been penalized by Google for spammy tactics.

If your site doesn’t have any other sites linking to it, you still might be able to get it indexed by submitting your XML sitemap in Google Search Console or manually submitting individual URLs to Google. There’s no guarantee they’ll include a submitted URL in their index, but it’s worth a try!

Can search engines see your whole site?

Sometimes a search engine will be able to find parts of your site by crawling, but other pages or sections might be obscured for one reason or another. It’s important to make sure that search engines are able to discover all the content you want indexed, and not just your homepage.

Ask yourself this: Can the bot crawl through your website, and not just to it?

Is your content hidden behind login forms?

If you require users to log in, fill out forms, or answer surveys before accessing certain content, search engines won’t see those protected pages. A crawler is definitely not going to log in.

Are you relying on search forms?

Robots cannot use search forms. Some individuals believe that if they place a search box on their site, search engines will be able to find everything that their visitors search for.

Is text hidden within non-text content?

Non-text media forms (images, video, GIFs, etc.) should not be used to display text that you wish to be indexed. While search engines are getting better at recognizing images, there’s no guarantee they will be able to read and understand it just yet. It’s always best to add text within the <HTML> markup of your webpage.

Can search engines follow your site navigation?

Just as a crawler needs to discover your site via links from other sites, it needs a path of links on your own site to guide it from page to page. If you’ve got a page you want search engines to find but it isn’t linked to from any other pages, it’s as good as invisible. Many sites make the critical mistake of structuring their navigation in ways that are inaccessible to search engines, hindering their ability to get listed in search results.

Common navigation mistakes that can keep crawlers from seeing all of your site:

  • Having a mobile navigation that shows different results than your desktop navigation
  • Any type of navigation where the menu items are not in the HTML, such as JavaScript-enabled navigations. Google has gotten much better at crawling and understanding Javascript, but it’s still not a perfect process. The more surefire way to ensure something gets found, understood, and indexed by Google is by putting it in the HTML.
  • Personalization, or showing unique navigation to a specific type of visitor versus others, could appear to be cloaking to a search engine crawler
  • Forgetting to link to a primary page on your website through your navigation — remember, links are the paths crawlers follow to new pages!

This is why it’s essential that your website has a clear navigation and helpful URL folder structures.

Information architecture

Information architecture is the practice of organizing and labeling content on a website to improve efficiency and fundability for users. The best information architecture is intuitive, meaning that users shouldn’t have to think very hard to flow through your website or to find something.

Your site should also have a useful 404 (page not found) page for when a visitor clicks on a dead link or mistypes a URL. The best 404 pages allow users to click back into your site so they don’t bounce off just because they tried to access a nonexistent link.

Tell search engines how to crawl your site

In addition to making sure crawlers can reach your most important pages, it’s also pertinent to note that you’ll have pages on your site you don’t want them to find. These might include things like old URLs that have thin content, duplicate URLs (such as sort-and-filter parameters for e-commerce), special promo code pages, staging or test pages, and so on.

Blocking pages from search engines can also help crawlers prioritize your most important pages and maximize your crawl budget (the average number of pages a search engine bot will crawl on your site).

Crawler directives allow you to control what you want Googlebot to crawl and index using a robots.txt file, meta tag, sitemap.xml file, or Google Search Console.

Robots.txt

Robots.txt files are located in the root directory of websites (ex. yourdomain.com/robots.txt) and suggest which parts of your site search engines should and shouldn’t crawl via specific robots.txt directives. This is a great solution when trying to block search engines from non-private pages on your site.

You wouldn’t want to block private/sensitive pages from being crawled here because the file is easily accessible by users and bots.

Pro tip:

  • If Googlebot can’t find a robots.txt file for a site (40X HTTP status code), it proceeds to crawl the site.
  • If Googlebot finds a robots.txt file for a site (20X HTTP status code), it will usually abide by the suggestions and proceed to crawl the site.
  • If Googlebot finds neither a 20X or a 40X HTTP status code (ex. a 501 server error) it can’t determine if you have a robots.txt file or not and won’t crawl your site.

Meta directives

The two types of meta directives are the meta robots tag (more commonly used) and the x-robots-tag. Each provides crawlers with stronger instructions on how to crawl and index a URL’s content.

The x-robots tag provides more flexibility and functionality if you want to block search engines at scale because you can use regular expressions, block non-HTML files, and apply sitewide noindex tags.

These are the best options for blocking more sensitive*/private URLs from search engines.

*For very sensitive URLs, it is best practice to remove them from or require a secure login to view the pages.

WordPress Tip: In Dashboard > Settings > Reading, make sure the “Search Engine Visibility” box is not checked. This blocks search engines from coming to your site via your robots.txt file!

Avoid these common pitfalls, and you’ll have clean, crawlable content that will allow bots easy access to your pages.

Once you’ve ensured your site has been crawled, the next order of business is to make sure it can be indexed. That’s right — just because your site can be discovered and crawled by a search engine doesn’t necessarily mean that it will be stored in their index. Read on to learn about how indexing works and how you can make sure your site makes it into this all-important database.

Sitemaps

A sitemap is just what it sounds like: a list of URLs on your site that crawlers can use to discover and index your content. One of the easiest ways to ensure Google is finding your highest priority pages is to create a file that meets Google’s standards and submit it through Google Search Console. While submitting a sitemap doesn’t replace the need for good site navigation, it can certainly help crawlers follow a path to all of your important pages.

Google Search Console

Some sites (most common with e-commerce) make the same content available on multiple different URLs by appending certain parameters to URLs. If you’ve ever shopped online, you’ve likely narrowed down your search via filters. For example, you may search for “shoes” on Amazon, and then refine your search by size, color, and style. Each time you refine, the URL changes slightly. How does Google know which version of the URL to serve to searchers? Google does a pretty good job at figuring out the representative URL on its own, but you can use the URL Parameters feature in Google Search Console to tell Google exactly how you want them to treat your pages.

Indexing: How do search engines understand and remember your site?

Once you’ve ensured your site has been crawled, the next order of business is to make sure it can be indexed. That’s right — just because your site can be discovered and crawled by a search engine doesn’t necessarily mean that it will be stored in their index. In the previous section on crawling, we discussed how search engines discover your web pages. The index is where your discovered pages are stored. After a crawler finds a page, the search engine renders it just like a browser would. In the process of doing so, the search engine analyzes that page’s contents. All of that information is stored in its index.

Read on to learn about how indexing works and how you can make sure your site makes it into this all-important database.

Can I see how a Googlebot crawler sees my pages?

Yes, the cached version of your page will reflect a snapshot of the last time googlebot crawled it.

Google crawls and caches web pages at different frequencies. More established, well-known sites that post frequently like https://www.nytimes.com will be crawled more frequently than the much-less-famous website for Roger the Mozbot’s side hustle, http://www.rogerlovescupcakes.com (if only it were real…)

You can view what your cached version of a page looks like by clicking the drop-down arrow next to the URL in the SERP and choosing “Cached”:

You can also view the text-only version of your site to determine if your important content is being crawled and cached effectively.

Are pages ever removed from the index?

Yes, pages can be removed from the index! Some of the main reasons why a URL might be removed include:

  • The URL is returning a “not found” error (4XX) or server error (5XX) – This could be accidental (the page was moved and a 301 redirect was not set up) or intentional (the page was deleted and 404ed in order to get it removed from the index)
  • The URL had a noindex meta tag added – This tag can be added by site owners to instruct the search engine to omit the page from its index.
  • The URL has been manually penalized for violating the search engine’s Webmaster Guidelines and, as a result, was removed from the index.
  • The URL has been blocked from crawling with the addition of a password required before visitors can access the page.

If you believe that a page on your website that was previously in Google’s index is no longer showing up, you can manually submit the URL to Google by navigating to the “Submit URL” tool in Search Console.

Ranking: How do search engines rank URLs?

How do search engines ensure that when someone types a query into the search bar, they get relevant results in return? That process is known as ranking, or the ordering of search results by most relevant to least relevant to a particular query.

To determine relevance, search engines use algorithms, a process or formula by which stored information is retrieved and ordered in meaningful ways. These algorithms have gone through many changes over the years in order to improve the quality of search results. Google, for example, makes algorithm adjustments every day — some of these updates are minor quality tweaks, whereas others are core/broad algorithm updates deployed to tackle a specific issue, like Penguin to tackle link spam. Check out our Google Algorithm Change History for a list of both confirmed and unconfirmed Google updates going back to the year 2000.

Why does the algorithm change so often? Is Google just trying to keep us on our toes? While Google doesn’t always reveal specifics as to why they do what they do, we do know that Google’s aim when making algorithm adjustments is to improve overall search quality. That’s why, in response to algorithm update questions, Google will answer with something along the lines of: “We’re making quality updates all the time.” This indicates that, if your site suffered after an algorithm adjustment, compare it against Google’s Quality Guidelines or Search Quality Rater Guidelines, both are very telling in terms of what search engines want.

What do search engines want?

Search engines have always wanted the same thing: to provide useful answers to searcher’s questions in the most helpful formats. If that’s true, then why does it appear that SEO is different now than in years past?

Think about it in terms of someone learning a new language.

At first, their understanding of the language is very rudimentary — “See Spot Run.” Over time, their understanding starts to deepen, and they learn semantics—- the meaning behind language and the relationship between words and phrases. Eventually, with enough practice, the student knows the language well enough to even understand nuance, and is able to provide answers to even vague or incomplete questions.

When search engines were just beginning to learn our language, it was much easier to game the system by using tricks and tactics that actually go against quality guidelines. Take keyword stuffing, for example. If you wanted to rank for a particular keyword like “funny jokes,” you might add the words “funny jokes” a bunch of times onto your page, and make it bold, in hopes of boosting your ranking for that term:

Welcome to funny jokes! We tell the funniest jokes in the world. Funny jokes are fun and crazy. Your funny joke awaits. Sit back and read funny jokes because funny jokes can make you happy and funnier. Some funny favorite funny jokes.

This tactic made for terrible user experiences, and instead of laughing at funny jokes, people were bombarded by annoying, hard-to-read text. It may have worked in the past, but this is never what search engines wanted.

The role links play in SEO

When we talk about links, we could mean two things. Backlinks or “inbound links” are links from other websites that point to your website, while internal links are links on your own site that point to your other pages (on the same site).

Links have historically played a big role in SEO. Very early on, search engines needed help figuring out which URLs were more trustworthy than others to help them determine how to rank search results. Calculating the number of links pointing to any given site helped them do this.

Backlinks work very similarly to real life WOM (Word-Of-Mouth) referrals. Let’s take a hypothetical coffee shop, Jenny’s Coffee, as an example:

  • Referrals from others = good sign of authority
    Example: Many different people have all told you that Jenny’s Coffee is the best in town
  • Referrals from yourself = biased, so not a good sign of authority
    Example: Jenny claims that Jenny’s Coffee is the best in town
  • Referrals from irrelevant or low-quality sources = not a good sign of authority and could even get you flagged for spam
    Example: Jenny paid to have people who have never visited her coffee shop tell others how good it is.
  • No referrals = unclear authority
    Example: Jenny’s Coffee might be good, but you’ve been unable to find anyone who has an opinion so you can’t be sure.

This is why PageRank was created. PageRank (part of Google’s core algorithm) is a link analysis algorithm named after one of Google’s founders, Larry Page. PageRank estimates the importance of a web page by measuring the quality and quantity of links pointing to it. The assumption is that the more relevant, important, and trustworthy a web page is, the more links it will have earned.

The more natural backlinks you have from high-authority (trusted) websites, the better your odds are to rank higher within search results.

The role content plays in SEO

There would be no point to links if they didn’t direct searchers to something. That something is content! Content is more than just words; it’s anything meant to be consumed by searchers — there’s video content, image content, and of course, text. If search engines are answer machines, content is the means by which the engines deliver those answers.

Any time someone performs a search, there are thousands of possible results, so how do search engines decide which pages the searcher is going to find valuable? A big part of determining where your page will rank for a given query is how well the content on your page matches the query’s intent. In other words, does this page match the words that were searched and help fulfill the task the searcher was trying to accomplish?

Because of this focus on user satisfaction and task accomplishment, there’s no strict benchmarks on how long your content should be, how many times it should contain a keyword, or what you put in your header tags. All those can play a role in how well a page performs in search, but the focus should be on the users who will be reading the content.

Today, with hundreds or even thousands of ranking signals, the top three have stayed fairly consistent: links to your website (which serve as a third-party credibility signals), on-page content (quality content that fulfills a searcher’s intent), and RankBrain.

What is RankBrain?

RankBrain is the machine learning component of Google’s core algorithm. Machine learning is a computer program that continues to improve its predictions over time through new observations and training data. In other words, it’s always learning, and because it’s always learning, search results should be constantly improving.

For example, if RankBrain notices a lower ranking URL providing a better result to users than the higher ranking URLs, you can bet that RankBrain will adjust those results, moving the more relevant result higher and demoting the lesser relevant pages as a byproduct.

Like most things with the search engine, we don’t know exactly what comprises RankBrain, but apparently, neither do the folks at Google.

What does this mean for SEOs?

Because Google will continue leveraging RankBrain to promote the most relevant, helpful content, we need to focus on fulfilling searcher intent more than ever before. Provide the best possible information and experience for searchers who might land on your page, and you’ve taken a big first step to performing well in a RankBrain world.

Engagement metrics: correlation, causation, or both?

With Google rankings, engagement metrics are most likely part correlation and part causation.

When we say engagement metrics, we mean data that represents how searchers interact with your site from search results. This includes things like:

  • Clicks (visits from search)
  • Time on page (amount of time the visitor spent on a page before leaving it)
  • Bounce rate (the percentage of all website sessions where users viewed only one page)
  • Pogo-sticking (clicking on an organic result and then quickly returning to the SERP to choose another result)

Many tests, including Moz’s own ranking factor survey, have indicated that engagement metrics correlate with higher ranking, but causation has been hotly debated. Are good engagement metrics just indicative of highly ranked sites? Or are sites ranked highly because they possess good engagement metrics?

What Google has said

While they’ve never used the term “direct ranking signal,” Google has been clear that they absolutely use click data to modify the SERP for particular queries.

According to Google’s former Chief of Search Quality, Udi Manber:

“The ranking itself is affected by the click data. If we discover that, for a particular query, 80% of people click on #2 and only 10% click on #1, after a while we figure out probably #2 is the one people want, so we’ll switch it.”

Another comment from former Google engineer Edmond Lau corroborates this:

“It’s pretty clear that any reasonable search engine would use click data on their own results to feed back into ranking to improve the quality of search results. The actual mechanics of how click data is used is often proprietary, but Google makes it obvious that it uses click data with its patents on systems like rank-adjusted content items.”

Because Google needs to maintain and improve search quality, it seems inevitable that engagement metrics are more than correlation, but it would appear that Google falls short of calling engagement metrics a “ranking signal” because those metrics are used to improve search quality, and the rank of individual URLs is just a byproduct of that.

What tests have confirmed

Various tests have confirmed that Google will adjust SERP order in response to searcher engagement:

  • Rand Fishkin’s 2014 test resulted in a #7 result moving up to the #1 spot after getting around 200 people to click on the URL from the SERP. Interestingly, ranking improvement seemed to be isolated to the location of the people who visited the link. The rank position spiked in the US, where many participants were located, whereas it remained lower on the page in Google Canada, Google Australia, etc.
  • Larry Kim’s comparison of top pages and their average dwell time pre- and post-RankBrain seemed to indicate that the machine-learning component of Google’s algorithm demotes the rank position of pages that people don’t spend as much time on.
  • Darren Shaw’s testing has shown user behavior’s impact on local search and map pack results as well.

Since user engagement metrics are clearly used to adjust the SERPs for quality, and rank position changes as a byproduct, it’s safe to say that SEOs should optimize for engagement. Engagement doesn’t change the objective quality of your web page, but rather your value to searchers relative to other results for that query. That’s why, after no changes to your page or its backlinks, it could decline in rankings if searchers’ behaviors indicates they like other pages better.

In terms of ranking web pages, engagement metrics act like a fact-checker. Objective factors such as links and content first rank the page, then engagement metrics help Google adjust if they didn’t get it right.

The evolution of search results

Back when search engines lacked a lot of the sophistication they have today, the term “10 blue links” was coined to describe the flat structure of the SERP. Any time a search was performed, Google would return a page with 10 organic results, each in the same format.

In this search landscape, holding the #1 spot was the holy grail of SEO. But then something happened. Google began adding results in new formats on their search result pages, called SERP features. Some of these SERP features include:

  • Paid advertisements
  • Featured snippets
  • People Also Ask boxes
  • Local (map) pack
  • Knowledge panel
  • Sitelinks

And Google is adding new ones all the time. It even experimented with “zero-result SERPs,” a phenomenon where only one result from the Knowledge Graph was displayed on the SERP with no results below it except for an option to “view more results.”

The addition of these features caused some initial panic for two main reasons. For one, many of these features caused organic results to be pushed down further on the SERP. Another byproduct is that fewer searchers are clicking on the organic results since more queries are being answered on the SERP itself.

So why would Google do this? It all goes back to the search experience. User behavior indicates that some queries are better satisfied by different content formats. Notice how the different types of SERP features match the different types of query intents.

Query Intent

Possible SERP Feature Triggered

Informational

Featured Snippet

Informational with one answer

Knowledge Graph / Instant Answer

Local

Map Pack

Transactional

Shopping

We’ll talk more about intent in Chapter 3, but for now, it’s important to know that answers can be delivered to searchers in a wide array of formats, and how you structure your content can impact the format in which it appears in search.

Localized search

A search engine like Google has its own proprietary index of local business listings, from which it creates local search results.

If you are performing local SEO work for a business that has a physical location customers can visit (ex: dentist) or for a business that travels to visit their customers (ex: plumber), make sure that you claim, verify, and optimize a free Google My Business Listing.

When it comes to localized search results, Google uses three main factors to determine ranking:

  1. Relevance
  2. Distance
  3. Prominence

Relevance

Relevance is how well a local business matches what the searcher is looking for. To ensure that the business is doing everything it can to be relevant to searchers, make sure the business’ information is thoroughly and accurately filled out.

Distance

Google use your geo-location to better serve you local results. Local search results are extremely sensitive to proximity, which refers to the location of the searcher and/or the location specified in the query (if the searcher included one).

Organic search results are sensitive to a searcher’s location, though seldom as pronounced as in local pack results.

Prominence

With prominence as a factor, Google is looking to reward businesses that are well-known in the real world. In addition to a business’ offline prominence, Google also looks to some online factors to determine local ranking, such as:

Reviews

The number of Google reviews a local business receives, and the sentiment of those reviews, have a notable impact on their ability to rank in local results.

Citations

A “business citation” or “business listing” is a web-based reference to a local business’ “NAP” (name, address, phone number) on a localized platform (Yelp, Acxiom, YP, Infogroup, Localeze, etc.).

Local rankings are influenced by the number and consistency of local business citations. Google pulls data from a wide variety of sources in continuously making up its local business index. When Google finds multiple consistent references to a business’s name, location, and phone number it strengthens Google’s “trust” in the validity of that data. This then leads to Google being able to show the business with a higher degree of confidence. Google also uses information from other sources on the web, such as links and articles.

Check a local business’ citation accuracy here.

Organic ranking

SEO best practices also apply to local SEO, since Google also considers a website’s position in organic search results when determining local ranking.

In the next chapter, you’ll learn on-page best practices that will help Google and users better understand your content.

[Bonus!] Local engagement

Although not listed by Google as a local ranking determiner, the role of engagement is only going to increase as time goes on. Google continues to enrich local results by incorporating real-world data like popular times to visit and average length of visits…

Screenshot of Google SERP result for a local business showing busy times of day

…and even provides searchers with the ability to ask the business questions!

Screenshot of the Questions & Answers portion of a local Google SERP result

Undoubtedly now more than ever before, local results are being influenced by real-world data. This interactivity is how searchers interact with and respond to local businesses, rather than purely static (and game-able) information like links and citations.

Since Google wants to deliver the best, most relevant local businesses to searchers, it makes perfect sense for them to use real time engagement metrics to determine quality and relevance.


You don’t have to know the ins and outs of Google’s algorithm (that remains a mystery!), but by now you should have a great baseline knowledge of how the search engine finds, interprets, stores, and ranks content. Armed with that knowledge, let’s learn about choosing the keywords your content will target!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

The Local SEO’s Guide to the Buy Local Phenomenon: A Competitive Advantage for Clients

Posted by MiriamEllis

Photo credit: Michelle Shirley

What if a single conversation with one of your small local business clients could spark activity that would lead to an increase in their YOY sales of more than 7%, as opposed to only 4% if you don’t have the conversation? What if this chat could triple the amount of spending that stays in their town, reduce pollution in their community, improve their neighbors’ health, and strengthen democracy?

What if the brass ring of content dev, link opportunities, consumer sentiment and realtime local inventory is just waiting for you to grab it, on a ride we just haven’t taken yet, in a setting we’re just not talking about?

Let’s travel a different road today, one that parallels our industry’s typical conversation about citations, reviews, markup, and Google My Business. As a 15-year sailor on the Local SEO ship, I love all this stuff, but, like you, I’m experiencing a merging of online goals with offline realities, a heightened awareness of how in-store is where local business successes are born and bred, before they become mirrored on the web.

At Moz, our SaaS tools serve businesses of every kind: Digital, bricks-and-mortar, SABs, enterprises, mid-market agencies, big brands, and bootstrappers. But today, I’m going to go as small and as local as possible, speaking directly to independently-owned local businesses and their marketers about the buy local/shop local/go local movement and what I’ve learned about its potential to deliver meaningful and far-reaching successes. Frankly, I think you’ll be as amazed as I’ve been.

At the very least, I hope reading this article will inspire you to have a conversation with your local business clients about what this growing phenomenon could do for them and for their communities. Successful clients, after all, are the very best kind to have.

What is the Buy Local movement all about?

What’s the big idea?

You’re familiar with the concept of there being power in numbers. A single independent business lacks the resources and clout to determine the local decisions and policies that affect it. Should Walmart or Target be invited to set up shop in town? Should the crumbling building on Main St. be renovated or demolished? Which safety and cultural services should be supported with funding? The family running the small grocery store has little say, but if they join together with the folks running the bakery, the community credit union, the animal shelter, and the bookstore … then they begin to have a stronger voice.

Who does this?

Buy Local programs formalize the process of independently-owned businesses joining together to educate their communities about the considerable benefits to nearly everyone of living in a thriving local economy. These efforts can be initiated by merchants, Chambers of Commerce, grassroots citizen groups, or others. They can be assisted and supported by non-profit organizations like the American Independent Business Alliance (AMIBA) and the Institute for Local Self-Reliance (ILSR).

What are the goals?

Through signage, educational events, media promotions, and other forms of marketing, most Buy Local campaigns share some or all of these goals:

  • Increase local wealth that recirculates within the community
  • Preserve local character
  • Build community
  • Create good jobs
  • Have a say in policy-making
  • Decrease environmental impacts
  • Support entrepreneurship
  • Improve diversity/variety
  • Compete with big businesses

Do Buy Local campaigns actually work?

Yes – research indicates that, if managed correctly, these programs yield a variety of benefits to both merchants and residents. Consider these findings:

1) Healthy YOY sales advantages

ILSR conducted a national survey of independent businesses to gauge YOY sales patterns. 2016 respondents reported a good increase in sales across the board, but with a significant difference which AMIBA sums up:

“Businesses in communities with a sustained grassroots “buy independent/buy local” campaign reported a strong 7.4% sales increase, nearly doubling the 4.2% gain for those in areas without such an alliance.”

2) Keeping spending local

The analysts at Civic Economics conducted surveys of 10 cities to gauge the local financial impacts of independents vs. chain retailers, yielding a series of graphics like this one:

While statistics vary from community to community, the overall pattern is one of significantly greater local recirculation of wealth in the independent vs. chain environment. These patterns can be put to good use by Buy Local campaigns with the goal of increasing community-sustaining wealth.

3) Keeping communities employed and safe

Few communities can safely afford the loss of jobs and tax revenue documented in a second Civic Economics study which details the impacts of Americans’ Amazon habit, state by state and across the nation:

While the recent supreme court ruling allowing states to tax e-commerce models could improve some of these dire numbers, towns and cities with Buy Local alliances can speak plainly: Lack of tax revenue that leads to lack of funding for emergency services like fire departments is simply unsafe and unsustainable. A study done a few years back found that ⅔ of volunteer firefighters in the US report that their departments are underfunded with 86% of these heroic workers having to dip into their own pockets to buy supplies to keep their stations going. As I jot these statistics down, there is a runaway 10,000 acre wildfire burning a couple of hours north of me…

Meanwhile, Inc.com is pointing out,

“According to the Bureau of Labor Statistics, since the end of the Great Recession, small businesses have created 62 percent of all net new private-sector jobs. Among those jobs, 66 percent were created by existing businesses, while 34 percent were generated through new establishments (adjusted for establishment closings and job losses)”.

When communities have Go Local-style business alliances, they are capitalizing on the ability to create jobs, increase sales, and build up tax revenue that could make a serious difference not just to local unemployment rates, but to local safety.

4) Shaping policy

In terms of empowering communities to shape policy, there are many anecdotes to choose from, but one of the most celebrated surrounds a landmark study conducted by the Austin Independent Business Alliance which documented community impacts of spending at the local book and music stores vs. a proposed Borders. Their findings were compelling enough to convince the city not to give a $ 2.1 million subsidy to the now-defunct corporation.

5) Improving the local environment

A single statistic here is incredibly eye opening. According to the US Department of Transportation, shopping-related driving per household more than tripled between 1969-2009.

All you have to do is picture to yourself the centralized location of mainstreet businesses vs. big boxes on the outskirts of town to imagine how city planning has contributed to this stunning rise in time spent on the road. When residents can walk or bike to make daily purchases, the positive environmental impacts are obvious.

6) Improving residents’ health and well-being

A recent Cigna survey of 20,000 Americans found that nearly half of them always or sometimes feel lonely, lacking in significant face-to-face interactions with others. Why does this matter? Because the American Psychological Association finds that you have a 50% less chance of dying prematurely if you have quality social interactions.

There’s a reason author Jan Karon’s “Mitford” series about life in a small town in North Carolina has been a string of NY Times Best Sellers; readers and reviewers continuously state that they yearn to live someplace like this fictitious community with the slogan “Mitford takes care of its own”. In the novels, the lives of residents, independent merchants, and “outsiders” interweave, in good times and bad, creating a support network many Americans envy.

This societal setup must be a winner, as well as a bestseller, because the Cambridge Journal of Regions published a paper in which they propose that the concentration of small businesses in a given community can be equated with levels of public health.

Beyond the theory that eating fresh and local is good for you, it turns out that knowing your farmer, your banker, your grocer could help you live longer.

7) Realizing big-picture goals

Speaking of memorable stories, this video from ILSR does a good job of detailing one view of the ultimate impacts independent business alliances can have on shaping community futures:

I interviewed author and AMIBA co-founder, Jeff Milchen, about the good things that can happen when independents join hands. He summed it up,

“The results really speak for themselves when you look at what the impact of public education for local alliances has been in terms of shifting culture. It’s a great investment for independent businesses to partner with other independents, to do things they can’t do individually. Forming these partnerships can help them compete with the online giants.”

Getting going with a Go Local campaign, the right way

If sharing some of the above with clients has made them receptive to further exploration of what involvement in an independent business alliance might do for them, here are the next steps to take:

  1. First, find out if a Go Local/Shop Local/Buy Local/Stay Local campaign already exists in the business’ community. If so, the client can join up.
  2. If not, contact AMIBA. The good folks there will know if other local business owners in the client’s community have already expressed interest in creating an alliance. They can help connect the interested parties up.
  3. I highly, highly recommend reading through Amiba’s nice, free primer covering just about everything you need to know about Go Local campaigns.
  4. Encourage the client to publicize their intent to create an alliance if none exists in their community. Do an op ed in the local print news, put it on social media sites, talk to neighbors. This can prompt outreach from potential allies in the effort.
  5. A given group can determine to go it alone, but it may be better to rely on the past experience of others who have already created successful campaigns. AMIBA offers a variety of paid community training modules, including expert speakers, workshops, and on-site consultations. Each community can write in to request a quote for a training plan that will work best for them. The organization also offers a wealth of free educational materials on their website.
  6. According to AMIBA’s Jeff Milchen, a typical Buy Local campaign takes about 3-4 months to get going.

It’s important to know that Go Local campaigns can fail, due to poor execution. Here is a roundup of practices all alliances should focus on to avoid the most common pitfalls:

  1. Codify the definition of a “local” business as being independently-owned-and-run, or else big chain inclusion will anger some members and cause them to leave.
  2. Emphasize all forms of local patronage; campaigns that stick too closely to words like “buy” or “shop” overlook the small banks, service area businesses, and other models that are an integral part of the independent local economy.
  3. Ensure diversity in leadership; an alliance that fails to reflect the resources of age, race, gender/identity, political views, economics and other factors may wind up perishing from narrow viewpoints. On a related note, AMIBA has been particularly active in advocating for business communities to rid themselves of bigotry. Strong communities welcome everyone.
  4. Do the math of what success looks like; education is a major contributing factor to forging a strong alliance, based on projected numbers of what campaigns can yield in concrete benefits for both merchants and residents.
  5. Differentiate inventory and offerings so that independently-owned businesses offer something of added value which patrons can’t easily replicate online; this could be specialty local products, face-to-face time with expert staff, or other benefits.
  6. Take the high road in inspiring the community to increase local spending; campaigns should not rely on vilifying big and online businesses or asking for patronage out of pity. In other words, guilt-tripping locals because they do some of their shopping at Walmart or Amazon isn’t a good strategy. Even a 10% shift towards local spending can have positive impacts for a community!
  7. Clearly assess community resources; not every town, city, or district hosts the necessary mix of independent businesses to create a strong campaign. For example, approximately 2.2% of the US population live in “food deserts”, many miles from a grocery store. These areas may lack other local businesses, as well, and their communities may need to create grassroots campaigns surrounding neighborhood gardens, mobile markets, private investors and other creative solutions.

In sum, success significantly depends on having clear definitions, clear goals, diverse participants and a proud identity as independents, devoid of shaming tactics.

Circling back to the Web — our native heath!

So, let’s say that your incoming client is now participating in a Buy Local program. Awesome! Now, where do we go from here?

In speaking with Jeff Milchen, I asked what he has seen in terms of digital marketing being used to promote the businesses involved in Buy Local campaigns. He said that, while some alliances have workshops, it’s a work in progress and something he hopes to see grow in the future.

As a Local SEO, that future is now for you and your fortunate clients. Here are some ways I see this working out beautifully:

Basic data distribution and consistency

Small local businesses can sometimes be unaware of inconsistent or absent local business listings, because the owners are just so busy. The quickest way I know to demo this scenario is to plug the company name and zip into the free Moz Check Listing tool to show them how they’re doing on the majors. Correct data errors and fill in the blanks, either manually, or, using affordable software like Moz Local. You’ll also want to be sure the client has a presence on any geo or industry-specific directories and platforms. It’s something your agency can really help with!

A hyperlocalized content powerhouse

Build proud content around the company’s involvement in the Buy Local program.

  • Write about all of the economic, environmental, and societal benefits residents can support by patronizing the business.
  • Motivated independents take time to know their customers. There are stories in this. Write about the customers and their needs. I’ve even seen independent restaurants naming menu items after beloved patrons. Get personal. Build community.
  • Don’t forget that even small towns can be powerful points of interest for tourists. Create a warm welcome for travelers, and for new neighbors, too!

Link building opportunities of a lifetime

Local business alliances form strong B2B bonds.

  • Find relationships with related businesses that can sprout links. For example, the caterer knows the wedding cake baker, who knows the professional seamstress, who knows the minister, who knows the DJ, who knows the florist.
  • Dive deep into opportunities for sponsoring local organizations, teams and events, hosting and participating in workshops and conferences, offering scholarships and special deals.
  • Make fast friends with local media. Be newsworthy.

A wellspring of sentiment

Independents form strong business-to-community bonds.

  • When a business really knows its customers, asking for online reviews is so much easier. In some communities, it may be necessary to teach customers how to leave reviews, but once you get a strategy going for this, the rest is gravy.
  • It’s also a natural fit for asking for written and video testimonials to be published on the company website.
  • Don’t forget the power of Word of Mouth Marketing, while you’re at it. Loyal patrons are an incredible asset.
  • The one drawback could be if your business model is one of a sensitive nature. Tight-knit communities can be ones in which residents may be more desirous of protecting their privacy.

Digitize inventory easily

30% of consumers say they’d buy from a local store instead of online if they knew the store was nearby (Google). Over half of consumers prefer to shop in-store to interact with products (Local Search Association). Over 63% of consumers would rather buy from a company they consider to be authentic over the competition (Bright Local).

It all adds up to the need for highly-authentic independently-owned businesses to have an online presence that signals to Internet users that they stock desired products. For many small, local brands, going full e-commerce on their website is simply too big of an implementation and management task. It’s a problem that’s dogged this particular business sector for years. And it’s why I got excited when the folks at AMIBA told me to check out Pointy.

Pointy offers a physical device that small business owners can attach to their barcode scanner to have their products ported to a Pointy-controlled webpage. But, that’s not all. Pointy integrates with the “See What’s In Store” inventory function of Google My Business Knowledge Panels. Check out Talbot’s Toyland in San Mateo, CA for a live example.

Pointy is a startup, but one that is exciting enough to have received angel investing from the founder of WordPress and the co-founder of Google Maps. Looks like a real winner to me, and it could provide a genuine answer for brick-and-mortar independents who have found their sales staggering in the wake of Amazon and other big digital brands.

Local SEOs have an important part to play

Satisfaction in work is a thing to be cherished. If the independent business movement speaks to you, bringing your local search marketing skills to these alliances and small brands could make more of your work days really good days.

The scenario could be an especially good fit for agencies that have specialized in city or state marketing. For example, one of our Moz Community members confines his projects to South Carolina. Imagine him taking it on the road a bit, hosting and attending workshops for towns across the state that are ready to revitalize main street. An energetic client roster could certainly result if someone like him could show local banks, grocery stores, retail shops and restaurants how to use the power of the local web!

Reading America

Our industry is living and working in complex times.

The bad news is, a current Bush-Biden poll finds that 8/10 US residents are “somewhat” or “very” concerned about the state of democracy in our nation.

The not-so-bad news is that citizen ingenuity for discovering solutions and opportunities is still going strong. We need only look as far as the runaway success of the TV show “Fixer Upper”, which drew 5.21 million viewers in its fourth season as the second-largest telecast of Q2 of that year. The show surrounded the revitalization of dilapidated homes and businesses in and around Waco, Texas, and has turned the entire town into a major tourist destination, pulling in millions of annual visitors and landing book deals, a magazine, and the Magnolia Home furnishing line for its entrepreneurial hosts.

While not every town can (or would want to) experience what is being called the “Magnolia effect”, channels like HGTV and the DIY network are heavily capitalizing on the rebirth of American communities, and private citizens are taking matters into their own hands.

There’s the family who moved from Washington D.C. to Water Valley, Mississippi, bought part of the decaying main street and began to refurbish it. I found the video story of this completely riveting, and look at the Yelp reviews of the amazing grocery store and lunch counter these folks are operating now. The market carries local products, including hoop cheese and milk from the first dairy anyone had opened in 50 years in the state.

There are the half-dozen millennials who are helping turn New Providence, Iowa into a place young families can live and work again. There’s Corning, NY, Greensburg, KS, Colorado Springs, CO, and so many more places where people are eagerly looking to strengthen community sufficiency and sustainability.

Some marketing firms are visionary forerunners in this phenomenon, like Deluxe, which has sponsored the Small Business Revolution show, doing mainstreet makeovers that are bringing towns back to life. There could be a place out there somewhere on the map of the country, just waiting for your agency to fill it.

The best news is that change is possible. A recent study in Science magazine states that the tipping point for a minority group to change a majority viewpoint is 25% of the population. This is welcome news at a time when 80% of citizens are feeling doubtful about the state of our democracy. There are 28 million small businesses in the United States – an astonishing potential educational force – if communities can be taught what a vote with their dollar can do in terms of giving them a voice. As Jeff Milchen told me:

One of the most inspiring things is when we see local organizations helping residents to be more engaged in the future of their community. Most communities feel somewhat powerless. When you see towns realize they have the ability to shift public policy to support their own community, that’s empowering.”

Sometimes, the extremes of our industry can make our society and our democracy hard to read. On the one hand, the largest brands developing AI, checkout-less shopping, driverless cars, same-day delivery via robotics, and the gig economy win applause at conferences.

On the other hand, the public is increasingly hearing the stories of employees at these same companies who are protesting Microsoft developing face recognition for ICE, Google’s development of AI drone footage analysis for the Pentagon, working conditions at Amazon warehouses that allegedly preclude bathroom breaks and have put people in the hospital, and the various outcomes of the “Walmart Effect”.

The Buy Local movement is poised in time at this interesting moment, in which our democracy gets to choose. Gigs or unions? Know your robot or know your farmer? Convenience or compassion? Is it either/or? Can it be both?

Both big and small brands have a major role to play in answering these timely questions and shaping the ethics of our economy. Big brands, after all, have tremendous resources for raising the bar for ethical business practices. Your agency likely wants to serve both types of clients, but it’s all to the good if all business sectors remember that the real choosers are the “consumers”, the everyday folks voting with their dollars.

I know that it can be hard to find good news sometimes. But I’m hoping what you’ve read today gifts you with a feeling of optimism that you can take to the office, take to your independently-owned local business clients, and maybe even help take to their communities. Spark a conversation today and you may stumble upon a meaningful competitive advantage for your agency and its most local customers.

Every year, local SEOs are delving deeper and deeper into the offline realities of the brands they serve, large and small. We’re learning so much, together. It’s sometimes a heartbreaker, but always an honor, being part of this local journey.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Getting Real with Retail: An Agency’s Guide to Inspiring In-Store Excellence

Posted by MiriamEllis

A screenshot of a negative 1-star review citing poor customer service

No marketing agency staffer feels good when they see a retail client getting reviews like this on the web.

But we can find out why they’re happening, and if we’re going above-and-beyond in our work, we just might be able to catalyze turning things around if we’re committed to being honest with clients and have an actionable strategy for their in-store improvements.

In this post, I’ll highlight some advice from an internal letter at Tesla that I feel is highly applicable to the retail sector. I’d also like to help your agency combat the retail blues headlining the news these days with big brands downsizing, liquidating and closing up shop — I’m going to share a printable infographic with some statistics with you that are almost guaranteed to generate the client positivity so essential to making real change. And, for some further inspiration, I’d like to offer a couple of anecdotes involving an Igloo cooler, a monk, reindeer moss, and reviews.

The genuine pain of retail gone wrong: The elusive cooler, “Corporate,” and the man who could hardly stand

“Hi there,” I greeted the staffer at the customer service counter of the big department store. “Where would I find a small cooler?”

“We don’t have any,” he mumbled.

“You don’t have any coolers? Like, an Igloo cooler to take on a picnic to keep things cold?”

“Maybe over there,” he waved his hand in unconcern.

And I stood there for a minute, expecting him to actually figure this out for me, maybe even guide me to the appropriate aisle, or ask a manager to assist my transaction, if necessary. But in his silence, I walked away.

“Hi there,” I tried with more specificity at the locally owned general store the next day. “Where would I find something like a small Igloo cooler to keep things cold on a picnic?”

“I don’t know,” the staffer replied.

“Oh…” I said, uncomfortably.

“It could be upstairs somewhere,” he hazarded, and left me to quest for the second floor, which appeared to be a possibly-non-code-compliant catch-all attic for random merchandise, where I applied to a second dimly illuminated employee who told me I should probably go downstairs and escalate my question to someone else.

And apparently escalation was necessary, for on the third try, a very tall man was able to lift his gaze to some coolers on a top shelf… within clear view of the checkout counter where the whole thing began.

Why do we all have experiences like this?

“Corporate tells us what to carry” is the almost defensive-sounding refrain I have now received from three employees at two different Whole Foods Markets when asking if they could special order items for me since the Amazon buyout.

Because, you know, before they were Amazon-Whole Foods, staffers would gladly offer to procure anything they didn’t have in stock. Now, if they stop carrying that Scandinavian vitamin D-3 made from the moss eaten by reindeer and I’ve got to have it because I don’t want the kind made by irradiating sheep wool, I’d have to special order an entire case of it to get my hands on a bottle. Because, you know, “Corporate.”

Why does the distance between corporate and customer make me feel like the store I’m standing in, and all of its employees, are powerless? Why am I, the customer, left feeling powerless?

So maybe my search for a cooler, my worries about access to reindeer moss, and the laughable customer service I’ve experienced don’t signal “genuine pain.” But this does:

Screenshot of a one-star review: "The pharmacy shows absolutely no concern for the sick, aged and disabled from what I see and experienced. There's 2 lines for drops and pick up, which is fine, but keep in mind those using the pharmacy are sick either acute or chronic. No one wants to be there. The lines are often long with the phone ringing off the hook, so very understaffed. There are no chairs near the line to sit even if someone is in so much pain they can hardly stand, waiting area not nearby. If you have to drop and pick you have to wait in 2 separate lines. They won't inform the other window even though they are just feet away from each other. I saw one poor man wait 4 people deep, leg bandaged, leaning on a cart to be able to stand, but he was in the wrong line and was told to go to the other line. They could have easily taken the script, asked him to wait in the waiting area, walk the script 5 feet, and call him when it was his turn, but this fella who could barely stand had to wait in another line, 4 people deep. I was in the correct line, pick up. I am a disabled senior with cancer and chronic pain. However, I had a new Rx insurance card, beginning of the year. I was told that to wait in the other line, too! I was in the correct line, but the staff was so poorly trained she couldn't enter a few new numbers. This stuff happens repeatedly there. I've written and called the home office who sound so concerned but nothing changes. I tried to talk to manager, who naturally was "unavailable" but his underling made it clear their process was more important than the customers. All they have to do to fix the problem is provide nearby sitting or ask the customer to wait in the waiting area where there are chairs and take care of the problem behind the counter, but they would rather treat the sick, injured and old like garbage than make a small change that would make a big difference to their customers. Although they are very close I am looking for a pharmacy who actually cares to transfer my scripts, because I feel they are so uncaring and disinterested although it's their job to help the sick."

This is genuine pain. When customer service is failing to the point that badly treated patrons are being further distressed by the sight of fellow shoppers meeting the same fate, the cause is likely built into company structure. And your marketing agency is looking at a bonafide reputation crisis that could presage things like lawsuits, impactful reputation damage, and even closure for your valuable clients.

When you encounter customer service disasters, it begs questions like:

  1. Could no one in my situation access a list of current store inventory, or, barring that, seek out merchandise with me instead of risking the loss of a sale?
  2. Could no one offer to let “corporate” know that I’m dissatisfied with a “customer service policy” that would require me to spend $ 225 to buy a whole case of vitamins? Why am I being treated like a warehouse instead of a person?
  3. Could no one at the pharmacy see a man with a leg wound about to fall over, grab a folding chair for him, and keep him safe, instead of risking a lawsuit?

I think a “no” answer to all three questions proceeds from definite causes. And I think Tesla CEO, Elon Musk, had such causes in mind when he recently penned a letter to his own employees.

“It must be okay for people to talk directly and just make the right thing happen.”

“Communication should travel via the shortest path necessary to get the job done, not through the ‘chain of command.’ Any manager who attempts to enforce chain of command communication will soon find themselves working elsewhere.

A major source of issues is poor communication between depts. The way to solve this is allow free flow of information between all levels. If, in order to get something done between depts, an individual contributor has to talk to their manager, who talks to a director, who talks to a VP, who talks to another VP, who talks to a director, who talks to a manager, who talks to someone doing the actual work, then super dumb things will happen. It must be ok for people to talk directly and just make the right thing happen.

In general, always pick common sense as your guide. If following a ‘company rule’ is obviously ridiculous in a particular situation, such that it would make for a great Dilbert cartoon, then the rule should change.”
- Elon Musk, CEO, Tesla

Let’s parlay this uncommon advice into retail. If it’s everyone’s job to access a free flow of information, use common sense, make the right thing happen, and change rules that don’t make sense, then:

  1. Inventory is known by all store staff, and my cooler can be promptly located by any employee, rather than workers appearing helpless.
  2. Employees have the power to push back and insist that, because customers still expect to be able to special order merchandise, a specific store location will maintain this service rather than disappoint consumers.
  3. Pharmacists can recognize that patrons are often quite ill and can immediately place some chairs near the pharmacy counter, rather than close their eyes to suffering.

“But wait,” retailers may say. “How can I trust that an employee’s idea of ‘common sense’ is reliable?”

Let’s ask a monk for the answer.

“He took the time…”

I recently had the pleasure of listening to a talk given by a monk who was defining what it meant to be a good leader. He hearkened back to his young days, and to the man who was then the leader of his community.

“He was a busy man, but he took the time to get to know each of us one-on-one, and to be sure that we knew him. He set an example for me, and I watched him,” the monk explained.

Most monasteries function within a set of established rules, many of which are centuries old. You can think of these guidelines as a sort of policy. In certain communities, it’s perfectly acceptable that some of the members live apart as hermits most of the year, only breaking their meditative existence by checking in with the larger group on important holidays to share what they’ve been working on solo. In others, every hour has its appointed task, from prayer, to farming, to feeding people, to engaging in social activism.

The point is that everyone within a given community knows the basic guidelines, because at some point, they’ve been well-communicated. Beyond that, it is up to the individual to see whether they can happily live out their personal expression within the policy.

It’s a lot like retail can be, when done right. And it hinges on the question:

“Has culture been well-enough communicated to every employee so that he or she can act like the CEO of the company would in wide variety of circumstances?”

Or to put it another way, would Amazon owner Jeff Bezos be powerless to get me my vitamins?

The most accessible modern benchmark of good customer service — the online review — is what tells the public whether the CEO has “set the example.” Reviews tell whether time has been taken to acquaint every staffer with the business that employs them, preparing them to fit their own personal expression within the company’s vision of serving the public.

An employee who is able to recognize that an injured patron needs a seat while awaiting his prescription should be empowered to act immediately, knowing that the larger company supports treating people well. If poor training, burdensome chains of command, or failure to share brand culture are obstacles to common-sense personal initiative, the problem must be traced back to the CEO and corrected, starting from there.

And, of course, should a random staffer’s personal expression genuinely include an insurmountable disregard for other people, they can always be told it’s time to leave the monastery…

For marketing agencies, opportunity knocks

So your agency is auditing a valuable incoming client, and their negative reviews citing dirty premises, broken fixtures, food poisoning, slowness, rudeness, cluelessness, and lack of apparent concern make you say to yourself,

“Well, I was hoping we could clean up the bad data on the local business listings for this enterprise, but unless they clean up their customer service at 150 of their worst-rated locations, how much ROI are we really going to be able to deliver? What’s going on at these places?”

Let’s make no bones about this: Your honesty at this critical juncture could mean the difference between survival and closure for the brand.

You need to bring it home to the most senior level person you can reach in the organization that no amount of honest marketing can cover up poor customer service in the era of online reviews. If the brand has fallen to the level of the pharmacy I’ve cited, structural change is an absolute necessity. You can ask the tough questions, ask for an explanation of the bad reviews.

“But I’m just a digital marketer,” you may think. “I’m not in charge of whatever happens offline.”

Think again.

Headlines in retail land are horrid right now:

If you were a retail brand C-suite and were swallowing these predictions of doom with your daily breakfast, wouldn’t you be looking for inspiration from anyone with genuine insight? And if a marketing agency should make it their business to confront the truth while also being the bearer of some better news, wouldn’t you be ready to listen?

What is the truth? That poor reviews are symptoms smart doctors can use for diagnosis of structural problems.
What is the better news? The retail scenario is not nearly as dire as it may seem.

Why let hierarchy and traditional roles hold your agency back? Tesla wouldn’t. Why not roll up your sleeves and step into in-store? Organize and then translate the narrative negative reviews are telling about structural problems for the brand which have resulted in dangerously bad customer service. And then, be prepared to counter corporate inertia born of fear with some eye-opening statistics.

Print and share some good retail tidings

Local SEO infographic

Print your own copy of this infographic to share with clients.

At Moz, we’re working with enterprises to get their basic location data into shape so that they are ready to win their share of the predicted $ 1.4 trillion in mobile-influenced local sales by 2021, and your agency can use these same numbers to combat indecision and apathy for your retail clients. Look at that second statistic again: 90% of purchases are still happening in physical stores. At Moz, we ask our customers if their data is ready for this. Your agency can ask its clients if their reputations are ready for this, if their employees have what they need to earn the brand’s piece of that 90% action. Great online data + great in-store service = table stakes for retail success.

While I won’t play down the unease that major brand retail closures is understandably causing, I hope I’ve given you the tools to fight the “retail disaster” narrative. 85% more mobile users are searching for things like “Where do I buy that reindeer moss vitamin D3?” than they were just 3 years ago. So long as retail staff is ready to deliver, I see no “apocalypse” here.

Investing time

So, your agency has put in the time to identify a reputation problem severe enough that it appears to be founded in structural deficiencies or policies. Perhaps you’ve used some ORM software to do review sentiment analysis to discover which of your client’s locations are hurting worst, or perhaps you’ve done an initial audit manually. You’ve communicated the bad news to the most senior-level person you can reach at the company, and you’ve also shared the statistics that make change seem very worthwhile, begging for a new commitment to in-store excellence. What happens next?

While there are going to be nuances specific to every brand, my bet is that the steps will look like this for most businesses:

  1. C-suites need to invest time in creating a policy which a) abundantly communicates company culture, b) expresses trust in employee initiative, and c) dispenses with needless “chain of command” steps, while d) ensuring that every public facing staffer receives full and ongoing training. A recent study says 62% of new retail hires receive less than 10 hours of training. I’d call even these worrisome numbers optimistic. I worked at 5 retail jobs in my early youth. I’d estimate that I received no more than 1 hour of training at any of them.
  2. Because a chain of command can’t realistically be completely dispensed with in a large organization, store managers must then be allowed the time to communicate the culture, encourage employees to use common sense, define what “common sense” does and doesn’t look like to the company, and, finally, offer essential training.
  3. Employees at every level must be given the time to observe how happy or unhappy customers appear to be at their location, and they must be taught that their observations are of inestimable value to the brand. If an employee suggests a solution to a common consumer complaint, this should be recognized and rewarded.
  4. Finally, customers must be given the time to air their grievances at the time of service, in-person, with accessible, responsive staff. The word “corporate” need never come into most of these conversations unless a major claim is involved. Given that it may cost as much as 7x more to replace an unhappy customer than to keep an existing one happy, employees should be empowered to do business graciously and resolve complaints, in most cases, without escalation.

Benjamin Franklin may or may not have said that “time is money.” While the adage rings true in business, reviews have taught me the flip side — that a lack of time equals less money. Every negative review that cites helpless employees and poor service sounds to my marketing ears like a pocketful of silver dollars rolling down a drain.

The monk says good leaders make the time to communicate culture one-on-one.

Tesla says rules should change if they’re ridiculous.

Chairs should be offered to sick people… where common sense is applied.

Reviews can read like this:

Screenshot of a positive 5-star review: "Had personal attention of three Tesla employees at the same time. They let me sit in both the model cars they had for quite time time and let me freely fiddle and figure out all the gizmos of the car. Super friendly and answered all my questions. The sales staff did not pressure me to buy or anything, but only casually mentioned the price and test drive opportunities, which is the perfect touch for a car company like Tesla. "

And digital marketers have never known a time quite like this to have the ear of retail, maybe stepping beyond traditional boundaries into the fray of the real world. Maybe making a fundamental difference.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

SearchCap: Yelp against Google, Guide to PPC out & Google Shopping Auctions

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Advert