Tag Archive | "Ranking"

SearchCap: Google expands featured snippets, voice search ranking study & Rand Fishkin moves on

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google expands featured snippets, voice search ranking study & Rand Fishkin moves on appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Related Articles

Posted in Latest NewsComments Off

The Google Ranking Factor You Can Influence in an Afternoon [Case Study]

Posted by sanfran

What does Google consider “quality content”? And how do you capitalize on a seemingly subjective characteristic to improve your standing in search?

We’ve been trying to figure this out since the Hummingbird algorithm was dropped in our laps in 2013, prioritizing “context” over “keyword usage/frequency.” This meant that Google’s algorithm intended to understand the meaning behind the words on the page, rather than the page’s keywords and metadata alone.

This new sea change meant the algorithm was going to read in between the lines in order to deliver content that matched the true intent of someone searching for a keyword.

Write longer content? Not so fast!

Watching us SEOs respond to Google updates is hilarious. We’re like a floor full of day traders getting news on the latest cryptocurrency.

One of the most prominent theories that made the rounds was that longer content was the key to organic ranking. I’m sure you’ve read plenty of articles on this. We at Brafton, a content marketing agency, latched onto that one for a while as well. We even experienced some mixed success.

However, what we didn’t realize was that when we experienced success, it was because we accidentally stumbled on the true ranking factor.

Longer content alone was not the intent behind Hummingbird.

Content depth

Let’s take a hypothetical scenario.

If you were to search the keyword “search optimization techniques,” you would see a SERP that looks similar to the following:

Nothing too surprising about these results.

However, if you were to go through each of these 10 results and take note of the major topics they discussed, theoretically you would have a list of all the topics being discussed by all of the top ranking sites.

Example:

Position 1 topics discussed: A, C, D, E, F

Position 2 topics discussed: A, B, F

Position 3 topics discussed: C, D, F

Position 4 topics discussed: A, E, F

Once you finished this exercise, you would have a comprehensive list of every topic discussed (A–F), and you would start to see patterns of priority emerge.

In the example above, note “topic F” is discussed in all four pieces of content. One would consider this a cornerstone topic that should be prioritized.

If you were then to write a piece of content that covered each of the topics discussed by every competitor on page one, and emphasized the cornerstone topics appropriately, in theory, you would have the most comprehensive piece of content on that particular topic.

By producing the most comprehensive piece of content available, you would have the highest quality result that will best satisfy the searcher’s intent. More than that, you would have essentially created the ultimate resource center for everything a person would want to know about that topic.

How to identify topics to discuss in a piece of content

At this point, we’re only theoretical. The theory makes logical sense, but does it actually work? And how do we go about scientifically gathering information on topics to discuss in a piece of content?

Finding topics to cover:

  • Manually: As discussed previously, you can do it manually. This process is tedious and labor-intensive, but it can be done on a small scale.
  • Using SEMrush: SEMrush features an SEO content template that will provide guidance on topic selection for a given keyword.
  • Using MarketMuse: MarketMuse was originally built for the very purpose of content depth, with an algorithm that mimics Hummingbird. MM takes a largely unscientific process and makes it scientific. For the purpose of this case study, we used MarketMuse.

The process

Watch the process in action

1. Identify content worth optimizing

We went through a massive list of keywords our blog ranked for. We filtered that list down to keywords that were not ranking number one in SERPs but had strong intent. You can also do this with core landing pages.

Here’s an example: We were ranking in the third position for the keyword “financial content marketing.” While this is a low-volume keyword, we were enthusiastic to own it due to the high commercial intent it comes with.

2. Evaluate your existing piece

Take a subjective look at your piece of content that is ranking for the keyword. Does it SEEM like a comprehensive piece? Could it benefit from updated examples? Could it benefit from better/updated inline embedded media? With a cursory look at our existing content, it was clear that the examples we used were old, as was the branding.

3. Identify topics

As mentioned earlier, you can do this in a few different ways. We used MarketMuse to identify the topics we were doing a good job of covering as well as our topic gaps, topics that competitors were discussing, but we were not. The results were as follows:

Topics we did a good job of covering:

  • Content marketing impact on branding
  • Impact of using case studies
  • Importance of infographics
  • Business implications of a content marketing program
  • Creating articles for your audience

Topics we did a poor job of covering:

  • Marketing to millennials
  • How to market to existing clients
  • Crafting a content marketing strategy
  • Identifying and tracking goals

4. Rewrite the piece

Considering how out-of-date our examples were, and the number of topics we had neglected to discuss, we determined a full rewrite of the piece was warranted. Our writer, Mike O’Neill, was given the topic guidance, ensuring he had a firm understanding of everything that needed to be discussed in order to create a comprehensive article.

5. Update the content

To maintain our link equity, we kept the same URL and simply updated the old content with the new. Then we updated the publish date. The new article looks like this, with updated content depth, modern branding, and inline visuals.

6. Fetch as Google

Rather than wait for Google to reindex the content, I wanted to see the results immediately (and it is indeed immediate).

7. Check your results

Open an incognito window and see your updated position.

Promising results:

We have run more than a dozen experiments and have seen positive results across the board. As demonstrated in the video, these results are usually realized within 60 seconds of reindexing the updated content.

Keyword target

Old Ranking

New ranking

“Financial content marketing”

3

1

“What is a subdomain”

16

6

“Best company newsletters”

32

4

“Staffing marketing”

7

3

“Content marketing agency”

16

1

“Google local business cards”

16

5

“Company blog”

7

4

“SEO marketing tools”

9

3

Of those tests, here’s another example of this process in action for the keyword, “best company newsletters.”

Before:

After

Assumptions:

From these results, we can assume that content depth and breadth of topic coverage matters — a lot. Google’s algorithm seems to have an understanding of the competitive topic landscape for a keyword. In our hypothetical example from before, it would appear the algorithm knows that topics A–F exist for a given keyword and uses that collection of topics as a benchmark for content depth across competitors.

We can also assume Google’s algorithm either a.) responds immediately to updated information, or b.) has a cached snapshot of the competitive content depth landscape for any given keyword. Either of these scenarios is very likely because of the speed at which updated content is re-ranked.


In conclusion, don’t arbitrarily write long content and call it “high quality.” Choose a keyword you want to rank for and create a comprehensive piece of content that fully supports that keyword. There is no guarantee you’ll be granted a top position — domain strength factors play a huge role in rankings — but you’ll certainly improve your odds, as we have seen.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Diagnosing Why a Site’s Set of Pages May Be Ranking Poorly – Whiteboard Friday

Posted by randfish

Your rankings have dropped and you don’t know why. Maybe your traffic dropped as well, or maybe just a section of your site has lost rankings. It’s an important and often complex mystery to solve, and there are a number of boxes to check off while you investigate. In this Whiteboard Friday, Rand shares a detailed process to follow to diagnose what went wrong to cause your rankings drop, why it happened, and how to start the recovery process.

Diagnosing why a site's pages may be ranking poorly

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to talk about diagnosing a site and specifically a section of a site’s pages and why they might be performing poorly, why their traffic may have dropped, why rankings may have dropped, why both of them might have dropped. So we’ve got a fairly extensive process here, so let’s get started.

Step 1: Uncover the problem

First off, our first step is uncovering the problem or finding whether there is actually a problem. A good way to think about this is especially if you have a larger website, if we’re talking about a site that’s 20 or 30 or even a couple hundred pages, this is not a big issue. But many websites that SEOs are working on these days are thousands, tens of thousands, hundreds of thousands of pages. So what I like to urge folks to do is to

A. Treat different site sections as unique segments for investigation. You should look at them individually.

A lot of times subfolders or URL structures are really helpful here. So I might say, okay, MySite.com, I’m going to look exclusively at the /news section. Did that fall in rankings? Did it fall in traffic? Or was it /posts, where my blog posts and my content is? Or was it /cities? Let’s say I have a website that’s dealing with data about the population of cities. So I rank for lots of those types of queries, and it seems like I’m ranking for fewer of them, and it’s my cities pages that are poorly performing in comparison to where they were a few months ago or last year at this time.

B. Check traffic from search over time.

So I go to my Google Analytics or whatever analytics you’re using, and you might see something like, okay, I’m going to look exclusively at the /cities section. If you can structure your URLs in this fashion, use subfolders, this is a great way to do it. Then take a look and see, oh, hang on, that’s a big traffic drop. We fell off a cliff there for these particular pages.

This data can be hiding inside your analytics because it could be that the rest of your site is performing well. It’s going sort of up and to the right, and so you see this slow plateauing or a little bit of a decline, but it’s not nearly as sharp as it is if you look at the traffic specifically for a single subsection that might be performing poorly, like this /cities section.

From there, I’m going to next urge you to use Google Trends. Why? Why would I go to Google Trends? Because what I want you to do is I want you to look at some of your big keywords and topics in Google Trends to see if there has been a serious decline in search volume at the same time. If search demand is rising or staying stable over the course of time where you have lost traffic, it’s almost certainly something you’ve done, not something searchers are doing. But if you see that traffic has declined, for example, maybe you were ranking really well for population data from 2015. It turns out people are now looking for population data for 2016 or ’17 or ’18. Maybe that is part of the problem, that search demand has fallen and your curve matches that.

C. Perform some diagnostic queries or use your rank tracking data if you have it on these types of things.

This is one of the reasons I like to rank track for even these types of queries that don’t get a lot of traffic.

1. Target keywords. In this case, it might be “Denver population growth,” maybe that’s one of your keywords. You would see, “Do I still rank for this? How well do I rank for this? Am I ranking more poorly than I used to?”

2. Check brand name plus target keyword. So, in this case, it would be my site plus the above here plus “Denver population growth,” so My Site or MySite.com Denver population growth. If you’re not ranking for that, that’s usually an indication of a more serious problem, potentially a penalty or some type of dampening that’s happening around your brand name or around your website.

3. Look for a 10 to 20-word text string from page content without quotes. It could be shorter. It could be only six or seven words, or it could be longer, 25 words if you really need it. But essentially, I want to take a string of text that exists on the page and put it in order in Google search engine, not in quotes. I do not want to use quotes here, and I want to see how it performs. This might be several lines of text here.

4. Look for a 10 to 20-word text string with quotes. So those lines of text, but in quotes searched in Google. If I’m not ranking for this, but I am for this one … sorry, if I’m not ranking for the one not in quotes, but I am in quotes, I might surmise this is probably not duplicate content. It’s probably something to do with my content quality or maybe my link profile or Google has penalized or dampened me in some way.

5. site: urlstring/ So I would search for “site:MySite.com/cities/Denver.” I would see: Wait, has Google actually indexed my page? When did they index it? Oh, it’s been a month. I wonder why they haven’t come back. Maybe there’s some sort of crawl issue, robots.txt issue, meta robots issue, something. I’m preventing Google from potentially getting there. Or maybe they can’t get there at all, and this results in zero results. That means Google hasn’t even indexed the page. Now we have another type of problem.

D. Check your tools.

1. Google Search Console. I would start there, especially in the site issues section.

2. Check your rank tracker or whatever tool you’re using, whether that’s Moz or something else.

3. On-page and crawl monitoring. Hopefully you have something like that. It could be through Screaming Frog. Maybe you’ve run some crawls over time, or maybe you have a tracking system in place. Moz has a crawl system. OnPage.org has a really good one.

4. Site uptime. So I might check Pingdom or other things that alert me to, “Oh, wait a minute, my site was down for a few days last week. That obviously is why traffic has fallen,” those types of things.

Step 2: Offer hypothesis for falling rankings/traffic

Okay, you’ve done your diagnostics. Now it’s time to offer some hypotheses. So now that we understand which problem I might have, I want to understand what could be resulting in that problem. So there are basically two situations you can have. Rankings have stayed stable or gone up, but traffic has fallen.

A. If rankings are up, but traffic is down…

In those cases, these are the five things that are most typically to blame.

1. New SERP features. There’s a bunch of featured snippets that have entered the population growth for cities search results, and so now number one is not what number one used to be. If you don’t get that featured snippet, you’re losing out to one of your competitors.

2. Lower search demand. Like we talked about in Google Trends. I’m looking at search demand, and there are just not as many people searching as there used to be.

3. Brand or reputation issues. I’m ranking just fine, but people now for some reason hate me. People who are searching this sector think my brand is evil or bad or just not as helpful as it used to be. So I have issues, and people are not clicking on my results. They’re choosing someone else actively because of reputation issues.

4. Snippet problems. I’m ranking in the same place I used to be, but I’m no longer the sexiest, most click-drawing snippet in the search results, and other people are earning those clicks instead.

5. Shift in personalization or location biasing by Google. It used to be the case that everyone who searched for city name plus population growth got the same results, but now suddenly people are seeing different results based on maybe their device or things they’ve clicked in the past or where they’re located. Location is often a big cause for this.

So for many SEOs for many years, “SEO consultant” resulted in the same search results. Then Google introduced the Maps results and pushed down a lot of those folks, and now “SEO consultant” results in different ranked results in each city and each geography that you search in. So that can often be a cause for falling traffic even though rankings remain high.

B. If rankings and traffic are down…

If you’re seeing that rankings have fallen and traffic has fallen in conjunction, there’s a bunch of other things that are probably going on that are not necessarily these things. A few of these could be responsible still, like snippet problems could cause your rankings and your traffic to fall, or brand and reputation issues could cause your click-through rate to fall, which would cause you to get dampened. But oftentimes it’s things like this:

1. & 2. Duplicate content and low-quality or thin content. Google thinks that what you’re providing just isn’t good enough.

3. Change in searcher intent. People who were searching for population growth used to want what you had to offer, but now they want something different and other people in the SERP are providing that, but you are not, so Google is ranking you lower. Even though your content is still good, it’s just not serving the new searcher intent.

4. Loss to competitors. So maybe you have worse links than they do now or less relevance or you’re not solving the searcher’s query as well. Your user interface, your UX is not as good. Your keyword targeting isn’t as good as theirs. Your content quality and the unique value you provide isn’t as good as theirs. If you see that one or two competitors are consistently outranking you, you might diagnose that this is the problem.

5. Technical issues. So if I saw from over here that the crawl was the problem, I wasn’t getting indexed, or Google hasn’t updated my pages in a long time, I might look into accessibility things, maybe speed, maybe I’m having problems like letting Googlebot in, HTTPS problems, or indexable content, maybe Google can’t see the content on my page anymore because I made some change in the technology of how it’s displayed, or crawlability, internal link structure problems, robots.txt problems, meta robots tag issues, that kind of stuff.

Maybe at the server level, someone on the tech ops team of my website decided, “Oh, there’s this really problematic bot coming from Mountain View that’s costing us a bunch of bandwidth. Let’s block bots from Mountain View.” No, don’t do that. Bad. Those kinds of technical issues can happen.

6. Spam and penalties. We’ll talk a little bit more about how to diagnose those in a second.

7. CTR, engagement, or pogo-sticking issues. There could be click-through rate issues or engagement issues, meaning pogo sticking, like people are coming to your site, but they are clicking back because they weren’t satisfied by your results, maybe because their expectations have changed or market issues have changed.

Step 3: Make fixes and observe results

All right. Next and last in this process, what we’re going to do is make some fixes and observe the results. Hopefully, we’ve been able to correctly diagnose and form some wise hypotheses about what’s going wrong, and now we’re going to try and resolve them.

A. On-page and technical issues should solve after a new crawl + index.

So on-page and technical issues, if we’re fixing those, they should usually resolve, especially on small sections of sites, pretty fast. As soon as Google has crawled and indexed the page, you should generally see performance improve. But this can take a few weeks if we’re talking about a large section on a site, many thousands of pages, because Google has to crawl and index all of them to get the new sense that things are fixed and traffic is coming in. Since it’s long tail to many different pages, you’re not going to see that instant traffic gain and rise as fast.

B. Link issues and spam penalty problems can take months to show results.

Look, if you have crappier links or not a good enough link profile as your competitors, growing that can take months or years even to fix. Penalty problems and spam problems, same thing. Google can take sometimes a long time. You’ve seen a lot of spam experts on Twitter saying, “Oh, well, all my clients who had issues over the last nine months suddenly are ranking better today,” because Google made some fix in their latest index rollout or their algorithm changed, and it’s sort of, okay, well we’ll reward the people for all the fixes that they’ve made. Sometimes that’s in batches that take months.

C. Fixing a small number of pages in a section that’s performing poorly might not show results very quickly.

For example, let’s say you go and you fix /cities/Milwaukee. You determine from your diagnostics that the problem is a content quality issue. So you go and you update these pages. They have new content. It serves the searchers much better, doing a much better job. You’ve tested it. People really love it. You fixed two cities, Milwaukee and Denver, to test it out. But you’ve left 5,000 other cities pages untouched.

Sometimes Google will sort of be like, “No, you know what? We still think your cities pages, as a whole, don’t do a good job solving this query. So even though these two that you’ve updated do a better job, we’re not necessarily going to rank them, because we sort of think of your site as this whole section and we grade it as a section or apply some grades as a section.” That is a real thing that we’ve observed happening in Google’s results.

Because of this, one of the things that I would urge you to do is if you’re seeing good results from the people you’re testing it with and you’re pretty confident, I would roll out the changes to a significant subset, 30%, 50%, 70% of the pages rather than doing only a tiny, tiny sample.

D. Sometimes when you encounter these issues, a remove and replace strategy works better than simply upgrading old URLs.

So if Google has decided /cities, your /cities section is just awful, has all sorts of problems, not performing well on a bunch of different vectors, you might take your /cities section and actually 301 redirect them to a new URL, /location, and put the new UI and the new content that better serves the searcher and fixes a lot of these issues into that location section, such that Google now goes, “Ah, we have something new to judge. Let’s see how these location pages on MySite.com perform versus the old cities pages.”

So I know we’ve covered a ton today and there are a lot of diagnostic issues that we haven’t necessarily dug deep into, but I hope this can help you if you’re encountering rankings challenges with sections of your site or with your site as a whole. Certainly, I look forward to your comments and your feedback. If you have other tips for folks facing this, that would be great. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Find More Articles

Posted in Latest NewsComments Off

How to Troubleshoot Local Ranking Failures [Updated for 2018]

Posted by MiriamEllis

I love a mystery… especially a local search ranking mystery I can solve for someone.

Now, the truth is, some ranking puzzles are so complex, they can only be solved by a formal competitive audit. But there are many others that can be cleared up by spending 15 minutes or less going through an organized 10-point checklist of the commonest problems that can cause a business to rank lower than the owner thinks it should. By zipping through the following checklist, there’s a good chance you’ll be able to find one or more obvious “whodunits” contributing to poor Google local pack visibility for a given search.

Since I wrote the original version of this post in 2014, so much has changed. Branding, tools, tactics — things are really different in 2018. Definitely time for a complete overhaul, with the goal of making you a super sleuth for your forum friends, clients, agency teammates, or executive superiors.

Let’s emulate the Stratemeyer Syndicate, which earned lasting fame by hitting on a simple formula for surfacing and solving mysteries in a most enjoyable way.

Before we break out our magnifying glass, it’s critical to stress one very important thing. The local rankings I see from an office in North Beach, San Francisco are not the rankings you see while roaming around Golden Gate park in the same city. The rankings your client in Des Moines sees for things in his town are not the same rankings you see from your apartment in Albuquerque when you look at Des Moines results. With the user having become the centroid of search for true local searches, it is no mystery at all that we see different results when we are different places, and it is no cause for concern.

And now that we’ve gotten that out of the way and are in the proper detective spirit, let’s dive into how to solve for each item on our checklist!


☑ Google updates/bugs

The first thing to ask if a business experiences a sudden change in rankings is whether Google has done something. Search Engine Land strikes me as the fastest reporter of Google updates, with MozCast offering an ongoing weather report of changes in the SERPs. Also, check out the Moz Google Algo Change history list and the Moz Blog for some of the most in-depth strategic coverage of updates, penalties, and filters.

For local-specific bugs (or even just suspected tests), check out the Local Search Forum, the Google My Business forum, and Mike Blumenthal’s blog. See if the effects being described match the weirdness you are seeing in your local packs. If so, it’s a matter of fixing a problematic practice (like iffy link building) that has been caught in an update, waiting to see how the update plays out, or waiting for Google to fix a bug or turn a dial down to normalize results.

*Pro tip: Don’t make the mistake of thinking organic updates have nothing to do with local SEO. Crack detectives know organic and local are closely connected.

☑ Eligibility to list and rank

When a business owner wants to know why he isn’t ranking well locally, always ask these four questions:

  1. Does the business have a real address? (Not a PO box, virtual office, or a string of employees’ houses!)
  2. Does the business make face-to-face contact with its customers?
  3. What city is the business in?
  4. What is the exact keyword phrase they are hoping to rank for?

If the answer is “no” to either of the first two questions, the business isn’t eligible for a Google My Business listing. And while spam does flow through Google, a lack of eligibility could well be the key to a lack of rankings.

For the third question, you need to know the city the business is in so that you can see if it’s likely to rank for the search phrase cited in the fourth question. For example, a plumber with a street address in Sugar Land, TX should not expect to rank for “plumber Dallas TX.” If a business lacks a physical location in a given city, it’s atypical for it to rank for queries that stem from or relate to that locale. It’s amazing just how often this simple fact solves local pack mysteries.

☑ Guideline spam

To be an ace local sleuth, you must commit to memory the guidelines for representing your business on Google so that you can quickly spot violations. Common acts of spam include:

  • Keyword stuffing the business name field
  • Improper wording of the business name field
  • Creating listings for ineligible locations, departments, or people
  • Category spam
  • Incorrect phone number implementation
  • Incorrect website URL implementation
  • Review guideline violations

If any of the above conundrums are new to you, definitely spend 10 minutes reading the guidelines. Make flash cards, if necessary, to test yourself on your spam awareness until you can instantly detect glaring errors. With this enhanced perception, you’ll be able to see problems that may possibly be leading to lowered rankings, or even… suspensions!

☑ Suspensions

There are two key things to look for here when a local business owner comes to you with a ranking woe:

  1. If the listing was formerly verified, but has mysteriously become unverified, you should suspect a soft suspension. Soft suspensions might occur around something like a report of keyword-stuffing the GMB business name field. Oddly, however, there is little anecdotal evidence to support the idea that soft suspensions cause ranking drops. Nevertheless, it’s important to spot the un-verification clue and tell the owner to stop breaking guidelines. It’s possible that the listing may lose reviews or images during this type of suspension, but in most cases, the owner should be able to re-verify his listing. Just remember: a soft suspension is not a likely cause of low local pack rankings.
  2. If the listing’s rankings totally disappear and you can’t even find the listing via a branded search, it’s time to suspect a hard suspension. Hard suspensions can result from a listing falling afoul of a Google guideline or new update, a Google employee, or just a member of the public who has reported the business for something like an ineligible location. If the hard suspension is deserved, as in the case of creating a listing at a fake address, then there’s nothing you can do about it. But, if a hard suspension results from a mistake, I recommend taking it to the Google My Business forum to plead for help. Be prepared to prove that you are 100% guideline-compliant and eligible in hopes of getting your listing reinstated with its authority and reviews intact.

☑ Duplicates

Notorious for their ability to divide ranking strength, duplicate listings are at their worst when there is more than one verified listing representing a single entity. If you encounter a business that seems like it should be ranking better than it is for a given search, always check for duplicates.

The quickest way to do this is to get all present and past NAP (name, address, phone) from the business and plug it into the free Moz Check Listing tool. Pay particular attention to any GMB duplicates the tool surfaces. Then:

  1. If the entity is a brick-and-mortar business or service area business, and the NAP exactly matches between the duplicates, contact Google to ask them to merge the listings. If the NAP doesn’t match and represents a typo or error on the duplicate, use the “suggest an edit” link in Google Maps to toggle the “yes/no” toggle to “yes,” and then select the radio button for “never existed.”
  2. If the duplicates represent partners in a multi-practitioner business, Google won’t simply delete them. Things get quite complicated in this scenario, and if you discover practitioner duplicates, tread carefully. There are half a dozen nuances here, including whether you’re dealing with actual duplicates, whether they represent current or past staffers, whether they are claimed or unclaimed, and even whether a past partner is deceased. There isn’t perfect industry agreement on the handling of all of the ins-and-outs of practitioner listings. Given this, I would advise an affected business to read all of the following before making a move in any direction:

☑ Missing/inaccurate listings

While you’ve got Moz Check Listing fired up, pay attention to anything it tells you about missing or inaccurate listings. The tool will show you how accurate and complete your listings on are on the major local business data aggregators, plus other important platforms like Google My Business, Facebook, Factual, Yelp, and more. Why does this matter?

  1. Google can pull information from anywhere on the web and plunk it into your Google My Business listing.
  2. While no one can quantify the exact degree to which citation/listing consistency directly impacts Google local rankings for every possible search query, it has been a top 5 ranking factor in the annual Local Search Ranking Factors survey as far back as I can remember. Recently, I’ve seen some industry discussion as to whether citations still matter, with some practitioners claiming they can’t see the difference they make. I believe that conclusion may stem from working mainly in ultra-competitive markets where everyone has already got their citations in near-perfect order, forcing practitioners to look for differentiation tactics beyond the basics. But without those basics, you’re missing table stakes in the game.
  3. Indirectly, listing absence or inconsistency impacts local rankings in that it undermines the quest for good local KPIs as well as organic authority. Every lost or misdirected consumer represents a failure to have someone click-for-directions, click-to-call, click-to-your website, or find your website at all. Online and offline traffic, conversions, reputation, and even organic authority all hang in the balance of active citation management.

☑ Lack of organic authority

Full website or competitive audits are not the work of a minute. They really take time, and deep delving. But, at a glance, you can access some quick metrics to let you know whether a business’ lack of achievement on the organic side of things could be holding them back in the local packs. Get yourself the free MozBar SEO toolbar and try this:

  1. Turn the MozBar on by clicking the little “M” at the top of your browser so that it is blue.
  2. Perform your search and look at the first few pages of the organic results, ignoring anything from major directory sites like Yelp (they aren’t competing with you for local pack rankings, eh?).
  3. Note down the Page Authority, Domain Authority, and link counts for each of the businesses coming up on the first 3 pages of the organic results.
  4. Finally, bring up the website of the business you’re investigating. If you see that the top competitors have Domain Authorities of 50 and links numbering in the hundreds or thousands, whereas your target site is well below in these metrics, chances are good that organic authority is playing a strong role in lack of local search visibility. How do we know this is true? Do some local searches and note just how often the businesses that make it into the 3-pack or the top of the local finder view have correlating high organic rankings.

Where organic authority is poor, a business has a big job of work ahead. They need to focus on content dev + link building + social outreach to begin building up their brand in the minds of consumers and the “RankBrain” of Google.

One other element needs to be mentioned here, and that’s the concept of how time affects authority. When you’re talking to a business with a ranking problem, it’s very important to ascertain whether they just launched their website or just built their local business listings last week, or even just a few months ago. Typically, if they have, the fruits of their efforts have yet to fully materialize. That being said, it’s not a given that a new business will have little authority. Large brands have marketing departments which exist solely to build tremendous awareness of new assets before they even launch. It’s important to keep that in mind, while also realizing that if the business is smaller, building authority will likely represent a longer haul.

☑ Possum effect

Where local rankings are absent, always ask:

“Are there any other businesses in your building or even on your street that share your Google category?”

If the answer is “yes,” search for the business’ desired keyword phase and look at the local finder view in Google Maps. Note which companies are ranking. Then begin to zoom in on the map, level by level, noting changes in the local finder as you go. If, a few levels in, the business you’re advising suddenly appears on the map and in the local finder, chances are good it’s the Possum filter that’s causing their apparent invisibility at the automatic zoom level.

Google Possum rolled out in September 2016, and its observable effects included a geographic diversification of the local results, filtering out many listings that share a category and are in close proximity to one another. Then, about one year later, Google initiated the Hawk update, which appears to have tightened the radius of Possum, with the result that while many businesses in the same building are still being filtered out, a number of nearby neighbors have reappeared at the automatic zoom level of the results.

If your sleuthing turns up a brand that is being impacted by Possum/Hawk, the only surefire way to beat the filter is to put in the necessary work to become the most authoritative answer for the desired search phrase. It’s important to remember that filters are the norm in Google’s local results, and have long been observed impacting listings that share an address, share a phone number, etc. If it’s vital for a particular listing to outrank all others that possess shared characteristics, then authority must be built around it in every possible way to make it one of the most dominant results.

☑ Local Service Ads effect

The question you ask here is:

“Is yours a service-area business?”

And if the answer is “yes,” then brace yourself for ongoing results disruption in the coming year.

Google’s Local Service Ads (formerly Home Service Ads) make Google the middleman between consumers and service providers, and in the 2+ years since first early testing, they’ve caused some pretty startling things to happen to local search results. These have included:

Suffice it to say, rollout to an ever-increasing number of cities and categories hasn’t been for the faint of heart, and I would hazard a guess that Google’s recent re-brand of this program signifies their intention to move beyond the traditional SAB market. One possible benefit of Google getting into this type of lead gen is that it could decrease spam, but I’m not sold on this, given that fake locations have ended up qualifying for LSA inclusion. While I honor Google’s need to be profitable, I share some of the qualms business owners have expressed about the potential impacts of this venture.

Since I can’t offer a solid prediction of what precise form these impacts will take in the coming months, the best I can do here is to recommend that if an SAB experiences a ranking change/loss, the first thing to look for is whether LSA has come to town. If so, alteration of the SERPs may be unavoidable, and the only strategy left for overcoming vanished visibility may be to pay for it… by qualifying for the program.

☑ GMB neglect

Sometimes, a lack of competitive rankings can simply be chalked up to a lack of effort. If a business wonders why they’re not doing better in the local packs, pull up their GMB listing and do a quick evaluation of:

  • Verification status – While you can rank without verifying, lack of verification is a hallmark of listing neglect.
  • Basic accuracy – If NAP or map markers are incorrect, it’s a sure sign of neglect.
  • Category choices – Wrong categories make right rankings impossible.
  • Image optimization – Every business needs a good set of the most professional, persuasive photos it can acquire, and should even consider periodic new photo shoots for seasonal freshness; imagery impacts KPIs, which are believed to impact rank.
  • Review count, sentiment and management – Too few reviews, low ratings, and lack of responses = utter neglect of this core rank/reputation-driver.
  • Hours of operation – If they’re blank or incorrect, conversions are being missed.
  • Main URL choice – Does the GMB listing point to a strong, authoritative website page or a weak one?
  • Additional URL choices – If menus, bookings, reservations, or placing orders is part of the business model, a variety of optional URLs are supported by Google and should be explored.
  • Google Posts – Early-days testing indicates that regular posting may impact rank.
  • Google Questions and Answers – Pre-populate with best FAQs and actively manage incoming questions.

There is literally no business, large or small, with a local footprint that can afford to neglect its Google My Business listing. And while some fixes and practices move the ranking needle more than others, the increasing number of consumer actions that take place within Google is reason enough to put active GMB management at the top of your list.


Closing the case

The Hardy Boys never went anywhere without their handy kit of detection tools. Their father was so confident in their utter preparedness that he even let them chase down gangs in Hong Kong and dictators in the Guyanas (which, on second thought, doesn’t seem terribly wise.) But I have that kind of confidence in you. I hope my troubleshooting checklist is one you’ll bookmark and share to be prepared for the local ranking mysteries awaiting you and your digital marketing colleagues in 2018. Happy sleuthing!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

More Website Articles

Posted in Latest NewsComments Off

SEO Ranking Factors & Correlation: What Does It Mean When a Metric Is Correlated with Google Rankings? – Whiteboard Friday

Posted by randfish

In an industry where knowing exactly how to get ranked on Google is murky at best, SEO ranking factors studies can be incredibly alluring. But there’s danger in believing every correlation you read, and wisdom in looking at it with a critical eye. In this Whiteboard Friday, Rand covers the myths and realities of correlations, then shares a few smart ways to use and understand the data at hand.

SEO Ranking Factors and Correlation

Click on the whiteboard image above to open a high-resolution version in a new tab!


Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we are chatting about SEO ranking factors and the challenge around understanding correlation, what correlation means when it comes to SEO factors.

So you have likely seen, over the course of your career in the SEO world, lots of studies like this. They’re usually called something like ranking factors or ranking elements study or the 2017 ranking factors, and a number of companies put them out. Years ago, Moz started to do this work with correlation stuff, and now many, many companies put these out. So people from Searchmetrics and I think Ahrefs puts something out, and SEMrush puts one out, and of course Moz has one.

These usually follow a pretty similar format, which is they take a large number of search results from Google, from a specific country or sometimes from multiple countries, and they’ll say, “We analyzed 100,000 or 50,000 Google search results, and in our set of results, we looked at the following ranking factors to see how well correlated they were with higher rankings.” That is to say how much they predicted that, on average, a page with this factor would outrank a page without the factor, or a page with more of this factor would outrank a page with less of this factor.

Correlation in SEO studies like these usually mean:

So, basically, in an SEO study, they usually mean something like this. They do like a scatter plot. They don’t have to specifically do a scatter plot, but visualization of the results. Then they’ll say, “Okay, linking root domains had better correlation or correlation with higher organic rankings than the 10 blue link-style results to the degree of 0.39.” They’ll usually use either Spearman or Pearson correlation. We won’t get into that here. It doesn’t matter too much.

Across this many searches, the metric predicted higher or lower rankings with this level of consistency. 1.0, by the way, would be perfect correlation. So, for example, if you were looking at days that end in Y and days that follow each other, well, there’s a perfect correlation because every day’s name ends in Y, at least in English.

So search visits, let’s walk down this path just a little bit. So search visits, saying that that 0.47 correlated with higher rankings, if that sounds misleading to you, it sounds misleading to me too. The problem here is that’s not necessarily a ranking factor. At least I don’t think it is. I don’t think that the more visits you get from search from Google, the higher Google ranks you. I think it’s probably that the correlation runs the other way around — the higher you rank in search results, the more visits on average you get from Google search.

So these ranking factors, I’ll run through a bunch of these myths, but these ranking factors may not be factors at all. They’re just metrics or elements where the study has looked at the correlation and is trying to show you the relationship on average. But you have to understand and intuit this information properly, otherwise you can be very misled.

Myths and realities of correlation in SEO

So let’s walk through a few of these.

1. Correlation doesn’t tell us which way the connection runs.

So it does not say whether factor X influences the rankings or whether higher rankings influences factor X. Let’s take another example — number of Facebook shares. Could it be the case that search results that rank higher in Google oftentimes get people sharing them more on Facebook because they’ve been seen by more people who searched for them? I think that’s totally possible. I don’t know whether it’s the case. We can’t prove it right here and now, but we can certainly say, “You know what? This number does not necessarily mean that Facebook shares influence Google results.” It could be the case that Google results influence Facebook searches. It could be the case that there’s a third factor that’s causing both of them. Or it could be the case that there’s, in fact, no relationship and this is merely a coincidental result, probably unlikely given that there is some relationship there, but possible.

2. Correlation does not imply causation.

This is a famous quote, but let’s continue with the famous quote. But it sure is a hint. It sure is a hint. That’s exactly what we like to use correlation for is as a hint of things we might investigate further. We’ll talk about that in a second.

3. In an algorithm like Google’s, with thousands of potential ranking inputs, if you see any single metric at 0.1 or higher, I tend to think that, in general, that is an interesting result.

Not prove something, not means that there’s a direct correlation, just it is interesting. It’s worthy of further exploration. It’s worthy of understanding. It’s worthy of forming hypotheses and then trying to prove those wrong. It is interesting.

4. Correlation does tell us what more successful pages and sites do that less successful sites and pages don’t do.

Sometimes, in my opinion, that is just as interesting as what is actually causing rankings in Google. So you might say, “Oh, this doesn’t prove anything.” What it proves to me is pages that are getting more Facebook shares tend to do a good bit better than pages that are not getting as many Facebook shares.

I don’t really care, to be honest, whether that is a direct Google ranking factor or whether that’s just something that’s happening. If it’s happening in my space, if it’s happening in the world of SERPs that I care about, that is useful information for me to know and information that I should be applying, because it suggests that my competitors are doing this and that if I don’t do it, I probably won’t be as successful, or I may not be as successful as the ones who are. Certainly, I want to understand how they’re doing it and why they’re doing it.

5. None of these studies that I have ever seen so far have looked specifically at SERP features.

So one of the things that you have to remember, when you’re looking at these, is think organic, 10 blue link-style results. We’re not talking about AdWords, the paid results. We’re not talking about Knowledge Graph or featured snippets or image results or video results or any of these other, the news boxes, the Twitter results, anything else that goes in there. So this is kind of old-school, classic organic SEO.

6. Correlation is not a best practice.

So it does not mean that because this list descends and goes down in this order that those are the things you should do in that particular order. Don’t use this as a roadmap.

7. Low correlation does not mean that a metric or a tactic doesn’t work

Example, a high percent of sites using a page or a tactic will result in a very low correlation. So, for example, when we first did this study in I think it was 2005 that Moz ran its first one of these, maybe it was ’07, we saw that keyword use in the title element was strongly correlated. I think it was probably around 0.2, 0.15, something like that. Then over time, it’s gone way, way down. Now, it’s something like 0.03, extremely small, infinitesimally small.

What does that mean? Well, it could mean one of two things. It could mean Google is using it less as a ranking factor. It could mean that it was never connected, and it’s just total speculation, total coincidence. Or three, it could mean that a lot more people who rank in the top 20 or 30 results, which is what these studies usually look at, top 10 to top 50 sometimes, a lot more of them are putting the keyword in the title, and therefore, there’s just no difference between result number 31 and result number 1, because they both have them in the title. So you’re seeing a much lower correlation between pages that don’t have them and do have them and higher rankings. So be careful about how you intuit that.

Oh, one final note. I did put -0.02 here. A negative correlation means that as you see less of this thing, you tend to see higher rankings. Again, unless there is a strong negative correlation, I tend to watch out for these, or I tend to not pay too much attention. For example, the keyword in the meta description, it could just be that, well, it turns out pretty much everyone has the keyword in the meta description now, so this is just not a big differentiating factor.

What is correlation good for?

All right. What’s correlation actually good for? We talked about a bunch of myths, ways not to use it.

A. IDing the elements that more successful pages tend to have

So if I look across a correlation and I see that lots of pages are twice as likely to have X and rank highly as the ones that don’t rank highly, well, that is a good piece of data for me.

B. Watching elements over time to see if they rise or lower in correlation.

For example, we watch links very closely over time to see if they rise or lower so that we can say: “Gosh, does it look like links are getting more or less influential in Google’s rankings? Are they more or less correlated than they were last year or two years ago?” And if we see that drop dramatically, we might intuit, “Hey, we should test the power of links again. Time for another experiment to see if links still move the needle, or if they’re becoming less powerful, or if it’s merely that the correlation is dropping.”

C. Comparing sets of search results against one another we can identify unique attributes that might be true

So, for example, in a vertical like news, we might see that domain authority is much more important than it is in fitness, where smaller sites potentially have much more opportunity or dominate. Or we might see that something like https is not a great way to stand out in news, because everybody has it, but in fitness, it is a way to stand out and, in fact, the folks who do have it tend to do much better. Maybe they’ve invested more in their sites.

D. Judging metrics as a predictive ranking ability

Essentially, when I’m looking at a metric like domain authority, how good is that at telling me on average how much better one domain will rank in Google versus another? I can see that this number is a good indication of that. If that number goes down, domain authority is less predictive, less sort of useful for me. If it goes up, it’s more useful. I did this a couple years ago with Alexa Rank and SimilarWeb, looking at traffic metrics and which ones are best correlated with actual traffic, and found Alexa Rank is awful and SimilarWeb is quite excellent. So there you go.

E. Finding elements to test

So if I see that large images embedded on a page that’s already ranking on page 1 of search results has a 0.61 correlation with the image from that page ranking in the image results in the first few, wow, that’s really interesting. You know what? I’m going to go test that and take big images and embed them on my pages that are ranking and see if I can get the image results that I care about. That’s great information for testing.

This is all stuff that correlation is useful for. Correlation in SEO, especially when it comes to ranking factors or ranking elements, can be very misleading. I hope that this will help you to better understand how to use and not use that data.

Thanks. We’ll see you again next week for another edition of Whiteboard Friday.

Video transcription by Speechpad.com

The image used to promote this post was adapted with gratitude from the hilarious webcomic, xkcd.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Local ranking factors study finds reviews, organic SEO best practices boost local visibility

This is the second year of the LocalSEOGuide ranking factors study, performed in conjunction with UC Irvine.

The post Local ranking factors study finds reviews, organic SEO best practices boost local visibility appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

SearchCap: Local ranking factors, keyword bidding & more

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Local ranking factors, keyword bidding & more appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Related Articles

Posted in Latest NewsComments Off

SearchCap: Canada courts with Google, Google Maps sharing & SEO ranking factors

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Canada courts with Google, Google Maps sharing & SEO ranking factors appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

SearchCap: Google Shopping ad updates, SEO ranking factors & nofollow links

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google Shopping ad updates, SEO ranking factors & nofollow links appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

SEO ranking factors: What’s important, what’s not

This week, Google celebrated its 19th birthday. A lot has changed in nearly two decades. Rather than relying primarily on PageRank to evaluate the quality of web pages, Google now uses a whole array of techniques to suggest a wide range of content in response to queries, from simple direct answers…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Advert