Tag Archive | "Authority"

How Do I Improve My Domain Authority (DA)?

Posted by Dr-Pete

The Short Version: Don’t obsess over Domain Authority (DA) for its own sake. Domain Authority shines at comparing your overall authority (your aggregate link equity, for the most part) to other sites and determining where you can compete. Attract real links that drive traffic, and you’ll improve both your Domain Authority and your rankings.

Unless you’ve been living under a rock, over a rock, or really anywhere rock-adjacent, you may know that Moz has recently invested a lot of time, research, and money in a new-and-improved Domain Authority. People who use Domain Authority (DA) naturally want to improve their score, and this is a question that I admit we’ve avoided at times, because like any metric, DA can be abused if taken out of context or viewed in isolation.

I set out to write a how-to post, but what follows can only be described as a belligerent FAQ …

Why do you want to increase DA?

This may sound like a strange question coming from an employee of the company that created Domain Authority, but it’s the most important question I can ask you. What’s your end-goal? Domain Authority is designed to be an indicator of success (more on that in a moment), but it doesn’t drive success. DA is not used by Google and will have no direct impact on your rankings. Increasing your DA solely to increase your DA is pointless vanity.

So, I don’t want a high DA?

I understand your confusion. If I had to over-simplify Domain Authority, I would say that DA is an indicator of your aggregate link equity. Yes, all else being equal, a high DA is better than a low DA, and it’s ok to strive for a higher DA, but high DA itself should not be your end-goal.

So, DA is useless, then?

No, but like any metric, you can’t use it recklessly or out of context. Our Domain Authority resource page dives into more detail, but the short answer is that DA is very good at helping you understand your relative competitiveness. Smart SEO isn’t about throwing resources at vanity keywords, but about understanding where you realistically have a chance at competing. Knowing that your DA is 48 is useless in a vacuum. Knowing that your DA is 48 and the sites competing on a query you’re targeting have DAs from 30-45 can be extremely useful. Likewise, knowing that your would-be competitors have DAs of 80+ could save you a lot of wasted time and money.

But Google says DA isn’t real!

This topic is a blog post (or eleven) in and of itself, but I’m going to reduce it to a couple points. First, Google’s official statements tend to define terms very narrowly. What Google has said is that they don’t use a domain-level authority metric for rankings. Ok, let’s take that at face value. Do you believe that a new page on a low-authority domain (let’s say DA = 25) has an equal chance of ranking as a high-authority domain (DA = 75)? Of course not, because every domain benefits from its aggregate internal link equity, which is driven by the links to individual pages. Whether you measure that aggregate effect in a single metric or not, it still exists.

Let me ask another question. How do you measure the competitiveness of a new page, that has no Page Authority (or PageRank or whatever metrics Google uses)? This question is a big part of why Domain Authority exists — to help you understand your ability to compete on terms you haven’t targeted and for content you haven’t even written yet.


Seriously, give me some tips!

I’ll assume you’ve read all of my warnings and taken them seriously. You want to improve your Domain Authority because it’s the best authority metric you have, and authority is generally a good thing. There are no magical secrets to improving the factors that drive DA, but here are the main points:

1. Get more high-authority links

Shocking, I know, but that’s the long and short of it. Links from high-authority sites and pages still carry significant ranking power, and they drive both Domain Authority and Page Authority. Even if you choose to ignore DA, you know high-authority links are a good thing to have. Getting them is the topic of thousands of posts and more than a couple of full-length novels (well, ok, books — but there’s probably a novel and feature film in the works).

2. Get fewer spammy links

Our new DA score does a much better job of discounting bad links, as Google clearly tries to do. Note that “bad” doesn’t mean low-authority links. It’s perfectly natural to have some links from low-authority domains and pages, and in many cases it’s both relevant and useful to searchers. Moz’s Spam Score is pretty complex, but as humans we intuitively know when we’re chasing low-quality, low-relevance links. Stop doing that.

3. Get more traffic-driving links

Our new DA score also factors in whether links come from legitimate sites with real traffic, because that’s a strong signal of usefulness. Whether or not you use DA regularly, you know that attracting links that drive traffic is a good thing that indicates relevance to searches and drives bottom-line results. It’s also a good reason to stop chasing every link you can at all costs. What’s the point of a link that no one will see, that drives no traffic, and that is likely discounted by both our authority metrics and Google.


You can’t fake real authority

Like any metric based on signals outside of our control, it’s theoretically possible to manipulate Domain Authority. The question is: why? If you’re using DA to sell DA 10 links for $ 1, DA 20 links for $ 2, and DA 30 links for $ 3, please, for the love of all that is holy, stop (and yes, I’ve seen that almost verbatim in multiple email pitches). If you’re buying those links, please spend that money on something more useful, like sandwiches.

Do the work and build the kind of real authority that moves the needle both for Moz metrics and Google. It’s harder in the short-term, but the dividends will pay off for years. Use Domain Authority to understand where you can compete today, cost-effectively, and maximize your investments. Don’t let it become just another vanity metric.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

There’s no shortcut to authority: Why you need to take E-A-T seriously

Following expertise, authoritativeness and trustworthiness guidelines should be a part of every SEO strategy no matter your niche.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

How to Use Domain Authority for SEO – Whiteboard Friday

Posted by Cyrus-Shepard

Domain Authority is an incredibly well-known metric throughout the SEO industry, but what exactly is the right way to use it? In this week’s edition of Whiteboard Friday, we’re delighted to welcome Cyrus Shepard as he explains both what’s new with the new Domain Authority 2.0 update, and how to best harness its power for your own SEO success. 

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, SEO fans. Welcome to a very special edition of Whiteboard Friday. I’m Cyrus Shepard. I’m honored to be here today with Moz to talk about the new Domain Authority. I want to talk about how to use Domain Authority to do actual SEO.

What is Domain Authority?

Let’s start with a definition of what Domain Authority actually is because there’s a lot of confusion out there. A Domain Authority is a metric, from 1 to 100, which predicts how well a domain will rank in Google. Now let’s break that down a little bit and talk about some of the myths of Domain Authority. 

Is Domain Authority a ranking factor? No, Domain Authority is not a ranking factor. Does Google use Domain Authority in its algorithm? No, Google does not use Domain Authority in its algorithm. Now Google may use some domain-like metrics based on links similar to Domain Authority, but they do not use Domain Authority itself. In fact, it’s best if you don’t bring it up with them. They don’t tend to like that very much.

So if it’s not a ranking factor, if Google doesn’t use it, what does Domain Authority actually do? It does one thing very, very well. It predicts rankings. That’s what it was built to do. That’s what it was designed to do, and it does that job very, very well. And because of that, we can use it for SEO in a lot of different ways. So Domain Authority has been around since 2010, about 8 years now, and since then it’s become a very popular metric, used and abused in different ways.

What’s New With Domain Authority 2.0?

So what’s new about the new Domain Authority that makes it so great and less likely to be abused and gives it so many more uses? Before I go into this, a big shout-out to two of the guys who helped develop this — Russ Jones and Neil Martinsen-Burrell — and many other smart people at Moz. Some of our search scientists did a tremendous job of updating this metric for 2019.

1. Bigger Link Index

So the first thing is the new Domain Authority is based on a new, bigger link index, and that is Link Explorer, which was released last year. It contains 35 trillion links. There are different ways of judging index sizes, but that is one of the biggest or if not the biggest link indexes publicly available that we know of.

Thirty-five trillion links, to give you an idea of how big that is, if you were to count one link per second, you would be counting for 1.1 million years. That’s a lot of links, and that’s how many links are in the index that the new Domain Authority is based upon. Second of all, it uses a new machine learning model. Now part of Domain Authority looks at Google rankings and uses machine learning to try to fit the model in to predict how those rankings are stacked.

2. New Machine Learning Model

Now the new Domain Authority not only looks at what’s winning in Google search, but it’s also looking at what’s not ranking in Google search. The old model used to just look at the winners. This makes it much more accurate at determining where you might fall or where any domain or URL might fall within that prediction. 

3. Spam Score Incorporation

Next the new Domain Authority incorporates spam detection.

Spam Score is a proprietary Moz metric that looks at a bunch of on-page factors, and those have been incorporated into the new metric, which makes it much more reliable. 

4. Detects Link Manipulation

It also, and this is very important, the new Domain Authority detects link manipulation. This is people that are buying and selling links, PBNs, things like that.

It’s much better. In fact, Russ Jones, in a recent webinar, said that link buyers with the new Domain Authority will drop an average of 11 points. So the new Domain Authority is much better at rooting out this link manipulation, just like Google is attempting to do. So it much more closely resembles what Google is attempting.

5. Daily Updates

Lastly, the new Domain Authority is updated daily. This is a huge improvement. The old Domain Authority used to update about approximately every month or so.* The new Domain Authority is constantly being updated, and our search scientists are constantly adding improvements as they come along.

So it’s being updated much more frequently and improved much more frequently. So what does this mean? The new Domain Authority is the most accurate domain-level metric to predict Google search results that we know of. When you look at ranking factors that we know of, like title tags or even generally backlinks, they predict a certain amount of rankings. But Domain Authority blows those out of the water in its ranking potential.

*Note: Our former link research tool, Open Site Explorer, updated on a monthly cadence, resulting in monthly updates to DA scores. With the launch of Link Explorer in April 2018, Domain Authority scores moved to a daily update cadence. This remains true with the new underlying algorithm, Domain Authority 2.0.

How to Use Domain Authority for SEO

So the question is how do we actually use this? We have this tremendous power with Domain Authority that can predict rankings to a certain degree. How do we use this for SEO? So I want to go over some general tips for success. 

The first tip, never use Domain Authority in isolation. You always want to use it with other metrics and in context, because it can only tell you so much.

It’s a powerful tool, but it’s limited. For example, when you’re looking at rankings on-page, you’re going to want to look at the keyword targeting. You’re going to want to look at the on-page content, the domain history, other things like that. So never use Domain Authority by itself. That’s a key tip. 

Second, you want to keep in mind that the scale of Domain Authority is roughly logarithmic.

It’s not linear. Now what does this mean? It’s fairly easy to move from a zero Domain Authority or a one Domain Authority to a ten Domain Authority. You can get a handful of links, and that works pretty well. But moving from like a 70 to an 80 is much, much harder. It gets harder as you get higher. So a DA 40 is not twice a DA 20.

It’s actually much, much bigger because as you go higher and higher and higher, until you get to 100, it gets much harder. Sites like Google and Facebook, they’re near the 100 range, and everything else comes into it. It’s almost like a funnel. 

Next, keep in mind that DA is a relative metric. When you’re using DA, you always want to compare between competitors or your past scores.

Having a DA 50 doesn’t really tell you much unless you’re comparing it to other DA scores. So if you’re looking in Google and a site has a DA of 50, it doesn’t make much sense unless you put it in the context of “what do the other sites have?” Are they 40? Are they 60? In that regard, when you’re looking at your own DA, you can compare against past performance or competitors.

So if I have a 50 this month and a 40 last month, that might tell me that my ability to rank in Google has increased in that time period. 

1. Evaluate Potential Value of a Link

So talking about SEO use cases, we have this. We understand how to use it. What are some practical ways to use Domain Authority? Well, a very popular one with the old DA as well is judging the potential value of a link.

For instance, you have 1,000 outreach targets that you’re thinking about asking for a link, but you only have time for 100 because you want to spend your time wisely and it’s not worth it to ask all 1,000. So you might use DA as a filter to find the most valuable link targets. A DA 90 might be more valuable than a DA 5 or a 10.

But again, you do not want to use it in isolation. You’d be looking at other metrics as well, such as Page Authority, relevance, and traffic. But still DA might be a valuable metric to add to that experience. 

2. Judging Keyword Difficulty

Judging keyword difficulty, judging when you look at SERPs and see what is my potential of ranking for this SERP with this particular keyword?

If you look at a SERP and everybody has a DA 95, it’s going to be pretty hard to rank in that SERP. But if everybody has a lower DA, you might have a chance. But again, you’re going to want to look at other metrics, such as Page Authority, keyword volume, on-page targeting. You can use Moz’s Keyword Difficulty Score to run these calculations as well.

3. Campaign Performance

Very popular in the agency world is link campaign performance or campaign performance in general, and this kind of makes sense. If you’re building links for a client and you want to show progress, a common way of doing this is showing Domain Authority, meaning that we built these links for you and now your potential to rank is higher.

It’s a good metric, but it’s not the only metric I would report. I would definitely report rankings for targeted keywords. I would report traffic and conversions, because ranking potential is one thing, but I’d actually like to show that those links actually did something. So I’d be more inclined to show the other things. But DA is perfectly fine to report for campaign performance as long as you show it in context.

4. Purchasing Existing Domains

A popular one on the marketplaces is buying existing domains. Sites like Flippa often show DA or some similar metric like that. Again, the new Domain Authority is going to be much better at rooting out link manipulation, so these scores might be a little more trustworthy in this sense. But again, never buy a domain just on Domain Authority alone.

You’re going to want to look at a lot of factors, such as the content, the traffic, the domain history, things like that. But Domain Authority might be a good first-line filter for you. 

How to Find Domain Authority Metrics

So where can you find the new Domain Authority? It is available right now. You can go to Link Explorer. It’s available through the Moz API.

The free MozBar, you can download the MozBar for free and turn on SERP overlay, and it will show you the DA of everything as you browse through Google. 

It’s available in Moz Campaigns and also Keyword Explorer. I hope this gives you some ideas about how to use Domain Authority. Please share your ideas and thoughts in the comments below. If you like this video, please share.

Thanks a lot, everybody. Have a great day.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

SEOs beware: Link builders are back with bogus Domain Authority pitches

Stop optimizing for Domain Authority; it has no impact on Google rankings.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

A Comprehensive Analysis of the New Domain Authority

Posted by rjonesx.

Moz’s Domain Authority is requested over 1,000,000,000 times per year, it’s referenced millions of times on the web, and it has become a veritable household name among search engine optimizers for a variety of use cases, from determining the success of a link building campaign to qualifying domains for purchase. With the launch of Moz’s entirely new, improved, and much larger link index, we recognized the opportunity to revisit Domain Authority with the same rigor as we did keyword volume years ago (which ushered in the era of clickstream-modeled keyword data).

What follows is a rigorous treatment of the new Domain Authority metric. What I will not do in this piece is rehash the debate over whether Domain Authority matters or what its proper use cases are. I have and will address those at length in a later post. Rather, I intend to spend the following paragraphs addressing the new Domain Authority metric from multiple directions.

Correlations between DA and SERP rankings

The most important component of Domain Authority is how well it correlates with search results. But first, let’s get the correlation-versus-causation objection out of the way: Domain Authority does not cause search rankings. It is not a ranking factor. Domain Authority predicts the likelihood that one domain will outrank another. That being said, its usefulness as a metric is tied in large part to this value. The stronger the correlation, the more valuable Domain Authority is for predicting rankings.

Methodology

Determining the “correlation” between a metric and SERP rankings has been accomplished in many different ways over the years. Should we compare against the “true first page,” top 10, top 20, top 50 or top 100? How many SERPs do we need to collect in order for our results to be statistically significant? It’s important that I outline the methodology for reproducibility and for any comments or concerns on the techniques used. For the purposes of this study, I chose to use the “true first page.” This means that the SERPs were collected using only the keyword with no additional parameters. I chose to use this particular data set for a number of reasons:

  • The true first page is what most users experience, thus the predictive power of Domain Authority will be focused on what users see.
  • By not using any special parameters, we’re likely to get Google’s typical results. 
  • By not extending beyond the true first page, we’re likely to avoid manually penalized sites (which can impact the correlations with links.)
  • We did NOT use the same training set or training set size as we did for this correlation study. That is to say, we trained on the top 10 but are reporting correlations on the true first page. This prevents us from the potential of having a result overly biased towards our model. 

I randomly selected 16,000 keywords from the United States keyword corpus for Keyword Explorer. I then collected the true first page for all of these keywords (completely different from those used in the training set.) I extracted the URLs but I also chose to remove duplicate domains (ie: if the same domain occurred, one after another.) For a length of time, Google used to cluster domains together in the SERPs under certain circumstances. It was easy to spot these clusters, as the second and later listings were indented. No such indentations are present any longer, but we can’t be certain that Google never groups domains. If they do group domains, it would throw off the correlation because it’s the grouping and not the traditional link-based algorithm doing the work.

I collected the Domain Authority (Moz), Citation Flow and Trust Flow (Majestic), and Domain Rank (Ahrefs) for each domain and calculated the mean Spearman correlation coefficient for each SERP. I then averaged the coefficients for each metric.

Outcome

Moz’s new Domain Authority has the strongest correlations with SERPs of the competing strength-of-domain link-based metrics in the industry. The sign (-/+) has been inverted in the graph for readability, although the actual coefficients are negative (and should be).

Moz’s Domain Authority scored a ~.12, or roughly 6% stronger than the next best competitor (Domain Rank by Ahrefs.) Domain Authority performed 35% better than CitationFlow and 18% better than TrustFlow. This isn’t surprising, in that Domain Authority is trained to predict rankings while our competitor’s strength-of-domain metrics are not. It shouldn’t be taken as a negative that our competitors strength-of-domain metrics don’t correlate as strongly as Moz’s Domain Authority — rather, it’s simply exemplary of the intrinsic differences between the metrics. That being said, if you want a metric that best predicts rankings at the domain level, Domain Authority is that metric.

Note: At first blush, Domain Authority’s improvements over the competition are, frankly, underwhelming. The truth is that we could quite easily increase the correlation further, but doing so would risk over-fitting and compromising a secondary goal of Domain Authority…

Handling link manipulation

Historically, Domain Authority has focused on only one single feature: maximizing the predictive capacity of the metric. All we wanted were the highest correlations. However, Domain Authority has become, for better or worse, synonymous with “domain value” in many sectors, such as among link buyers and domainers. Subsequently, as bizarre as it may sound, Domain Authority has itself been targeted for spam in order to bolster the score and sell at a higher price. While these crude link manipulation techniques didn’t work so well in Google, they were sufficient to increase Domain Authority. We decided to rein that in. 

Data sets

The first thing we did was compile a series off data sets that corresponded with industries we wished to impact, knowing that Domain Authority was regularly manipulated in these circles.

  • Random domains
  • Moz customers
  • Blog comment spam
  • Low-quality auction domains
  • Mid-quality auction domains
  • High-quality auction domains
  • Known link sellers
  • Known link buyers
  • Domainer network
  • Link network

While it would be my preference to release all the data sets, I’ve chosen not to in order to not “out” any website in particular. Instead, I opted to provide these data sets to a number of search engine marketers for validation. The only data set not offered for outside validation was Moz customers, for obvious reasons.

Methodology

For each of the above data sets, I collected both the old and new Domain Authority scores. This was conducted all on February 28th in order to have parity for all tests. I then calculated the relative difference between the old DA and new DA within each group. Finally, I compared the various data set results against one another to confirm that the model addresses the various methods of inflating Domain Authority.

Results

In the above graph, blue represents the Old Average Domain Authority for that data set and orange represents the New Average Domain Authority for that same data set. One immediately noticeable feature is that every category drops. Even random domains drops. This is a re-centering of the Domain Authority score and should cause no alarm to webmasters. There is, on average, a 6% reduction in Domain Authority for randomly selected domains from the web. Thus, if your Domain Authority drops a few points, you are well within the range of normal. Now, let’s look at the various data sets individually.



Random domains: -6.1%

Using the same methodology of finding random domains which we use for collecting comparative link statistics, I selected 1,000 domains, we were able to determine that there is, on average, a 6.1% drop in Domain Authority. It’s important that webmasters recognize this, as the shift is likely to affect most sites and is nothing to worry about.  

Moz customers: -7.4%

Of immediate interest to Moz is how our own customers perform in relation to the random set of domains. On average, the Domain Authority of Moz customers lowered by 7.4%. This is very close to the random set of URLs and indicates that most Moz customers are likely not using techniques to manipulate DA to any large degree.  

Link buyers: -15.9%

Surprisingly, link buyers only lost 15.9% of their Domain Authority. In retrospect, this seems reasonable. First, we looked specifically at link buyers from blog networks, which aren’t as spammy as many other techniques. Second, most of the sites paying for links are also optimizing their site’s content, which means the sites do rank, sometimes quite well, in Google. Because Domain Authority trains against actual rankings, it’s reasonable to expect that the link buyers data set would not be impacted as highly as other techniques because the neural network learns that some link buying patterns actually work. 

Comment spammers: -34%

Here’s where the fun starts. The neural network behind Domain Authority was able to drop comment spammers’ average DA by 34%. I was particularly pleased with this one because of all the types of link manipulation addressed by Domain Authority, comment spam is, in my honest opinion, no better than vandalism. Hopefully this will have a positive impact on decreasing comment spam — every little bit counts. 

Link sellers: -56%

I was actually quite surprised, at first, that link sellers on average dropped 56% in Domain Authority. I knew that link sellers often participated in link schemes (normally interlinking their own blog networks to build up DA) so that they can charge higher prices. However, it didn’t occur to me that link sellers would be easier to pick out because they explicitly do not optimize their own sites beyond links. Subsequently, link sellers tend to have inflated, bogus link profiles and flimsy content, which means they tend to not rank in Google. If they don’t rank, then the neural network behind Domain Authority is likely to pick up on the trend. It will be interesting to see how the market responds to such a dramatic change in Domain Authority.

High-quality auction domains: -61%

One of the features that I’m most proud of in regards to Domain Authority is that it effectively addressed link manipulation in order of our intuition regarding quality. I created three different data sets out of one larger data set (auction domains), where I used certain qualifiers like price, TLD, and archive.org status to label each domain as high-quality, mid-quality, and low-quality. In theory, if the neural network does its job correctly, we should see the high-quality domains impacted the least and the low-quality domains impacted the most. This is the exact pattern which was rendered by the new model. High-quality auction domains dropped an average of 61% in Domain Authority. That seems really high for “high-quality” auction domains, but even a cursory glance at the backlink profiles of domains that are up for sale in the $ 10K+ range shows clear link manipulation. The domainer industry, especially the domainer-for-SEO industry, is rife with spam. 

Link network: -79%

There is one network on the web that troubles me more than any other. I won’t name it, but it’s particularly pernicious because the sites in this network all link to the top 1,000,000 sites on the web. If your site is in the top 1,000,000 on the web, you’ll likely see hundreds of root linking domains from this network no matter which link index you look at (Moz, Majestic, or Ahrefs). You can imagine my delight to see that it drops roughly 79% in Domain Authority, and rightfully so, as the vast majority of these sites have been banned by Google.

Mid-quality auction domains: -95%

Continuing with the pattern regarding the quality of auction domains, you can see that “mid-quality” auction domains dropped nearly 95% in Domain Authority. This is huge. Bear in mind that these drastic drops are not combined with losses in correlation with SERPs; rather, the neural network is learning to distinguish between backlink profiles far more effectively, separating the wheat from the chaff. 

Domainer networks: -97%

If you spend any time looking at dropped domains, you have probably come upon a domainer network where there are a series of sites enumerated and all linking to one another. For example, the first site might be sbt001.com, then sbt002.com, and so on and so forth for thousands of domains. While it’s obvious for humans to look at this and see a pattern, Domain Authority needed to learn that these techniques do not correlate with rankings. The new Domain Authority does just that, dropping the domainer networks we analyzed on average by 97%.

Low-quality auction domains: -98%

Finally, the worst offenders — low-quality auction domains — dropped 98% on average. Domain Authority just can’t be fooled in the way it has in the past. You have to acquire good links in the right proportions (in accordance with a natural model and sites that already rank) if you wish to have a strong Domain Authority score. 

What does this mean?

For most webmasters, this means very little. Your Domain Authority might drop a little bit, but so will your competitors’. For search engine optimizers, especially consultants and agencies, it means quite a bit. The inventories of known link sellers will probably diminish dramatically overnight. High DA links will become far more rare. The same is true of those trying to construct private blog networks (PBNs). Of course, Domain Authority doesn’t cause rankings so it won’t impact your current rank, but it should give consultants and agencies a much smarter metric for assessing quality.

What are the best use cases for DA?

  • Compare changes in your Domain Authority with your competitors. If you drop significantly more, or increase significantly more, it could indicate that there are important differences in your link profile.
  • Compare changes in your Domain Authority over time. The new Domain Authority will update historically as well, so you can track your DA. If your DA is decreasing over time, especially relative to your competitors, you probably need to get started on outreach.
  • Assess link quality when looking to acquire dropped or auction domains. Those looking to acquire dropped or auction domains now have a much more powerful tool in their hands for assessing quality. Of course, DA should not be the primary metric for assessing the quality of a link or a domain, but it certainly should be in every webmaster’s toolkit.

What should we expect going forward?

We aren’t going to rest. An important philosophical shift has taken place at Moz with regards to Domain Authority. In the past, we believed it was best to keep Domain Authority static, rarely updating the model, in order to give users an apples-to-apples comparison. Over time, though, this meant that Domain Authority would become less relevant. Given the rapidity with which Google updates its results and algorithms, the new Domain Authority will be far more agile as we give it new features, retrain it more frequently, and respond to algorithmic changes from Google. We hope you like it.


Be sure to join us on Thursday, March 14th at 10am PT at our upcoming webinar discussing strategies & use cases for the new Domain Authority:

Save my spot

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

SearchCap: Local SEO survey, Moz domain authority and GOOG earnings

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Related Articles

Posted in Latest NewsComments Off

A New Domain Authority Is Coming Soon: What’s Changing, When, & Why

Posted by rjonesx.

Howdy Moz readers,

I’m Russ Jones, Principal Search Scientist at Moz, and I am excited to announce a fantastic upgrade coming next month to one of the most important metrics Moz offers: Domain Authority.

Domain Authority has become the industry standard for measuring the strength of a domain relative to ranking. We recognize that stability plays an important role in making Domain Authority valuable to our customers, so we wanted to make sure that the new Domain Authority brought meaningful changes to the table.

Learn more about the new DA

What’s changing?

What follows is an account of some of the technical changes behind the new Domain Authority and why they matter.

The training set:

Historically, we’ve relied on training Domain Authority against an unmanipulated, large set of search results. In fact, this has been the standard methodology across our industry. But we have found a way to improve upon it that fundamentally, from the ground up, makes Domain Authority more reliable.

The training algorithm:

Rather than relying on a complex linear model, we’ve made the switch to a neural network. This offers several benefits including a much more nuanced model which can detect link manipulation.

The model factors:

We have greatly improved upon the ranking factors behind Domain Authority. In addition to looking at link counts, we’ve now been able to integrate our proprietary Spam Score and complex distributions of links based on quality and traffic, along with a bevy of other factors.

The backbone:

At the heart of Domain Authority is the industry’s leading link index, our new Moz Link Explorer. With over 35 trillion links, our exceptional data turns the brilliant statistical work by Neil Martinsen-Burrell, Chas Williams, and so many more amazing Mozzers into a true industry standard.

What does this mean?

These fundamental improvements to Domain Authority will deliver a better, more trustworthy metric than ever before. We can remove spam, improve correlations, and, most importantly, update Domain Authority relative to all the changes that Google makes.

It means that you will see some changes to Domain Authority when the launch occurs. We staked the model to our existing Domain Authority which minimizes changes, but with all the improvements there will no doubt be some fluctuation in Domain Authority scores across the board.

What should we do?

Use DA as a relative metric, not an absolute one.

First, make sure that you use Domain Authority as a relative metric. Domain Authority is meaningless when it isn’t compared to other sites. What matters isn’t whether your site drops or increases — it’s whether it drops or increases relative to your competitors. When we roll out the new Domain Authority, make sure you check your competitors’ scores as well as your own, as they will likely fluctuate in a similar direction.

Know how to communicate changes with clients, colleagues, and stakeholders

Second, be prepared to communicate with your clients or webmasters about the changes and improvements to Domain Authority. While change is always disruptive, the new Domain Authority is better than ever and will allow them to make smarter decisions about search engine optimization strategies going forward.

Expect DA to keep pace with Google

Finally, expect that we will be continuing to improve Domain Authority. Just like Google makes hundreds of changes to their algorithm every year, we intend to make Domain Authority much more responsive to Google’s changes. Even when Google makes fundamental algorithm updates like Penguin or Panda, you can feel confident that Moz’s Domain Authority will be as relevant and useful as ever.

When is it happening?

We plan on rolling out the new Domain Authority on March 5th, 2019. We will have several more communications between now and then to help you and your clients best respond to the new Domain Authority, including a webinar on February 21st. We hope you’re as excited as we are and look forward to continuing to bring you the most reliable, cutting-edge metrics our industry has to offer.


Be sure to check out the resources we’ve prepared to help you acclimate to the change, including an educational whitepaper and a presentation you can download to share with your clients, team, and stakeholders:

Explore more resources here

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

More Articles

Posted in Latest NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 6: Link Building & Establishing Authority

Posted by BritneyMuller

In Chapter 6 of the new Beginner’s Guide to SEO, we’ll be covering the dos and don’ts of link building and ways your site can build its authority. If you missed them, we’ve got the drafts of our outline, Chapter One, Chapter Two, Chapter Three, Chapter Four, and Chapter Five for your reading pleasure. Be sure to let us know what you think of Chapter 6 in the comments!


Chapter 6: Link Building & Establishing Authority

Turn up the volume.

You’ve created content that people are searching for, that answers their questions, and that search engines can understand, but those qualities alone don’t mean it’ll rank. To outrank the rest of the sites with those qualities, you have to establish authority. That can be accomplished by earning links from authoritative websites, building your brand, and nurturing an audience who will help amplify your content.

Google has confirmed that links and quality content (which we covered back in Chapter 4) are two of the three most important ranking factors for SEO. Trustworthy sites tend to link to other trustworthy sites, and spammy sites tend to link to other spammy sites. But what is a link, exactly? How do you go about earning them from other websites? Let’s start with the basics.

What are links?

Inbound links, also known as backlinks or external links, are HTML hyperlinks that point from one website to another. They’re the currency of the Internet, as they act a lot like real-life reputation. If you went on vacation and asked three people (all completely unrelated to one another) what the best coffee shop in town was, and they all said, “Cuppa Joe on Main Street,” you would feel confident that Cuppa Joe is indeed the best coffee place in town. Links do that for search engines.

Since the late 1990s, search engines have treated links as votes for popularity and importance on the web.

Internal links, or links that connect internal pages of the same domain, work very similarly for your website. A high amount of internal links pointing to a particular page on your site will provide a signal to Google that the page is important, so long as it’s done naturally and not in a spammy way.

The engines themselves have refined the way they view links, now using algorithms to evaluate sites and pages based on the links they find. But what’s in those algorithms? How do the engines evaluate all those links? It all starts with the concept of E-A-T.

You are what you E-A-T

Google’s Search Quality Rater Guidelines put a great deal of importance on the concept of E-A-T — an acronym for expert, authoritative, and trustworthy. Sites that don’t display these characteristics tend to be seen as lower-quality in the eyes of the engines, while those that do are subsequently rewarded. E-A-T is becoming more and more important as search evolves and increases the importance of solving for user intent.

Creating a site that’s considered expert, authoritative, and trustworthy should be your guiding light as you practice SEO. Not only will it simply result in a better site, but it’s future-proof. After all, providing great value to searchers is what Google itself is trying to do.

E-A-T and links to your site

The more popular and important a site is, the more weight the links from that site carry. A site like Wikipedia, for example, has thousands of diverse sites linking to it. This indicates it provides lots of expertise, has cultivated authority, and is trusted among those other sites.

To earn trust and authority with search engines, you’ll need links from websites that display the qualities of E-A-T. These don’t have to be Wikipedia-level sites, but they should provide searchers with credible, trustworthy content.

  • Tip: Moz has proprietary metrics to help you determine how authoritative a site is: Domain Authority, Page Authority, and Spam Score. In general, you’ll want links from sites with a higher Domain Authority than your sites.

Followed vs. nofollowed links

Remember how links act as votes? The rel=nofollow attribute (pronounced as two words, “no follow”) allows you to link to a resource while removing your “vote” for search engine purposes.

Just like it sounds, “nofollow” tells search engines not to follow the link. Some engines still follow them simply to discover new pages, but these links don’t pass link equity (the “votes of popularity” we talked about above), so they can be useful in situations where a page is either linking to an untrustworthy source or was paid for or created by the owner of the destination page (making it an unnatural link).

Say, for example, you write a post about link building practices, and want to call out an example of poor, spammy link building. You could link to the offending site without signaling to Google that you trust it.

Standard links (ones that haven’t had nofollow added) look like this:

<a href="https://moz.com">I love Moz</a>

Nofollow link markup looks like this:

<a href="https://moz.com" rel="nofollow">I love Moz</a>

If follow links pass all the link equity, shouldn’t that mean you want only follow links?

Not necessarily. Think about all the legitimate places you can create links to your own website: a Facebook profile, a Yelp page, a Twitter account, etc. These are all natural places to add links to your website, but they shouldn’t count as votes for your website. (Setting up a Twitter profile with a link to your site isn’t a vote from Twitter that they like your site.)

It’s natural for your site to have a balance between nofollowed and followed backlinks in its link profile (more on link profiles below). A nofollow link might not pass authority, but it could send valuable traffic to your site and even lead to future followed links.

  • Tip: Use the MozBar extension for Google Chrome to highlight links on any page to find out whether they’re nofollow or follow without ever having to view the source code!

Your link profile

Your link profile is an overall assessment of all the inbound links your site has earned: the total number of links, their quality (or spamminess), their diversity (is one site linking to you hundreds of times, or are hundreds of sites linking to you once?), and more. The state of your link profile helps search engines understand how your site relates to other sites on the Internet. There are various SEO tools that allow you to analyze your link profile and begin to understand its overall makeup.

How can I see which inbound links point to my website?

Visit Moz Link Explorer and type in your site’s URL. You’ll be able to see how many and which websites are linking back to you.

What are the qualities of a healthy link profile?

When people began to learn about the power of links, they began manipulating them for their benefit. They’d find ways to gain artificial links just to increase their search engine rankings. While these dangerous tactics can sometimes work, they are against Google’s terms of service and can get a website deindexed (removal of web pages or entire domains from search results). You should always try to maintain a healthy link profile.

A healthy link profile is one that indicates to search engines that you’re earning your links and authority fairly. Just like you shouldn’t lie, cheat, or steal, you should strive to ensure your link profile is honest and earned via your hard work.

Links are earned or editorially placed

Editorial links are links added naturally by sites and pages that want to link to your website.

The foundation of acquiring earned links is almost always through creating high-quality content that people genuinely wish to reference. This is where creating 10X content (a way of describing extremely high-quality content) is essential! If you can provide the best and most interesting resource on the web, people will naturally link to it.

Naturally earned links require no specific action from you, other than the creation of worthy content and the ability to create awareness about it.

  • Tip: Earned mentions are often unlinked! When websites are referring to your brand or a specific piece of content you’ve published, they will often mention it without linking to it. To find these earned mentions, use Moz’s Fresh Web Explorer. You can then reach out to those publishers to see if they’ll update those mentions with links.

Links are relevant and from topically similar websites

Links from websites within a topic-specific community are generally better than links from websites that aren’t relevant to your site. If your website sells dog houses, a link from the Society of Dog Breeders matters much more than one from the Roller Skating Association. Additionally, links from topically irrelevant sources can send confusing signals to search engines regarding what your page is about.

  • Tip: Linking domains don’t have to match the topic of your page exactly, but they should be related. Avoid pursuing backlinks from sources that are completely off-topic; there are far better uses of your time.

Anchor text is descriptive and relevant, without being spammy

Anchor text helps tell Google what the topic of your page is about. If dozens of links point to a page with a variation of a word or phrase, the page has a higher likelihood of ranking well for those types of phrases. However, proceed with caution! Too many backlinks with the same anchor text could indicate to the search engines that you’re trying to manipulate your site’s ranking in search results.

Consider this. You ask ten separate friends at separate times how their day was going, and they each responded with the same phrase:

“Great! I started my day by walking my dog, Peanut, and then had a picante beef Top Ramen for lunch.”

That’s strange, and you’d be quite suspicious of your friends. The same goes for Google. Describing the content of the target page with the anchor text helps them understand what the page is about, but the same description over and over from multiple sources starts to look suspicious. Aim for relevance; avoid spam.

  • Tip: Use the “Anchor Text” report in Moz’s Link Explorer to see what anchor text other websites are using to link to your content.

Links send qualified traffic to your site

Link building should never be solely about search engine rankings. Esteemed SEO and link building thought leader Eric Ward used to say that you should build your links as though Google might disappear tomorrow. In essence, you should focus on acquiring links that will bring qualified traffic to your website — another reason why it’s important to acquire links from relevant websites whose audience would find value in your site, as well.

  • Tip: Use the “Referral Traffic” report in Google Analytics to evaluate websites that are currently sending you traffic. How can you continue to build relationships with similar types of websites?

Link building don’ts & things to avoid

Spammy link profiles are just that: full of links built in unnatural, sneaky, or otherwise low-quality ways. Practices like buying links or engaging in a link exchange might seem like the easy way out, but doing so is dangerous and could put all of your hard work at risk. Google penalizes sites with spammy link profiles, so don’t give in to temptation.

A guiding principle for your link building efforts is to never try to manipulate a site’s ranking in search results. But isn’t that the entire goal of SEO? To increase a site’s ranking in search results? And herein lies the confusion. Google wants you to earn links, not build them, but the line between the two is often blurry. To avoid penalties for unnatural links (known as “link spam”), Google has made clear what should be avoided.

Purchased links

Google and Bing both seek to discount the influence of paid links in their organic search results. While a search engine can’t know which links were earned vs. paid for from viewing the link itself, there are clues it uses to detect patterns that indicate foul play. Websites caught buying or selling followed links risk severe penalties that will severely drop their rankings. (By the way, exchanging goods or services for a link is also a form of payment and qualifies as buying links.)

Link exchanges / reciprocal linking

If you’ve ever received a “you link to me and I’ll link you you” email from someone you have no affiliation with, you’ve been targeted for a link exchange. Google’s quality guidelines caution against “excessive” link exchange and similar partner programs conducted exclusively for the sake of cross-linking, so there is some indication that this type of exchange on a smaller scale might not trigger any link spam alarms.

It is acceptable, and even valuable, to link to people you work with, partner with, or have some other affiliation with and have them link back to you.

It’s the exchange of links at mass scale with unaffiliated sites that can warrant penalties.

Low-quality directory links

These used to be a popular source of manipulation. A large number of pay-for-placement web directories exist to serve this market and pass themselves off as legitimate, with varying degrees of success. These types of sites tend to look very similar, with large lists of websites and their descriptions (typically, the site’s critical keyword is used as the anchor text to link back to the submittor’s site).

There are many more manipulative link building tactics that search engines have identified. In most cases, they have found algorithmic methods for reducing their impact. As new spam systems emerge, engineers will continue to fight them with targeted algorithms, human reviews, and the collection of spam reports from webmasters and SEOs. By and large, it isn’t worth finding ways around them.

If your site does get a manual penalty, there are steps you can take to get it lifted.

How to build high-quality backlinks

Link building comes in many shapes and sizes, but one thing is always true: link campaigns should always match your unique goals. With that said, there are some popular methods that tend to work well for most campaigns. This is not an exhaustive list, so visit Moz’s blog posts on link building for more detail on this topic.

Find customer and partner links

If you have partners you work with regularly, or loyal customers that love your brand, there are ways to earn links from them with relative ease. You might send out partnership badges (graphic icons that signify mutual respect), or offer to write up testimonials of their products. Both of those offer things they can display on their website along with links back to you.

Publish a blog

This content and link building strategy is so popular and valuable that it’s one of the few recommended personally by the engineers at Google. Blogs have the unique ability to contribute fresh material on a consistent basis, generate conversations across the web, and earn listings and links from other blogs.

Careful, though — you should avoid low-quality guest posting just for the sake of link building. Google has advised against this and your energy is better spent elsewhere.

Create unique resources

Creating unique, high quality resources is no easy task, but it’s well worth the effort. High quality content that is promoted in the right ways can be widely shared. It can help to create pieces that have the following traits:

Creating a resource like this is a great way to attract a lot of links with one page. You could also create a highly-specific resource — without as broad of an appeal — that targeted a handful of websites. You might see a higher rate of success, but that approach isn’t as scalable.

Users who see this kind of unique content often want to share it with friends, and bloggers/tech-savvy webmasters who see it will often do so through links. These high quality, editorially earned votes are invaluable to building trust, authority, and rankings potential.

Build resource pages

Resource pages are a great way to build links. However, to find them you’ll want to know some Advanced Google operators to make discovering them a bit easier.

For example, if you were doing link building for a company that made pots and pans, you could search for: cooking intitle:”resources” and see which pages might be good link targets.

This can also give you great ideas for content creation — just think about which types of resources you could create that these pages would all like to reference/link to.

Get involved in your local community

For a local business (one that meets its customers in person), community outreach can result in some of the most valuable and influential links.

  • Engage in sponsorships and scholarships.
  • Host or participate in community events, seminars, workshops, and organizations.
  • Donate to worthy local causes and join local business associations.
  • Post jobs and offer internships.
  • Promote loyalty programs.
  • Run a local competition.
  • Develop real-world relationships with related local businesses to discover how you can team up to improve the health of your local economy.

All of these smart and authentic strategies provide good local link opportunities.

Refurbish top content

You likely already know which of your site’s content earns the most traffic, converts the most customers, or retains visitors for the longest amount of time.

Take that content and refurbish it for other platforms (Slideshare, YouTube, Instagram, Quora, etc.) to expand your acquisition funnel beyond Google.

You can also dust off, update, and simply republish older content on the same platform. If you discover that a few trusted industry websites all linked to a popular resource that’s gone stale, update it and let those industry websites know — you may just earn a good link.

You can also do this with images. Reach out to websites that are using your images and not citing/linking back to you and ask if they’d mind including a link.

Be newsworthy

Earning the attention of the press, bloggers, and news media is an effective, time-honored way to earn links. Sometimes this is as simple as giving something away for free, releasing a great new product, or stating something controversial. Since so much of SEO is about creating a digital representation of your brand in the real world, to succeed in SEO, you have to be a great brand.

Be personal and genuine

The most common mistake new SEOs make when trying to build links is not taking the time to craft a custom, personal, and valuable initial outreach email. You know as well as anyone how annoying spammy emails can be, so make sure yours doesn’t make people roll their eyes.

Your goal for an initial outreach email is simply to get a response. These tips can help:

  • Make it personal by mentioning something the person is working on, where they went to school, their dog, etc.
  • Provide value. Let them know about a broken link on their website or a page that isn’t working on mobile.
  • Keep it short.
  • Ask one simple question (typically not for a link; you’ll likely want to build a rapport first).

Pro Tip:

Earning links can be very resource-intensive, so you’ll likely want to measure your success to prove the value of those efforts.

Metrics for link building should match up with the site’s overall KPIs. These might be sales, email subscriptions, page views, etc. You should also evaluate Domain and/or Page Authority scores, the ranking of desired keywords, and the amount of traffic to your content — but we’ll talk more about measuring the success of your SEO campaigns in Chapter 7.

Beyond links: How awareness, amplification, and sentiment impact authority

A lot of the methods you’d use to build links will also indirectly build your brand. In fact, you can view link building as a great way to increase awareness of your brand, the topics on which you’re an authority, and the products or services you offer.

Once your target audience knows about you and you have valuable content to share, let your audience know about it! Sharing your content on social platforms will not only make your audience aware of your content, but it can also encourage them to amplify that awareness to their own networks, thereby extending your own reach.

Are social shares the same as links? No. But shares to the right people can result in links. Social shares can also promote an increase in traffic and new visitors to your website, which can grow brand awareness, and with a growth in brand awareness can come a growth in trust and links. The connection between social signals and rankings seems indirect, but even indirect correlations can be helpful for informing strategy.

Trustworthiness goes a long way

For search engines, trust is largely determined by the quality and quantity of the links your domain has earned, but that’s not to say that there aren’t other factors at play that can influence your site’s authority. Think about all the different ways you come to trust a brand:

  • Awareness (you know they exist)
  • Helpfulness (they provide answers to your questions)
  • Integrity (they do what they say they will)
  • Quality (their product or service provides value; possibly more than others you’ve tried)
  • Continued value (they continue to provide value even after you’ve gotten what you needed)
  • Voice (they communicate in unique, memorable ways)
  • Sentiment (others have good things to say about their experience with the brand)

That last point is what we’re going to focus on here. Reviews of your brand, its products, or its services can make or break a business.

In your effort to establish authority from reviews, follow these review rules of thumb:

  • Never pay any individual or agency to create a fake positive review for your business or a fake negative review of a competitor.
  • Don’t review your own business or the businesses of your competitors. Don’t have your staff do so either.
  • Never offer incentives of any kind in exchange for reviews.
  • All reviews must be left directly by customers in their own accounts; never post reviews on behalf of a customer or employ an agency to do so.
  • Don’t set up a review station/kiosk in your place of business; many reviews stemming from the same IP can be viewed as spam.
  • Read the guidelines of each review platform where you’re hoping to earn reviews.

Be aware that review spam is a problem that’s taken on global proportions, and that violation of governmental truth-in-advertising guidelines has led to legal prosecution and heavy fines. It’s just too dangerous to be worth it. Playing by the rules and offering exceptional customer experiences is the winning combination for building both trust and authority over time.

Authority is built when brands are doing great things in the real-world, making customers happy, creating and sharing great content, and earning links from reputable sources.

In the next and final section, you’ll learn how to measure the success of all your efforts, as well as tactics for iterating and improving upon them. Onward!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Articles

Posted in Latest NewsComments Off

Using a New Correlation Model to Predict Future Rankings with Page Authority

Posted by rjonesx.

Correlation studies have been a staple of the search engine optimization community for many years. Each time a new study is released, a chorus of naysayers seem to come magically out of the woodwork to remind us of the one thing they remember from high school statistics — that “correlation doesn’t mean causation.” They are, of course, right in their protestations and, to their credit, and unfortunate number of times it seems that those conducting the correlation studies have forgotten this simple aphorism.

We collect a search result. We then order the results based on different metrics like the number of links. Finally, we compare the orders of the original search results with those produced by the different metrics. The closer they are, the higher the correlation between the two.

That being said, correlation studies are not altogether fruitless simply because they don’t necessarily uncover causal relationships (ie: actual ranking factors). What correlation studies discover or confirm are correlates.

Correlates are simply measurements that share some relationship with the independent variable (in this case, the order of search results on a page). For example, we know that backlink counts are correlates of rank order. We also know that social shares are correlates of rank order.

Correlation studies also provide us with direction of the relationship. For example, ice cream sales are positive correlates with temperature and winter jackets are negative correlates with temperature — that is to say, when the temperature goes up, ice cream sales go up but winter jacket sales go down.

Finally, correlation studies can help us rule out proposed ranking factors. This is often overlooked, but it is an incredibly important part of correlation studies. Research that provides a negative result is often just as valuable as research that yields a positive result. We’ve been able to rule out many types of potential factors — like keyword density and the meta keywords tag — using correlation studies.

Unfortunately, the value of correlation studies tends to end there. In particular, we still want to know whether a correlate causes the rankings or is spurious. Spurious is just a fancy sounding word for “false” or “fake.” A good example of a spurious relationship would be that ice cream sales cause an increase in drownings. In reality, the heat of the summer increases both ice cream sales and people who go for a swim. That swimming can cause drownings. So while ice cream sales is a correlate of drowning, it is *spurious.* It does not cause the drowning.

How might we go about teasing out the difference between causal and spurious relationships? One thing we know is that a cause happens before its effect, which means that a causal variable should predict a future change.

An alternative model for correlation studies

I propose an alternate methodology for conducting correlation studies. Rather than measure the correlation between a factor (like links or shares) and a SERP, we can measure the correlation between a factor and changes in the SERP over time.

The process works like this:

  1. Collect a SERP on day 1
  2. Collect the link counts for each of the URLs in that SERP
  3. Look for any URLs are out of order with respect to links; for example, if position 2 has fewer links than position 3
  4. Record that anomaly
  5. Collect the same SERP in 14 days
  6. Record if the anomaly has been corrected (ie: position 3 now out-ranks position 2)
  7. Repeat across ten thousand keywords and test a variety of factors (backlinks, social shares, etc.)

So what are the benefits of this methodology? By looking at change over time, we can see whether the ranking factor (correlate) is a leading or lagging feature. A lagging feature can automatically be ruled out as causal. A leading factor has the potential to be a causal factor.

We collect a search result. We record where the search result differs from the expected predictions of a particular variable (like links or social shares). We then collect the same search result 2 weeks later to see if the search engine has corrected the out-of-order results.

Following this methodology, we tested 3 different common correlates produced by ranking factors studies: Facebook shares, number of root linking domains, and Page Authority. The first step involved collecting 10,000 SERPs from randomly selected keywords in our Keyword Explorer corpus. We then recorded Facebook Shares, Root Linking Domains, and Page Authority for every URL. We noted every example where 2 adjacent URLs (like positions 2 and 3 or 7 and 8) were flipped with respect to the expected order predicted by the correlating factor. For example, if the #2 position had 30 shares while the #3 position had 50 shares, we noted that pair. Finally, 2 weeks later, we captured the same SERPs and identified the percent of times that Google rearranged the pair of URLs to match the expected correlation. We also randomly selected pairs of URLs to get a baseline percent likelihood that any 2 adjacent URLs would switch positions. Here were the results…

The outcome

It’s important to note that it is incredibly rare to expect a leading factor to show up strongly in an analysis like this. While the experimental method is sound, it’s not as simple as a factor predicting future — it assumes that in some cases we will know about a factor before Google does. The underlying assumption is that in some cases we have seen a ranking factor (like an increase in links or social shares) before Googlebot has and that in the 2 week period, Google will catch up and correct the incorrectly ordered results. As you can expect, this is a rare occasion. However, with a sufficient number of observations, we should be able to see a statistically significant difference between lagging and leading results. However, the methodology only detects when a factor is both leading and Moz Link Explorer discovered the relevant factor before Google.

Factor Percent Corrected P-Value 95% Min 95% Max
Control 18.93% 0
Facebook Shares Controlled for PA 18.31% 0.00001 -0.6849 -0.5551
Root Linking Domains 20.58% 0.00001 0.016268 0.016732
Page Authority 20.98% 0.00001 0.026202 0.026398

Control:

In order to create a control, we randomly selected adjacent URL pairs in the first SERP collection and determined the likelihood that the second will outrank the first in the final SERP collection. Approximately 18.93% of the time the worse ranking URL would overtake the better ranking URL. By setting this control, we can determine if any of the potential correlates are leading factors – that is to say that they are potential causes of improved rankings.

Facebook Shares:

Facebook Shares performed the worst of the three tested variables. Facebook Shares actually performed worse than random (18.31% vs 18.93%), meaning that randomly selected pairs would be more likely to switch than those where shares of the second were higher than the first. This is not altogether surprising as it is the general industry consensus that social signals are lagging factors — that is to say the traffic from higher rankings drives higher social shares, not social shares drive higher rankings. Subsequently, we would expect to see the ranking change first before we would see the increase in social shares.

RLDs

Raw root linking domain counts performed substantially better than shares at ~20.5%. As I indicated before, this type of analysis is incredibly subtle because it only detects when a factor is both leading and Moz Link Explorer discovered the relevant factor before Google. Nevertheless, this result was statistically significant with a P value <0.0001 and a 95% confidence interval that RLDs will predict future ranking changes around 1.5% greater than random.

Page Authority

By far, the highest performing factor was Page Authority. At 21.5%, PA correctly predicted changes in SERPs 2.6% better than random. This is a strong indication of a leading factor, greatly outperforming social shares and outperforming the best predictive raw metric, root linking domains.This is not unsurprising. Page Authority is built to predict rankings, so we should expect that it would outperform raw metrics in identifying when a shift in rankings might occur. Now, this is not to say that Google uses Moz Page Authority to rank sites, but rather that Moz Page Authority is a relatively good approximation of whatever link metrics Google is using to determine ranking sites.

Concluding thoughts

There are so many different experimental designs we can use to help improve our research industry-wide, and this is just one of the methods that can help us tease out the differences between causal ranking factors and lagging correlates. Experimental design does not need to be elaborate and the statistics to determine reliability do not need to be cutting edge. While machine learning offers much promise for improving our predictive models, simple statistics can do the trick when we’re establishing the fundamentals.

Now, get out there and do some great research!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

More Articles

Posted in Latest NewsComments Off

Authority Pro for WordPress: Demonstrate Your Expertise and Build Trust

Authority Pro is a fresh new design by our Lead Designer Rafal Tomal and the team at StudioPress. The big idea behind this specific design is to help you put the full extent of your expertise on display. Consistently demonstrating your likable expertise over time is what allows you to build meaningful and lasting trust
Read More…

The post Authority Pro for WordPress: Demonstrate Your Expertise and Build Trust appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

Advert