Tag Archive | "Friday"

The One-Hour Guide to SEO, Part 1: SEO Strategy – Whiteboard Friday

Posted by randfish

Can you learn SEO in an hour? Surprisingly, the answer is yes, at least when it comes to the fundamentals! 

With this edition of Whiteboard Friday, we’re kicking off something special: a six-part series of roughly ten-minute-long videos designed to deliver core SEO concepts efficiently and effectively. It’s our hope that this will serve as a helpful resource for a wide range of people:

  • Beginner SEOs looking to get acquainted with the field concisely & comprehensively
  • Clients, bosses, and stakeholders who would benefit from an enhanced understanding of your work
  • New team members who need quick and easy onboarding
  • Colleagues with SEO-adjacent roles, such as web developers and software engineers

Today we’ll be covering Part 1: SEO Strategy with the man who wrote the original guide on SEO, our friend Rand. Settle in, and stay tuned next Friday for our second video covering keyword research!

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to a special edition of the Whiteboard Friday series. I’m Rand Fishkin, the founder and former CEO of Moz, and I’m here with you today because I’m going to deliver a one-hour guide to SEO, front and back, so that you can learn in just an hour the fundamentals of the practice and be smarter at choosing a great SEO firm to work with, hiring SEO people. 

A handy SEO resource for your clients, team, and colleagues

If you are already in SEO, you might pick up some tips and tactics that you didn’t otherwise know or hadn’t previously considered. I want to ask those of you who are sort of intermediate level and advanced level SEOs — and I know there are many of you who have historically watched me on Whiteboard Friday and I really appreciate that — to give this video a chance even though it is at the beginner level, because my hope is that it will be valuable to you to send to your clients, your potential customers, people who join your team and work with you, developers or software engineers or web devs who you are working with and whose help you need but you want them to understand the fundamentals of SEO.

If those are the people that you’re talking to, excellent. This series is for you. We’re going to begin with SEO strategy. That is our first part. Then we’ll get into things like keyword research and technical SEO and link building and all of that good stuff as well. 

The essentials: What is SEO, and what does it do?

So first off, SEO is search engine optimization. It is essentially the practice of influencing or being able to control some of the results that Google shows when someone types in or speaks a query to their system.

I say Google. You can influence other search engines, like Bing and DuckDuckGo and Yahoo and Seznam if you’re in the Czech Republic or Baidu. But we are primarily focused on Google because Google has more than a 90% market share in the United States and, in fact, in North America and South America, in most of Europe, Asia, and the Middle East with a few exceptions.

Start with business goals

So SEO is a tactic. It’s a way to control things. It is not a business goal. No one forms a new company or sits down with their division and says, “Okay, we need to rank for all of these keywords.” Instead what you should be saying, what hopefully is happening in your teams is, “We have these business goals.”

Example: “Grow our online soccer jersey sales to a web-savvy, custom heavy audience.”

Let’s say we’re an online e-commerce shop and we sell customized soccer jerseys, well, football for those of you outside of the United States. So we want to grow our online soccer jersey sales. Great, that is a true business goal. We’re trying to build a bigger audience. We want to sell more of these jerseys. In order to do that, we have marketing goals that we want to achieve, things like we want to build brand awareness.

Next, marketing goals

Build brand awareness

We want more people to know who we are, to have heard of our particular brand, because people who have heard of us are going to be more likely to buy from us. The first time you hear about someone, very unlikely to buy. The seventh time you’ve heard about someone, much more likely to buy from them. So that is a good marketing goal, and SEO can help with that. We’ll talk about that in a sec.

Grow top-of-funnel traffic

You might want to grow top-of-funnel traffic. We want more people coming to the site overall so that we can do a better job of figuring out who is the right audience for us and converting some of those people, retargeting some of those people, capturing emails from some of those people, all those good things. 

Attract ready-to-buy fans

We want to attract ready-to-buy fans, people who are chomping at the bit to buy our soccer jerseys, customize them and get them shipped.

SEO, as a strategy, is essentially a set of tactics, things that you will do in the SEO world to rank for different keywords in the search engines or control and influence what already ranks in there so that you can achieve your marketing goals so that you can achieve your business goals.

Don’t get this backwards. Don’t start from a place of SEO. Especially if you are an SEO specialist or a practitioner or you’re joining a consulting firm, you should always have an excellent idea of what these are and why the SEO tactics that you are undertaking fit into them. If you don’t, you should be asking those questions before you begin any SEO work.

Otherwise you’re going to accomplish things and do things that don’t have the impact or don’t tie directly to the impact that the business owners care about, and that’s going to mean probably you won’t get picked up for another contract or you won’t accomplish the goals that mean you’re valuable to the team or you do things that people don’t necessarily need and want in the business and therefore you are seen as a less valuable part of it.

Finally, move into SEO strategy

But if you’re accomplishing things that can clearly tie to these, the opposite. People will really value what you do. 

Rank for low-demand, high-conversion keywords

So SEO can do things like rank for low demand, things that don’t have a lot of searches per month but they are high conversion likely keywords, keywords like “I am looking for a customized Seattle Sounders soccer jersey that’s in the away colors.” Well, there’s not a lot of search demand for that exact phrase. But if you’re searching for it, you’re very likely to convert. 

Earn traffic from high-demand, low-competition, less commerce-focused keywords

You could try and earn traffic from high-demand, low competition keywords that are less focused directly on e-commerce. So it could be things like “Seattle Sounders news” or “Seattle Sounders stats” or a comparison of “Portland Timbers versus Seattle Sounders.” These are two soccer or football clubs in the Pacific Northwest. 

Build content that attracts links and influencer engagement

Or you might be trying to do things like building content that attracts links and influencer engagement so that in the future you can rank for more competitive keywords. We’ll talk about that in a sec. SEO can do some amazing things, but there are also things that it cannot do.

What SEO can do:

If you put things in here, if you as an SEO pitch to your marketing team or your business owners that SEO can do things that it can’t, you’re going to be in trouble. So when we compose an SEO strategy, a set of tactics that tries to accomplish marketing goals that tie to business goals, SEO can do things like:

  • Attract searchers that are seeking your content.
  • Control how your brand is seen in search results when someone searches for your particular name. 
  • Nudge searchers toward queries by influencing what gets suggested in the auto suggest or by suggesting related searches or people also ask boxes. 

Anything that shows up in the search results, nearly anything can be influenced by what we as SEOs can do.

What SEO cannot do:

Grow or create search demand on its own

But SEO cannot grow or create search demand by itself. So if someone says, “Hey, I want us to get more traffic for this specific keyword,” if you’re already ranking number one and you have some videos showing in the results and you’re also in the image results and you’ve got maybe a secondary page that links off to you from the results, you might say, “Hey, there’s just not more demand,” and SEO by itself can’t create that additional demand.

Build brand (by itself)

SEO also can’t build brand, at least not by itself. It can certainly be a helpful part of that structure. But if someone says, “Hey, I want us to be better known among this audience,”you can say, “Well, SEO can help a little, but it can’t build a brand on its own, and it certainly can’t build brand perception on its own.” People are going to go and visit your website. They’re going to go and experience, have an interaction with what you’ve created on the web. That is going to be far more of a brand builder, a brand indicator than just what appears in the search results. So SEO can’t do that alone. 

Directly convert customers

It also can’t directly convert customers. A lot of the time what we find is that someone will do a great job of ranking, but when you actually reach the website, when visitors reach the website, they are unsatisfied by the search, which by the way is one of the reasons why this one-hour guide is going to include a section on searcher satisfaction.

When Google sees over time that searchers are unsatisfied by a result, they will push that result down in the rankings and find someone who does a great job of satisfying searchers, and they will rank them instead. So the website has to do this. It is part of SEO. It’s certainly part of the equation, but SEO can’t influence it or control it on its own.

WORK OVERNIGHT!

Finally, last but not least, SEO cannot work overnight. It just won’t happen. SEO is a long-term investment. It is very different from paid search ads, PPC, also called SEM sometimes, buying from Google ads or from Bing ads and appearing in the sponsored results. That is a tactic where you can pour money in and optimize and get results out in 24 hours. SEO is more like a 24-month long process. 

The SEO Growth Path

I’ve tried to show that here. The fundamental concept is when you have a new website, you need to earn these things — links and engagement and historical performance in the rankings.

As you earn those things, other people are linking to you from around the web, people are talking about you, people are engaging with your pages and your brand, people start searching for your brand specifically, people are clicking you more in the search results and then having good experiences on your website, as all those great things happen, you will grow your historical engagement and links and ranking factors, all these things that we sort of put into the bucket of the authority and influence of a website.

3–6 months: Begin to rank for things in the long tail of search demand

As that grows, you will be able to first, over time, this might be three to six months down here, you might be able to rank for a few keywords in the long tail of search demand. 

6–9 months: Begin to rank for more and more competitive keywords

After six to nine months, if you’re very good at this, you may be able to rank for more and more competitive keywords.

12–18 months: Compete for tougher keywords

As you truly grow a brand that is well-known and well thought of on the internet and by search engines, 12 to 18 months in, maybe longer, you may be able to compete for tougher and tougher keywords. When I started the Moz website, back in the early days of Google, it took me years, literally two or three years before I was ranking for anything in Google, anything in the search engines, and that is because I had to first earn that brand equity, that trust, that relationship with the search engines, those links and that engagement.

Today this is more true than ever because Google is so good at estimating these things. All right. I look forward to hearing all about the amazing strategies and structures that you’ve got probably in the comments down below. I’m sure it will be a great thread. We’ll move on to the second part of our one-hour guide next time — keyword research. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Find More Articles

Posted in Latest NewsComments Off

How to Use Domain Authority for SEO – Whiteboard Friday

Posted by Cyrus-Shepard

Domain Authority is an incredibly well-known metric throughout the SEO industry, but what exactly is the right way to use it? In this week’s edition of Whiteboard Friday, we’re delighted to welcome Cyrus Shepard as he explains both what’s new with the new Domain Authority 2.0 update, and how to best harness its power for your own SEO success. 

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, SEO fans. Welcome to a very special edition of Whiteboard Friday. I’m Cyrus Shepard. I’m honored to be here today with Moz to talk about the new Domain Authority. I want to talk about how to use Domain Authority to do actual SEO.

What is Domain Authority?

Let’s start with a definition of what Domain Authority actually is because there’s a lot of confusion out there. A Domain Authority is a metric, from 1 to 100, which predicts how well a domain will rank in Google. Now let’s break that down a little bit and talk about some of the myths of Domain Authority. 

Is Domain Authority a ranking factor? No, Domain Authority is not a ranking factor. Does Google use Domain Authority in its algorithm? No, Google does not use Domain Authority in its algorithm. Now Google may use some domain-like metrics based on links similar to Domain Authority, but they do not use Domain Authority itself. In fact, it’s best if you don’t bring it up with them. They don’t tend to like that very much.

So if it’s not a ranking factor, if Google doesn’t use it, what does Domain Authority actually do? It does one thing very, very well. It predicts rankings. That’s what it was built to do. That’s what it was designed to do, and it does that job very, very well. And because of that, we can use it for SEO in a lot of different ways. So Domain Authority has been around since 2010, about 8 years now, and since then it’s become a very popular metric, used and abused in different ways.

What’s New With Domain Authority 2.0?

So what’s new about the new Domain Authority that makes it so great and less likely to be abused and gives it so many more uses? Before I go into this, a big shout-out to two of the guys who helped develop this — Russ Jones and Neil Martinsen-Burrell — and many other smart people at Moz. Some of our search scientists did a tremendous job of updating this metric for 2019.

1. Bigger Link Index

So the first thing is the new Domain Authority is based on a new, bigger link index, and that is Link Explorer, which was released last year. It contains 35 trillion links. There are different ways of judging index sizes, but that is one of the biggest or if not the biggest link indexes publicly available that we know of.

Thirty-five trillion links, to give you an idea of how big that is, if you were to count one link per second, you would be counting for 1.1 million years. That’s a lot of links, and that’s how many links are in the index that the new Domain Authority is based upon. Second of all, it uses a new machine learning model. Now part of Domain Authority looks at Google rankings and uses machine learning to try to fit the model in to predict how those rankings are stacked.

2. New Machine Learning Model

Now the new Domain Authority not only looks at what’s winning in Google search, but it’s also looking at what’s not ranking in Google search. The old model used to just look at the winners. This makes it much more accurate at determining where you might fall or where any domain or URL might fall within that prediction. 

3. Spam Score Incorporation

Next the new Domain Authority incorporates spam detection.

Spam Score is a proprietary Moz metric that looks at a bunch of on-page factors, and those have been incorporated into the new metric, which makes it much more reliable. 

4. Detects Link Manipulation

It also, and this is very important, the new Domain Authority detects link manipulation. This is people that are buying and selling links, PBNs, things like that.

It’s much better. In fact, Russ Jones, in a recent webinar, said that link buyers with the new Domain Authority will drop an average of 11 points. So the new Domain Authority is much better at rooting out this link manipulation, just like Google is attempting to do. So it much more closely resembles what Google is attempting.

5. Daily Updates

Lastly, the new Domain Authority is updated daily. This is a huge improvement. The old Domain Authority used to update about approximately every month or so.* The new Domain Authority is constantly being updated, and our search scientists are constantly adding improvements as they come along.

So it’s being updated much more frequently and improved much more frequently. So what does this mean? The new Domain Authority is the most accurate domain-level metric to predict Google search results that we know of. When you look at ranking factors that we know of, like title tags or even generally backlinks, they predict a certain amount of rankings. But Domain Authority blows those out of the water in its ranking potential.

*Note: Our former link research tool, Open Site Explorer, updated on a monthly cadence, resulting in monthly updates to DA scores. With the launch of Link Explorer in April 2018, Domain Authority scores moved to a daily update cadence. This remains true with the new underlying algorithm, Domain Authority 2.0.

How to Use Domain Authority for SEO

So the question is how do we actually use this? We have this tremendous power with Domain Authority that can predict rankings to a certain degree. How do we use this for SEO? So I want to go over some general tips for success. 

The first tip, never use Domain Authority in isolation. You always want to use it with other metrics and in context, because it can only tell you so much.

It’s a powerful tool, but it’s limited. For example, when you’re looking at rankings on-page, you’re going to want to look at the keyword targeting. You’re going to want to look at the on-page content, the domain history, other things like that. So never use Domain Authority by itself. That’s a key tip. 

Second, you want to keep in mind that the scale of Domain Authority is roughly logarithmic.

It’s not linear. Now what does this mean? It’s fairly easy to move from a zero Domain Authority or a one Domain Authority to a ten Domain Authority. You can get a handful of links, and that works pretty well. But moving from like a 70 to an 80 is much, much harder. It gets harder as you get higher. So a DA 40 is not twice a DA 20.

It’s actually much, much bigger because as you go higher and higher and higher, until you get to 100, it gets much harder. Sites like Google and Facebook, they’re near the 100 range, and everything else comes into it. It’s almost like a funnel. 

Next, keep in mind that DA is a relative metric. When you’re using DA, you always want to compare between competitors or your past scores.

Having a DA 50 doesn’t really tell you much unless you’re comparing it to other DA scores. So if you’re looking in Google and a site has a DA of 50, it doesn’t make much sense unless you put it in the context of “what do the other sites have?” Are they 40? Are they 60? In that regard, when you’re looking at your own DA, you can compare against past performance or competitors.

So if I have a 50 this month and a 40 last month, that might tell me that my ability to rank in Google has increased in that time period. 

1. Evaluate Potential Value of a Link

So talking about SEO use cases, we have this. We understand how to use it. What are some practical ways to use Domain Authority? Well, a very popular one with the old DA as well is judging the potential value of a link.

For instance, you have 1,000 outreach targets that you’re thinking about asking for a link, but you only have time for 100 because you want to spend your time wisely and it’s not worth it to ask all 1,000. So you might use DA as a filter to find the most valuable link targets. A DA 90 might be more valuable than a DA 5 or a 10.

But again, you do not want to use it in isolation. You’d be looking at other metrics as well, such as Page Authority, relevance, and traffic. But still DA might be a valuable metric to add to that experience. 

2. Judging Keyword Difficulty

Judging keyword difficulty, judging when you look at SERPs and see what is my potential of ranking for this SERP with this particular keyword?

If you look at a SERP and everybody has a DA 95, it’s going to be pretty hard to rank in that SERP. But if everybody has a lower DA, you might have a chance. But again, you’re going to want to look at other metrics, such as Page Authority, keyword volume, on-page targeting. You can use Moz’s Keyword Difficulty Score to run these calculations as well.

3. Campaign Performance

Very popular in the agency world is link campaign performance or campaign performance in general, and this kind of makes sense. If you’re building links for a client and you want to show progress, a common way of doing this is showing Domain Authority, meaning that we built these links for you and now your potential to rank is higher.

It’s a good metric, but it’s not the only metric I would report. I would definitely report rankings for targeted keywords. I would report traffic and conversions, because ranking potential is one thing, but I’d actually like to show that those links actually did something. So I’d be more inclined to show the other things. But DA is perfectly fine to report for campaign performance as long as you show it in context.

4. Purchasing Existing Domains

A popular one on the marketplaces is buying existing domains. Sites like Flippa often show DA or some similar metric like that. Again, the new Domain Authority is going to be much better at rooting out link manipulation, so these scores might be a little more trustworthy in this sense. But again, never buy a domain just on Domain Authority alone.

You’re going to want to look at a lot of factors, such as the content, the traffic, the domain history, things like that. But Domain Authority might be a good first-line filter for you. 

How to Find Domain Authority Metrics

So where can you find the new Domain Authority? It is available right now. You can go to Link Explorer. It’s available through the Moz API.

The free MozBar, you can download the MozBar for free and turn on SERP overlay, and it will show you the DA of everything as you browse through Google. 

It’s available in Moz Campaigns and also Keyword Explorer. I hope this gives you some ideas about how to use Domain Authority. Please share your ideas and thoughts in the comments below. If you like this video, please share.

Thanks a lot, everybody. Have a great day.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

7 Red Flags to Watch Out For When Auditing Your Link Profile – Whiteboard Friday

Posted by KameronJenkins

From irrelevant, off-topic backlinks to cookie-cutter anchor text, there are more than a few clues hidden in your backlink profile that something spammy is going on. Alone they might not be something to worry about, but in conjunction, common red flags can spell trouble when you’re performing an audit on your backlink profile. In this week’s Whiteboard Friday, Kameron Jenkins shares her best advice from years working with clients on what to watch out for in a link profile audit.


Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, guys. Welcome to this week’s edition of Whiteboard Friday. My name is Kameron Jenkins, and I work here at Moz. Today we’re going to be talking about auditing your backlink profile, why you might want to do it, when you should do it, and then how to do it. So let’s just dive right in.

It might be kind of confusing to be talking about auditing your backlink profile. When I say auditing your backlink profile, I’m specifically talking about trying to diagnose if there’s anything funky or manipulative going on. There’s been quite a bit of debate among SEOs, so in a post-Penguin 4.0 world, we all wonder if Google can ignore spammy backlinks and low-quality backlinks, why would we also need to disavow, which essentially tells Google the same thing: “Just ignore these links.”

I posed three reasons why we might still want to consider this in some situations. 

Why should you audit your backlink profile?

Disavow is still offered

Disavow is still an option — you can go to and submit a disavow file right now if you wanted to.

You can still get manual penalties

Google still has guidelines that outline all of the link schemes and types of link manipulation. If you violate those, you could get a manual penalty. In your Google Search Console, it will say something like unnatural links to your site detected, total or partial. You can still get those. That’s another reason I would say that the disavow is still something you could consider doing.

Google says their stance hasn’t changed

I know there’s like a little bit of back-and-forth about this, but technically Google has said, “Our stance hasn’t changed. Still use the disavow file carefully and when it’s appropriate.” So we’ll talk about when it might be appropriate, but that’s why we consider that this is still a legitimate activity that you could do.

When should you audit your backlink profile?

Look for signs of a link scheme or link manipulation

I would say that, in today’s climate, it’s probably best just to do this when you see overt signs of a link scheme or link manipulation, something that looks very wrong or very concerning. Because Google is so much better at uncovering when there are manipulative links and just ignoring them and not penalizing a whole site for them, it’s not as important, I think, to be as aggressive as we maybe used to be previously. But if you do, maybe you inherit a client and you just look at their link profile for the first time and you notice that there’s something sketchy in there, I might want to consider doing it if there are signs. You’re an SEO. You can detect the signs of whether there’s a link scheme going on.

How do you audit your backlink profile?

Check for red flags in Moz Link Explorer

But if you’re not quite sure how to diagnose that, check for red flags in Moz Link Explorer, and that’s the second part of this. We’re going to go through some red flags that I have noticed. But huge disclaimer — seven possible red flags. Please don’t just take one of these and say, “Oh, I found this,” and immediately disavow.

These are just things that I have noticed over time. I started in SEO in 2012, right around the time of Penguin, and so I did a lot of cleanup of so many spammy links. I kind of just saw patterns, and this is the result of that. I think that’s stayed true over the last couple of years, links that haven’t been cleaned up. Some people are still doing these kinds of low-quality link building techniques that actually could get you penalized.

These are some things that I have noticed. They should just pique your interest. If you see something like this, if you detect one of these red flags, it should prompt you to look into it further, not immediately write off those links as “those are bad.” They’re just things to spark your interest so that you can explore further on your own. So with that big disclaimer, let’s dive into the red flags.

7 possible red flags

1. Irrelevance

Countries you don’t serve

A couple of examples of this. Maybe you are working on a client. They are US-based, and all of their locations are in the US. Their entire audience is US-based. But you get a quick glimpse of the inbound links. Maybe you’re on Link Explorer and you go to the inbound links report and you see a bunch of domains linking to you that are .ru and .pl, and that’s kind of confusing. Why is my site getting a huge volume of links from other countries that we don’t serve and we don’t have any content in Russian or Polish or anything like that? So that might spark my interest to look into it further. It could be a sign of something.

Off-topic links

Another thing is off-topic. My favorite example, just because it was so ridiculous, was I was working with an Atlanta DUI attorney, and he had a huge chunk of backlinks that were from party planning, like low-quality party planning directories, and they didn’t make any sense. I clicked on them just to see what it was. You can go to it and see okay, yes, there really is no reason they should be linking to each other. It was clear he just went to Fiverr and was like, “$ 5, here build me links,” and he didn’t care where they came from. So you might notice a lot of totally off-topic, irrelevant stuff.

But obviously a disclaimer, it might look irrelevant, but then when you dive in further, they are in the same market and they kind of have a co-marketing relationship going on. Just be careful with that. But it could be a sign that there is some link manipulation going on if you have totally off-topic links in there.

2. Anchor text

The second red flag is anchor text. Again, this is another cool report in Moz Link Explorer. You can go in there and see the anchor text report. When I notice that there’s link manipulation going on, usually what I see is that there is a huge majority of their backlinks coming with the same exact anchor text, and usually it’s the exact match keyword that they want to rank for. That’s usually a huge earmark of, hey, they’ve been doing some shady linking.

The example I like to use for this and why that is concerning — and there’s no percentage that’s like, whoa, that’s manipulative. But if you see a really disproportionate percentage of links coming with the same exact anchor text, it might prompt you to look into it further. The example I like to use is, say you meet with five different friends throughout the course of your day, different occasions. They’re not all in the same room with you. You talk to each of them and they all say, “Hey, yeah, my weekend was great, but like I broke my foot.” You would be suspicious: “What, they all broke their foot? This is weird. What’s going on?”

Same thing with anchor text. If you’re earning links naturally, they’re not all going to look the same and mechanical. Something suspicious is probably going on if they’re all linking with the exact same anchor text. So that’s that.

3. Nofollow/follow

Nofollow to follow, this is another one — please don’t use this as a sweeping rule, because I think even Russ Jones has come out and said at a mass scale that’s not a good predictor of spamminess. But what I have tended to see is usually if they also have spammy anchor text and they’re irrelevant, usually I also see that there’s a really, really disproportionate ratio of nofollow to follow. Use these red flags in conjunction with each other. When they start to pile on, it’s even more of a sign to me that there’s something fishy going on.

Nofollow to follow, you might see something ridiculous. Again, it’s something you can see in Link Explorer. Maybe like 99% of all of their backlinks are follow, which are the ones that pass PageRank. If you’re going to do a link scheme, you’re going to go out and get the ones that you think are going to pass PageRank to your site. Then one percent or no percent is nofollow. It may be something to look into.

4. Links/domains

Same thing with links to domains. Again, not an overt sign of spamminess. There’s no magic ratio here. But sometimes when I notice all of these other things, I will also notice that there’s a really disproportionate ratio of, say, they have 10,000 inbound links, but they’re coming from only 5 domains. Sometimes this happens. An example of this: I was auditing a client’s backlink profile, and they had set up five different websites, and on those websites they had put site-wide links to all of their other websites. They had created their own little network. By linking to each other, they were hoping to bolster all of their sites’ authority. Obviously, be careful with something like that. It could indicate that you’re self-creating follow links, which is a no-no.

5. Domain naming

“DIR” or “directory”

This one is just kind of like the eyeball test, which I’ll get to later. If you go to your inbound links, you can start to notice domain names that just look weird, and they’ll start to look off the more you look into stuff like this. When I was doing a lot of backlink auditing, what I noticed was that a lot of these spammier links came from low-quality directory submission sites. A lot of those tend to have or they would say “directory” in it or “DIR,” so like bestlinkdir.co, whatever. A lot of times when they have naming conventions like that, I have noticed that those tend to be low-quality directory submission sites. You could even eyeball or scan and see if there are any “DIR” directory-type of links.

“Article”

Same thing with articles. Like back in the day, when people use to submit like e-zine articles or Article Base or something like that, if it has the word “article” in the domain name, it might be something to look into. Maybe they were doing some low-quality article submission with backlinks to their site. 

“SEO”/”links”

Then if you tend to see a lot of links in their backlink profile that have like SEO link type naming conventions, unless you’re working on a site that is in the SEO space, they shouldn’t have a bunch of links that say like bestSEOsite.com or bestlinksforyou.com. I’ve seen a lot of that. It’s just something that I have noticed. It’s something to maybe watch out for.

6. Tool metrics

These can be super helpful. If you see tool metrics that maybe there is a really high Spam score, it’s something to look into. It might be helpful that Moz on their Help Hub has a list of all 27 criteria that they look at when evaluating a site’s spamminess. That might be something helpful to look into how Moz’s Spam Score calculates spamminess. 

DA and PA, just to know on this Domain Authority and Page Authority, if you see links coming from low DA or low PA URLs, just make sure you don’t write those off right off the bat. It could just be that those domains are very new. Maybe they haven’t engaged in a lot of marketing yet. It doesn’t necessarily mean they’re spammy. It just means they haven’t done much to earn any authority. Watch out for kind of writing off links and thinking they’re spammy just because they have a low DA or PA. Just something to consider.

7. Eyeball test

Then finally we have the eyeball test. Like I said, the more you do this, and it’s not something that you should be engaging in constantly all the time nowadays, but you’ll start to notice patterns if you are working on clients with spammier link profiles. These kind of low-quality sites tend to have like the same template. You’ll have 100 sites that are all blue. They have the exact same navigation, exact same logo. They’re all on the same network. You’ll start to notice themes like that. A lot of times they don’t have any contact information because no one maintains the things. They’re just up for the purpose of links. They don’t care about them, so no phone number, no contact information, no email address, nothing. Also a telltale sign, which I tend to notice on these like self-submission type of link sites is they’ll have a big PayPal button on the top and it will say, “Pay to Submit Links” or even worse it will be like “Uses this PayPal to get your links removed from this site,” because they know that it’s low-quality and people ask them all the time. Just something to consider on the eyeball test front.

I hope this was helpful. Hopefully it helped you understand when you might want to do this, when you might not want to do this, and then if you do try to engage in some kind of link audit, some things to watch out for. So I hope that was helpful. If you have any tips for this, if you’ve noticed anything else that you think would be helpful for other SEOs to know, drop it in the comments.

That’s it for this week’s Whiteboard Friday. Come back again next week for another one.

Check my link profile!

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

More Articles

Posted in Latest NewsComments Off

What a Two-Tiered SERP Means for Content Strategy – Whiteboard Friday

Posted by willcritchlow

If you’re a big site competing to rank for popular head terms, where’s the best place to focus your content strategy? According to a hypothesis by the good folks at Distilled, the answer may lie in perfectly satisfying searcher intent.

Click on the whiteboard image above to open a high resolution version in a new tab!

If you haven’t heard the news, the Domain Authority metric discussed in this episode will be updated on March 5th, 2019 to better correlate with Google algorithm changes. Learn about what’s changing below:

Learn more about the new DA


Video Transcription

Hi, Whiteboard Friday fans. I’m Will Critchlow, one of the founders at Distilled, and what I want to talk about today is joining the dots between some theoretical work that some of my colleagues have been doing and some of the client work that we’ve been doing recently and the results that we’ve been seeing from that in the wild and what I think it means for strategies for different-sized sites going on from here.

Correlations and a hypothesis

The beginning of this I credit to one of my colleagues, Tom Capper, THCapper on Twitter, who presented at our Search Love London conference a presentation entitled “The Two-Tiered SERP,” and I’m going to describe what that means in just a second. But what I’m going to do today is talk about what I think that the two-tiered SERP means for content strategy going forward and base that a little bit on some of what we’re seeing in the wild with some of our client projects.

What Tom presented at Search Love London was he started by looking at the fact that the correlation between domain authority and rankings has decreased over time. So he pulled out some stats from February 2017 and looked at those same stats 18 months later and saw a significant drop in the correlation between domain authority and rankings. This ties into a bunch of work that he’s done and presented elsewhere around potentially less reliance on links going forward and some other data that Google might be using, some other metrics and ranking factors that they might be using in their place, particularly branded metrics and so forth.

But Tom saw this drop and had a hypothesis that it wasn’t just an across-the-board drop. This wasn’t just Google not using links anymore or using links less. It was actually a more granular effect than that. This is the two-tiered SERP or what we mean by the two-tiered SERP. So a search engine result page, a SERP, you’ve got some results at the top and some results further down the page.

What Tom found — he had this hypothesis that was born out in the data — was that the correlation between domain authority and rankings was much higher among the positions 6 through 10 than it was among the top half of the search results page and that this can be explained by essentially somewhat traditional ranking factors lower down the page and in lower competition niches and that at the top of the page, where there’s more usage data, greater search volume and so forth in these top positions, that traditional ranking factors played less of a part.

They maybe get you into the consideration set. There are no domains ranking up here that are very, very weak. But once you’re in the consideration set, there’s much less of a correlation between these different positions. So it’s still true on average that these positions 1 through 5 are probably more authoritative than the sites that are appearing in lower positions. But within this set there’s less predictive value.

The domain authority is less predictive of ranking within this set than it is of ranking within this set. So this is the two-tiered SERP, and this is consistent with a bunch of data that we’ve seen across the place and in particular with the outcomes that we’re seeing among content campaigns and content strategies for different kinds of sites.

At Distilled, we get quite a lot of clients coming to us wanting either a content strategy put together or in some cases coming to us essentially with their content strategy and saying, “Can you execute this? Can you help us execute this plan?” It’s very common for that plan to be, “We want to create a bunch of big pieces of content that get a ton of links, and we’re going to use that link authority to make our site more authoritative and that is going to result in our whole site doing better and ranking better.”

An anonymized case study

We’ve seen that that is performing differently in different cases, and in particular it’s performing better on smaller sites than it is on big sites. So this is a little anonymized case study. This is a real example of a story that happened with one of our consulting clients where we put in place a content strategy for them that did include a plan to build the domain authority because this was a site that came to us with a domain authority significantly below that of their key competitors, also with all of these sites not having a ton of domain authority.

This was working in a B2B space, relatively small domains. They came to us with that, and we figured that actually growing the authority was a key part of this content strategy and over the next 18 months put out a bunch of pieces that have done really well and generated a ton of press coverage and traction and things. Over that time, they’ve actually outstripped their key competitors in the domain authority metrics, and crucially we saw that tie directly to increases in traffic that went hand-in-hand with this increase in domain authority.

But this contrasts to what we’ve seen with some much larger sites in much more competitive verticals where they’re already very, very high domain authority, maybe they’re already stronger than some of their competitors and adding to that. So adding big content pieces that get even more big authoritative links has not moved the needle in the way that it might have done a few years ago.

That’s totally consistent with this kind of setup, where if you are currently trying to edge in the bottom or you’re competing for less competitive search terms, then this kind of approach might really work for you and it might, in fact, be necessary to get into the consideration set for the more competitive end. But if you’re operating on a much bigger site, you’ve already got the competitive domain authority, you and your competitors are all very powerful sites, then our kind of hypothesis is that you’re going to be needing to look more towards the user experience, the conversion rate, and intent research.

Are you satisfying searcher intent for competitive head terms?

What is somebody who performs this search actually looking to do? Can you satisfy that intent? Can you make sure that they don’t bounce back to the search results and click on a competitor? Can you make sure that in fact they stay on your site, they get done the thing they want to get done, and it all works out for them, because we think that these kinds of things are going to be much more powerful for moving up through the very top end of the most competitive head terms.

So when we’re working on a content strategy or putting our creative team to work on these kinds of things on bigger sites, we’re more likely to be creating content directly designed to rank. We might be creating content based off a ton of this research, and we’re going to be incrementally improving those things to try and say, “Have we actually satisfied the perfect intent for this super competitive head term?”

What we’re seeing is that’s more likely to move the needle up at this top end than growing the domain authority on a big site. So I hope you found that interesting. I’m looking forward to a vigorous discussion in the comments on this one. But thank you for joining me for this week’s Whiteboard Friday. I’ve been Will Critchlow from Distilled. Take care.

Video transcription by Speechpad.com


Learn about Domain Authority 2.0!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

4 Ways to Improve Your Data Hygiene – Whiteboard Friday

Posted by DiTomaso

We base so much of our livelihood on good data, but managing that data properly is a task in and of itself. In this week’s Whiteboard Friday, Dana DiTomaso shares why you need to keep your data clean and some of the top things to watch out for.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi. My name is Dana DiTomaso. I am President and partner at Kick Point. We’re a digital marketing agency, based in the frozen north of Edmonton, Alberta. So today I’m going to be talking to you about data hygiene.

What I mean by that is the stuff that we see every single time we start working with a new client this stuff is always messed up. Sometimes it’s one of these four things. Sometimes it’s all four, or sometimes there are extra things. So I’m going to cover this stuff today in the hopes that perhaps the next time we get a profile from someone it is not quite as bad, or if you look at these things and see how bad it is, definitely start sitting down and cleaning this stuff up.

1. Filters

So what we’re going to start with first are filters. By filters, I’m talking about analytics here, specifically Google Analytics. When go you into the admin of Google Analytics, there’s a section called Filters. There’s a section on the left, which is all the filters for everything in that account, and then there’s a section for each view for filters. Filters help you exclude or include specific traffic based on a set of parameters.

Filter out office, home office, and agency traffic

So usually what we’ll find is one Analytics property for your website, and it has one view, which is all website data which is the default that Analytics gives you, but then there are no filters, which means that you’re not excluding things like office traffic, your internal people visiting the website, or home office. If you have a bunch of people who work from home, get their IP addresses, exclude them from this because you don’t necessarily want your internal traffic mucking up things like conversions, especially if you’re doing stuff like checking your own forms.

You haven’t had a lead in a while and maybe you fill out the form to make sure it’s working. You don’t want that coming in as a conversion and then screwing up your data, especially if you’re a low-volume website. If you have a million hits a day, then maybe this isn’t a problem for you. But if you’re like the rest of us and don’t necessarily have that much traffic, something like this can be a big problem in terms of the volume of traffic you see. Then agency traffic as well.

So agencies, please make sure that you’re filtering out your own traffic. Again things like your web developer, some contractor you worked with briefly, really make sure you’re filtering out all that stuff because you don’t want that polluting your main profile.

Create a test and staging view

The other thing that I recommend is creating what we call a test and staging view. Usually in our Analytics profiles, we’ll have three different views. One we call master, and that’s the view that has all these filters applied to it.

So you’re only seeing the traffic that isn’t you. It’s the customers, people visiting your website, the real people, not your office people. Then the second view we call test and staging. So this is just your staging server, which is really nice. For example, if you have a different URL for your staging server, which you should, then you can just include that traffic. Then if you’re making enhancements to the site or you upgraded your WordPress instance and you want to make sure that your goals are still firing correctly, you can do all that and see that it’s working in the test and staging view without polluting your main view.

Test on a second property

That’s really helpful. Then the third thing is make sure to test on a second property. This is easy to do with Google Tag Manager. What we’ll have set up in most of our Google Tag Manager accounts is we’ll have our usual analytics and most of the stuff goes to there. But then if we’re testing something new, like say the content consumption metric we started putting out this summer, then we want to make sure we set up a second Analytics view and we put the test, the new stuff that we’re trying over to the second Analytics property, not view.

So you have two different Analytics properties. One is your main property. This is where all the regular stuff goes. Then you have a second property, which is where you test things out, and this is really helpful to make sure that you’re not going to screw something up accidentally when you’re trying out some crazy new thing like content consumption, which can totally happen and has definitely happened as we were testing the product. You don’t want to pollute your main data with something different that you’re trying out.

So send something to a second property. You do this for websites. You always have a staging and a live. So why wouldn’t you do this for your analytics, where you have a staging and a live? So definitely consider setting up a second property.

2. Time zones

The next thing that we have a lot of problems with are time zones. Here’s what happens.

Let’s say your website, basic install of WordPress and you didn’t change the time zone in WordPress, so it’s set to UTM. That’s the default in WordPress unless you change it. So now you’ve got your data for your website saying it’s UTM. Then let’s say your marketing team is on the East Coast, so they’ve got all of their tools set to Eastern time. Then your sales team is on the West Coast, so all of their tools are set to Pacific time.

So you can end up with a situation where let’s say, for example, you’ve got a website where you’re using a form plugin for WordPress. Then when someone submits a form, it’s recorded on your website, but then that data also gets pushed over to your sales CRM. So now your website is saying that this number of leads came in on this day, because it’s in UTM mode. Well, the day ended, or it hasn’t started yet, and now you’ve got Eastern, which is when your analytics tools are recording the number of leads.

But then the third wrinkle is then you have Salesforce or HubSpot or whatever your CRM is now recording Pacific time. So that means that you’ve got this huge gap of who knows when this stuff happened, and your data will never line up. This is incredibly frustrating, especially if you’re trying to diagnose why, for example, I’m submitting a form, but I’m not seeing the lead, or if you’ve got other data hygiene issues, you can’t match up the data and that’s because you have different time zones.

So definitely check the time zones of every product you use –website, CRM, analytics, ads, all of it. If it has a time zone, pick one, stick with it. That’s your canonical time zone. It will save you so many headaches down the road, trust me.

3. Attribution

The next thing is attribution. Attribution is a whole other lecture in and of itself, beyond what I’m talking about here today.

Different tools have different ways of showing attribution

But what I find frustrating about attribution is that every tool has its own little special way of doing it. Analytics is like the last non-direct click. That’s great. Ads says, well, maybe we’ll attribute it, maybe we won’t. If you went to the site a week ago, maybe we’ll call it a view-through conversion. Who knows what they’re going to call it? Then Facebook has a completely different attribution window.

You can use a tool, such as Supermetrics, to change the attribution window. But if you don’t understand what the default attribution window is in the first place, you’re just going to make things harder for yourself. Then there’s HubSpot, which says the very first touch is what matters, and so, of course, HubSpot will never agree with Analytics and so on. Every tool has its own little special sauce and how they do attribution. So pick a source of truth.

Pick your source of truth

This is the best thing to do is just say, “You know what? I trust this tool the most.” Then that is your source of truth. Do not try to get this source of truth to match up with that source of truth. You will go insane. You do have to make sure that you are at least knowing that things like your time zones are clear so that’s all set.

Be honest about limitations

But then after that, really it’s just making sure that you’re being honest about your limitations.

Know where things are necessarily going to fall down, and that’s okay, but at least you’ve got this source of truth that you at least can trust. That’s the most important thing with attribution. Make sure to spend the time and read how each tool handles attribution so when someone comes to you and says, “Well, I see that we got 300 visits from this ad campaign, but in Facebook it says we got 6,000.

Why is that? You have an answer. That might be a little bit of an extreme example, but I mean I’ve seen weirder things with Facebook attribution versus Analytics attribution. I’ve even talked about stuff like Mixpanel and Kissmetrics. Every tool has its own little special way of recording attributions. It’s never the same as anyone else’s. We don’t have a standard in the industry of how this stuff works, so make sure you understand these pieces.

4. Interactions

Then the last thing are what I call interactions. The biggest thing that I find that people do wrong here is in Google Tag Manager it gives you a lot of rope, which you can hang yourself with if you’re not careful.

GTM interactive hits

One of the biggest things is what we call an interactive hit versus a non-interactive hit. So let’s say in Google Tag Manager you have a scroll depth.

You want to see how far down the page people scroll. At 25%, 50%, 75%, and 100%, it will send off an alert and say this is how far down they scrolled on the page. Well, the thing is that you can also make that interactive. So if somebody scrolls down the page 25%, you can say, well, that’s an interactive hit, which means that person is no longer bounced, because it’s counting an interaction, which for your setup might be great.

Gaming bounce rate

But what I’ve seen are unscrupulous agencies who come in and say if the person scrolls 2% of the way down the page, now that’s an interactive hit. Suddenly the client’s bounce rate goes down from say 80% to 3%, and they think, “Wow, this agency is amazing.” They’re not amazing. They’re lying. This is where Google Tag Manager can really manipulate your bounce rate. So be careful when you’re using interactive hits.

Absolutely, maybe it’s totally fair that if someone is reading your content, they might just read that one page and then hit the back button and go back out. It’s totally fair to use something like scroll depth or a certain piece of the content entering the user’s view port, that that would be interactive. But that doesn’t mean that everything should be interactive. So just dial it back on the interactions that you’re using, or at least make smart decisions about the interactions that you choose to use. So you can game your bounce rate for that.

Goal setup

Then goal setup as well, that’s a big problem. A lot of people by default maybe they have destination goals set up in Analytics because they don’t know how to set up event-based goals. But what we find happens is by destination goal, I mean you filled out the form, you got to a thank you page, and you’re recording views of that thank you page as goals, which yes, that’s one way to do it.

But the problem is that a lot of people, who aren’t super great at interneting, will bookmark that page or they’ll keep coming back to it again and again because maybe you put some really useful information on your thank you page, which is what you should do, except that means that people keep visiting it again and again without actually filling out the form. So now your conversion rate is all messed up because you’re basing it on destination, not on the actual action of the form being submitted.

So be careful on how you set up goals, because that can also really game the way you’re looking at your data.

Ad blockers

Ad blockers could be anywhere from 2% to 10% of your audience depending upon how technically sophisticated your visitors are. So you’ll end up in situations where you have a form fill, you have no corresponding visit to match with that form fill.

It just goes into an attribution black hole. But they did fill out the form, so at least you got their data, but you have no idea where they came from. Again, that’s going to be okay. So definitely think about the percentage of your visitors, based on you and your audience, who probably have an ad blocker installed and make sure you’re comfortable with that level of error in your data. That’s just the internet, and ad blockers are getting more and more popular.

Stuff like Apple is changing the way that they do tracking. So definitely make sure that you understand these pieces and you’re really thinking about that when you’re looking at your data. Again, these numbers may never 100% match up. That’s okay. You can’t measure everything. Sorry.

Bonus: Audit!

Then the last thing I really want you to think about — this is the bonus tip — audit regularly.

So at least once a year, go through all the different stuff that I’ve covered in this video and make sure that nothing has changed or updated, you don’t have some secret, exciting new tracking code that somebody added in and then forgot because you were trying out a trial of this product and you tossed it on, and it’s been running for a year even though the trial expired nine months ago. So definitely make sure that you’re running the stuff that you should be running and doing an audit at least on an yearly basis.

If you’re busy and you have a lot of different visitors to your website, it’s a pretty high-volume property, maybe monthly or quarterly would be a better interval, but at least once a year go through and make sure that everything that’s there is supposed to be there, because that will save you headaches when you look at trying to compare year-over-year and realize that something horrible has been going on for the last nine months and all of your data is trash. We really don’t want to have that happen.

So I hope these tips are helpful. Get to know your data a little bit better. It will like you for it. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Using STAT for Content Strategy – Whiteboard Friday

Posted by DiTomaso

Search results are sophisticated enough to show searchers not only the content they want, but in the format they want it. Being able to identify searcher intent and interest based off of ranking results can be a powerful driver of content strategy. In this week’s Whiteboard Friday, we warmly welcome Dana DiTomaso as she describes her preferred tools and methods for developing a modern and effective content strategy.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, everyone. Welcome to Whiteboard Friday. My name is Dana DiTomaso. I’m President and partner of Kick Point, which is a digital marketing agency based way up in Edmonton, Alberta. Come visit sometime.

What I’m going to be talking about today is using STAT for content strategy. STAT, if you’re not familiar with STAT Search Analytics, which is in my opinion the best ranking tool on the market and Moz is not paying me to say that, although they did pay for STAT, so now STAT is part of the Moz family of products. I really like STAT. I’ve been using it for quite some time. They are also Canadian. That may or may not influence my decision.

But one of the things that STAT does really well is it doesn’t just show you where you’re ranking, but it breaks down what type of rankings and where you should be thinking about rankings. Typically I find, especially if you’ve been working in this field for a long time, you might think about rankings and you still have in your mind the 10 blue links that we used to have forever ago, and that’s so long gone. One of the things that’s useful about using STAT rankings is you can figure out stuff that you should be pursuing other than, say, the written word, and I think that that’s something really important again for marketers because a lot of us really enjoy reading stuff.

Consider all the ways searchers like to consume content

Maybe you’re watching this video. Maybe you’re reading the transcript. You might refer to the transcript later. A lot of us are readers. Not a lot of us are necessarily visual people, so sometimes we can forget stuff like video is really popular, or people really do prefer those places packs or whatever it might be. Thinking outside of yourself and thinking about how Google has decided to set up the search results can help you drive better content to your clients’ and your own websites.

The biggest thing that I find that comes of this is you’re really thinking about your audience a lot more because you do have to trust that Google maybe knows what it’s doing when it presents certain types of results to people. It knows the intent of the keyword, and therefore it’s presenting results that make sense for that intent. We can argue all day about whether or not answer boxes are awesome or terrible.

But from a visitor’s perspective and a searcher’s perspective, they like them. I think we need to just make sure that we’re understanding where they might be showing up, and if we’re playing by Google rules, people also ask is not necessarily going anywhere.

All that being said, how can we use ranking results to figure out our content strategy? The first thing about STAT, if you haven’t used STAT before, again check it out, it’s awesome.

Grouping keywords with Data Views

But one of the things that’s really nice is you can do this thing called data views. In data views, you can group together parts of keywords. So you can do something called smart tags and say, “I want to tag everything that has a specific location name together.”

Opportunities — where are you not showing up?

Let’s say, for example, that you’re working with a moving company and they are across Canada. So what I want to see here for opportunities are things like where I’m not ranking, where are there places box showing up that I am not in, or where are the people also ask showing up that I am not involved in. This is a nice way to keep an eye on your competitors.

Locations

Then we’ll also do locations. So we’ll say everything in Vancouver, group this together. Everything in Winnipeg, group this together. Everything in Edmonton and Calgary and Toronto, group all that stuff together.

Attributes (best, good, top, free, etc.)

Then the third thing can be attributes. This is stuff like best, good, top, free, cheap, all those different things that people use to describe your product, because those are definitely intent keywords, and often they will drive very different types of results than things you might consider as your head phrases.

So, for example, looking at “movers in Calgary” will drive a very different result than “top movers in Calgary.” In that case, you might get say a Yelp top 10 list. Or if you’re looking for “cheapest mover in Calgary,”again a different type of search result. So by grouping your keywords together by attributes, that can really help you as well determine how those types of keywords can be influenced by the type of search results that Google is putting out there.

Products / services

Then the last thing is products/services. So we’ll take each product and service and group it together. One of the nice things about STAT is you can do something called smart tags. So we can, say, figure out every keyword that has the word “best” in it and put it together. Then if we ever add more keywords later, that also have the word “best,”they automatically go into that keyword group. It’s really useful, especially if you are adding lots of keywords over time. I recommend starting by setting up some views that make sense.

You can just import everything your client is ranking for, and you can just take a look at the view of all these different keywords. But the problem is that there’s so much data, when you’re looking at that big set of keywords, that a lot of the useful stuff can really get lost in the noise. By segmenting it down to a really small level, you can start to understand that search for that specific type of term and how you fit in versus your competition.

A deep dive into SERP features

So put that stuff into STAT, give it a little while, let it collect some data, and then you get into the good stuff, which is the SERP features. I’m covering just a tiny little bit of what STAT does. Again, they didn’t pay me for this. But there’s lots of other stuff that goes on in here. My personal favorite part is the SERP features.

Which features are increasing/decreasing both overall and for you?

So what I like here is that in SERP features it will tell you which features are increasing and decreasing overall and then what features are increasing and decreasing for you.

This is actually from a real set for one of our clients. For them, what they’re seeing are big increases in places version 3, which is the three pack of places. Twitter box is increasing. I did not see that coming. Then AMP is increasing. So that says to me, okay, so I need to make sure that I’m thinking about places, and maybe this is a client who doesn’t necessarily have a lot of local offices.

Maybe it’s not someone you would think of as a local client. So why are there a lot more local properties popping up? Then you can dive in and say, “Okay, only show me the keywords that have places boxes.” Then you can look at that and decide: Is it something where we haven’t thought about local SEO before, but it’s something where searchers are thinking about local SEO? So Google is giving them three pack local boxes, and maybe we should start thinking about can we rank in that box, or is that something we care about.

Again, not necessarily content strategy, but certainly your SEO strategy. The next thing is Twitter box, and this is something where you think Twitter is dead. No one is using Twitter. It’s full of terrible people, and they tweet about politics all day. I never want to use it again, except maybe Google really wants to show more Twitter boxes. So again, looking at it and saying, “Is Twitter something where we need to start thinking about it from a content perspective? Do we need to start focusing our energies on Twitter?”

Maybe you abandoned it and now it’s back. You have to start thinking, “Does this matter for the keywords?” Then AMP. So this is something where AMP is really tricky obviously. There have been studies where it said, “I implemented AMP, and I lost 70% of my traffic and everything was terrible.” But if that’s the case, why would we necessarily be seeing more AMP show up in search results if it isn’t actually something that people find useful, particularly on mobile search?

Desktop vs mobile

One of the things actually that I didn’t mention in the tagging is definitely look at desktop versus mobile, because you are going to see really different feature sets between desktop and mobile for these different types of keywords. Mobile may have a completely different intent for a type of search. If you’re a restaurant, for example, people looking for reservations on a desktop might have different intent from I want a restaurant right now on mobile, for example, and you’re standing next to it and maybe you’re lost.

What kind of intent is behind the search results?

You really have to think about what that intent means for the type of search results that Google is going to present. So for AMP, then you have to look at it and say, “Well, is this newsworthy? Why is more AMP being shown?” Should we consider moving our news or blog or whatever you happen call it into AMP so that we can start to show up for these search results in mobile? Is that a thing that Google is presenting now?

We can get mad about AMP all day, but how about instead if we actually be there? I don’t want the comment section to turn into a whole AMP discussion, but I know there are obviously problems with AMP. But if it’s being shown in the search results that searchers who should be finding you are seeing and you’re not there, that’s definitely something you need to think about for your content strategy and thinking, “Is AMP something that we need to pursue? Do we have to have more newsy content versus evergreen content?”

Build your content strategy around what searchers are looking for

Maybe your content strategy is really focused on posts that could be relevant for years, when in reality your searchers are looking for stuff that’s relevant for them right now. So for example, things with movers, there’s some sort of mover scandal. There’s always a mover who ended up taking someone’s stuff and locking it up forever, and they never gave it back to them. There’s always a story like that in the news.

Maybe that’s why it’s AMP. Definitely investigate before you start to say, “AMP everything.” Maybe it was just like a really bad day for movers, for example. Then you can see the decreases. So the decrease here is organic, which is that traditional 10 blue links. So obviously this new stuff that’s coming in, like AMP, like Twitter, like places is displacing a lot of the organic results that used to be there before.

So instead you think, well, I can do organic all day, but if the results just aren’t there, then I could be limiting the amount of traffic I could be getting to my website. Videos, for example, now it was really interesting for this particular client that videos is a decreasing SERP for them, because videos is actually a big part of their content strategy. So if we see that videos are decreasing, then we can take a step back and say, “Is it decreasing in the keywords that we care about? Why is it decreasing? Do we think this is a test or a longer-term trend?”

Historical data

What’s nice about STAT is you can say “I want to see results for the last 7 days, 30 days, or 60 days.” Once you get a year of data in there, you can look at the whole year and look at that trend and see is it something where we have to maybe rethink our video strategy? Maybe people don’t like video for these phrases. Again, you could say, “But people do like video for these phrases.” But Google, again, has access to more data than you do.

If Google has decided that for these search phrases video is not a thing they want to show anymore, then maybe people don’t care about video the way that you thought they did. Sorry. So that could be something where you’re thinking, well, maybe we need to change the type of content we create. Then the last one is carousel that showed up for this particular client. Carousel, there are ones where they show lots of different results.

I’m glad that’s dropping because that actually kind of sucks. It’s really hard to show up well there. So I think that’s something to think about in the carousel as well. Maybe we’re pleased that that’s going away and then we don’t have to fight it as much anymore. Then what you can see in the bottom half are what we call share of voice.

Share of voice

Share of voice is calculated based on your ranking and all of your competitors’ ranking and the number of clicks that you’re expected to get based on your ranking position.

So the number 1 position obviously gets more ranks than the number 100 position. So the share of voice is a percentage calculated based on how many of these types of items, types of SERP features that you own versus your competitors as well as your position in these SERP features. So what I’m looking at here is share of voice and looking at organic, places, answers, and people also ask, for example.

So what STAT will show you is the percentage of organic, and it’s still, for this client — and obviously this is not an accurate chart, but this is vaguely accurate to what I saw in STAT — organic is still a big, beefy part of this client’s search results. So let’s not panic that it’s decreasing. This is really where this context can come in. But then you can think, all right, so we know that we are doing “eeh” on organic.

Is it something where we think that we can gain more? So the green shows you your percentage that you own of this, and then the black is everyone else. Thinking realistically, you obviously cannot own 100% of all the search results all the time because Google wouldn’t allow that. So instead thinking, what’s a realistic thing? Are we topping out at the point now where we’re going to have diminishing returns if we keep pushing on this?

Identify whether your content efforts support what you’re seeing in STAT

Are we happy with how we’re doing here? Maybe we need to turn our attention to something else, like answers for example. This particular client does really well on places. They own a lot of it. So for places, it’s maintain, watch, don’t worry about it that much anymore. Then that can drop off when we’re thinking about content. We don’t necessarily need to keep writing blog post for things that are going to help us to rank in the places pack because it’s not something that’s going to influence that ranking any further.

We’re already doing really well. But instead we can look at answers and people also ask, which for this particular client they’re not doing that well. It is something that’s there, and it is something that it may not be one of the top increases, but it’s certainly an increase for this particular client. So what we’re looking at is saying, “Well, you have all these great blog posts, but they’re not really written with people also ask or answers in mind. So how about we go back and rewrite the stuff so that we can get more of these answer boxes?”

That can be the foundation of that content strategy. When you put your keywords into STAT and look at your specific keyword set, really look at the SERP features and determine what does this mean for me and the type of content I need to create, whether it’s more images for example. Some clients, when you’re looking at e-commerce sites, some of the results are really image heavy, or they can be product shopping or whatever it might be.

There are really specific different features, and I’ve only shown a tiny subset. STAT captures all of the different types of SERP features. So you can definitely look at anything if it’s specific to your industry. If it’s a feature, they’ve got it in here. So definitely take a look and see where are these opportunities. Remember, you can’t have a 100% share of voice because other people are just going to show up there.

You just want to make sure that you’re better than everybody else. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Articles

Posted in Latest NewsComments Off

Redirects: One Way to Make or Break Your Site Migration – Whiteboard Friday

Posted by KameronJenkins

Correctly redirecting your URLs is one of the most important things you can do to make a site migration go smoothly, but there are clear processes to follow if you want to get it right. In this week’s Whiteboard Friday, Kameron Jenkins breaks down the rules of redirection for site migrations to make sure your URLs are set up for success.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, guys. Welcome to this week’s edition of Whiteboard Friday. My name is Kameron Jenkins, and I work here at Moz. What we’re going to be talking about today is redirects and how they’re one way that you can make or break your site migration. Site migration can mean a lot of different things depending on your context.

Migrations?

I wanted to go over quickly what I mean before we dive into some tips for avoiding redirection errors. When I talk about migration, I’m coming from the experience of these primary activities.

CMS moving/URL format

One example of a migration I might be referring to is maybe we’re taking on a client and they previously used a CMS that had a default kind of URL formatting, and it was dated something.

So it was like /2018/May/ and then the post. Then we’re changing the CMS. We have more flexibility with how our pages, our URLs are structured, so we’re going to move it to just /post or something like that. In that way a lot of URLs are going to be moving around because we’re changing the way that those URLs are structured.

“Keywordy” naming conventions

Another instance is that sometimes we’ll get clients that come to us with kind of dated or keywordy URLs, and we want to change this to be a lot cleaner, shorten them where possible, just make them more human-readable.

An example of that would be maybe the client used URLs like /best-plumber-dallas, and we want to change it to something a little bit cleaner, more natural, and not as keywordy, to just /plumbers or something like that. So that can be another example of lots of URLs moving around if we’re taking over a whole site and we’re kind of wanting to do away with those.

Content overhaul

Another example is if we’re doing a complete content overhaul. Maybe the client comes to us and they say, “Hey, we’ve been writing content and blogging for a really long time, and we’re just not seeing the traffic and the rankings that we want. Can you do a thorough audit of all of our content?” Usually what we notice is that you have maybe even thousands of pages, but four of them are ranking.

So there are a lot of just redundant pages, pages that are thin and would be stronger together, some pages that just don’t really serve a purpose and we want to just let die. So that’s another example where we would be merging URLs, moving pages around, just letting some drop completely. That’s another example of migrating things around that I’m referring to.

Don’t we know this stuff? Yes, but…

That’s what I’m referring to when it comes to migrations. But before we dive in, I kind of wanted to address the fact that like don’t we know this stuff already? I mean I’m talking to SEOs, and we all know or should know the importance of redirection. If there’s not a redirect, there’s no path to follow to tell Google where you’ve moved your page to.

It’s frustrating for users if they click on a link that no longer works, that doesn’t take them to the proper destination. We know it’s important, and we know what it does. It passes link equity. It makes sure people aren’t frustrated. It helps to get the correct page indexed, all of those things. So we know this stuff. But if you’re like me, you’ve also been in those situations where you have to spend entire days fixing 404s to correct traffic loss or whatever after a migration, or you’re fixing 301s that were maybe done but they were sent to all kinds of weird, funky places.

Mistakes still happen even though we know the importance of redirects. So I want to talk about why really quickly.

Unclear ownership

Unclear ownership is something that can happen, especially if you’re on a scrappier team, a smaller team and maybe you don’t handle these things very often enough to have a defined process for this. I’ve been in situations where I assumed the tech was going to do it, and the tech assumed that the project assistant was going to do it.

We’re all kind of pointing fingers at each other with no clear ownership, and then the ball gets dropped because no one really knows whose responsibility it is. So just make sure that you designate someone to do it and that they know and you know that that person is going to be handling it.

Deadlines

Another thing is deadlines. Internal and external deadlines can affect this. So one example that I encountered pretty often is the client would say, “Hey, we really need this project done by next Monday because we’re launching another initiative. We’re doing a TV commercial, and our domain is going to be listed on the TV commercial. So I’d really like this stuff wrapped up when those commercials go live.”

So those kind of external deadlines can affect how quickly we have to work. A lot of times it just gets left by the wayside because it is not a very visible thing. If you don’t know the importance of redirects, you might handle things like content and making sure the buttons all work and the template looks nice and things like that, the visible things. Where people assume that redirects, oh, that’s just a backend thing. We can take care of it later. Unfortunately, redirects usually fall into that category if the person doing it doesn’t really know the importance of it.

Another thing with deadlines is internal deadlines. Sometimes maybe you might have a deadline for a quarterly game or a monthly game. We have to have all of our projects done by this date. The same thing with the deadlines. The redirects are usually unfortunately something that tends to miss the cutoff for those types of things.

Non-SEOs handling the redirection

Then another situation that can cause site migration errors and 404s after moving around is non-SEOs handling this. Now you don’t have to be a really experienced SEO usually to handle these types of things. It depends on your CMS and how complicated is the way that you’re implementing your redirects. But sometimes if it’s easy, if your CMS makes redirection easy, it can be treated as like a data entry-type of job, and it can be delegated to someone who maybe doesn’t know the importance of doing all of them or formatting them properly or directing them to the places that they’re supposed to go.

The rules of redirection for site migrations

Those are all situations that I’ve encountered issues with. So now that we kind of know what I’m talking about with migrations and why they kind of sometimes still happen, I’m going to launch into some rules that will hopefully help prevent site migration errors because of failed redirects.

1. Create one-to-one redirects

Number one, always create one-to-one redirects. This is super important. What I’ve seen sometimes is oh, man, it could save me tons of time if I just use a wildcard and redirect all of these pages to the homepage or to the blog homepage or something like that. But what that tells Google is that Page A has moved to Page B, whereas that’s not the case. You’re not moving all of these pages to the homepage. They haven’t actually moved there. So it’s an irrelevant redirect, and Google has even said, I think, that they treat those essentially as a soft 404. They don’t even count. So make sure you don’t do that. Make sure you’re always linking URL to its new location, one-to-one every single time for every URL that’s moving.

2. Watch out for redirect chains

Two, watch out for chains. I think Google says something oddly specific, like watch out for redirect chains, three, no more than five. Just try to limit it as much as possible. By chains, I mean you have URL A, and then you redirect it to B, and then later you decide to move it to a third location. Instead of doing this and going through a middleman, A to B to C, shorten it if you can. Go straight from the source to the destination, A to C.

3. Watch out for loops

Three, watch out for loops. Similarly what can happen is you redirect position A to URL B to another version C and then back to A. What happens is it’s chasing its tail. It will never resolve, so you’re redirecting it in a loop. So watch out for things like that. One way to check those things I think is a nifty tool, Screaming Frog has a redirect chains report. So you can see if you’re kind of encountering any of those issues after you’ve implemented your redirects.

4. 404 strategically

Number four, 404 strategically. The presence of 404s on your site alone, that is not going to hurt your site’s rankings. It is letting pages die that were ranking and bringing your site traffic that is going to cause issues. Obviously, if a page is 404ing, eventually Google is going to take that out of the index if you don’t redirect it to its new location. If that page was ranking really well, if it was bringing your site traffic, you’re going to lose the benefits of it. If it had links to it, you’re going to lose the benefits of that backlink if it dies.

So if you’re going to 404, just do it strategically. You can let pages die. Like in these situations, maybe you’re just outright deleting a page and it has no new location, nothing relevant to redirect it to. That’s okay. Just know that you’re going to lose any of the benefits that URL was bringing your site.

5. Prioritize “SEO valuable” URLs

Number five, prioritize “SEO valuable” URLs, and I do that because I prefer to obviously redirect everything that you’re moving, everything that’s legitimately moving.

But because of situations like deadlines and things like that, when we’re down to the wire, I think it’s really important to at least have started out with your most important URLs. So those are URLs that are ranking really well, giving you a lot of good traffic, URLs that you’ve earned links to. So those really SEO valuable URLs, if you have a deadline and you don’t get to finish all of your redirects before this project goes live, at least you have those most critical, most important URLs handled first.

Again, obviously, it’s not ideal, I don’t think in my mind, to save any until after the launch. Obviously, I think it’s best to have them all set up by the time it goes live. But if that’s not the case and you’re getting rushed and you have to launch, at least you will have handled the most important URLs for SEO value.

6. Test!

Number six, just to end it off, test. I think it’s super important just to monitor these things, because you could think that you have set these all up right, but maybe there were some formatting errors, or maybe you mistakenly redirected something to the wrong place. It is super important just to test. So what you can do, you can do a site:domain.com and just start clicking on all the results that come up and see if any are redirecting to the wrong place, maybe they’re 404ing.

Just checking all of those indexed URLs to make sure that they’re going to a proper new destination. I think Moz’s Site Crawl is another huge benefit here for testing purposes. What it does, if you have a domain set up or a URL set up in a campaign in Moz Pro, it checks this every week, and you can force another run if you want it to.

But it will scan your site for errors like this, 404s namely. So if there are any issues like that, 500 or 400 type errors, Site Crawl will catch it and notify you. If you’re not managing the domain that you’re working on in a campaign in Moz Pro, there’s on-demand crawl too. So you can run that on any domain that you’re working on to test for things like that.

There are plenty of other ways you can test and find errors. But the most important thing to remember is just to do it, just to test and make sure that even once you’ve implemented these things, that you’re checking and making sure that there are no issues after a launch. I would check right after a launch and then a couple of days later, and then just kind of taper off until you’re absolutely positive that everything has gone smoothly.

So those are my tips, those are my rules for how to implement redirects properly, why you need to, when you need to, and the risks that can happen with that. If you have any tips of your own that you’d like to share, pop them in the comments and share it with all of us in the SEO community. That’s it for this week’s Whiteboard Friday.

Come back again next week for another one. Thanks, everybody.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Full Funnel Testing: SEO & CRO Together – Whiteboard Friday

Posted by willcritchlow

Testing for only SEO or only CRO isn’t always ideal. Some changes result in higher conversions and reduced site traffic, for instance, while others may rank more highly but convert less well. In today’s Whiteboard Friday, we welcome Will Critchlow as he demonstrates a method of testing for both your top-of-funnel SEO changes and your conversion-focused CRO changes at once.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, everyone. Welcome to another Whiteboard Friday. My name is Will Critchlow, one of the founders at Distilled. If you’ve been following what I’ve been writing and talking about around the web recently, today’s topic may not surprise you that much. I’m going to be talking about another kind of SEO testing.

Over at Distilled, we’ve been investing pretty heavily in building out our capability to do SEO tests and in particular built our optimization delivery network, which has let us do a new kind of SEO testing that hasn’t been previously available to most of our clients. Recently we’ve been working on a new enhancement to this, which is full funnel testing, and that’s what I want to talk about today.

So funnel testing is testing all the way through the funnel, from acquisition at the SEO end to conversion. So it’s SEO testing plus CRO testing together. I’m going to write a little bit more about some of the motivation for this. But, in a nutshell, it essentially boils down to the fact that it is perfectly possible, in fact we’ve seen in the wild cases of tests that win in SEO terms and lose in CRO terms or vice versa.

In other words, tests that maybe you make a change and it converts better, but you lose organic search traffic. Or the other way around, it ranks better, but it converts less well. If you’re only testing one, which is common — I mean most organizations are only testing the conversion rate side of things — it’s perfectly possible to have a winning test, roll it out, and do worse.

CRO testing

So let’s step back a little bit. A little bit of a primer. Conversion rate optimization testing works in an A/B split kind of way. You can test on a single page, if you want to, or a site section. The way it works is you split your audience. So your audience is split. Some of your audience gets one version of the page, and the rest of the audience gets a different version.

Then you can compare the conversion rate among the group who got the control and the group who got the variant. That’s very straightforward. Like I say, it can happen on a single page or across an entire site. SEO testing, a little bit newer. The way this works is you can’t split the audience, because we care very much about the search engine spiders in this case. For the purposes of this consideration, there’s essentially only one Googlebot. So you couldn’t put Google in Class A or Class B here and expect to get anything meaningful.

SEO testing

So the way that we do an SEO test is we actually split the pages. To do this, you need a substantial site section. So imagine, for example, an e-commerce website with thousands of products. You might have a hypothesis of something that will help those product pages perform better. You take your hypothesis and you only apply it to some of the pages, and you leave some of the pages unchanged as a control.

Then, crucially, search engines and users see the same experience. There’s no cloaking going on. There’s no duplication of content. You simply change some pages and not change others. Then you apply kind of advanced mathematical, statistical analysis trying to figure out do these pages get statistically more organic search traffic than we think they would have done if we hadn’t made this change. So that’s how an SEO test works.

Now, as I said, the problem that we are trying to tackle here is it’s really plausible, despite Google’s best intentions to do what’s right for users, it’s perfectly plausible that you can have a test that ranks better but converts less well or vice versa. We’ve seen this with, for example, removing content from a page. Sometimes having a cleaner, simpler page can convert better. But maybe that was where the keywords were and maybe that was helping the page rank. So we’re trying to avoid those kinds of situations.

Full funnel testing

That’s where full funnel testing comes in. So I want to just run through how you run a full funnel test. What you do is you first of all set it up in the same way as an SEO test, because we’re essentially starting with SEO at the top of the funnel. So it’s set up exactly the same way.

Some pages are unchanged. Some pages get the hypothesis applied to them. As far as Google is concerned, that’s the end of the story, because on any individual request to these pages that’s what we serve back. But the critically important thing here is I’ve got my little character. This is a human browser performs a search, “What do badgers eat?”

This was one of our silly examples that we came up with on one of our demo sites. The user lands on this page here. What we do is we then set a cookie. This is a cookie. This user then, as they navigate around the site, no matter where they go within this site section, they get the same treatment, either the control or the variant. They get the same treatment across the entire site section. This is more like the conversion rate test here.

Googlebot = stateless requests

So what I didn’t show in this diagram is if you were running this test across a site section, you would cookie this user and make sure that they always saw the same treatment no matter where they navigated around the site. So because Googlebot is making stateless requests, in other words just independent, one-off requests for each of these of these pages with no cookie set, Google sees the split.

Evaluate SEO test on entrances

Users get whatever their first page impression looks like. They then get that treatment applied across the entire site section. So what we can do then is we can evaluate independently the performance in search, evaluate that on entrances. So do we get significantly more entrances to the variant pages than we would have expected if we hadn’t applied a hypothesis to them?

That tells us the uplift from an SEO perspective. So maybe we say, “Okay, this is plus 11% in organic traffic.” Well, great. So in a vacuum, all else being equal, we’d love to roll out this test.

Evaluate conversion rate on users

But before we do that, what we can do now is we can evaluate the conversion rate, and we do that based on user metrics. So these users are cookied.

We can also set an analytics tag on them and say, “Okay, wherever they navigate around, how many of them end up converting?” Then we can evaluate the conversion rate based on whether they saw treatment A or treatment B. Because we’re looking at conversion rate, the audience size doesn’t exactly have to be the same. So the statistical analysis can take care of that fact, and we can evaluate the conversion rate on a user-centric basis.

So then we maybe see that it’s -5% in conversion rate. We then need to evaluate, “Is this something we should roll out?” So step 1 is: Do we just roll it out? If it’s a win in both, then the answer is yes probably. If they’re in different directions, then there are couple things we can do. Firstly, we can evaluate the relative performance in different directions, taking care that conversion rate applies generally across all channels, and so a relatively small drop in conversion rate can be a really big deal compared to even an uplift in organic traffic, because the conversion rate is applying to all channels, not just your organic traffic channel.

But suppose that it’s a small net positive or a small net negative. What we can then do is we might get to the point that it’s a net positive and roll it out. Either way, we might then say, “What can we take from this? What can we actually learn?” So back to our example of the content. We might say, “You know what? Users like this cleaner version of the page with apparently less content on it.The search engines are clearly relying on that content to understand what this page is about. How do we get the best of both worlds?”

Well, that might be a question of a redesign, moving the layout of the page around a little bit, keeping the content on there, but maybe not putting it front and center to the user as they land right at the beginning. We can test those different things, run sequential tests, try and take the best of the SEO tests and the best of the CRO tests and get it working together and crucially avoid those situations where you think you’ve got a win, because your conversion rate is up, but you actually are about to crater your organic search performance.

We think this is going to just be the more data-driven we get, the more accountable SEO testing makes us, the more important it’s going to be to join these dots and make sure that we’re getting true uplifts on a net basis when we combine them. So I hope that’s been useful to some of you. Thank you for joining me on this week’s Whiteboard Friday. I’m Will Critchlow from Distilled.

Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

How to Get Into Google News – Whiteboard Friday

Posted by Polemic

Today we’re tackling a question that many of us have asked over the years: how do you increase your chances of getting your content into Google News? We’re delighted to welcome renowned SEO specialist Barry Adams to share the framework you need to have in place in order to have a chance of appearing in that much-coveted Google News carousel.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, everyone. I’m Barry Adams. I’m a technical SEO consultant at Polemic Digital and a specialist in news SEO. Today we’re going to be talking about how to get into Google News. I get a lot of questions from a lot of people about Google News and specifically how you get a website into Google News, because it’s a really great source of traffic for websites. Once you’re in the Google News Index, you can appear in the top stories carousel in Google search results, and that can send a lot of traffic your way.

How do you get into Google News’ manually curated index?

So how do you get into Google News? How do you go about getting your website to be a part of Google News’ manual index so that you can get that top stories traffic for yourself? Well, it’s not always as easy as it makes it appear. You have to jump through quite a few hoops before you get into Google News.

1. Have a dedicated news website

First of all, you have to have a dedicated news website. You have to keep in mind when you apply to be included in Google News, there’s a team of Googlers who will manually review your website to decide whether or not you’re worthy of being in the News index. That is a manual process, and your website has to be a dedicated news website.

I get a lot of questions from people asking if they have a news section or a blog on their site and if that could be included in Google News. The answer tends to be no. Google doesn’t want news websites in there that aren’t entirely about news, that are commercial websites that have a news section. They don’t really want that. They want dedicated news websites, websites whose sole purpose is to provide news and content on specific topics and specific niches.

So that’s the first hurdle and probably the most important one. If you can’t clear that hurdle, you shouldn’t even try getting into Google News.

2. Meet technical requirements

There are also a lot of other aspects that go into Google News. You have to jump through, like I said, quite a few hoops. Some technical requirements are very important to know as well.

Have static, unique URLs.

Google wants your articles and your section pages to have static, unique URLs so that an article or a section is always on the same URL and Google can crawl it and recrawl it on that URL without having to work with any redirects or other things. If you have content with dynamically generated URLs, that does not tend to work with Google News very well. So you have to keep that in mind and make sure that your content, both your articles and your static section pages are on fixed URLs that tend not to change over time.

Have your content in plain HTML.

It also helps to have all your content in plain HTML. Google News, when it indexes your content, it’s all about speed. It tries to index articles as fast as possible. So any content that requires like client-side JavaScript or other sort of scripting languages tends not to work for Google News. Google has a two-stage indexing process, where the first stage is based on the HTML source code and the second stage is based on a complete render of the page, including executing JavaScript.

For Google News, that doesn’t work. If your content relies on JavaScript execution, it will never be seen by Google News. Google News only uses the first stage of indexing, based purely on the HTML source code. So keep your JavaScript to a minimum and make sure that the content of your articles is present in the HTML source code and does not require any JavaScript to be seen to be present.

Have clean code.

It also helps to have clean code. By clean code, I mean that the article content in the HTML source code should be one continuous block of code from the headline all the way to the end. That tends to result in the best and most efficient indexing in Google News, because I’ve seen many examples where websites put things in the middle of the article code, like related articles or video carousels, photo galleries, and that can really mess up how Google News indexes the content. So having clean code and make sure the article code is in one continuous block of easily understood HTML code tends to work the best for Google News.

3. Optional (but more or less mandatory) technical considerations

There’s also quite a few other things that are technically optional, but I see them as pretty much mandatory because it really helps with getting your content picked up in Google News very fast and also makes sure you get that top stories carousel position as fast as possible, which is where you will get most of your news traffic from.

Have a news-specific XML sitemap.

Primarily the news XML sitemap, Google says this is optional but recommended, and I agree with them on that. Having a news-specific XML sitemap that lists articles that you’ve published in the last 48 hours, up to a maximum of 1,000 articles, is absolutely necessary. For me, I think this is Google News’ primary discovery mechanism when they crawl your website and try to find new articles.

So that news-specific XML sitemap is absolutely crucial, and you want to make sure you have that in place before you submit your site to Google News.

Mark up articles with NewsArticle structured data.

I also think it’s very important to mark up your articles with news article structured data. It can be just article structured data or even more specific structured data segments that Google is introducing, like news article analysis and news article opinion for specific types of articles.

But article or news article markup on your article pages is pretty much mandatory. I see your likelihood of getting into the top stories carousel much improved if you have that markup implemented on your article pages.

Helpful-to-have extras:

Also, like I said, this is a manually curated index. So there are a few extra hoops that you want to jump through to make sure that when a Googler looks at your website and reviews it, it ticks all the boxes and it appears like a trustworthy, genuine news website.

A. Multiple authors

Having multiple authors contribute to your website is hugely valuable, hugely important, and it does tend to elevate you above all the other blogs and small sites that are out there and makes it a bit more likely that the Googler reviewing your site will press that Approve button.

B. Daily updates

Having daily updates definitely is necessary. You don’t want just one news post every couple of days. Ideally, multiple new articles every single day that also should be unique. You can have some sort of syndicated content on there, like from feeds, from AP or Reuters or whatever, but the majority of your content needs to be your own unique content. You don’t want to rely too much on syndicated articles to fill your website with news content.

C. Mostly unique content

Try to write as much unique content as you possibly can. There isn’t really a clear ratio for that. Generally speaking, I recommend my clients to have at least 70% of the content as unique stuff that they write themselves and publish themselves and only 30% maximum syndicated content from external sources.

D. Specialized niche/topic

It really helps to have a specialized niche or a specialized topic that you focus on as a news website. There are plenty of news sites out there that are general news and try to do everything, and Google News doesn’t really need many more of those. What Google is interested in is niche websites on specific topics, specific areas that can provide in-depth reporting on those specific industries or topics. So if you have a very niche topic or a niche industry that you cover with your news, it does tend to improve your chances of getting into that News Index and getting that top stories carousel traffic.

So that, in a nutshell, is how you get into Google News. It might appear to be quite simple, but, like I said, quite a few hoops for you to jump through, a few technical things you have to implement on your website as well. But if you tick all those boxes, you can get so much traffic from the top stories carousel, and the rest is profit. Thank you very much.

This has been my Whiteboard Friday.

Further resources:

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

On-Page SEO for 2019 – Whiteboard Friday

Posted by BritneyMuller

Whew! We made it through another year, and it seems like we’re past due for taking a close look at the health of our on-page SEO practices. What better way to hit the ground running than with a checklist? In today’s Whiteboard Friday, the fabulous Britney Muller shares her best tips for doing effective on-page SEO in 2019.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome to another edition of Whiteboard Friday. Today we’re going over all things on-page SEO, and I’ve divided it into three different sections:

  1. How are crawlers and Googlebot crawling through your site and your web pages?
  2. What is the UX of your on-page content?
  3. What is the value in the content of your on-page content?

So let’s just jump right in, shall we?

Crawler/bot-accessible

☑ Meta robots tag allows crawling

Making sure your meta robots tag allows crawling is essential. If that’s blocking Googlebot from crawling, your page will never be in search. You want to make sure that’s all panned out.

☑ Robots.txt doesn’t disallow crawling

You want to make sure that let’s say this page that you’re trying to get to rank in search engines, that you’re not disallowing this URL from your robots.txt.

☑ URL is included in sitemap

Similarly you want to make sure that the URL is in your site map.

☑ Schema markup

You also want to add any schema markup, any relevant schema markup that you can. This is essentially spoon-feeding search engines what your page is about and what your content is about.

☑ Internal links pointing to your page with natural anchor text

So let’s say I am trying to rank for chakra stones. Maybe I’m on a yoga website and I want to make sure that I have other internal pages linking to chakra stones with the anchor text “chakra crystals” or “chakra stones” and making sure that I’m showing Google that this is indeed an internally linked page and it’s important and we want to give it some weight.

☑ HTTPS – SSL

You want to make sure that that is secure and that Google is taking that into consideration as well.

User experience

☑ Meets Web Content Accessibility Guidelines

Does it meet Web Content Accessibility Guidelines? Definitely look into that and make sure you check all the boxes.

☑ Responsive mobile design with same content and links

Is it responsive for mobile? Super important with the mobile-first indexing.

☑ Clear CTA

Is there one clear call to action? A lot of pages miss this. So, for this page, maybe I would have a big “Buy Chakra Crystals Here” button or link. That would be a clear CTA. It’s important to have.

☑ Multimedia: Evaluate SERP and add desired media

Are you providing other desired media types? Are there images and video and different forms of content on your page?

☑ Page speed: utilize CDNs, compress images, use reliable hosting

Are you checking the page speed? Are you using CDNs? Are you compressing your images? You want to check all of that.

☑ Integrate social sharing buttons

It’s the easiest thing. Make sure that people can easily share your content.

Content and value

This is where it gets really fun and strategic too.

☑ Unique, high-quality content

Are you providing high-quality content? So if you go to Google and you search “chakra stones” and you take a look at all of those results, are you including all of that good content into your page? Then are you making it even better? Because that should be the goal.

☑ Optimize for intent: Evaluate SERP and PPC, note which SERP features show up

You want to also optimize for intent. So you want to evaluate that SERP. If that search result page is showing tons of images or maybe videos, you should be incorporating that into your page as well, because clearly that’s what people are looking for.

You also want to evaluate the PPC. They have done so much testing on what converts and what doesn’t. So it’s silly not to take that into consideration when optimizing your page.

☑ Title tags and meta descriptions

What are those titles? What are those descriptions? What’s working? Title tags and meta description are still so important. This is the first impression to many of your visitors in Google. Are you enticing a click? Are you making that an enticing call to action to your site?

☑ Header tags

H1, H2, and H3 header tags are still super important. You want to make sure that the title of your page is the H1 and so forth. But just to check on all of that would be good.

☑ Optimize images: compress, title file names, add alt text

Images are the biggest source of bloat of on-page site speed. So you want to make sure that your images are compressed and optimized and keeping your page fast and easily accessible to your users.

☑ Review for freshness

You want to review for freshness. We want to make sure that this is up-to-date content. Maybe take a look at popular content the last year or two of your site and update that stuff. This should be a continual wash and repeat. You want to continue to update the content on your site.

☑ Include commonly asked questions

It’s such an easy thing to do, but it’s commonly overlooked. AnswerThePublic does a great job of surfacing questions. Moz Keyword Explorer has a really great filter that provides some of the most commonly asked questions for a keyword term. I highly suggest you check that out and start to incorporate some of that.

Find common questions now

These help to target featured snippets. So if you’re incorporating some of that, not only do you get the extra traffic, but you find these opportunities of getting featured snippets, which is great. You’re expanding your real estate in search. Awesome. PAA boxes are also a great way to find commonly asked questions for a particular keyword.

☑ Add summaries

Summaries are also hidden gems. We see Google seeking out summaries for content all of the time. They are providing summaries in featured snippets and in different SERP features to help sort of distill information for users. So if you can do that, not only will you make your content more easily scannable, but you’re also making it more accessible for search, which is great.

☑ TF-IDF (term frequency-inverse document frequency)

TF-IDF stands for “term frequency-inverse document frequency.” It sounds a little intimidating. It’s actually pretty simple. What’s the number of times that “chakra stones” is mentioned in this particular page divided by the number of times it’s mentioned anywhere? This is basically just a calculation to determine relevance for the term “chakra stones.” Really cool and commonly used by Google. So if you can do this on your on-page, it will just help you in the long term.

☑ LSI (latent semantic indexing) for relevance

Similarly LSI or LSA, it sometimes referred to, is latent semantic indexing, and it’s also for relevance. This helps determine, okay, if I’m talking about chakra stones, it may also incorporate those other topics that are commonly related to this topic. Relevant.

☑ Flesch-Kincaid Readability Test

What is the readability of this page? The easier it is to read the better, but you just want to keep an eye on that in general.

Bonus tip!

One final tip that Kameron Jenkins put on Twitter, that I love so much, and Kameron is a world-class writer —she’s one of the best I’ve ever had the privilege of working with — mentioned this on-page SEO trick. Find the top three ranking URLs for your target keyword.

So if I were to put in “chakra stones” in Google and pull the top three URLs, put them into Moz Keyword Explorer and I see what they’re ranking for, I see what those three URLs are specifically ranking for, and I look at what they’re commonly ranking for in the middle here. Then I use those keywords to optimize my page even better. It’s genius. It’s very similar to some of the relevant stuff we were talking about over here.

Discover new keyword ideas

So definitely try some of this stuff out. I hope this helps. I really look forward to any of your comments or questions down below in the comments section.

Thank you so much for joining me on this edition of Whiteboard Friday. I look forward to seeing you all again soon, so thanks. Have a good one.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Find More Articles

Posted in Latest NewsComments Off

Advert