Tag Archive | "Whiteboard"

3 Big Lessons from Interviewing John Mueller at SearchLove London – Whiteboard Friday

Posted by willcritchlow

When you’ve got one of Google’s most helpful and empathetic voices willing to answer your most pressing SEO questions, what do you ask? Will Critchlow recently had the honor of interviewing Google’s John Mueller at SearchLove London, and in this week’s edition of Whiteboard Friday he shares his best lessons from that session, covering the concept of Domain Authority, the great subdomain versus subfolder debate, and a view into the technical workings of noindex/nofollow.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, Whiteboard Friday fans. I’m Will Critchlow from Distilled, and I found myself in Seattle, wanted to record another Whiteboard Friday video and talk through some things that I learned recently when I got to sit down with John Mueller from Google at our SearchLove London conference recently.

So I got to interview John on stage, and, as many of you may know, John is a webmaster relations guy at Google and really a point of contact for many of us in the industry when there are technical questions or questions about how Google is treating different things. If you followed some of the stuff that I’ve written and talked about in the past, you’ll know that I’ve always been a little bit suspicious of some of the official lines that come out of Google and felt like either we don’t get the full story or we haven’t been able to drill in deep enough and really figure out what’s going on.

I was under no illusions that I might be able to completely fix this this in one go, but I did want to grill John on a couple of specific things where I felt like we hadn’t maybe asked things clearly enough or got the full story. Today I wanted to run through a few things that I learned when John and I sat down together. A little side note, I found it really fascinating doing this kind of interview. I sat on stage in a kind of journalistic setting. I had never done this before. Maybe I’ll do a follow-up Whiteboard Friday one day on things I learned and how to run interviews.

1. Does Google have a “Domain Authority” concept?

But the first thing that I wanted to quiz John about was this domain authority idea. So here we are on Moz. Moz has a proprietary metric called domain authority, DA. I feel like when, as an industry, we’ve asked Google, and John in particular, about this kind of thing in the past, does Google have a concept of domain authority, it’s got bundled up with feeling like, oh, he’s had an easy way out of being able to answer and say, “No, no, that’s a proprietary Moz metric. We don’t have that.”

I felt like that had got a bit confusing, because our suspicion is that there is some kind of an authority or a trust metric that Google has and holds at a domain level. We think that’s true, but we felt like they had always been able to wriggle out of answering the question. So I said to John, “Okay, I am not asking you do you use Moz’s domain authority metric in your ranking factors. Like we know that isn’t the case. But do you have something a little bit like it?”

Yes, Google has metrics that map into similar things

John said yes. He said yes, they have metrics that, his exact quote was, “map into similar things.”My way of phrasing this was this is stuff that is at the domain level. It’s based on things like link authority, and it is something that is used to understand performance or to rank content across an entire domain. John said yes, they have something similar to that.

New content inherits those metrics

They use it in particular when they discover new content on an existing domain. New content, in some sense, can inherit some of the authority from the domain, and this is part of the reason why we figured they must have something like this, because we’ve seen identical content perform differently on different sites. We know that there’s something to this. So yes, John confirmed that until they have some of those metrics developed, when they’ve seen a bit of content for long enough, and it can have its own link metrics and usage metrics, in the intervening time up until that point it can inherit some of this stuff from the domain.

Not wholly link-based

He did also just confirm that it’s not just link-based. This is not just a domain-level PageRank type thing.

2. Subdomains versus subfolders

This led me into the second thing that I really wanted to get out of him, which was — and when I raised this, I got kind of an eye roll, “Are we really going down this rabbit hole” — the subdomain versus subfolder question. You might have seen me talk about this. You might have seen people like Rand talk about this, where we’ve seen cases and we have case studies of moving blog.example.com to example.com/blog and changing nothing else and getting an uplift.

We know something must be going on, and yet the official line out of Google has for a very long time been: “We don’t treat these things differently. There is nothing special about subfolders. We’re perfectly happy with subdomains. Do whatever is right for your business.” We’ve had this kind of back-and-forth a few times. The way I put it to John was I said, “We have seen these case studies. How would you explain this?”

They try to figure out what belongs to the site

To his credit, John said, “Yes, we’ve seen them as well.” So he said, yes, Google has also seen these things. He acknowledged this is true. He acknowledged that it happens. The way he explained it connects back into this Domain Authority thing in my mind, which is to say that the way they think about it is: Are these pages on this subdomain part of the same website as things on the main domain?

That’s kind of the main question. They try and figure out, as he put it, “what belongs to this site.” We all know of sites where subdomains are entirely different sites. If you think about a blogspot.com or a WordPress.com domain, subdomains might be owned and managed by entirely different people, and there would be no reason for that authority to pass across. But what Google is trying to do and is trying to say, “Is this subdomain part of this main site?”

Sometimes this includes subdomains and sometimes not

He said sometimes they determine that it is, and sometimes they determine that it is not. If it is part of the site, in their estimation, then they will treat it as equivalent to a subfolder. This, for me, pretty much closes this loop. I think we understand each other now, which is Google is saying, in these certain circumstances, they will be treated identically, but there are circumstances where it can be treated differently.

My recommendation stays what it’s always been, which is 100% if you’re starting from the outset, put it on a subfolder. There’s no upside to the subdomain. Why would you risk the fact that Google might treat it as a separate site? If it is currently on a subdomain, then it’s a little trickier to make that case. I would personally be arguing for the integration and for making that move.

If it’s treated as part of the site, a subdomain is equivalent to a subfolder

But unfortunately, but somewhat predictably, I couldn’t tie John down to any particular way of telling if this is the case. If your content is currently on a subdomain, there isn’t really any way of telling if Google is treating it differently, which is a shame, but it’s somewhat predictable. But at least we understand each other now, and I think we’ve kind of got to the root of the confusion. These case studies are real. This is a real thing. Certainly in certain circumstances moving from the subdomain to the subfolder can improve performance.

3. Noindex’s impact on nofollow

The third thing that I want to talk about is a little bit more geeked out and technical, and also, in some sense, it leads to some bigger picture lessons and thinking. A little while ago John kind of caught us out by talking about how if you have a page that you no index and keep it that way for a long time, that Google will eventually treat that equivalently to a no index, no follow.

In the long-run, a noindex page’s links effectively become nofollow

In other words, the links off that page, even if you’ve got it as a no index, follow, the links off that page will be effectively no followed. We found that a little bit confusing and surprising. I mean I certainly felt like I had assumed it didn’t work that way simply because they have the no index, follow directive, and the fact that that’s a thing seems to suggest that it ought to work that way.

It’s been this way for a long time

It wasn’t really so much about the specifics of this, but more the like: How did we not know this? How did this come about and so forth? John talked about how, firstly, it has been this way for a long time. I think he was making the point none of you all noticed, so how big a deal can this really be? I put it back to him that this is kind of a subtle thing and very hard to test, very hard to extract out the different confounding factors that might be going on.

I’m not surprised that, as an industry, we missed it. But the point being it’s been this way for a long time, and Google’s view and certainly John’s view was that this hadn’t been hidden from us so much as the people who knew this hadn’t realized that they needed to tell anyone. The actual engineers working on the search algorithm, they had a curse of knowledge.

The curse of knowledge: engineers didn’t realize webmasters had the wrong idea

They knew it worked this way, and they had never realized that webmasters didn’t know that or thought any differently. This was one of the things that I was kind of trying to push to John a little more was kind of saying, “More of this, please. Give us more access to the engineers. Give us more insight into their way of thinking. Get them to answer more questions, because then out of that we’ll spot the stuff that we can be like, ‘Oh, hey, that thing there, that was something I didn’t know.’ Then we can drill deeper into that.”

That led us into a little bit of a conversation about how John operates when he doesn’t know the answer, and so there were some bits and pieces that were new to me at least about how this works. John said he himself is generally not attending search quality meetings. The way he works is largely off his knowledge and knowledge base type of content, but he has access to engineers.

They’re not dedicated to the webmaster relations operation. He’s just going around the organization, finding individual Google engineers to answer these questions. It was somewhat interesting to me at least to find that out. I think hopefully, over time, we can generally push and say, “Let’s look for those engineers. John, bring them to the front whenever they want to be visible, because they’re able to answer these kinds of questions that might just be that curse of knowledge that they knew this all along and we as marketers hadn’t figured out this was how things worked.”

That was my quick run-through of some of the things that I learned when I interviewed John. We’ll link over to more resources and transcripts and so forth. But it’s been a blast. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Content Comprehensiveness – Whiteboard Friday

Posted by KameronJenkins

When Google says they prefer comprehensive, complete content, what does that really mean? In this week’s episode of Whiteboard Friday, Kameron Jenkins explores actionable ways to translate the demands of the search engines into valuable, quality content that should help you rank.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, guys. Welcome to this week’s edition of Whiteboard Friday. My name is Kameron Jenkins, and I work here at Moz.

Today we’re going to be talking about the quality of content comprehensiveness and what that means and why sometimes it can be confusing. I want to use an example scenario of a conversation that tends to go on between SEOs and Google. So here we go.

An SEO usually says something like, “Okay, Google, you say you want to rank high-quality content. But what does that really mean? What is high quality, because we need more specifics than that.”

Then Google goes, “Okay, high quality is something that’s comprehensive and complete. Yeah, it’s really comprehensive.” SEOs go, “Well, wait. What does that even mean?”

That’s kind of what this was born out of. Just kind of an explanation of what is comprehensive, what does Google mean when they say that, and how that differs depending on the query.

Here we have an example page, and I’ll kind of walk you through it. It’s just going to serve to demonstrate why when Google says “comprehensive,” that can mean something different for an e-commerce page than it would for a history of soccer page. It’s really going to differ depending on the query, because people want all sorts of different kinds of things. Their intent is going to be different depending on what they’re searching in Google. So the criteria is going to be different for comprehensiveness. So hopefully, by way of example, we’ll be able to kind of walk you through what comprehensiveness looks like for this one particular query. So let’s just dive in.

1. Intent

All right. So first I’m going to talk about intent. I have here a Complete Guide to Buying a House. This is the query I used as an example. Before we dive in, even before we look into keyword research tools or anything like that, I think it’s really important to just like let the query sit with you for a little bit. So “guide to buying a house,” okay, I’m going to think about that and think about what the searcher probably wanted based on the query.

So first of all, I noticed “guide.” The word “guide” to me makes it sound like someone wants something very complete, very thorough. They don’t just want quick tips. They don’t want a quick bullet list. This can be longer, because someone is searching for a comprehensive guide.

“To buying a house,” that’s a process. That’s not like an add-to-cart like Amazon. It’s a step-by-step. There are multiple phases to that type of process. It’s really important to realize here that they’re probably looking for something a little lengthier and something that is maybe a step-by-step process.

And too, you just look at the query, “guide to buying a house,” people are probably searching that if they’ve never bought a house before. So if they’ve never bought a house before, it’s just good to remember that your audience is in a phase where they have no idea what they’re doing. It’s important to understand your audience and understand that this is something that they’re going to need very, very comprehensive, start-to-finish information on it.

2. Implications

Two, implications. This is again also before we get into any keyword research tools. By implications, I mean what is going to be the effect on someone after reading this? So the implications here, a guide to buying a house, that is a big financial decision. That’s a big financial purchase. It’s going to affect people’s finances and happiness and well-being, and Google actually has a name for that. In their Quality Rater Guidelines, they call that YMYL. So that stands for “your money, your life.”

Those types of pages are held to a really high standard, and rightfully so. If someone reads this, they’re going to get advice about how to spend their money. It’s important for us, as SEOs and writers crafting these types of pages, to understand that these are going to be held to a really high standard. I think what that could look like on the page is, because they’re making a big purchase like this, it might be a good sign of trustworthiness to maybe have some expert quotes in here. Maybe you kind of sprinkle those throughout your page. Maybe you actually have it written by an expert author instead of just Joe Schmoe blogger. Those are just some ideas for making a page really trustworthy, and I think that’s a key to comprehensiveness.

3. Subtopics

Number three here we have subtopics. There are two ways that I’ll walk you through finding subtopics to fit within your umbrella topic. I’m going to use Moz Keyword Explorer as an example of this.

Use Keyword Explorer to reveal subtopics

In Moz Keyword Explorer, you can search for different keywords and related keywords two different ways. You can type in a query. So you can type in something like “buy a house” or “home buying” or something like that. You start with your main topic, and what you’ll get as a result is a bunch of suggested keywords that you can also incorporate on your page, terms that are related to the term that you searched. This is going to be really great, because you’re going to start to notice themes emerge. Some of the themes I noticed were people tend to search for “home buying calculator,” like a can-I-afford-it type of calculator. A lot of people search financial-related things obviously, bad credit. I filed for bankruptcy, can I still buy a house? You’ll start to see subthemes emerge.

Then I also wanted to mention that, in Moz Keyword Explorer, you can also search by URL. What I might do is query my term that I’m trying to target on my page. I’m going to pick the top three URLs that are ranking. You pop them into Keyword Explorer, and you can compare them and you can see the areas of most overlap. So what you’ll get essentially is a list of keywords that the top ranking pages for that term also rank for. That’s going to be a really good way to mine some extra keyword ideas for your page to make it more comprehensive.

4. Questions

Then here we go. We have step four. After we’ve come up with some subtopics, I think it’s also a really good idea to mine questions and try to find what questions our audience is actually asking. So, for these, I like to use Answer the Public and Keywords Everywhere. Those are two really great tools that I kind of like to use in tandem.

Use Answer the Public to mine questions

Answer the Public, if you’ve never used it, is a really fun tool. You can put in a keyword, and you get a huge list. Depending on how vague your query is, you might get a ton of ideas. If your query is really specific, you might not get as many keyword ideas back. But it’s a really great way to type in a keyword, like “buying a house” or “buy a house” or “home buying” or something like that, and get a whole, big, long list of questions that your audience is asking. People that want to know how to buy a house, they’re also asking these questions.

I think a comprehensive page will answer those questions. But it can be a little bit overwhelming. There’s going to be probably a lot of questions potentially to answer. So how do you prioritize and choose which questions are the best to address on your page?

Use Keywords Everywhere to highlight keywords on a page

That’s where the Keywords Everywhere plug-in comes in handy. I use it in Chrome. You can have it highlight the keywords on the page. I think I have mine set to highlight anything that’s searched 50 or more times a month. That’s a really good way to gauge, just right off the bat you can see, okay, now there are these 10 instead of these 100 questions to potentially answer on my page.

So examples of questions here, I have questions like: Can I afford this? Is now the right time to buy? So you can kind of fit those into your page and answer those questions.

5. Trends

Then finally here I have trends. I think this is a really commonly missed step. It’s important to remember that a lot of terms have seasonality attached to them. So what I did with this query, I queried “buy a house,” and I wanted to see if there were any trends for home buying-type of research queries in Google Trends. I zoomed out to five years to see if I could see year-over-year if there were any trends that emerged.

That was totally the case. When people are searching “buy a house,” it’s at its peak kind of around January into spring, and then in the summer it starts to dive, and then it’s at its lowest during the holidays. That kind of shows you that people are researching at the beginning of the year. They’re kind of probably moving into their house during the summertime, and then during the holidays they’ve had all the time to move in and now they’re just enjoying the holidays. That’s kind of the trend flow that it follows. That’s really key information, if you’re going to build a comprehensive page, to kind of understand that there’s seasonality attached with your term.

Because I know now that there’s seasonality with my term, I can incorporate information like what are the pros and cons of buying in peak season versus off-season for buying a house. Maybe what’s the best time of year to buy. Those are, again, other ideas for things that you can incorporate on your page to make it more comprehensive.

This page is not comprehensive. I didn’t have enough room to fit some things. So you don’t just stop at this phase. If you’re really building a comprehensive page on this topic, don’t stop where I stopped. But this is kind of just an example of how to go about thinking through what Google means when they say make a page comprehensive. It’s going to mean something different depending on your query and just keep that in mind. Just think about the query, think about what your audience wanted based on what they searched, and you’ll be off to a great start building a comprehensive page.

I hope that was helpful. If you have any ideas for building your own comprehensive page, how you do that, maybe how it differs in different industries that you’ve worked in, pop it in the comments. That would be really good for us to share that information. Come back again next week for another edition of Whiteboard Friday.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

What SEOs Can Learn from AdWords – Whiteboard Friday

Posted by DiTomaso

Organic and paid search aren’t always at odds; there are times when there’s benefit in knowing how they work together. Taking the time to know the ins and outs of AdWords can improve your rankings and on-site experience. In today’s edition of Whiteboard Friday, our fabulous guest host Dana DiTomaso explains how SEOs can improve their game by taking cues from paid search in this Whiteboard Friday.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, my name is Dana DiTomaso. I’m President and Partner at Kick Point, and one of the things that we do at Kick Point is we do both SEO and paid. One of the things that’s really useful is when SEO and paid work together. But what’s even better is when SEOs can learn from paid to make their stuff better.

One of the things that is great about AdWords or Google Ads — whenever you’re watching this, it may be called one thing or the other — is that you can learn a lot from what has a high click-through rate, what performs well in paid, and paid is way faster than waiting for Google to catch up to the awesome title tags you’ve written or the new link building that you’ve done to see how it’s going to perform. So I’m going to talk about four things today that you can learn from AdWords, and really these are easy things to get into in AdWords.

Don’t be intimidated by the interface. You can probably just get in there and look at it yourself, or talk to your AdWords person. I bet they’d be really excited that you know what a callout extension is. So we’re going to start up here.

1. Negative keywords

The first thing is negative keywords. Negative keywords, obviously really important. You don’t want to show up for things that you shouldn’t be showing up for.

Often when we need to take over an AdWords account, there aren’t a lot of negative keywords. But if it’s a well-managed account, there are probably lots of negatives that have been added there over time. What you want to look at is if there’s poor word association. So in your industry, cheap, free, jobs, and then things like reviews and coupons, if these are really popular search phrases, then maybe this is something you need to create content for or you need to think about how your service is presented in your industry.

Then what you can do to change that is to see if there’s something different that you can do to present this kind of information. What are the kinds of things your business doesn’t want? Are you definitely not saying these things in the content of your website? Or is there a way that you can present the opposite opinion to what people might be searching for, for example? So think about that from a content perspective.

2. Title tags and meta descriptions

Then the next thing are title tags and meta descriptions. Title tags and meta descriptions should never be a write it once and forget it kind of thing. If you’re an on-it sort of SEO, you probably go in every once in a while and try to tweak those title tags and meta descriptions. But the problem is that sometimes there are just some that aren’t performing. So go into Google Search Console, find the title tags that have low click-through rate and high rankings, and then think about what you can do to test out new ones.

Then run an AdWords campaign and test out those title tags in the title of the ad. Test out new ad copy — that would be your meta descriptions — and see what actually brings a higher click-through rate. Then whichever one does, ta-da, that’s your new title tags and your meta descriptions. Then add those in and then watch your click-through rate increase or decrease.

Make sure to watch those rankings, because obviously title tag changes can have an impact on your rankings. But if it’s something that’s keyword rich, that’s great. I personally like playing with meta descriptions, because I feel like meta descriptions have a bigger impact on that click-through rate than title tags do, and it’s something really important to think about how are we making this unique so people want to click on us. The very best meta description I’ve ever seen in my life was for an SEO company, and they were ranking number one.

They were obviously very confident in this ranking, because it said, “The people above me paid. The people below me aren’t as good as me. Hire me for your SEO.” I’m like, “That’s a good meta description.” So what can you do to bring in especially that brand voice and your personality into those titles, into those meta descriptions and test it out with ads first and see what’s going to resonate with your audience. Don’t just think about click-through rate for these ads.

Make sure that you’re thinking about conversion rate. If you have a really long sales cycle, make sure those leads that you’re getting are good, because what you don’t want to have happen is have an ad that people click on like crazy, they convert like crazy, and then the customers are just a total trash fire. You really want to make sure you’re driving valuable business through this kind of testing. So this might be a bit more of a longer-term piece for you.

3. Word combinations

The third thing you can look at are word combinations.

So if you’re not super familiar with AdWords, you may not be familiar with the idea of broad match modifier. So in AdWords we have broad phrases that you can search for, recipes, for example, and then anything related to the word “recipe” will show up. But you could put in a phrase in quotes. You could say “chili recipes.” Then if they say, “I would like a chili recipe,” it would come up.

If it says “chili crockpot recipes,” it would not come up. Now if you had + chili + recipes, then anything with the phrase “chili recipes” would come up, which can be really useful. If you have a lot of different keyword combinations and you don’t have time for that, you can use broad match modifier to capture a lot of them. But then you have to have a good negative keyword list, speaking as an AdWords person for a second.

Now one of the things that can really come out of broad match modifier are a lot of great, new content ideas. If you look at the keywords that people had impressions from or clicks from as a result of these broad match modifier keywords, you can find the strangest phrasing that people come up with. There are lots of crazy things that people type into Google. We all know this, especially if it’s voice search and it’s obviously voice search.

One of the fun things to do is look and see if anybody has “okay Google” and then the search phrase, because they said “okay Google” twice and then Google searched “okay Google” plus the phrase. That’s always fun to pick up. But you can also pick up lots of different content ideas, and this can help you modify poorly performing content for example. Maybe you’re just not saying the thing in the way in which your audience is saying it.

AdWords gives you totally accurate data on what your customers are thinking and feeling and saying and searching. So why not use that kind of data? So definitely check out broad match modifier stuff and see what you can do to make that better.

4. Extensions

Then the fourth thing is extensions. So extensions are those little snippets that can show up under an ad.

You should always have all of the extensions loaded in, and then maybe Google picks some, maybe they won’t, but at least they’re there as an option. Now one thing that’s great are callout extensions. Those are the little site links that are like free trial, and people click on those, or find out more information or menu or whatever it might be. Now testing language in those callout extensions can help you with your call-to-action buttons.

Especially if you’re thinking about things like people want to download a white paper, well, what’s the best way to phrase that? What do you want to say for things like a submit button for your newsletter or for a contact form? Those little, tiny pieces, that are called micro-copy, what can you do by taking your highest performing callout extensions and then using those as your call-to-action copy on your website?

This is really going to improve your lead click-through rate. You’re going to improve the way people feel about you, and you’re going to have that really nice consistency between the language that you see in your advertising and the language that you have on your website, because one thing you really want to avoid as an SEO is to get into that silo where this is SEO and this is AdWords and the two of you aren’t talking to each other at all and the copy just feels completely disjointed between the paid side and the organic side.

It should all be working together. So by taking the time to understand AdWords a little bit, getting to know it, getting to know what you can do with it, and then using some of that information in your SEO work, you can improve your on-site experience as well as rankings, and your paid person is probably going to appreciate that you talked to them for a little bit.

Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Articles

Posted in Latest NewsComments Off

YouTube SEO: Top Factors to Invest In – Whiteboard Friday

Posted by randfish

If you have an audience on YouTube, are you doing everything you can to reach them? Inspired by a large-scale study from Justin Briggs, Rand covers the top factors to invest in when it comes to YouTube SEO in this week’s episode of Whiteboard Friday.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about YouTube SEO. So I was lucky enough to be speaking at the Search Love Conference down in San Diego a little while ago, and Justin Briggs was there presenting on YouTube SEO and on a very large-scale study that he had conducted with I think it was 100,000 different video rankings across YouTube’s search engine as well as looking at the performance of many thousands of channels and individual videos in YouTube.

Justin came up with some fascinating results. I’ve called them out here @justinrbriggs on Twitter, and his website is Briggsby.com. You can find this study, including an immense amount of data, there. But I thought I would try and sum up some of the most important points that he brought up and some of the conclusions he came to in his research. I do urge you to check out the full study, especially if you’re doing YouTube SEO.

5 crucial elements for video ranking success

So first off, there are some crucial elements for video ranking success. Now video ranking success, what do we mean by that? We mean if you perform a search query in YouTube for a specific keyword, and not necessarily a branded one, what are the things that will come up? So sort of like the same thing we talk about when we talk about Google success ranking factors, these are success factors for YouTube. That doesn’t necessarily mean that these are the things that will get you the most possible views. In fact, some of them work the other way.

1. Video views and watch time

First off, video views and watch time. So it turns out these are both very well correlated and in Justin’s opinion probably causal with higher rankings. So if you have a video and you’re competing against a competitor’s video and you get more views and a greater amount of watch time on average per view — so that’s how many people make it through a greater proportion of the video itself –you tend to do better than your competitors.

2. Keyword matching the searcher’s query in the title

Number two, keyword matching still more important we think on YouTube than it is in classic Google search. That’s not to say it’s not important in classic Google, but that in YouTube it’s even more important. It’s even a bigger factor. Essentially what Justin’s data showed is that exact match keywords, exactly matching the keyword phrase in the video title tended to outperform partial by a little bit, and partial outperformed none or only some by a considerable portion.

So if you’re trying to rank your video for what pandas eat and your video is called “What Pandas Eat,”that’s going to do much better than, for example, “Panda Consumption Habits” or “Panda Food Choices.” So describe your video, name your video in the same way that searchers are searching, and you can get intel into how searchers are using YouTube.

You can also use the data that comes back from Google keyword searches, especially if videos appear at the top of Google keyword searches, that means there’s probably a lot of demand on YouTube as well.

3. Shorter titles (<50 characters) with keyword-rich descriptions

Next up, shorter titles, less than 50 characters, with keyword-rich descriptions between 200 and 350 words tended to perform best in this dataset.

So if you’re looking for guidelines around how big should I make my YouTube title, how big should I make my description, that’s generally probably some best practices. If you leak over a little bit, it’s not a huge deal. The curve doesn’t fall off dramatically. But certainly staying around there is a good idea.

4. Keyword tags

Number four, keyword tags. So YouTube will let you apply keyword tags to a video.

This is something that used to exist in Google SEO decades ago with the meta keywords tag. It still does exist in YouTube. These keyword tags seem to matter a little for rankings, but they seem to matter more for the recommended videos. So those recommended videos are sort of what appear on the right-hand side of the video player if you’re in a desktop view or below the video on a mobile player.

Those recommended videos are also kind of what play when you keep watching a video and it’s what comes up next. So those both figure prominently into earning you more views, which can then help your rankings of course. So using keyword tags in two to three word phrase elements and usually the videos that Justin’s dataset saw performing best were those with 31 to 40 unique tags, which is a pretty hefty number.

That means folks are going through and they’re taking their “What Pandas Eat” and they’re tagging it with pandas, zoo animals, mammals, and they might even be tagging it with marsupials — I think pandas are a marsupial — but those kinds of things. So they’re adding a lot of different tags on there, 31 to 40, and those tended to do the best.

So if you’re worried that adding too many keyword tags can hurt you, maybe it can, but not up until you get to a pretty high limit here.

5. Certain video lengths perform and rank well

Number five, the videos that perform best — I like that this correlates with how Whiteboard Fridays do well as well — 10 to 16 minutes in length tend to do best in the rankings. Under two minutes in length tend to be very disliked by YouTube’s audience. They don’t perform well. Four to six minutes get the most views. So it depends on what you’re optimizing for. At Whiteboard Friday, we’re trying to convey information and make it useful and interesting and valuable. So we would probably try and stick to 10 to 16 minutes. But if we had a promotional video, for example, for a new product that we were launching, we might try and aim for a four to six minute video to get the most views, the most amplification, the most awareness that we possibly could.

3 takeaways of interest

Three other takeaways of interest that I just found potentially valuable.

Older videos do better on average, but new videos get a boost

One is older videos on average tend to do better in the rankings, but new videos get a boost when they initially come out. So in the dataset, Justin created a great graph that looks like this –zero to two weeks after a video is published, two to six weeks, six to twelve weeks, and after a year, and there are a few other ones in here.

But you can see the slope of this curve follows this concept that there’s a fresh boost right here in those first two to six weeks, and it’s strongest in the first zero to two weeks. So if you are publishing regularly and you sort of have that like, “Oh, this video didn’t hit. Let me try again.This video didn’t hit. Oh, this one got it.This nailed what my audience was looking for.This was really powerful.” That seems to do quite well.

Channels help boost their videos

Channels is something Justin looked deeply into. I haven’t covered it much here, but he looked into channel optimization a lot. Channels do help boost their individual videos with things like subscribers who comment and like and have a higher watch time on average than videos that are disconnected from subscribers. He noted that about 1,000 or more subscriptions is a really good target to start to benefit from the metrics that a good subscriber base can bring. These tend to have a positive impact on views and also on rankings. Although whether that’s correlated or merely causal, hard to say.

Embeds and links are correlated, but unsure if causal

Again on the correlation but not causation, embeds and links. So the study looked at the rankings, higher rankings up here and lower rankings down there, versus embeds.

Videos that received more embeds, they were embedded on websites more, did tend to perform better. But through experimentation, we’re not quite clear if we can prove that by embedding a video a lot we can increase its rankings. So it could just be that as something ranks well and gets picked up a lot, many people embed it rather than many embeds lead to better rankings.

All right, everyone, if you’re producing video, which I probably recommend that you do if video is ranking in the SERPs that you care about or if your audience is on YouTube, hopefully this will be helpful, and I urge you to check out Justin’s research. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

The Difference Between URL Structure and Information Architecture – Whiteboard Friday

Posted by willcritchlow

Questions about URL structure and information architecture are easy to get confused, but it’s an important distinction to maintain. IA tends to be more impactful than URL decisions alone, but advice given around IA often defaults to suggestions on how to best structure your URLs. In this Whiteboard Friday, Will Critchlow helps us distinguish between the two disparate topics and shares some guiding questions to ask about each.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, everyone. Welcome to a British Whiteboard Friday. My name is Will Critchlow. I’m one of the founders of Distilled, and I wanted to go back to some basics today. I wanted to cover a little bit of the difference between URL structure and information architecture, because I see these two concepts unfortunately mixed up a little bit too often when people are talking about advice that they want to give.

I’m thinking here particularly from an SEO perspective. So there is a much broader study of information architecture. But here we’re thinking really about: What do the search engines care about, and what do users care about when they’re searching? So we’ll link some basics about things like what is URL structure, but we’re essentially talking here about the path, right, the bit that comes after the domain www.example.com/whatever-comes-next.

There’s a couple of main ways of structuring your URL. You can have kind of a subfolder type of structure or a much flatter structure where everything is kind of collapsed into the one level. There are pros and cons of different ways of doing this stuff, and there’s a ton of advice. You’re generally trading off considerations around, in general, it’s better to have shorter URLs than longer URLs, but it’s also better, on average, to have your keyword there than not to have your keyword there.

These are in tension. So there’s a little bit of art that goes into structuring good URLs. But too often I see people, when they’re really trying to give information architecture advice, ending up talking about URL structure, and I want to just kind of tease those things apart so that we know what we’re talking about.

So I think the confusion arises because both of them can involve questions around which pages exist on my website and what hierarchies are there between pages and groups of pages.

URL questions

So what pages exist is clearly a URL question at some level. Literally if I go to /shoes/womens, is that a 200 status? Is that a page that returns things on my website? That is, at its basics, a URL question. But zoom out a little bit and say what are the set of pages, what are the groups of pages that exist on my website, and that is an information architecture question, and, in particular, how they’re structured and how those hierarchies come together is an information architecture question.

But it’s muddied by the fact that there are hierarchy questions in the URL. So when you’re thinking about your red women’s shoes subcategory page on an e-commerce site, for example, you could structure that in a flat way like this or in a subfolder structure. That’s just a pure URL question. But it gets muddied with the information architecture questions, which we’ll come on to.

I think probably one of the key ones that comes up is: Where do your detail-level pages sit? So on an e-commerce site, imagine a product page. You could have just /product-slug. Ideally that would have some kind of descriptive keywords in it, rather than just being an anonymous number. But you can have it just in the root like this, or you can put it in a subfolder, the category it lives in.

So if this is a pair of red women’s shoes, then you could have it in /shoes/women/red slug, for example. There are pros and cons of both of these. I’m not going to get deep into it, but in general the point is you can make any of these decisions about your URLs independent of your information architecture questions.

Information architecture questions

Let’s talk about the information architecture, because these are actually, in general, the more impactful questions for your search performance. So these are things like, as I said at the beginning, it’s essentially what pages exist and what are their hierarchies.

  • How many levels of category and subcategory should we have on our website?
  • What do we do in our faceted navigation?
  • Do we go two levels deep?
  • Do we go three levels deep?
  • Do we allow all those pages to be crawled and indexed?
  • How do we link between things?
  • How do we link between the sibling products that are in the same category or subcategory?
  • How do we link back up the structure to the parent subcategory or category?
  • How do we crucially build good link paths out from the big, important pages on our website, so our homepage or major category pages?
  • What’s the link path that you can follow by clicking multiple links from there to get to detail level for every product on your website?

Those kind of questions are really impactful. They make a big difference, on an SEO front, both in terms of crawl depth, so literally a search engine spider coming in and saying, “I need to discover all these pages, all these detail-level pages on your website.” So what’s the click depth and crawl path out from those major pages?

Think about link authority and your link paths

It’s also a big factor in a link authority sense. Your internal linking structure is how your PageRank and other link metrics get distributed out around your website, and so it’s really critical that you have these great linking paths down into the products, between important products, and between categories and back up the hierarchy. How do we build the best link paths from our important pages down to our detail-level pages and back up?

Make your IA decisions before your URL structure decisions

After you have made whatever IA decisions you like, then you can independently choose your preferred URLs for each page type.

These are SEO information architecture questions, and the critical thing to realize is that you can make all of your information architecture decisions — which pages exist, which subcategories we’re going to have indexed, how we link between sibling products, all of this linking stuff — we can make all these decisions, and then we can say, independently of whatever decisions we made, we can choose any of the URL structures we like for what those actual pages’ paths are, what the URLs are for those pages.

We need to not get those muddied, and I see that getting muddied too often. People talk about these decisions as if they’re information architecture questions, and they make them first, when actually you should be making these decisions first and then picking the best, like I said, it’s a bit more art than science sometimes to making the decision between longer URLs, more descriptive URLs, or shorter URL paths.

So I hope that’s been a helpful intro to a basic topic. I’ve written a bunch of this stuff up in a blog post, and we’ll link to that. But yeah, I’ve enjoyed this Whiteboard Friday. I hope you have too. See you soon.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

How Do Sessions Work in Google Analytics? – Whiteboard Friday

Posted by Tom.Capper

One of these sessions is not like the other. Google Analytics data is used to support tons of important work, ranging from our everyday marketing reporting all the way to investment decisions. To that end, it’s integral that we’re aware of just how that data works.

In this week’s edition of Whiteboard Friday, we welcome Tom Capper to explain how the sessions metric in Google Analytics works, several ways that it can have unexpected results, and as a bonus, how sessions affect the time on page metric (and why you should rethink using time on page for reporting).

How do sessions work in Google Analytics?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hello, Moz fans, and welcome to another edition of Whiteboard Friday. I am Tom Capper. I am a consultant at Distilled, and today I’m going to be talking to you about how sessions work in Google Analytics. Obviously, all of us use Google Analytics. Pretty much all of us use Google Analytics in our day-to-day work.

Data from the platform is used these days in everything from investment decisions to press reporting to the actual marketing that we use it for. So it’s important to understand the basic building blocks of these platforms. Up here I’ve got the absolute basics. So in the blue squares I’ve got hits being sent to Google Analytics.

So when you first put Google Analytics on your site, you get that bit of tracking code, you put it on every page, and what that means is when someone loads the page, it sends a page view. So those are the ones I’ve marked P. So we’ve got page view and page view and so on as you’re going around the site. I’ve also got events with an E and transactions with a T. Those are two other hit types that you might have added.

The job of Google Analytics is to take all this hit data that you’re sending it and try and bring it together into something that actually makes sense as sessions. So they’re grouped into sessions that I’ve put in black, and then if you have multiple sessions from the same browser, then that would be a user that I’ve marked in pink. The issue here is it’s kind of arbitrary how you divide these up.

These eight hits could be one long session. They could be eight tiny ones or anything in between. So I want to talk today about the different ways that Google Analytics will actually split up those hit types into sessions. So over here I’ve got some examples I’m going to go through. But first I’m going to go through a real-world example of a brick-and-mortar store, because I think that’s what they’re trying to emulate, and it kind of makes more sense with that context.

Brick-and-mortar example

So in this example, say a supermarket, we enter by a passing trade. That’s going to be our source. Then we’ve got an entrance is in the lobby of the supermarket when we walk in. We got passed from there to the beer aisle to the cashier, or at least I do. So that’s one big, long session with the source passing trade. That makes sense.

In the case of a brick-and-mortar store, it’s not to difficult to divide that up and try and decide how many sessions are going on here. There’s not really any ambiguity. In the case of websites, when you have people leaving their keyboard for a while or leaving the computer on while they go on holiday or just having the same computer over a period of time, it becomes harder to divide things up, because you don’t know when people are actually coming and going.

So what they’ve tried to do is in the very basic case something quite similar: arrive by Google, category page, product page, checkout. Great. We’ve got one long session, and the source is Google. Okay, so what are the different ways that that might go wrong or that that might get divided up?

Several things that can change the meaning of a session

1. Time zone

The first and possibly most annoying one, although it doesn’t tend to be a huge issue for some sites, is whatever time zone you’ve set in your Google Analytics settings, the midnight in that time zone can break up a session. So say we’ve got midnight here. This is 12:00 at night, and we happen to be browsing. We’re doing some shopping quite late.

Because Google Analytics won’t allow a session to have two dates, this is going to be one session with the source Google, and this is going to be one session and the source will be this page. So this is a self-referral unless you’ve chosen to exclude that in your settings. So not necessarily hugely helpful.

2. Half-hour cutoff for “coffee breaks”

Another thing that can happen is you might go and make a cup of coffee. So ideally if you went and had a cup of coffee while in you’re in Tesco or a supermarket that’s popular in whatever country you’re from, you might want to consider that one long session. Google has made the executive decision that we’re actually going to have a cutoff of half an hour by default.

If you leave for half an hour, then again you’ve got two sessions. One, the category page is the landing page and the source of Google, and one in this case where the blog is the landing page, and this would be another self-referral, because when you come back after your coffee break, you’re going to click through from here to here. This time period, the 30 minutes, that is actually adjustable in your settings, but most people do just leave it as it is, and there isn’t really an obvious number that would make this always correct either. It’s kind of, like I said earlier, an arbitrary distinction.

3. Leaving the site and coming back

The next issue I want to talk about is if you leave the site and come back. So obviously it makes sense that if you enter the site from Google, browse for a bit, and then enter again from Bing, you might want to count that as two different sessions with two different sources. However, where this gets a little murky is with things like external payment providers.

If you had to click through from the category page to PayPal to the checkout, then unless PayPal is excluded from your referral list, then this would be one session, entrance from Google, one session, entrance from checkout. The last issue I want to talk about is not necessarily a way that sessions are divided, but a quirk of how they are.

4. Return direct sessions

If you were to enter by Google to the category page, go on holiday and then use a bookmark or something or just type in the URL to come back, then obviously this is going to be two different sessions. You would hope that it would be one session from Google and one session from direct. That would make sense, right?

But instead, what actually happens is that, because Google and most Google Analytics and most of its reports uses last non-direct click, we pass through that source all the way over here, so you’ve got two sessions from Google. Again, you can change this timeout period. So that’s some ways that sessions work that you might not expect.

As a bonus, I want to give you some extra information about how this affects a certain metric, mainly because I want to persuade you to stop using it, and that metric is time on page.

Bonus: Three scenarios where this affects time on page

So I’ve got three different scenarios here that I want to talk you through, and we’ll see how the time on page metric works out.

I want you to bear in mind that, basically, because Google Analytics really has very little data to work with typically, they only know that you’ve landed on a page, and that sent a page view and then potentially nothing else. If you were to have a single page visit to a site, or a bounce in other words, then they don’t know whether you were on that page for 10 seconds or the rest of your life.

They’ve got no further data to work with. So what they do is they say, “Okay, we’re not going to include that in our average time on page metrics.” So we’ve got the formula of time divided by views minus exits. However, this fudge has some really unfortunate consequences. So let’s talk through these scenarios.

Example 1: Intuitive time on page = actual time on page

In the first scenario, I arrive on the page. It sends a page view. Great. Ten seconds later I trigger some kind of event that the site has added. Twenty seconds later I click through to the next page on the site. In this case, everything is working as intended in a sense, because there’s a next page on the site, so Google Analytics has that extra data of another page view 20 seconds after the first one. So they know that I was on here for 20 seconds.

In this case, the intuitive time on page is 20 seconds, and the actual time on page is also 20 seconds. Great.

Example 2: Intuitive time on page is higher than measured time on page

However, let’s think about this next example. We’ve got a page view, event 10 seconds later, except this time instead of clicking somewhere else on the site, I’m going to just leave altogether. So there’s no data available, but Google Analytics knows we’re here for 10 seconds.

So the intuitive time on page here is still 20 seconds. That’s how long I actually spent looking at the page. But the measured time or the reported time is going to be 10 seconds.

Example 3: Measured time on page is zero

The last example, I browse for 20 seconds. I leave. I haven’t triggered an event. So we’ve got an intuitive time on page of 20 seconds and an actual time on page or a measured time on page of 0.

The interesting bit is when we then come to calculate the average time on page for this page that appeared here, here, and here, you would initially hope it would be 20 seconds, because that’s how long we actually spent. But your next guess, when you look at the reported or the available data that Google Analytics has in terms of how long we’re on these pages, the average of these three numbers would be 10 seconds.

So that would make some sense. What they actually do, because of this formula, is they end up with 30 seconds. So you’ve got the total time here, which is 30, divided by the number of views, we’ve got 3 views, minus 2 exits. Thirty divided 3 minus 2, 30 divided by 1, so we’ve got 30 seconds as the average across these 3 sessions.

Well, the average across these three page views, sorry, for the amount of time we’re spending, and that is longer than any of them, and it doesn’t make any sense with the constituent data. So that’s just one final tip to please not use average time on page as a reporting metric.

I hope that’s all been useful to you. I’d love to hear what you think in the comments below. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

More Articles

Posted in Latest NewsComments Off

Log File Analysis 101 – Whiteboard Friday

Posted by BritneyMuller

Log file analysis can provide some of the most detailed insights about what Googlebot is doing on your site, but it can be an intimidating subject. In this week’s Whiteboard Friday, Britney Muller breaks down log file analysis to make it a little more accessible to SEOs everywhere.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome to another edition of Whiteboard Friday. Today we’re going over all things log file analysis, which is so incredibly important because it really tells you the ins and outs of what Googlebot is doing on your sites.

So I’m going to walk you through the three primary areas, the first being the types of logs that you might see from a particular site, what that looks like, what that information means. The second being how to analyze that data and how to get insights, and then the third being how to use that to optimize your pages and your site.

For a primer on what log file analysis is and its application in SEO, check out our article: How to Use Server Log Analysis for Technical SEO

1. Types

So let’s get right into it. There are three primary types of logs, the primary one being Apache. But you’ll also see W3C, elastic load balancing, which you might see a lot with things like Kibana. But you also will likely come across some custom log files. So for those larger sites, that’s not uncommon. I know Moz has a custom log file system. Fastly is a custom type setup. So just be aware that those are out there.

Log data

So what are you going to see in these logs? The data that comes in is primarily in these colored ones here.

So you will hopefully for sure see:

  • the request server IP;
  • the timestamp, meaning the date and time that this request was made;
  • the URL requested, so what page are they visiting;
  • the HTTP status code, was it a 200, did it resolve, was it a 301 redirect;
  • the user agent, and so for us SEOs we’re just looking at those user agents’ Googlebot.

So log files traditionally house all data, all visits from individuals and traffic, but we want to analyze the Googlebot traffic. Method (Get/Post), and then time taken, client IP, and the referrer are sometimes included. So what this looks like, it’s kind of like glibbery gloop.

It’s a word I just made up, and it just looks like that. It’s just like bleh. What is that? It looks crazy. It’s a new language. But essentially you’ll likely see that IP, so that red IP address, that timestamp, which will commonly look like that, that method (get/post), which I don’t completely understand or necessarily need to use in some of the analysis, but it’s good to be aware of all these things, the URL requested, that status code, all of these things here.

2. Analyzing

So what are you going to do with that data? How do we use it? So there’s a number of tools that are really great for doing some of the heavy lifting for you. Screaming Frog Log File Analyzer is great. I’ve used it a lot. I really, really like it. But you have to have your log files in a specific type of format for them to use it.

Splunk is also a great resource. Sumo Logic and I know there’s a bunch of others. If you’re working with really large sites, like I have in the past, you’re going to run into problems here because it’s not going to be in a common log file. So what you can do is to manually do some of this yourself, which I know sounds a little bit crazy.

Manual Excel analysis

But hang in there. Trust me, it’s fun and super interesting. So what I’ve done in the past is I will import a CSV log file into Excel, and I will use the Text Import Wizard and you can basically delineate what the separators are for this craziness. So whether it be a space or a comma or a quote, you can sort of break those up so that each of those live within their own columns. I wouldn’t worry about having extra blank columns, but you can separate those. From there, what you would do is just create pivot tables. So I can link to a resource on how you can easily do that.

Top pages

But essentially what you can look at in Excel is: Okay, what are the top pages that Googlebot hits by frequency? What are those top pages by the number of times it’s requested?

Top folders

You can also look at the top folder requests, which is really interesting and really important. On top of that, you can also look into: What are the most common Googlebot types that are hitting your site? Is it Googlebot mobile? Is it Googlebot images? Are they hitting the correct resources? Super important. You can also do a pivot table with status codes and look at that. I like to apply some of these purple things to the top pages and top folders reports. So now you’re getting some insights into: Okay, how did some of these top pages resolve? What are the top folders looking like?

You can also do that for Googlebot IPs. This is the best hack I have found with log file analysis. I will create a pivot table just with Googlebot IPs, this right here. So I will usually get, sometimes it’s a bunch of them, but I’ll get all the unique ones, and I can go to terminal on your computer, on most standard computers.

I tried to draw it. It looks like that. But all you do is you type in “host” and then you put in that IP address. You can do it on your terminal with this IP address, and you will see it resolve as a Google.com. That verifies that it’s indeed a Googlebot and not some other crawler spoofing Google. So that’s something that these tools tend to automatically take care of, but there are ways to do it manually too, which is just good to be aware of.

3. Optimize pages and crawl budget

All right, so how do you optimize for this data and really start to enhance your crawl budget? When I say “crawl budget,” it primarily is just meaning the number of times that Googlebot is coming to your site and the number of pages that they typically crawl. So what is that with? What does that crawl budget look like, and how can you make it more efficient?

  • Server error awareness: So server error awareness is a really important one. It’s good to keep an eye on an increase in 500 errors on some of your pages.
  • 404s: Valid? Referrer?: Another thing to take a look at is all the 400s that Googlebot is finding. It’s so important to see: Okay, is that 400 request, is it a valid 400? Does that page not exist? Or is it a page that should exist and no longer does, but you could maybe fix? If there is an error there or if it shouldn’t be there, what is the referrer? How is Googlebot finding that, and how can you start to clean some of those things up?
  • Isolate 301s and fix frequently hit 301 chains: 301s, so a lot of questions about 301s in these log files. The best trick that I’ve sort of discovered, and I know other people have discovered, is to isolate and fix the most frequently hit 301 chains. So you can do that in a pivot table. It’s actually a lot easier to do this when you have kind of paired it up with crawl data, because now you have some more insights into that chain. What you can do is you can look at the most frequently hit 301s and see: Are there any easy, quick fixes for that chain? Is there something you can remove and quickly resolve to just be like a one hop or a two hop?
  • Mobile first: You can keep an eye on mobile first. If your site has gone mobile first, you can dig into that, into the logs and evaluate what that looks like. Interestingly, the Googlebot is still going to look like this compatible Googlebot 2.0. However, it’s going to have all of the mobile implications in the parentheses before it. So I’m sure these tools can automatically know that. But if you’re doing some of the stuff manually, it’s good to be aware of what that looks like.
  • Missed content: So what’s really important is to take a look at: What’s Googlebot finding and crawling, and what are they just completely missing? So the easiest way to do that is to cross-compare with your site map. It’s a really great way to take a look at what might be missed and why and how can you maybe reprioritize that data in the site map or integrate it into navigation if at all possible.
  • Compare frequency of hits to traffic: This was an awesome tip I got on Twitter, and I can’t remember who said it. They said compare frequency of Googlebot hits to traffic. I thought that was brilliant, because one, not only do you see a potential correlation, but you can also see where you might want to increase crawl traffic or crawls on a specific, high-traffic page. Really interesting to kind of take a look at that.
  • URL parameters: Take a look at if Googlebot is hitting any URLs with the parameter strings. You don’t want that. It’s typically just duplicate content or something that can be assigned in Google Search Console with the parameter section. So any e-commerce out there, definitely check that out and kind of get that all straightened out.
  • Evaluate days, weeks, months: You can evaluate days, weeks, and months that it’s hit. So is there a spike every Wednesday? Is there a spike every month? It’s kind of interesting to know, not totally critical.
  • Evaluate speed and external resources: You can evaluate the speed of the requests and if there’s any external resources that can potentially be cleaned up and speed up the crawling process a bit.
  • Optimize navigation and internal links: You also want to optimize that navigation, like I said earlier, and use that meta no index.
  • Meta noindex and robots.txt disallow: So if there are things that you don’t want in the index and if there are things that you don’t want to be crawled from your robots.txt, you can add all those things and start to help some of this stuff out as well.

Reevaluate

Lastly, it’s really helpful to connect the crawl data with some of this data. So if you’re using something like Screaming Frog or DeepCrawl, they allow these integrations with different server log files, and it gives you more insight. From there, you just want to reevaluate. So you want to kind of continue this cycle over and over again.

You want to look at what’s going on, have some of your efforts worked, is it being cleaned up, and go from there. So I hope this helps. I know it was a lot, but I want it to be sort of a broad overview of log file analysis. I look forward to all of your questions and comments below. I will see you again soon on another Whiteboard Friday. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Overcoming Blockers: How to Build Your Red Tape Toolkit – Whiteboard Friday

Posted by HeatherPhysioc

Have you ever made SEO recommendations that just don’t go anywhere? Maybe you run into a lack of budget, or you can’t get buy-in from your boss or colleagues. Maybe your work just keeps getting deprioritized in favor of other initiatives. Whatever the case, it’s important to set yourself up for success when it comes to the tangled web of red tape that’s part and parcel of most organizations.

In this week’s Whiteboard Friday, Heather Physioc shares her tried-and-true methods for building yourself a toolkit that’ll help you tear through roadblocks and bureaucracy to get your work implemented.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

What up, Moz fans? This is Heather Physioc. I’m the Director of the Discoverability Group at VML, headquartered in Kansas City. So today we’re going to talk about how to build your red tape toolkit to overcome obstacles to getting your search work implemented. So do you ever feel like your recommendations are overlooked, ignored, forgotten, deprioritized, or otherwise just not getting implemented?

Common roadblocks to implementing SEO recommendations

#SEOprobs

If so, you’re not alone. So I asked 140-plus of our industry colleagues the blockers that they run into and how they overcome them.

  • Low knowledge. So if you’re anything like every other SEO ever, you might be running into low knowledge and understanding of search, either on the client side or within your own agency.
  • Low buy-in. You may be running into low buy-in. People don’t care about SEO as much as you do.
  • Poor prioritization. So other things frequently come to the top of the list while SEO keeps falling further behind.
  • High bureaucracy. So a lot of red tape or slow approvals or no advocacy within the organization.
  • Not enough budget. A lot of times it’s not enough budget, not enough resources to get the work done.
  • Unclear and overcomplicated process. So people don’t know where they fit or even how to get started implementing your SEO work.
  • Bottlenecks. And finally bottlenecks where you’re just hitting blockers at every step along the way.

So if you’re in-house, you probably said that not enough budget and resources was your biggest problem. But on the agency side or individual practitioners, they said low understanding or knowledge of search on the client side was their biggest blocker.

So a lot of the time when we run into these blockers and it seems like nothing is getting done, we start to play the blame game. We start to complain that it’s the client who hung up the project or if the client had only listened or it’s something wrong with the client’s business.

Build out your red tape toolkit

But I don’t buy it. So we’re going to not do that. We’re going to build out our red tape toolkit. So here are some of the suggestions that came out of that survey.

1. Assess client maturity

First is to assess your client’s maturity. This could include their knowledge and capabilities for doing SEO, but also their organizational search program, the people, process, ability to plan, knowledge, capacity.

These are the problems that tend to stand in the way of getting our best work done. So I’m not going to go in-depth here because we’ve actually put out a full-length article on the Moz blog and another Whiteboard Friday. So if you need to pause, watch that and come back, no problem.

2. Speak your client’s language

So the next thing to put in your toolkit is to speak your client’s language. I think a lot of times we’re guilty of talking to fellow SEOs instead of the CMOs and CEOs who buy into our work. So unless your client is a super technical mind or they have a strong search background, it’s in our best interests to lift up and stay at 30,000 feet. Let’s talk about things that they care about, and I promise you that is not canonicalization or SSL encryption and HTTPS.

They’re thinking about ROI and their customers and operational costs. Let’s translate and speak their language. Now this could also mean using analogies that they can relate to or visual examples and data visualizations that tell the story of search better than words ever could. Help them understand. Meet them in the middle.

3. Seek greater perspective

Now let’s seek greater perspective. So what this means is SEO does not or should not operate in a silo. We’re one small piece of your client’s much larger marketing mix. They have to think about the big picture. A lot of times our clients aren’t just dedicated to SEO. They’re not even dedicated to just digital sometimes. A lot of times they have to think about how all the pieces fit together. So we need to have the humility to understand where search fits into that and ladder our SEO goals up to the brand goals, campaign goals, business and revenue goals. We also need to understand that every SEO project we recommend comes with a time and a cost associated with it.

Everything we recommend to a CMO is an opportunity cost as well for something else that they could be working on. So we need to show them where search fits into that and how to make those hard choices. Sometimes SEO doesn’t need to be the leader. Sometimes we’re the follower, and that’s okay.

4. Get buy-in

The next tool in your toolkit is to get buy-in. So there are two kinds of buy-in you can get.

Horizontal buy-in

One is horizontal buy-in. So a lot of times search is dependent on other disciplines to get our work implemented. We need copywriters. We need developers. So the number-one complaint SEOs have is not being brought in early. That’s the same complaint all your teammates on development and copywriting and everywhere else have.

Respect the expertise and the value that they bring to this project and bring them to the table early. Let them weigh in on how this project can get done. Build mockups together. Put together a plan together. Estimate the level of effort together.

Vertical buy-in

Which leads us to vertical buy-in. Vertical is up and down. When you do this horizontal buy-in first, you’re able to go to the client with a much smarter, better vetted recommendation. So a lot of times your day-to-day client isn’t the final decision maker. They have to sell this opportunity internally. So give them the tools and the voice that they need to do that by the really strong recommendation you put together with your peers and make it easy for them to take it up to their boss and their CMO and their CEO. Then you really increase the likelihood that you’re going to get that work done.

5. Build a bulletproof plan

Next, build a bulletproof plan.

Case studies

So the number-one recommendation that came out of this survey was case studies. Case studies are great. They talk about the challenge that you tried to overcome, the solution, how you actually tackled it, and the results you got out of that.

Clients love case studies. They show that you have the chops to do the work. They better explain the outcomes and the benefits of doing this kind of work, and you took the risk on that kind of project with someone else’s money first. So that’s going to reduce the perceived risk in the client’s mind and increase the likelihood that they’re going to do the work.

Make your plan simple and clear, with timelines

Another thing that helps here is building a really simple, clear plan so it’s stupid-easy for everybody who needs to be a part of it to know where they fit in and what they’re responsible for. So do the due diligence to put together a step-by-step plan and assign ownership to each step and put timelines to it so they know what pace they should be following.

Forecast ROI

Finally, forecast ROI. This is not optional. So a lot of times I think SEOs are hesitant to forecast the potential outcomes or ROI of a project because of the sheer volume of unknowns.

We live in a world of theory, and it’s very hard to commit to something that we can’t be certain about. But we have to give the client some sense of return. We have to know why we are recommending this project over others. There’s a wealth of resources out there to do that for even heavily caveated and conservative estimate, including case studies that others have published online.

Show the cost of inaction

Now sometimes forecasting the opportunity of ROI isn’t enough to light a fire for clients. Sometimes we need to show them the cost of inaction. I find that with clients the risk is not so much that they’re going to make the wrong move. It’s that they’ll make no move at all. So a lot of times we will visualize what that might look like. So we’ll show them this is the kind of growth we think that you can get if you invest and you follow this plan we put together.

Here’s what it will look like if you invest just a little to monitor and maintain, but you’re not aggressively investing in search. Oh, and here, dropping down and to the right, is what happens when you don’t invest at all. You stagnate and you get surpassed by your competitors. That can be really helpful for clients to contrast those different levels of investment and convince them to do the work that you’re recommending.

6. Use headlines & soundbites

Next use headlines, taglines, and sound bites. What we recommend is really complicated to some clients. So let’s help translate that into simple, usable language that’s memorable so they can go repeat those lines to their colleagues and their bosses and get that work sold internally. We also need to help them prioritize.

So if you’re anything like me, you love it when the list of SEO action items is about a mile long. But when we dump that in their laps, it’s too much. They get overwhelmed and bombarded, and they tune out. So instead, you are the expert consultant. Use what you know about search and know about your client to help them prioritize the single most important thing that they should be focusing on.

7. Patience, persistence, and parallel paths

Last in your toolkit, patience, persistence, and parallel paths. So getting this work done is a combination of communication, follow-up, patience, and persistence. While you’ve got your client working on this one big thing that you recommended, you can be building parallel paths, things that have fewer obstacles that you can own and run with.

They may not be as high impact as the one big thing, but you can start to get small wins that get your client excited and build momentum for more of the big stuff. But the number one thing out of all of the responses in the survey that our colleagues recommended to you is to stay strong. Have empathy and understanding for the hard decisions that your client has to make. But come with a strong, confident point of view on where to go next.

All right, gang, these are a lot of great tips to start your red tape toolkit and overcome obstacles to get your best search work done. Try these out. Let us know what you think. If you have other great ideas on how you overcome obstacles to get your best work done with clients, let us know down in the comments. Thank you so much for watching, and we’ll see you next week for another edition of Whiteboard Friday.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Articles

Posted in Latest NewsComments Off

How to Create a Local Marketing Results Dashboard in Google Data Studio – Whiteboard Friday

Posted by DiTomaso

Showing clients that you’re making them money is one of the most important things you can communicate to them, but it’s tough to know how to present your results in a way they can easily understand. That’s where Google Data Studio comes in. In this week’s edition of Whiteboard Friday, our friend Dana DiTomaso shares how to create a client-friendly local marketing results dashboard in Google Data Studio from start to finish.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, Moz fans. My name is Dana DiTomaso. I’m President and partner of Kick Point. We’re a digital marketing agency way up in the frozen north of Edmonton, Alberta. We work with a lot of local businesses, both in Edmonton and around the world, and small local businesses usually have the same questions when it comes to reporting.

Are we making money?

What I’m going to share with you today is our local marketing dashboard that we share with clients. We build this in Google Data Studio because we love Google Data Studio. If you haven’t watched my Whiteboard Friday yet on how to do formulas in Google Data Studio, I recommend you hit Pause right now, go back and watch that, and then come back to this because I am going to talk about what happened there a little bit in this video.

The Google Data Studio dashboard

This is a Google Data Studio dashboard which I’ve tried to represent in the medium of whiteboard as best as I could. Picture it being a little bit better design than my left-handedness can represent on a whiteboard, but you get the idea. Every local business wants to know, “Are we making money?” This is the big thing that people care about, and really every business cares about making money. Even charities, for example: money is important obviously because that’s what keeps the lights on, but there’s also perhaps a mission that they have.

But they still want to know: Are people filling out our donation form? Are people contacting us? These are important things for every business, organization, not-for-profit, whatever to understand and know. What we’ve tried to do in this dashboard is really boil it down to the absolute basics, one thing you can look at, see a couple of data points, know whether things are good or things are bad.

Are people contacting you?

Let’s start with this up here. The first thing is: Are people contacting you? Now you can break this out into separate columns. You can do phone calls and emails for example. Some of our clients prefer that. Some clients just want one mashed up number. So we’ll take the number of calls that people are getting.

If you’re using a call tracking tool, such as CallRail, you can import this in here. Emails, for example, or forms, just add it all together and then you have one single number of the number of times people contacted you. Usually this is a way bigger number than people think it is, which is also kind of cool.

Are people taking the action you want them to take?

The next thing is: Are people doing the thing that you want them to do? This is really going to decide on what’s meaningful to the client.

For example, if you have a client, again thinking about a charity, how many people filled out your donation form, your online donation form? For a psychologist client of ours, how many people booked an appointment? For a client of ours who offers property management, how many people booked a viewing of a property? What is the thing you want them to do? If they have online e-commerce, for example, then maybe this is how many sales did you have.

Maybe this will be two different things — people walking into the store versus sales. We’ve also represented in this field if a person has a people counter in their store, then we would pull that people counter data into here. Usually we can get the people counter data in a Google sheet and then we can pull it into Data Studio. It’s not the prettiest thing in the world, but it certainly represents all their data in one place, which is really the whole point of why we do these dashboards.

Where did visitors com from, and where are your customers coming from?

People contacting you, people doing the thing you want them to do, those are the two major metrics. Then we do have a little bit deeper further down. On this side here we start with: Where did visitors come from, and where are your customers coming from? Because they’re really two different things, right? Not every visitor to the website is going to become a customer. We all know that. No one has a 100% conversion rate, and if you do, you should just retire.

Filling out the dashboard

We really need to differentiate between the two. In this case we’re looking at channel, and there probably is a better word for channel. We’re always trying to think about, “What would clients call this?” But I feel like clients are kind of aware of the word “channel” and that’s how they’re getting there. But then the next column, by default this would be called users or sessions. Both of those are kind of cruddy. You can rename fields in Data Studio, and we can call this the number of people, for example, because that’s what it is.

Then you would use the users as the metric, and you would just call it number of people instead of users, because personally I hate the word “users.” It really boils down the humanity of a person to a user metric. Users are terrible. Call them people or visitors at least. Then unfortunately, in Data Studio, when you do a comparison field, you cannot rename and call it comparison. It does this nice percentage delta, which I hate.

It’s just like a programmer clearly came up with this. But for now, we have to deal with it. Although by the time this video comes out, maybe it will be something better, and then I can go back and correct myself in the comments. But for now it’s percentage delta. Then goal percentage and then again delta. They can sort by any of these columns in Data Studio, and it’s real live data.

Put a time period on this, and people can pick whatever time period they want and then they can look at this data as much as they want, which is delightful. If you’re not delivering great results, it may be a little terrifying for you, but really you shouldn’t be hiding that anyway, right? Like if things aren’t going well, be honest about it. That’s another talk for another time. But start with this kind of chart. Then on the other side, are you showing up on Google Maps?

We use the Supermetrics Google My Business plug-in to grab this kind of information. We hook it into the customer’s Google Maps account. Then we’re looking at branded searches and unbranded searches and how many times they came up in the map pack. Usually we’ll have a little explanation here. This is how many times you came up in the map pack and search results as well as Google Maps searches, because it’s all mashed in together.

Then what happens when they find you? So number of direction requests, number of website visits, number of phone calls. Now the tricky thing is phone calls here may be captured in phone calls here. You may not want to add these two pieces of data or just keep this off on its own separately, depending upon how your setup is. You could be using a tracking number, for example, in your Google My Business listing and that therefore would be captured up here.

Really just try to be honest about where that data comes from instead of double counting. You don’t want to have that happen. The last thing is if a client has messages set up, then you can pull that message information as well.

Tell your clients what to do

Then at the very bottom of the report we have a couple of columns, and usually this is a longer chart and this is shorter, so we have room down here to do this. Obviously, my drawing skills are not as good as as aligning things in Data Studio, so forgive me.

But we tell them what to do. Usually when we work with local clients, they can’t necessarily afford a monthly retainer to do stuff for clients forever. Instead, we tell them, “Here’s what you have to do this month.Here’s what you have to do next month. Hey, did you remember you’re supposed to be blogging?” That sort of thing. Just put it in here, because clients are looking at results, but they often forget the things that may get them those results. This is a really nice reminder of if you’re not happy with these numbers, maybe you should do these things.

Tell your clients how to use the report

Then the next thing is how to use. This is a good reference because if they only open it say once every couple months, they probably have forgotten how to do the stuff in this report or even things like up at the top make sure to set the time period for example. This is a good reminder of how to do that as well.

Because the report is totally editable by you at any time, you can always go in and change stuff later, and because the client can view the report at any time, they have a dashboard that is extremely useful to them and they don’t need to bug you every single time they want to see a report. It saves you time and money. It saves them time and money. Everybody is happy. Everybody is saving money. I really recommend setting up a really simple dashboard like this for your clients, and I bet you they’ll be impressed.

Thanks so much.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Articles

Posted in Latest NewsComments Off

Faceted Navigation Intro – Whiteboard Friday

Posted by sergeystefoglo

The topic of faceted navigation is bound to come up at some point in your SEO career. It’s a common solution to product filtering for e-commerce sites, but managing it on the SEO side can quickly spin out of control with the potential to cause indexing bloat and crawl errors. In this week’s Whiteboard Friday, we welcome our friend Sergey Stefoglo to give us a quick refresher on just what faceted nav is and why it matters, then dive into a few key solutions that can help you tame it.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans. My name is Serge. I’m from Distilled. I work at the Seattle office as a consultant. For those of you that don’t know about Distilled, we’re a full-service digital marketing agency specializing in SEO, but have branched out since to work on all sorts of things like content, PR, and recently a split testing tool, ODN.

Today I’m here to talk to you guys about faceted navigation, just the basics. We have a few minutes today, so I’m just going to cover kind of the 101 version of this. But essentially we’re going to go through what the definition is, why we should care as SEOs, why it’s important, what are some options we have with this, and then also what a solution could look like.

1. What is faceted navigation?

For those that don’t know, faceted navigation is essentially something like this, probably a lot nicer than this to be honest. But it’s essentially a page that allows you to filter down or allows a user to filter down based on what they’re looking for. So this is an example we have here of a list of products on a page that sells laptops, Apple laptops in this case.

Right here on the left side, in the green, we have a bunch of facets. Essentially, if you’re a user and you’re going in here, you could look at the size of the screen you might want. You could look at the price of the laptop, etc. That’s what faceted navigation is. Previously, when I worked at my previous agency, I worked on a lot of local SEO things, not really e-commerce, big-scale websites, so I didn’t run into this issue often. I actually didn’t even know it was a thing until I started at Distilled. So this might be interesting for you even if it doesn’t apply at the moment.

2. Why does faceted navigation matter?

Essentially, we should care as SEOs because this can get out of control really quickly. While being very useful to users, obviously it’s helpful to be able to filter down to the specific thing you want. this could get kind of ridiculous for Googlebot.

Faceted navigation can result in indexing bloat and crawl issues

We’ve had clients at Distilled that come to us that are e-commerce brands that have millions of pages in the index being crawled that really shouldn’t be. They don’t bring any value to the site, any revenue, etc. The main reason we should care is because we want to avoid indexation bloat and kind of crawl errors or issues.

3. What options do we have when it comes to controlling which pages are indexed/crawled?

The third thing we’ll talk about is what are some options we have in terms of controlling some of that, so controlling whether a page gets indexed or crawled, etc. I’m not going to get into the specifics of each of these today, but I have a blog post on this topic that we’ll link to at the bottom.

The main, most common options that we have for controlling this kind of thing would be around no indexing a page and stopping Google from indexing it, using canonical tags to choose a page that’s essentially the canonical version, using a disallow rule in robots.txt to stop Google from crawling a certain part of the site, or using the nofollow meta directive as well. Those are some of the most common options. Again, we’re not going to go into the nitty-gritty of each one. They each have their kind of pros and cons, so you can research that for yourselves.

4. What could a solution look like?

So okay, we know all of this. What could be an ideal solution? Before I jump into this, I don’t want you guys to run in to your bosses and say, “This is what we need to do.”

Please, please do your research beforehand because it’s going to vary a lot based on your site. Based on the dev resources you have, you might have to get scrappy with it. Also, do some keyword research mainly around the long tail. There are a lot of instances where you could and might want to have three or four facets indexed.

So again, a huge caveat: this isn’t the end-all be-all solution. It’s something that we’ve recommended at times, when appropriate, to clients. So let’s jump into what an ideal solution, or not ideal solution, a possible solution could look like.

Category, subcategory, and sub-subcategory pages open to indexing and crawling

What we’re looking at here is we’re going to have our category, subcategory, and sub-subcategory pages open to indexation and open to being crawled. In our example here, that would be this page, so /computers/laptops/apple. Perfectly fine. People are probably searching for Apple laptops. In fact, I know they are.

Any pages with one or more facets selected = indexed, facet links get nofollowed

The second step here is any page that has one facet selected, so for example, if I was on this page and I wanted an Apple laptop with a solid state drive in it, I would select that from these options. Those are fine to be indexed. But any time you have one or more facets selected, we want to make sure to nofollow all of these internal links pointing to other facets, essentially to stop link equity from being wasted and to stop Google from wasting time crawling those pages.

Any pages with 2+ facets selected = noindex tag gets added

Then, past that point, if a user selects two or more facets, so if I was interested in an Apple laptop with a solid state hard drive that was in the $ 1,000 price range for example, the chances of there being a lot of search volume for an Apple laptop for $ 1,000 with a solid state drive is pretty low.

So what we want to do here is add a noindex tag to those two-plus facet options, and that will again help us control crawl bloat and indexation bloat.

Already set up faceted nav? Think about keyword search volume, then go back and whitelist

The final thing I want to mention here, I touched on it a little bit earlier. But essentially, if you’re doing this after the fact, after the faceted navigation is already set up, which you probably are, it’s worth, again, having a strong think about where there is keyword search volume. If you do this, it’s worth also taking a look back a few months in to see the impact and also see if there’s anything you might want to whitelist. There might be a certain set of facets that do have search volume, so you might want to throw them back into the index. It’s worth taking a look at that.

That’s what faceted navigation is as a quick intro. Thank you for watching. I’d be really interested to hear what you guys think in the comments. Again, like I said, there isn’t a one-size-fits-all solution. So I’d be really interested to hear what’s worked for you, or if you have any questions, please ask them below.

Thank you.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Advert