Tag Archive | "Pages"

Do You Need Local Pages? – Whiteboard Friday

Posted by Tom.Capper

Does it make sense for you to create local-specific pages on your website? Regardless of whether you own or market a local business, it may make sense to compete for space in the organic SERPs using local pages. Please give a warm welcome to our friend Tom Capper as he shares a 4-point process for determining whether local pages are something you should explore in this week’s Whiteboard Friday!


Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hello, Moz fans. Welcome to another Whiteboard Friday. I’m Tom Capper. I’m a consultant at Distilled, and today I’m going to be talking to you about whether you need local pages. Just to be clear right off the bat what I’m talking about, I’m not talking about local rankings as we normally think of them, the local map pack results that you see in search results, the Google Maps rankings, that kind of thing.

A 4-step process to deciding whether you need local pages

I’m talking about conventional, 10 blue links rankings but for local pages, and by local pages I mean pages from a national or international business that are location-specific. What are some examples of that? Maybe on Indeed.com they would have a page for jobs in Seattle. Indeed doesn’t have a bricks-and-mortar premises in Seattle, but they do have a page that is about jobs in Seattle.

You might get a similar thing with flower delivery. You might get a similar thing with used cars, all sorts of different verticals. I think it can actually be quite a broadly applicable tactic. There’s a four-step process I’m going to outline for you. The first step is actually not on the board. It’s just doing some keyword research.

1. Know (or discover) your key transactional terms

I haven’t done much on that here because hopefully you’ve already done that. You already know what your key transactional terms are. Because whatever happens you don’t want to end up developing location pages for too many different keyword types because it’s gong to bloat your site, you probably just need to pick one or two key transactional terms that you’re going to make up the local variants of. For this purpose, I’m going to talk through an SEO job board as an example.

2. Categorize your keywords as implicit, explicit, or near me and log their search volumes

We might have “SEO jobs” as our core head term. We then want to figure out what the implicit, explicit, and near me versions of that keyword are and what the different volumes are. In this case, the implicit version is probably just “SEO jobs.” If you search for “SEO jobs” now, like if you open a new tab in your browser, you’re probably going to find that a lot of local orientated results appear because that is an implicitly local term and actually an awful lot of terms are using local data to affect rankings now, which does affect how you should consider your rank tracking, but we’ll get on to that later.

SEO jobs, maybe SEO vacancies, that kind of thing, those are all going to be going into your implicitly local terms bucket. The next bucket is your explicitly local terms. That’s going to be things like SEO jobs in Seattle, SEO jobs in London, and so on. You’re never going to get a complete coverage of different locations. Try to keep it simple.

You’re just trying to get a rough idea here. Lastly you’ve got your near me or nearby terms, and it turns out that for SEO jobs not many people search SEO jobs near me or SEO jobs nearby. This is also going to vary a lot by vertical. I would imagine that if you’re in food delivery or something like that, then that would be huge.

3. Examine the SERPs to see whether local-specific pages are ranking

Now we’ve categorized our keywords. We want to figure out what kind of results are going to do well for what kind of keywords, because obviously if local pages is the answer, then we might want to build some.

In this case, I’m looking at the SERP for “SEO jobs.” This is imaginary. The rankings don’t really look like this. But we’ve got SEO jobs in Seattle from Indeed. That’s an example of a local page, because this is a national business with a location-specific page. Then we’ve got SEO jobs Glassdoor. That’s a national page, because in this case they’re not putting anything on this page that makes it location specific.

Then we’ve got SEO jobs Seattle Times. That’s a local business. The Seattle Times only operates in Seattle. It probably has a bricks-and-mortar location. If you’re going to be pulling a lot of data of this type, maybe from stats or something like that, obviously tracking from the locations that you’re mentioning, where you are mentioning locations, then you’re probably going to want to categorize these at scale rather than going through one at a time.

I’ve drawn up a little flowchart here that you could encapsulate in a Excel formula or something like that. If the location is mentioned in the URL and in the domain, then we know we’ve got a local business. Most of the time it’s just a rule of thumb. If the location is mentioned in the URL but not mentioned in the domain, then we know we’ve got a local page and so on.

4. Compare & decide where to focus your efforts

You can just sort of categorize at scale all the different result types that we’ve got. Then we can start to fill out a chart like this using the rankings. What I’d recommend doing is finding a click-through rate curve that you are happy to use. You could go to somewhere like AdvancedWebRanking.com, download some example click-through rate curves.

Again, this doesn’t have to be super precise. We’re looking to get a proportionate directional indication of what would be useful here. I’ve got Implicit, Explicit, and Near Me keyword groups. I’ve got Local Business, Local Page, and National Page result types. Then I’m just figuring out what the visibility share of all these types is. In my particular example, it turns out that for explicit terms, it could be worth building some local pages.

That’s all. I’d love to hear your thoughts in the comments. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Digital Marketing News: Facebook’s Playable Ads & Business Pages Update, Gen Z Mom Trends, & B2B’s Video Uptick

Facebook Business Pages

Facebook redesigns biz Pages for utility as feed reach declines
Facebook has released a slew of changes to its popular Business Pages offering, including updates to mobile, recommendations, events, jobs, and Facebook Local. The updates bring marketers new opportunities along with the need to re-think certain strategies that may no longer be relevant. TechCruch

Twitter loses ability to let users auto-post tweets & retweets to Facebook
Facebook changed how its API is utilized by some 60K apps, including Twitter’s, doing away with cross-posted auto-tweets unless going through the more limited posting options of Facebook’s Share feature. Marketing Land

Move Over Millennials: It’s Time To Discuss How To Win With Generation Z Moms
An examination of digital native Gen Z moms and their online brand engagement traits and habits. Forbes

Making B2B video content work: marketers from Linkedln, Dailymotion and The Smalls share best practices
Marketers from LinkedIn (client), The Small, and Dailymotion take a serious look at what’s working in B2B video marketing, what isn’t, and why. The Drum

Facebook launches playable ads, tests retention optimization for app advertising
With Facebook’s recent launch, are playable ads likely to make their way into other, non-gaming areas of digital marketing? Marketing Land

‘Better ROI than influencers’: Meme accounts attract growing interest on Instagram
Brand and publisher partnerships look at engagement via meme, where even small follower counts can produce high engagement rates. DigiDay

2018 August 10 Statistics Image

We Analyzed 43 Million Facebook Posts From the Top 20,000 Brands (New Research)
A new study from Buffer and BuzzSumo examined Facebook posts from some 20,000 top brands, and results show posting volume has been up while page engagement has decreased. Buffer

Snapchat launches ad marketplace for Discover partners & brings Commercials to Ads Manager
Snapchat’s Private Marketplace and non-skippable ad options were among several new beta features recently rolled out to publishers. Marketing Land

ON THE LIGHTER SIDE:

Marketoonist Tom Fishburne ROI of Marketing Cartoon

A lighthearted look at the ROI of marketing by Marketoonist Tom Fishburne — Marketoonist

Anti-Poser CAPTCHA Asks User to Click ‘Every Real Punk Band’ — The Hard Times

TOPRANK MARKETING & CLIENTS IN THE NEWS:

  • TopRank Marketing — Top 10 Content Marketing Blogs on the Internet Today — Blogging.org
  • Lee Odden — 50 Tips for Ad Agency New Business — Michael Gass
  • Lee Odden — Natural Language Generation Accelerates Content Marketing, But Keep Your Hands on the Wheel
    CMSWire
  • Lee Odden — 9 Expert Guides: How to Win at Influencer Marketing — Marx Communications
  • Lee Odden — Main Stage Spotlight Speakers at Pubcon Pro Las Vegas — Pubcon

What are some of your top influencer marketing news items for this week?

Thanks for reading, and we hope you’ll join us again next week for the latest digital marketing news, and in the meantime you can follow us at @toprank on Twitter for even more timely daily news. Also, don’t miss the full video summary on our TopRank Marketing TV YouTube Channel.


Email Newsletter
Gain a competitive advantage by subscribing to the
TopRank® Online Marketing Newsletter.

© Online Marketing Blog – TopRank®, 2018. |
Digital Marketing News: Facebook’s Playable Ads & Business Pages Update, Gen Z Mom Trends, & B2B’s Video Uptick | http://www.toprankblog.com

The post Digital Marketing News: Facebook’s Playable Ads & Business Pages Update, Gen Z Mom Trends, & B2B’s Video Uptick appeared first on Online Marketing Blog – TopRank®.

Online Marketing Blog – TopRank®

Posted in Latest NewsComments Off

How a Few Pages Can Make or Break Your Website

Posted by Jeff_Baker

A prospect unequivocally disagreed with a recommendation I made recently.

I told him a few pages of content could make a significant impact on his site. Even when presented with hard numbers backing up my assertions, he still balked. My ego started gnawing: would a painter tell a mathematician how to do trigonometry?

Unlike art, content marketing and SEO aren’t subjective. The quality of the words you write can be quantified, and they can generate a return for your business.

Most of your content won’t do anything

In order to have this conversation, we really need to deal with this fact.

Most content created lives deep on page 7 of Google, ranking for an obscure keyword completely unrelated to your brand. A lack of scientific (objective math) process is to blame. But more on that later.

Case in point: Brafton used to employ a volume play with regard to content strategy. Volume = keyword rankings. It was spray-and-pray, and it worked.

Looking back on current performance for old articles, we find that the top 100 pages of our site (1.2% of all indexed pages) drive 68% of all organic traffic.

Further, 94.5% of all indexed pages drive five clicks or less from search every three months.

So what gives?

Here’s what has changed: easy content is a thing of the past. Writing content and “using keywords” is a plan destined for a lonely death on page 7 of the search results. The process for creating content needs to be rigorous and heavily supported by data. It needs to start with keyword research.

1. Keyword research:

Select content topics from keywords that are regularly being searched. Search volume implies interest, which guarantees what you are writing about is of interest to your target audience. The keywords you choose also need to be reasonable. Using organic difficulty metrics from Moz or SEMrush will help you determine if you stand a realistic chance of ranking somewhere meaningful.

2. SEO content writing:

Your goal is to get the page you’re writing to rank for the keyword you’re targeting. The days of using a keyword in blog posts and linking to a product landing page are over. One page, one keyword. Therefore, if you want your page to rank for the chosen keyword, that page must be the very best piece of content on the web for that keyword. It needs to be in-depth, covering a wide swath of related topics.

How to project results

Build out your initial list of keyword targets. Filter the list down to the keywords with the optimal combination of search volume, organic difficulty, SERP crowding, and searcher intent. You can use this template as a guide — just make a copy and you’re set.

Get the keyword target template

Once you’ve narrowed down your list to top contenders, tally up the total search volume potential — this is the total number of searches that are made on a monthly basis for all your keyword targets. You will not capture this total number of searches. A good rule of thumb is that if you rank, on average, at the bottom of page 1 and top of page 2 for all keywords, your estimated CTR will be a maximum of 2%. The mid-bottom of page 1 will be around 4%. The top-to-middle of page 1 will be 6%.

In the instance above, if we were to rank poorly, with a 2% CTR for 20 pages, we would drive an additional 42–89 targeted, commercial-intent visitors per month.

The website in question drives an average of 343 organic visitors per month, via a random assortment of keywords from 7,850 indexed pages in Google. At the very worst, 20 pages, or .3% of all pages, would drive 10.9% of all traffic. At best (if the client followed the steps above to a T), the .3% additional pages would drive 43.7% of all traffic!

Whoa.

That’s .3% of a site’s indexed pages driving an additional 77.6% of traffic every. single. month.

How a few pages can make a difference

Up until now, everything we’ve discussed has been hypothetical keyword potential. Fortunately, we have tested this method with 37 core landing pages on our site (.5% of all indexed pages). The result of deploying the method above was 24 of our targeted keywords ranking on page 1, driving an estimated 716 high-intent visitors per month.

That amounts to .5% of all pages driving 7.7% of all traffic. At an average CPC of $ 12.05 per keyword, the total cost of paying for these keywords would be $ 8,628 per month.

Our 37 pages (.5% of all pages), which were a one-time investment, drive 7.7% of all traffic at an estimated value of $ 103,533 yearly.

Can a few pages make or break your website? You bet your butt.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

More Articles

Posted in Latest NewsComments Off

Diagnosing Why a Site’s Set of Pages May Be Ranking Poorly – Whiteboard Friday

Posted by randfish

Your rankings have dropped and you don’t know why. Maybe your traffic dropped as well, or maybe just a section of your site has lost rankings. It’s an important and often complex mystery to solve, and there are a number of boxes to check off while you investigate. In this Whiteboard Friday, Rand shares a detailed process to follow to diagnose what went wrong to cause your rankings drop, why it happened, and how to start the recovery process.

Diagnosing why a site's pages may be ranking poorly

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to talk about diagnosing a site and specifically a section of a site’s pages and why they might be performing poorly, why their traffic may have dropped, why rankings may have dropped, why both of them might have dropped. So we’ve got a fairly extensive process here, so let’s get started.

Step 1: Uncover the problem

First off, our first step is uncovering the problem or finding whether there is actually a problem. A good way to think about this is especially if you have a larger website, if we’re talking about a site that’s 20 or 30 or even a couple hundred pages, this is not a big issue. But many websites that SEOs are working on these days are thousands, tens of thousands, hundreds of thousands of pages. So what I like to urge folks to do is to

A. Treat different site sections as unique segments for investigation. You should look at them individually.

A lot of times subfolders or URL structures are really helpful here. So I might say, okay, MySite.com, I’m going to look exclusively at the /news section. Did that fall in rankings? Did it fall in traffic? Or was it /posts, where my blog posts and my content is? Or was it /cities? Let’s say I have a website that’s dealing with data about the population of cities. So I rank for lots of those types of queries, and it seems like I’m ranking for fewer of them, and it’s my cities pages that are poorly performing in comparison to where they were a few months ago or last year at this time.

B. Check traffic from search over time.

So I go to my Google Analytics or whatever analytics you’re using, and you might see something like, okay, I’m going to look exclusively at the /cities section. If you can structure your URLs in this fashion, use subfolders, this is a great way to do it. Then take a look and see, oh, hang on, that’s a big traffic drop. We fell off a cliff there for these particular pages.

This data can be hiding inside your analytics because it could be that the rest of your site is performing well. It’s going sort of up and to the right, and so you see this slow plateauing or a little bit of a decline, but it’s not nearly as sharp as it is if you look at the traffic specifically for a single subsection that might be performing poorly, like this /cities section.

From there, I’m going to next urge you to use Google Trends. Why? Why would I go to Google Trends? Because what I want you to do is I want you to look at some of your big keywords and topics in Google Trends to see if there has been a serious decline in search volume at the same time. If search demand is rising or staying stable over the course of time where you have lost traffic, it’s almost certainly something you’ve done, not something searchers are doing. But if you see that traffic has declined, for example, maybe you were ranking really well for population data from 2015. It turns out people are now looking for population data for 2016 or ’17 or ’18. Maybe that is part of the problem, that search demand has fallen and your curve matches that.

C. Perform some diagnostic queries or use your rank tracking data if you have it on these types of things.

This is one of the reasons I like to rank track for even these types of queries that don’t get a lot of traffic.

1. Target keywords. In this case, it might be “Denver population growth,” maybe that’s one of your keywords. You would see, “Do I still rank for this? How well do I rank for this? Am I ranking more poorly than I used to?”

2. Check brand name plus target keyword. So, in this case, it would be my site plus the above here plus “Denver population growth,” so My Site or MySite.com Denver population growth. If you’re not ranking for that, that’s usually an indication of a more serious problem, potentially a penalty or some type of dampening that’s happening around your brand name or around your website.

3. Look for a 10 to 20-word text string from page content without quotes. It could be shorter. It could be only six or seven words, or it could be longer, 25 words if you really need it. But essentially, I want to take a string of text that exists on the page and put it in order in Google search engine, not in quotes. I do not want to use quotes here, and I want to see how it performs. This might be several lines of text here.

4. Look for a 10 to 20-word text string with quotes. So those lines of text, but in quotes searched in Google. If I’m not ranking for this, but I am for this one … sorry, if I’m not ranking for the one not in quotes, but I am in quotes, I might surmise this is probably not duplicate content. It’s probably something to do with my content quality or maybe my link profile or Google has penalized or dampened me in some way.

5. site: urlstring/ So I would search for “site:MySite.com/cities/Denver.” I would see: Wait, has Google actually indexed my page? When did they index it? Oh, it’s been a month. I wonder why they haven’t come back. Maybe there’s some sort of crawl issue, robots.txt issue, meta robots issue, something. I’m preventing Google from potentially getting there. Or maybe they can’t get there at all, and this results in zero results. That means Google hasn’t even indexed the page. Now we have another type of problem.

D. Check your tools.

1. Google Search Console. I would start there, especially in the site issues section.

2. Check your rank tracker or whatever tool you’re using, whether that’s Moz or something else.

3. On-page and crawl monitoring. Hopefully you have something like that. It could be through Screaming Frog. Maybe you’ve run some crawls over time, or maybe you have a tracking system in place. Moz has a crawl system. OnPage.org has a really good one.

4. Site uptime. So I might check Pingdom or other things that alert me to, “Oh, wait a minute, my site was down for a few days last week. That obviously is why traffic has fallen,” those types of things.

Step 2: Offer hypothesis for falling rankings/traffic

Okay, you’ve done your diagnostics. Now it’s time to offer some hypotheses. So now that we understand which problem I might have, I want to understand what could be resulting in that problem. So there are basically two situations you can have. Rankings have stayed stable or gone up, but traffic has fallen.

A. If rankings are up, but traffic is down…

In those cases, these are the five things that are most typically to blame.

1. New SERP features. There’s a bunch of featured snippets that have entered the population growth for cities search results, and so now number one is not what number one used to be. If you don’t get that featured snippet, you’re losing out to one of your competitors.

2. Lower search demand. Like we talked about in Google Trends. I’m looking at search demand, and there are just not as many people searching as there used to be.

3. Brand or reputation issues. I’m ranking just fine, but people now for some reason hate me. People who are searching this sector think my brand is evil or bad or just not as helpful as it used to be. So I have issues, and people are not clicking on my results. They’re choosing someone else actively because of reputation issues.

4. Snippet problems. I’m ranking in the same place I used to be, but I’m no longer the sexiest, most click-drawing snippet in the search results, and other people are earning those clicks instead.

5. Shift in personalization or location biasing by Google. It used to be the case that everyone who searched for city name plus population growth got the same results, but now suddenly people are seeing different results based on maybe their device or things they’ve clicked in the past or where they’re located. Location is often a big cause for this.

So for many SEOs for many years, “SEO consultant” resulted in the same search results. Then Google introduced the Maps results and pushed down a lot of those folks, and now “SEO consultant” results in different ranked results in each city and each geography that you search in. So that can often be a cause for falling traffic even though rankings remain high.

B. If rankings and traffic are down…

If you’re seeing that rankings have fallen and traffic has fallen in conjunction, there’s a bunch of other things that are probably going on that are not necessarily these things. A few of these could be responsible still, like snippet problems could cause your rankings and your traffic to fall, or brand and reputation issues could cause your click-through rate to fall, which would cause you to get dampened. But oftentimes it’s things like this:

1. & 2. Duplicate content and low-quality or thin content. Google thinks that what you’re providing just isn’t good enough.

3. Change in searcher intent. People who were searching for population growth used to want what you had to offer, but now they want something different and other people in the SERP are providing that, but you are not, so Google is ranking you lower. Even though your content is still good, it’s just not serving the new searcher intent.

4. Loss to competitors. So maybe you have worse links than they do now or less relevance or you’re not solving the searcher’s query as well. Your user interface, your UX is not as good. Your keyword targeting isn’t as good as theirs. Your content quality and the unique value you provide isn’t as good as theirs. If you see that one or two competitors are consistently outranking you, you might diagnose that this is the problem.

5. Technical issues. So if I saw from over here that the crawl was the problem, I wasn’t getting indexed, or Google hasn’t updated my pages in a long time, I might look into accessibility things, maybe speed, maybe I’m having problems like letting Googlebot in, HTTPS problems, or indexable content, maybe Google can’t see the content on my page anymore because I made some change in the technology of how it’s displayed, or crawlability, internal link structure problems, robots.txt problems, meta robots tag issues, that kind of stuff.

Maybe at the server level, someone on the tech ops team of my website decided, “Oh, there’s this really problematic bot coming from Mountain View that’s costing us a bunch of bandwidth. Let’s block bots from Mountain View.” No, don’t do that. Bad. Those kinds of technical issues can happen.

6. Spam and penalties. We’ll talk a little bit more about how to diagnose those in a second.

7. CTR, engagement, or pogo-sticking issues. There could be click-through rate issues or engagement issues, meaning pogo sticking, like people are coming to your site, but they are clicking back because they weren’t satisfied by your results, maybe because their expectations have changed or market issues have changed.

Step 3: Make fixes and observe results

All right. Next and last in this process, what we’re going to do is make some fixes and observe the results. Hopefully, we’ve been able to correctly diagnose and form some wise hypotheses about what’s going wrong, and now we’re going to try and resolve them.

A. On-page and technical issues should solve after a new crawl + index.

So on-page and technical issues, if we’re fixing those, they should usually resolve, especially on small sections of sites, pretty fast. As soon as Google has crawled and indexed the page, you should generally see performance improve. But this can take a few weeks if we’re talking about a large section on a site, many thousands of pages, because Google has to crawl and index all of them to get the new sense that things are fixed and traffic is coming in. Since it’s long tail to many different pages, you’re not going to see that instant traffic gain and rise as fast.

B. Link issues and spam penalty problems can take months to show results.

Look, if you have crappier links or not a good enough link profile as your competitors, growing that can take months or years even to fix. Penalty problems and spam problems, same thing. Google can take sometimes a long time. You’ve seen a lot of spam experts on Twitter saying, “Oh, well, all my clients who had issues over the last nine months suddenly are ranking better today,” because Google made some fix in their latest index rollout or their algorithm changed, and it’s sort of, okay, well we’ll reward the people for all the fixes that they’ve made. Sometimes that’s in batches that take months.

C. Fixing a small number of pages in a section that’s performing poorly might not show results very quickly.

For example, let’s say you go and you fix /cities/Milwaukee. You determine from your diagnostics that the problem is a content quality issue. So you go and you update these pages. They have new content. It serves the searchers much better, doing a much better job. You’ve tested it. People really love it. You fixed two cities, Milwaukee and Denver, to test it out. But you’ve left 5,000 other cities pages untouched.

Sometimes Google will sort of be like, “No, you know what? We still think your cities pages, as a whole, don’t do a good job solving this query. So even though these two that you’ve updated do a better job, we’re not necessarily going to rank them, because we sort of think of your site as this whole section and we grade it as a section or apply some grades as a section.” That is a real thing that we’ve observed happening in Google’s results.

Because of this, one of the things that I would urge you to do is if you’re seeing good results from the people you’re testing it with and you’re pretty confident, I would roll out the changes to a significant subset, 30%, 50%, 70% of the pages rather than doing only a tiny, tiny sample.

D. Sometimes when you encounter these issues, a remove and replace strategy works better than simply upgrading old URLs.

So if Google has decided /cities, your /cities section is just awful, has all sorts of problems, not performing well on a bunch of different vectors, you might take your /cities section and actually 301 redirect them to a new URL, /location, and put the new UI and the new content that better serves the searcher and fixes a lot of these issues into that location section, such that Google now goes, “Ah, we have something new to judge. Let’s see how these location pages on MySite.com perform versus the old cities pages.”

So I know we’ve covered a ton today and there are a lot of diagnostic issues that we haven’t necessarily dug deep into, but I hope this can help you if you’re encountering rankings challenges with sections of your site or with your site as a whole. Certainly, I look forward to your comments and your feedback. If you have other tips for folks facing this, that would be great. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Find More Articles

Posted in Latest NewsComments Off

Designing a Page’s Content Flow to Maximize SEO Opportunity – Whiteboard Friday

Posted by randfish

Controlling and improving the flow of your on-site content can actually help your SEO. What’s the best way to capitalize on the opportunity present in your page design? Rand covers the questions you need to ask (and answer) and the goals you should strive for in today’s Whiteboard Friday.

Designing a page's content flow to maximize SEO opportunity

Click on the whiteboard image above to open a high-resolution version in a new tab!


Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about a designing a page’s content flow to help with your SEO.

Now, unfortunately, somehow in the world of SEO tactics, this one has gotten left by the wayside. I think a lot of people in the SEO world are investing in things like content and solving searchers’ problems and getting to the bottom of searcher intent. But unfortunately, the page design and the flow of the elements, the UI elements, the content elements that sit in a page is discarded or left aside. That’s unfortunate because it can actually make a huge difference to your SEO.

Q: What needs to go on this page, in what order, with what placement?

So if we’re asking ourselves like, “Well, what’s the question here?” Well, it’s what needs to go on this page. I’m trying to rank for “faster home Wi-Fi.” Right now, Lifehacker and a bunch of other people are ranking in these results. It gets a ton of searches. I can drive a lot of revenue for my business if I can rank there. But what needs to go on this page in what order with what placement in order for me to perform the best that I possibly can? It turns out that sometimes great content gets buried in a poor page design and poor page flow. But if we want to answer this question, we actually have to ask some other ones. We need answers to at least these three:

A. What is the searcher in this case trying to accomplish?

When they enter “faster home Wi-Fi,” what’s the task that they want to get done?

B. Are there multiple intents behind this query, and which ones are most popular?

What’s the popularity of those intents in what order? We need to know that so that we can design our flow around the most common ones first and the secondary and tertiary ones next.

C. What’s the business goal of ranking? What are we trying to accomplish?

That’s always going to have to be balanced out with what is the searcher trying to accomplish. Otherwise, in a lot of cases, there’s no point in ranking at all. If we can’t get our goals met, we should just rank for something else where we can.

Let’s assume we’ve got some answers:

Let’s assume that, in this case, we have some good answers to these questions so we can proceed. So pretty simple. If I search for “faster home Wi-Fi,” what I want is usually it’s going to be…

A. Faster download speed at home.

That’s what the searcher is trying to accomplish. But there are multiple intents behind this. Sometimes the searcher is looking to do that..

B1. With their current ISP and their current equipment.

They want to know things they can optimize that don’t cause them to spend money. Can they place their router in different places? Can they change out a cable? Do they need to put it in a different room? Do they need to move their computer? Is the problem something else that’s interfering with their Wi-Fi in their home that they need to turn off? Those kinds of issues.

B2. With a new ISP.

Or can they get a new ISP? They might be looking for an ISP that can provide them with faster home internet in their area, and they want to know what’s available, which is a very different intent than the first one.

B3. With current ISP but new equipment.

maybe they want to keep their ISP, but they are willing to upgrade to new equipment. So they’re looking for what’s the equipment that I could buy that would make the current ISP I have, which in many cases in the United States, sadly, there’s only one ISP that can provide you with service in a lot of areas. So they can’t change ISP, but they can change out their equipment.

C. Affiliate revenue with product referrals.

Let’s assume that (C) is we know that what we’re trying to accomplish is affiliate revenue from product referrals. So our business is basically we’re going to send people to new routers or the Google Mesh Network home device, and we get affiliate revenue by passing folks off to those products and recommending them.

Now we can design a content flow.

Okay, fair enough. We now have enough to be able to take care of this design flow. The design flow can involve lots of things. There are a lot of things that could live on a page, everything from navigation to headline to the lead-in copy or the header image or body content, graphics, reference links, the footer, a sidebar potentially.

The elements that go in here are not actually what we’re talking about today. We can have that conversation too. I want a headline that’s going to tell people that I serve all of these different intents. I want to have a lead-in that has a potential to be the featured snippet in there. I want a header image that can rank in image results and be in the featured snippet panel. I’m going to want body content that serves all of these in the order that’s most popular. I want graphics and visuals that suggest to people that I’ve done my research and I can provably show that the results that you get with this different equipment or this different ISP will be relevant to them.

But really, what we’re talking about here is the flow that matters. The content itself, the problem is that it gets buried. What I see many times is folks will take a powerful visual or a powerful piece of content that’s solving the searcher’s query and they’ll put it in a place on the page where it’s hard to access or hard to find. So even though they’ve actually got great content, it is buried by the page’s design.

5 big goals that matter.

The goals that matter here and the ones that you should be optimizing for when you’re thinking about the design of this flow are:

1. How do I solve the searcher’s task quickly and enjoyably?

So that’s about user experience as well as the UI. I know that, for many people, they are going to want to see and, in fact, the result that’s ranking up here on the top is Lifehacker’s top 10 list for how to get your home Wi-Fi faster. They include things like upgrading your ISP, and here’s a tool to see what’s available in your area. They include maybe you need a better router, and here are the best ones. Maybe you need a different network or something that expands your network in your home, and here’s a link out to those. So they’re serving that purpose up front, up top.

2. Serve these multiple intents in the order of demand.

So if we can intuit that most people want to stick with their ISP, but are willing to change equipment, we can serve this one first (B3). We can serve this one second (B1), and we can serve the change out my ISP third (B2), which is actually the ideal fit in this scenario for us. That helps us

3. Optimize for the business goal without sacrificing one and two.

I would urge you to design generally with the searcher in mind and if you can fit in the business goal, that is ideal. Otherwise, what tends to happen is the business goal comes first, the searcher comes second, and you come tenth in the results.

4. If possible, try to claim the featured snippet and the visual image that go up there.

That means using the lead-in up at the top. It’s usually the first paragraph or the first few lines of text in an ordered or unordered list, along with a header image or visual in order to capture that featured snippet. That’s very powerful for search results that are still showing it.

5. Limit our bounce back to the SERP as much as possible.

In many cases, this means limiting some of the UI or design flow elements that hamper people from solving their problems or that annoy or dissuade them. So, for example, advertising that pops up or overlays that come up before I’ve gotten two-thirds of the way down the page really tend to hamper efforts, really tend to increase this bounce back to the SERP, the search engine call pogo-sticking and can harm your rankings dramatically. Design elements, design flows where the content that actually solves the problem is below an advertising block or below a promotional block, that also is very limiting.

So to the degree that we can control the design of our pages and optimize for that, we can actually take existing content that you might already have and improve its rankings without having to remake it, without needing new links, simply by improving the flow.

I hope we’ll see lots of examples of those in the comments, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

How to Diagnose Pages that Rank in One Geography But Not Another – Whiteboard Friday

Posted by randfish

Are you ranking pretty well in one locale, only to find out your rankings tank in another? It’s not uncommon, even for sites without an intent to capture local queries. In today’s Whiteboard Friday, Rand shows you how to diagnose the issue with a few clever SEO tricks, then identify the right strategy to get back on top.

Diagnose Why Pages ranks for One Geography But Not Another

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to this edition of Whiteboard Friday. This week we’re going to chat about rankings that differ from geography to geography. Many of you might see that you are ranking particularly well in one city, but when you perform that search in another city or in another country perhaps, that still speaks the same language and has very similar traits, that maybe you’re not performing well.

Maybe you do well in Canada, but you don’t do well in the United States. Maybe you do well in Portland, Oregon, but you do poorly in San Diego, California. Sometimes you might be thinking to yourself, “Well, wait, this search is not particularly local, or at least I didn’t think of it as being particularly local. Why am I ranking in one and not the other?” So here’s a process that you can use to diagnose.

Confirm the rankings you see are accurate:

The first thing we need to do is confirm that the rankings you see or that you’ve heard about are accurate. This is actually much more difficult than it used to be. It used to be you could scroll to the bottom of Google and change your location to whatever you wanted. Now Google will geolocate you by your IP address or by a precise location on your mobile device, and unfortunately you can’t just specify one particular location or another — unless you know some of these SEO hacks.

A. Google’s AdPreview Tool – Google has an ad preview tool, where you can specify and set a particular location. That’s at AdWords.Google.com slash a bunch of junk slash ad preview. We’ll make sure that the link is down in the notes below.

B. The ampersand-near-equals parameter (&near=) - Now, some SEOs have said that this is not perfect, and I agree it is imperfect, but it is pretty close. We’ve done some comparisons here at Moz. I’ve done them while I’m traveling. It’s not bad. Occasionally, you’ll see one or two things that are not the same. The advertisements are frequently not the same. In fact, they don’t seem to work well. But the organic results look pretty darn close. The maps results look pretty darn close. So I think it’s a reasonable tool that you can use.

That is by basically changing the Google search query — so this is the URL in the search query — from Google.com/search?q= and then you might have ice+cream or WordPress+web+design, and then you use this, &near= and the city and state here in the United States or city and province in Canada or city and region in another country. In this case, I’m going with Portland+OR. This will change my results. You can give this a try yourself. You can see that you will see the ice cream places that are in Portland, Oregon, when you perform this search query.

For countries, you can use another one. You can either go directly to the country code Google, so for the UK Google.co.uk, or for New Zealand Google.co.nz, or for Canada Google.ca. Then you can type that in.You can also use this parameter &GL= instead of &near. This is global location equals the country code, and then you could put in CA for Canada or UK for the UK or NZ for New Zealand.

C. The Mozbar’s search profiles – You can also do this with the MozBar. The MozBar kind of hacks the near parameter for you, and you can just specify a location and create a search profile. Do that right inside the MozBar. That’s one of the very nice things about using it.

D. Rank tracking with a platform that supports location-specific rankings - Some of them don’t, some of them do. Moz does right now. I believe Searchmetrics does if you use the enterprise. Oh, I’m trying to remember if Rob Bucci said STAT does. Well, Rob will answer in the comments, and he’ll tell us whether STAT does. I think that they do.

Look at who IS ranking and what features they may have:

So next, once you’ve figured out whether this ranking anomaly that you perceive is real or not, you can step two look at who is ranking in the one where you’re not and figure out what factors they might have going for them.

  • Have they gotten a lot of local links, location-specific links from these websites that are in that specific geography or serve that geography, local chambers of commerce, local directories, those kinds of things?
  • Do they have a more hyper-local service area? On a map, if this is the city, do they serve that specific region? You serve a broad set of locations all over the place, and maybe you don’t have a geo-specific region that you’re serving.
  • Do they have localized listings, listings in places like where Moz Local or a competitor like Yext or Whitespark might push all their data to? Those could be things like Google Maps and Bing Maps, directories, local data aggregators, Yelp, TripAdvisor, etc., etc.
  • Do they have rankings in Google Maps? If you go and look and you see that this website is ranking particularly well in Google Maps for that particular region and you are not, that might be another signal that hyper-local intent and hyper-local ranking signals, ranking algorithm is in play there.
  • Are they running local AdWords ads? I know this might seem like, “Wait a minute. Rand, I thought ads were not directly connected to organic search results.” They’re not, but it tends to be the case that if you bid on AdWords, you tend to increase your organic click-through rate as well, because people see your ad up at the top, and then they see you again a second time, and so they’re a little more biased to click. Therefore, buying local ads can sometimes increase organic click-through rate as well. It can also brand people with your particular business. So that is one thing that might make a difference here.

Consider location-based searcher behaviors:

Now we’re not considering who is ranking, but we’re considering who is doing the searching, these location-based searchers and what their behavior is like.

  • Are they less likely to search for your brand because you’re not as well known in that region?
  • Are they less likely to click your site in the SERPs because you’re not as well known?
  • Is their intent somehow different because of their geography? Maybe there’s a language issue or a regionalism of some kind. This could be a local language thing even here in the United States, where parts of the country say “soda” and parts of the country say “pop.” Maybe those mean two different things, and “pop” means, “Oh, it’s a popcorn store in Seattle,” because there’s the Pop brand, but in the Midwest, “pop” clearly refers to types of soda beverages.
  • Are they more or less sensitive to a co-located solution? So it could be that in many geographies, a lot of your market doesn’t care about whether the solution that they’re getting is from their local region, and in others it does. A classic one on a country level is France, whose searchers tend to care tremendously more that they are getting .fr results and that the location of the business they are clicking on is in France versus other folks in Europe who might click a .com or a .co.uk with no problem.


Divide into three buckets:

You’re going to divide the search queries that you care about that have these challenges into three different types of buckets:

Bucket one: Hyper-geo-sensitive

This would be sort of the classic geo-specific search, where you see maps results right up at the top. The SERPs change completely from geo to geo. So if you perform the search in Portland and then you perform it in San Diego, you see very, very different results. Seven to nine of the top ten at least are changing up, and it’s the case that almost no non-local listings are showing in the top five results. When you see these, this is probably non-targetable without a physical location in that geography. So if you don’t have a physical location, you’re kind of out of business until you get there. If you do, then you can work on the local ranking signals that might be holding you back.

Bucket two: Semi-geo-sensitive

I’ve actually illustrated this one over here, because this can be a little bit challenging to describe. But basically, you’re getting a mix of geo-specific and global results. So, for example, I use the &near=Portland, Oregon, because I’m in Seattle and I want to see Portland’s results for WordPress web design.

WordPress web design, when I do the search all over the United States, the first one or two results are pretty much always the same. They’re always this Web Savvy Marketing link and this Creative Bloq, and they’re very broad. They are not specifically about a local provider of WordPress web design.

But then you get to number three and four and five, and the results change to be local-specific businesses. So in Portland, it’s these Mozak Design guys. Mozak, no relation to Moz, to my knowledge anyway. In San Diego, it’s Kristin Falkner, who’s ranking number three, and then other local San Diego WordPress web design businesses at four and five. So it’s kind of this mix of geo and non-geo. You can generally tell this by looking and changing your geography in this fashion seeing those different things.

Some of the top search results usually will be like this, and they’ll stay consistent from geography to geography. In these cases, what you want to do is work on boosting those local-specific signals. So if you are ranking number five or six and you want to be number three, go for that, or you can try and be in the global results, in which case you’re trying to boost the classic ranking signals, not the local ones so you can get up there.

Bucket three: Non-geo-sensitive

Those would be, “I do this search, and I don’t see any local-specific results.” It’s just a bunch of nationwide or worldwide brands. There are no maps, usually only one, maybe two geo-specific results in the top 10, and they tend to be further down, and the SERPs barely change from geo to geo. They’re pretty much the same throughout the country.

So once you put these into these three buckets, then you know which thing to do. Here, it’s pursue classic signals. You probably don’t need much of a local boost.

Here, you have the option of going one way or the other, boosting local signals to get into these rankings or boosting the classic signals to get into those global ones.

Here you’re going to need the physical business.

All right, everyone. I hope you’ve enjoyed this edition of Whiteboard Friday, and we’ll see you again next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

New Site Crawl: Rebuilt to Find More Issues on More Pages, Faster Than Ever!

Posted by Dr-Pete

First, the good news — as of today, all Moz Pro customers have access to the new version of Site Crawl, our entirely rebuilt deep site crawler and technical SEO auditing platform. The bad news? There isn’t any. It’s bigger, better, faster, and you won’t pay an extra dime for it.

A moment of humility, though — if you’ve used our existing site crawl, you know it hasn’t always lived up to your expectations. Truth is, it hasn’t lived up to ours, either. Over a year ago, we set out to rebuild the back end crawler, but we realized quickly that what we wanted was an entirely re-imagined crawler, front and back, with the best features we could offer. Today, we launch the first version of that new crawler.

Code name: Aardwolf

The back end is entirely new. Our completely rebuilt “Aardwolf” engine crawls twice as fast, while digging much deeper. For larger accounts, it can support up to ten parallel crawlers, for actual speeds of up to 20X the old crawler. Aardwolf also fully supports SNI sites (including Cloudflare), correcting a major shortcoming of our old crawler.

View/search *all* URLs

One major limitation of our old crawler is that you could only see pages with known issues. Click on “All Crawled Pages” in the new crawler, and you’ll be brought to a list of every URL we crawled on your site during the last crawl cycle:

You can sort this list by status code, total issues, Page Authority (PA), or crawl depth. You can also filter by URL, status codes, or whether or not the page has known issues. For example, let’s say I just wanted to see all of the pages crawled for Moz.com in the “/blog” directory…

I just click the [+], select “URL,” enter “/blog,” and I’m on my way.

Do you prefer to slice and dice the data on your own? You can export your entire crawl to CSV, with additional data including per-page fetch times and redirect targets.

Recrawl your site immediately

Sometimes, you just can’t wait a week for a new crawl. Maybe you relaunched your site or made major changes, and you have to know quickly if those changes are working. No problem, just click “Recrawl my site” from the top of any page in the Site Crawl section, and you’ll be on your way…

Starting at our Medium tier, you’ll get 10 recrawls per month, in addition to your automatic weekly crawls. When the stakes are high or you’re under tight deadlines for client reviews, we understand that waiting just isn’t an option. Recrawl allows you to verify that your fixes were successful and refresh your crawl report.

Ignore individual issues

As many customers have reminded us over the years, technical SEO is not a one-sized-fits-all task, and what’s critical for one site is barely a nuisance for another. For example, let’s say I don’t care about a handful of overly dynamic URLs (for many sites, it’s a minor issue). With the new Site Crawl, I can just select those issues and then “Ignore” them (see the green arrow for location):

If you make a mistake, no worries — you can manage and restore ignored issues. We’ll also keep tracking any new issues that pop up over time. Just because you don’t care about something today doesn’t mean you won’t need to know about it a month from now.

Fix duplicate content

Under “Content Issues,” we’ve launched an entirely new duplicate content detection engine and a better, cleaner UI for navigating that content. Duplicate content is now automatically clustered, and we do our best to consistently detect the “parent” page. Here’s a sample from Moz.com:

You can view duplicates by the total number of affected pages, PA, and crawl depth, and you can filter by URL. Click on the arrow (far-right column) for all of the pages in the cluster (shown in the screenshot). Click anywhere in the current table row to get a full profile, including the source page we found that link on.

Prioritize quickly & tactically

Prioritizing technical SEO problems requires deep knowledge of a site. In the past, in the interest of simplicity, I fear that we’ve misled some of you. We attempted to give every issue a set priority (high, medium, or low), when the difficult reality is that what’s a major problem on one site may be deliberate and useful on another.

With the new Site Crawl, we decided to categorize crawl issues tactically, using five buckets:

  • Critical Crawler Issues
  • Crawler Warnings
  • Redirect Issues
  • Metadata Issues
  • Content Issues

Hopefully, you can already guess what some of these contain. Critical Crawler Issues still reflect issues that matter first to most sites, such as 5XX errors and redirects to 404s. Crawler Warnings represent issues that might be very important for some sites, but require more context, such as meta NOINDEX.

Prioritization often depends on scope, too. All else being equal, one 500 error may be more important than one duplicate page, but 10,000 duplicate pages is a different matter. Go to the bottom of the Site Crawl Overview Page, and we’ve attempted to balance priority and scope to target your top three issues to fix:

Moving forward, we’re going to be launching more intelligent prioritization, including grouping issues by folder and adding data visualization of your known issues. Prioritization is a difficult task and one we haven’t helped you do as well as we could. We’re going to do our best to change that.

Dive in & tell us what you think!

All existing customers should have access to the new Site Crawl as of earlier this morning. Even better, we’ve been crawling existing campaigns with the Aardwolf engine for a couple of weeks, so you’ll have history available from day one! Stay tuned for a blog post tomorrow on effectively prioritizing Site Crawl issues, and be sure to register for the upcoming webinar.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Articles

Posted in Latest NewsComments Off

How Google assesses the ‘authority’ of web pages

Google has no single authority metric but rather uses a bucket of signals to determine authority on a page-by-page basis.

The post How Google assesses the ‘authority’ of web pages appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

How to Build Backlinks Using Your Competitors’ Broken Pages

Posted by TomCaulton

We all know building backlinks is one of the most important aspects of any successful SEO and digital marketing campaign. However, I believe there is an untapped resource out there for link building: finding your competitors’ broken pages that have been linked to by external sources.

Allow me to elaborate.

Finding the perfect backlink often takes hours, and it can can take days, weeks, or even longer to acquire. That’s where the link building method I’ve outlined below comes in. I use it on a regular basis to build relevant backlinks from competitors’ 404 pages.

Please note: In this post, I will be using Search Engine Land as an example to make my points.

Ready to dive in? Great, because I’m going to walk you through the entire link building process now.

First, you need to find your competitor(s). This is as easy as searching for the keyword you’re targeting on Google and selecting websites that are above you in the SERPs. Once you have a list of competitors, create a spreadsheet to put all of your competitors on, including their position in the rankings and the date you listed them.

Next, download Screaming Frog SEO Spider [a freemium tool]. This software will allow you to crawl all of your competitors website, revealing all their 404 pages. To do this, simply enter your competitors’ URLs in the search bar one at a time, like this:OOskptt.png

Once the crawl is complete, click “Response Codes.”

e4LciHG.png

Then, click on the dropdown arrow next to “filter” and select “Client Error 4xx.”

HYi6TWa.png

Now you’ll be able to see the brand’s 404 pages.

Once you’ve completed the step above, simply press the “Export” button to export all of their 404 pages into a file. Next, import this file into to a spreadsheet in Excel or Google Docs. On this part of the spreadsheet, create tabs called “Trust Flow,” “Citation Flow,” “Referring Domains,” and “External Backlinks.”

Now that you’ve imported all of their 404 pages, you need to dissect the images and external links if there are any. A quick way to do this is to highlight the cell block by pressing on the specific cell at the top, then press “Filter” under the “Data” tab.H3YN9BG.pngLook for the drop-down arrow on the first cell of that block. Click the drop-down arrow, and underneath “Filter by values,” you will see two links: “Select all” and “Clear.”

Press “Clear,” like this:

ZERYiSm.pngThis will clear all preset options. Now, type in the URL of the competitor’s website in the search box and click “Select all.”SKqXxQ2.png

This will filter out all external links and just leave you with their 404 pages. Go through the whole list, highlighting the pages you think you can rewrite.

Now that you have all of your relevant 404 pages in place, run them through Majestic [a paid tool] or Moz’s Open Site Explorer (OSE) [a freemium tool] to see if their 404 pages actually have any external links (which is what we’re ultimately looking for). Add the details from Majestic or Moz to the spreadsheet. No matter which tool you use (I use OSE), hit “Request a CSV” for the backlink data. (Import the data into a new tab on your spreadsheet, or create a new spreadsheet altogether if you wish.)

Find relevant backlinks linking to (X’s) website. Once you’ve found all of the relevant websites, you can either highlight them or remove the ones that aren’t from your spreadsheet.

Please note: It’s worth running each of the websites you’re potentially going to be reaching out to through Majestic and Moz to find out their citation flow, trust flow, and domain authority (DA). You may only want to go for the highest DA; however, in my opinion, if it’s relevant to your niche and will provide useful information, it’s worth targeting.

With the 404s and link opportunities in hand, focus on creating content that’s relevant for the brands you hope to earn a link from. Find the contact information for someone at the brand you want the link from. This will usually be clear on their website; but if not, you can use tools such as VoilaNorbert and Email Hunter to get the information you need. Once you have this information, you need to send them an email similar to this one:


Hi [THEIR NAME],

My name is [YOUR NAME], and I carry out the [INSERT JOB ROLE – i.e., MARKETING] at [YOUR COMPANY'S NAME or WEBSITE].

I have just come across your blog post regarding [INSERT THEIR POST TITLE] and when I clicked on one of the links on that post, it happened to go to a 404 page. As you’re probably aware, this is bad for user experience, which is the reason I’m emailing you today.

We recently published an in-depth article regarding the same subject of the broken link you have on your website: [INSERT YOUR POST TITLE].

Here’s the link to our article: [URL].

I was wondering if you wouldn’t mind linking to our article instead of the 404 page you’re currently linking to, as our article will provide your readers with a better user experience.

We will be updating this article so we can keep people provided with the very latest information as the industry evolves.

Thank you for reading this email and I look forward to hearing from you.

[YOUR NAME]


Disclaimer: The email example above is just an example and should be tailored to your own style of writing.

In closing, remember to keep detailed notes of the conversations you have with people during outreach, and always follow up with people you connect with.

I hope this tactic helps your SEO efforts in the future. It’s certainly helped me find new places to earn links. Not only that, but it gives me new content ideas on a regular basis.

Do you use a similar process to build links? I’d love to hear about it in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Google AMP has reached 125 million documents & is expanding to apps & recipe pages

AMP is growing and expanding, here are some of the latest information from Richard Gingras of Google out of Google I/O.

The post Google AMP has reached 125 million documents & is expanding to apps & recipe pages appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Advert