Tag Archive | "&amp"

The One-Hour Guide to SEO: Keyword Targeting & On-Page Optimization – Whiteboard Friday

Posted by randfish

We’ve covered strategy, keyword research, and how to satisfy searcher intent — now it’s time to tackle optimizing the webpage itself! In the fourth part of the One-Hour Guide to SEO, Rand offers up an on-page SEO checklist to start you off on your way towards perfectly optimized and keyword-targeted pages.

If you missed them, check out the other episodes in the series so far:

A picture of the whiteboard. The content is all detailed within the transcript below.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. Welcome to another edition of our special One-Hour Guide to SEO. We are now on Part IV – Keyword Targeting and On-Page Optimization. So hopefully, you’ve watched Part III, where we talked about searcher satisfaction, how to make sure searchers are happy with the page content that you create and the user experience that you build for them, as well as Part II, where we talked about keyword research and how to make sure that you are targeting the right words and phrases that searchers are actually looking for, that you think you can actually rank for, and that actually get real organic click-through rate, because Google’s zero-click searches are rising.

A depiction of a site with important on-page SEO elements highlighted, drawn on the whiteboard.

Now we’re into on-page SEO. So this is essentially taking the words and phrases that we know we want to rank for with the content that we know will help searchers accomplish their task. Now how do we make sure that the page is optimal for ranking in Google?

On-page SEO has evolved

Well, this is very different from the way it was years ago. A long time ago, and unfortunately many people still believe this to be true about SEO, it was: How do I stuff my keywords into all the right tags and places on the page? How do I take advantage of things like the meta keywords tag, which hasn’t been used in a decade, maybe two? How do I take advantage of putting all the words and phrases stuffed into my title, my URL, my description, my headline, my H2 through H7 tags, all these kinds of things?

Most of that does not matter, but some of it still does. Some of it is still important, and we need to run through what those are so that you give yourself the best possible chance for ranking.

The on-page SEO checklist

So what I’ve done here is created a sort of brief, on-page SEO checklist. This is not comprehensive, especially on the technical portion, because we’re saving that for Part V, the technical SEO section, which we will get into, of this Guide. In this checklist, some of the most important things are on here. 

☑ Descriptive, compelling, keyword-rich title element

Many of the most important things are on here, and those include things like a descriptive, compelling, keyword-rich but not stuffed title element, also called the page title or a title tag. So, for example, if I am a tool website, like toolsource.com — I made that domain name up, I assume it’s registered to somebody — and I want to rank for the “best online survey tools,” well, “The Best Online Survey Tools for 2019″ is a great title tag, and it’s very different from best online survey tools, best online survey software, best online survey software 2019. You’ve seen title tags like that. You’ve seen pages that contain stuff like that. That is no longer good SEO practices.

So we want that descriptive, compelling, makes me want to click. Remember that this title is also going to show up in the search results as the title of the snippet that your website appears in.

☑ Meta description designed to draw the click

Second, a meta description. This is still used by search engines, not for rankings though. Sort of think of it like ad text. You are drawing a click, or you’re attempting to draw the click. So what you want to do is have a description that tells people what’s on the page and inspires them, incites them, makes them want to click on your result instead of somebody else’s. That’s your chance to say, “Here’s why we’re valuable and useful.”

☑ Easy-to-read, sensible, short URL

An easy-to-read, sensible, short URL. For example, toolsource.com/reviews/best-online-surveys-2019. Perfect, very legible, very readable. I see that in the results, I think, “Okay, I know what that page is going to be.” I see that copied and pasted somewhere on the web, I think, “I know what’s going to be at that URL. That looks relevant to me.”

Or reviews.best-online-tools.info. Okay, well, first off, that’s a freaking terrible domain name. /oldseqs?ide=17 bunch of weird letters and tab detail equals this, and UTM parameter equals that. I don’t know what this is. I don’t know what all this means. By the way, having more than one or two URL parameters is very poorly correlated with and not recommended for trying to rank in search results. So you want to try and rewrite these to be more friendly, shorter, more sensible, and readable by a human being. That will help Google as well.

☑ First paragraph optimized for appearing in featured snippets

That first paragraph, the first paragraph of the content or the first few words of the page should be optimized for appearing in what Google calls featured snippets. Now, featured snippets is when I perform a search, for many queries, I don’t just see a list of pages. Sometimes I’ll see this box, often with an image and a bunch of descriptive text that’s drawn from the page, often from the first paragraph or two. So if you want to get that featured snippet, you have to be able to rank on page one, and you need to be optimized to answer the query right in your first paragraph. But this is an opportunity for you to be ranking in position three or four or five, but still have the featured snippet answer above all the other results. Awesome when you can do this in SEO, very, very powerful thing. Featured snippet optimization, there’s a bunch of resources on Moz’s website that we can point you to there too.

☑ Use the keyword target intelligently in…

☑ The headline

So if I’m trying to rank for “best online survey tools,” I would try and use that in my headline. Generally speaking, I like to have the headline and the title of the piece nearly the same or exactly the same so that when someone clicks on that title, they get the same headline on the page and they don’t get this cognitive dissonance between the two.

☑ The first paragraph

The first paragraph, we talked about. 

☑ The page content

The page’s content, you don’t want to have a page that’s talking about best online survey tools and you never mention online surveys. That would be a little weird. 

☑ Internal link anchors

An internal link anchor. So if other places on your website talk about online survey tools, you should be linking to this page. This is helpful for Google finding it, helpful for visitors finding it, and helpful to say this is the page that is about this on our website.

A whiteboard drawing depicting how to target one page with multiple keywords vs multiple pages targeting single keywords.

I do strongly recommend taking the following advice, which is we are no longer in a world where it makes sense to target one keyword per page. For example, best online survey tools, best online survey software, and best online survey tools 2019 are technically three unique keyword phrases. They have different search volumes. Slightly different results will show up for each of them. But it is no longer the case, whereas it was maybe a decade ago, that I would go create a page for each one of those separate things.

Instead, because these all share the same searcher intent, I want to go with one page, just a single URL that targets all the keywords that share the exact same searcher intent. If searchers are looking to find exactly the same thing but with slightly modified or slight variations in how they phrase things, you should have a page that serves all of those keywords with that same searcher intent rather than multiple pages that try to break those up, for a bunch of reasons. One, it’s really hard to get links to all those different pages. Getting links just period is very challenging, and you need them to rank.

Second off, the difference between those is going to be very, very subtle, and it will be awkward and seem to Google very awkward that you have these slight variations with almost the same thing. It might even look to them like duplicate or very similar or low-quality content, which can get you down-ranked. So stick to one page per set of shared intent keywords.

☑ Leverage appropriate rich snippet options

Next, you want to leverage appropriate rich snippet options. So, for example, if you are in the recipes space, you can use a schema markup for recipes to show Google that you’ve got a picture of the recipe and a cooking time and all these different details. Google offers this in a wide variety of places. When you’re doing reviews, they offer you the star ratings. Schema.org has a full list of these, and Google’s rich snippets markup page offers a bunch more. So we’ll point you to both of those as well.

☑ Images on the page employ…

Last, but certainly not least, because image search is such a huge portion of where Google’s search traffic comes from and goes to, it is very wise to optimize the images on the page. Image search traffic can now send significant traffic to you, and optimizing for images can sometimes mean that other people will find your images through Google images and then take them, put them on their own website and link back to you, which solves a huge problem. Getting links is very hard. Images is a great way to do it.

☑ Descriptive, keyword-rich filenames

The images on your page should employ descriptive, keyword-rich filenames, meaning if I have one for typeform, I don’t want it to be pick one, two or three. I want it to be typeformlogo or typeformsurveysoftware as the name of the file.

☑ Descriptive alt attributes

The alt attribute or alt tag is part of how you describe that for screen readers and other accessibility-focused devices, and Google also uses that text too. 

☑ Caption text (if appropriate)

Caption text, if that’s appropriate, if you have like a photograph and a caption describing it, you want to be descriptive of what’s actually in the picture.

☑ Stored in same domain and subdomain

These files, in order to perform well, they generally need to be hosted on the same domain and subdomain. If, for example, all your images are stored on an Amazon Web Services domain and you don’t bother rewriting or making sure that the domain looks like it’s on toolsource.com/photos or /images here, that can cause real ranking problems. Oftentimes you won’t perform at all in Google images because they don’t associate the image with the same domain. Same subdomain as well is preferable.

If you do all these things and you nail searcher intent and you’ve got your keyword research, you are ready to move on to technical SEO and link building and then start ranking. So we’ll see you for that next edition next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 7: Measuring, Prioritizing, & Executing SEO

Posted by BritneyMuller

It’s finally here, for your review and feedback: Chapter 7 of the new Beginner’s Guide to SEO, the last chapter. We cap off the guide with advice on how to measure, prioritize, and execute on your SEO. And if you missed them, check out the drafts of our outline, Chapter One, Chapter Two, Chapter Three, Chapter FourChapter Five, and Chapter Six for your reading pleasure. As always, let us know what you think of Chapter 7 in the comments!


Set yourself up for success.

They say if you can measure something, you can improve it.

In SEO, it’s no different. Professional SEOs track everything from rankings and conversions to lost links and more to help prove the value of SEO. Measuring the impact of your work and ongoing refinement is critical to your SEO success, client retention, and perceived value.

It also helps you pivot your priorities when something isn’t working.

Start with the end in mind

While it’s common to have multiple goals (both macro and micro), establishing one specific primary end goal is essential.

The only way to know what a website’s primary end goal should be is to have a strong understanding of the website’s goals and/or client needs. Good client questions are not only helpful in strategically directing your efforts, but they also show that you care.

Client question examples:

  1. Can you give us a brief history of your company?
  2. What is the monetary value of a newly qualified lead?
  3. What are your most profitable services/products (in order)?

Keep the following tips in mind while establishing a website’s primary goal, additional goals, and benchmarks:

Goal setting tips

  • Measurable: If you can’t measure it, you can’t improve it.
  • Be specific: Don’t let vague industry marketing jargon water down your goals.
  • Share your goals: Studies have shown that writing down and sharing your goals with others boosts your chances of achieving them.

Measuring

Now that you’ve set your primary goal, evaluate which additional metrics could help support your site in reaching its end goal. Measuring additional (applicable) benchmarks can help you keep a better pulse on current site health and progress.

Engagement metrics

How are people behaving once they reach your site? That’s the question that engagement metrics seek to answer. Some of the most popular metrics for measuring how people engage with your content include:

Conversion rate – The number of conversions (for a single desired action/goal) divided by the number of unique visits. A conversion rate can be applied to anything, from an email signup to a purchase to account creation. Knowing your conversion rate can help you gauge the return on investment (ROI) your website traffic might deliver.

In Google Analytics, you can set up goals to measure how well your site accomplishes its objectives. If your objective for a page is a form fill, you can set that up as a goal. When site visitors accomplish the task, you’ll be able to see it in your reports.

Time on page – How long did people spend on your page? If you have a 2,000-word blog post that visitors are only spending an average of 10 seconds on, the chances are slim that this content is being consumed (unless they’re a mega-speed reader). However, if a URL has a low time on page, that’s not necessarily bad either. Consider the intent of the page. For example, it’s normal for “Contact Us” pages to have a low average time on page.

Pages per visit – Was the goal of your page to keep readers engaged and take them to a next step? If so, then pages per visit can be a valuable engagement metric. If the goal of your page is independent of other pages on your site (ex: visitor came, got what they needed, then left), then low pages per visit are okay.

Bounce rate – “Bounced” sessions indicate that a searcher visited the page and left without browsing your site any further. Many people try to lower this metric because they believe it’s tied to website quality, but it actually tells us very little about a user’s experience. We’ve seen cases of bounce rate spiking for redesigned restaurant websites that are doing better than ever. Further investigation discovered that people were simply coming to find business hours, menus, or an address, then bouncing with the intention of visiting the restaurant in person. A better metric to gauge page/site quality is scroll depth.

Scroll depth – This measures how far visitors scroll down individual webpages. Are visitors reaching your important content? If not, test different ways of providing the most important content higher up on your page, such as multimedia, contact forms, and so on. Also consider the quality of your content. Are you omitting needless words? Is it enticing for the visitor to continue down the page? Scroll depth tracking can be set up in your Google Analytics.

Search traffic

Ranking is a valuable SEO metric, but measuring your site’s organic performance can’t stop there. The goal of showing up in search is to be chosen by searchers as the answer to their query. If you’re ranking but not getting any traffic, you have a problem.

But how do you even determine how much traffic your site is getting from search? One of the most precise ways to do this is with Google Analytics.

Using Google Analytics to uncover traffic insights

Google Analytics (GA) is bursting at the seams with data — so much so that it can be overwhelming if you don’t know where to look. This is not an exhaustive list, but rather a general guide to some of the traffic data you can glean from this free tool.

Isolate organic traffic – GA allows you to view traffic to your site by channel. This will mitigate any scares caused by changes to another channel (ex: total traffic dropped because a paid campaign was halted, but organic traffic remained steady).

Traffic to your site over time – GA allows you to view total sessions/users/pageviews to your site over a specified date range, as well as compare two separate ranges.

How many visits a particular page has received – Site Content reports in GA are great for evaluating the performance of a particular page — for example, how many unique visitors it received within a given date range.

Traffic from a specified campaign – You can use UTM (urchin tracking module) codes for better attribution. Designate the source, medium, and campaign, then append the codes to the end of your URLs. When people start clicking on your UTM-code links, that data will start to populate in GA’s “campaigns” report.

Click-through rate (CTR) – Your CTR from search results to a particular page (meaning the percent of people that clicked your page from search results) can provide insights on how well you’ve optimized your page title and meta description. You can find this data in Google Search Console, a free Google tool.

In addition, Google Tag Manager is a free tool that allows you to manage and deploy tracking pixels to your website without having to modify the code. This makes it much easier to track specific triggers or activity on a website.

Additional common SEO metrics

  • Domain Authority & Page Authority (DA/PA) – Moz’s proprietary authority metrics provide powerful insights at a glance and are best used as benchmarks relative to your competitors’ Domain Authority and Page Authority.
  • Keyword rankings – A website’s ranking position for desired keywords. This should also include SERP feature data, like featured snippets and People Also Ask boxes that you’re ranking for. Try to avoid vanity metrics, such as rankings for competitive keywords that are desirable but often too vague and don’t convert as well as longer-tail keywords.
  • Number of backlinks – Total number of links pointing to your website or the number of unique linking root domains (meaning one per unique website, as websites often link out to other websites multiple times). While these are both common link metrics, we encourage you to look more closely at the quality of backlinks and linking root domains your site has.

How to track these metrics

There are lots of different tools available for keeping track of your site’s position in SERPs, site crawl health, SERP features, and link metrics, such as Moz Pro and STAT.

The Moz and STAT APIs (among other tools) can also be pulled into Google Sheets or other customizable dashboard platforms for clients and quick at-a-glance SEO check-ins. This also allows you to provide more refined views of only the metrics you care about.

Dashboard tools like Data Studio, Tableau, and PowerBI can also help to create interactive data visualizations.

Evaluating a site’s health with an SEO website audit

By having an understanding of certain aspects of your website — its current position in search, how searchers are interacting with it, how it’s performing, the quality of its content, its overall structure, and so on — you’ll be able to better uncover SEO opportunities. Leveraging the search engines’ own tools can help surface those opportunities, as well as potential issues:

  • Google Search Console – If you haven’t already, sign up for a free Google Search Console (GSC) account and verify your website(s). GSC is full of actionable reports you can use to detect website errors, opportunities, and user engagement.
  • Bing Webmaster Tools – Bing Webmaster Tools has similar functionality to GSC. Among other things, it shows you how your site is performing in Bing and opportunities for improvement.
  • Lighthouse Audit – Google’s automated tool for measuring a website’s performance, accessibility, progressive web apps, and more. This data improves your understanding of how a website is performing. Gain specific speed and accessibility insights for a website here.
  • PageSpeed Insights – Provides website performance insights using Lighthouse and Chrome User Experience Report data from real user measurement (RUM) when available.
  • Structured Data Testing Tool – Validates that a website is using schema markup (structured data) properly.
  • Mobile-Friendly Test – Evaluates how easily a user can navigate your website on a mobile device.
  • Web.dev – Surfaces website improvement insights using Lighthouse and provides the ability to track progress over time.
  • Tools for web devs and SEOs – Google often provides new tools for web developers and SEOs alike, so keep an eye on any new releases here.

While we don’t have room to cover every SEO audit check you should perform in this guide, we do offer an in-depth Technical SEO Site Audit course for more info. When auditing your site, keep the following in mind:

Crawlability: Are your primary web pages crawlable by search engines, or are you accidentally blocking Googlebot or Bingbot via your robots.txt file? Does the website have an accurate sitemap.xml file in place to help direct crawlers to your primary pages?

Indexed pages: Can your primary pages be found using Google? Doing a site:yoursite.com OR site:yoursite.com/specific-page check in Google can help answer this question. If you notice some are missing, check to make sure a meta robots=noindex tag isn’t excluding pages that should be indexed and found in search results.

Check page titles & meta descriptions: Do your titles and meta descriptions do a good job of summarizing the content of each page? How are their CTRs in search results, according to Google Search Console? Are they written in a way that entices searchers to click your result over the other ranking URLs? Which pages could be improved? Site-wide crawls are essential for discovering on-page and technical SEO opportunities.

Page speed: How does your website perform on mobile devices and in Lighthouse? Which images could be compressed to improve load time?

Content quality: How well does the current content of the website meet the target market’s needs? Is the content 10X better than other ranking websites’ content? If not, what could you do better? Think about things like richer content, multimedia, PDFs, guides, audio content, and more.

Pro tip: Website pruning!

Removing thin, old, low-quality, or rarely visited pages from your site can help improve your website’s perceived quality. Performing a content audit will help you discover these pruning opportunities. Three primary ways to prune pages include:

  1. Delete the page (4XX): Use when a page adds no value (ex: traffic, links) and/or is outdated.
  2. Redirect (3XX): Redirect the URLs of pages you’re pruning when you want to preserve the value they add to your site, such as inbound links to that old URL.
  3. NoIndex: Use this when you want the page to remain on your site but be removed from the index.

Keyword research and competitive website analysis (performing audits on your competitors’ websites) can also provide rich insights on opportunities for your own website.

For example:

  • Which keywords are competitors ranking on page 1 for, but your website isn’t?
  • Which keywords is your website ranking on page 1 for that also have a featured snippet? You might be able to provide better content and take over that snippet.
  • Which websites link to more than one of your competitors, but not to your website?

Discovering website content and performance opportunities will help devise a more data-driven SEO plan of attack! Keep an ongoing list in order to prioritize your tasks effectively.

Prioritizing your SEO fixes

In order to prioritize SEO fixes effectively, it’s essential to first have specific, agreed-upon goals established between you and your client.

While there are a million different ways you could prioritize SEO, we suggest you rank them in terms of importance and urgency. Which fixes could provide the most ROI for a website and help support your agreed-upon goals?

Stephen Covey, author of The 7 Habits of Highly Effective People, developed a handy time management grid that can ease the burden of prioritization:


Source: Stephen Covey, The 7 Habits of Highly Effective People

Putting out small, urgent SEO fires might feel most effective in the short term, but this often leads to neglecting non-urgent important fixes. The not urgent & important items are ultimately what often move the needle for a website’s SEO. Don’t put these off.

SEO planning & execution

“Without strategy, execution is aimless. Without execution, strategy is useless.”
- Morris Chang

Much of your success depends on effectively mapping out and scheduling your SEO tasks. You can use free tools like Google Sheets to plan out your SEO execution (we have a free template here), but you can use whatever method works best for you. Some people prefer to schedule out their SEO tasks in their Google Calendar, in a kanban or scrum board, or in a daily planner.

Use what works for you and stick to it.

Measuring your progress along the way via the metrics mentioned above will help you monitor your effectiveness and allow you to pivot your SEO efforts when something isn’t working. Say, for example, you changed a primary page’s title and meta description, only to notice that the CTR for that page decreased. Perhaps you changed it to something too vague or strayed too far from the on-page topic — it might be good to try a different approach. Keeping an eye on drops in rankings, CTRs, organic traffic, and conversions can help you manage hiccups like this early, before they become a bigger problem.

Communication is essential for SEO client longevity

Many SEO fixes are implemented without being noticeable to a client (or user). This is why it’s essential to employ good communication skills around your SEO plan, the time frame in which you’re working, and your benchmark metrics, as well as frequent check-ins and reports.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

A New Domain Authority Is Coming Soon: What’s Changing, When, & Why

Posted by rjonesx.

Howdy Moz readers,

I’m Russ Jones, Principal Search Scientist at Moz, and I am excited to announce a fantastic upgrade coming next month to one of the most important metrics Moz offers: Domain Authority.

Domain Authority has become the industry standard for measuring the strength of a domain relative to ranking. We recognize that stability plays an important role in making Domain Authority valuable to our customers, so we wanted to make sure that the new Domain Authority brought meaningful changes to the table.

Learn more about the new DA

What’s changing?

What follows is an account of some of the technical changes behind the new Domain Authority and why they matter.

The training set:

Historically, we’ve relied on training Domain Authority against an unmanipulated, large set of search results. In fact, this has been the standard methodology across our industry. But we have found a way to improve upon it that fundamentally, from the ground up, makes Domain Authority more reliable.

The training algorithm:

Rather than relying on a complex linear model, we’ve made the switch to a neural network. This offers several benefits including a much more nuanced model which can detect link manipulation.

The model factors:

We have greatly improved upon the ranking factors behind Domain Authority. In addition to looking at link counts, we’ve now been able to integrate our proprietary Spam Score and complex distributions of links based on quality and traffic, along with a bevy of other factors.

The backbone:

At the heart of Domain Authority is the industry’s leading link index, our new Moz Link Explorer. With over 35 trillion links, our exceptional data turns the brilliant statistical work by Neil Martinsen-Burrell, Chas Williams, and so many more amazing Mozzers into a true industry standard.

What does this mean?

These fundamental improvements to Domain Authority will deliver a better, more trustworthy metric than ever before. We can remove spam, improve correlations, and, most importantly, update Domain Authority relative to all the changes that Google makes.

It means that you will see some changes to Domain Authority when the launch occurs. We staked the model to our existing Domain Authority which minimizes changes, but with all the improvements there will no doubt be some fluctuation in Domain Authority scores across the board.

What should we do?

Use DA as a relative metric, not an absolute one.

First, make sure that you use Domain Authority as a relative metric. Domain Authority is meaningless when it isn’t compared to other sites. What matters isn’t whether your site drops or increases — it’s whether it drops or increases relative to your competitors. When we roll out the new Domain Authority, make sure you check your competitors’ scores as well as your own, as they will likely fluctuate in a similar direction.

Know how to communicate changes with clients, colleagues, and stakeholders

Second, be prepared to communicate with your clients or webmasters about the changes and improvements to Domain Authority. While change is always disruptive, the new Domain Authority is better than ever and will allow them to make smarter decisions about search engine optimization strategies going forward.

Expect DA to keep pace with Google

Finally, expect that we will be continuing to improve Domain Authority. Just like Google makes hundreds of changes to their algorithm every year, we intend to make Domain Authority much more responsive to Google’s changes. Even when Google makes fundamental algorithm updates like Penguin or Panda, you can feel confident that Moz’s Domain Authority will be as relevant and useful as ever.

When is it happening?

We plan on rolling out the new Domain Authority on March 5th, 2019. We will have several more communications between now and then to help you and your clients best respond to the new Domain Authority, including a webinar on February 21st. We hope you’re as excited as we are and look forward to continuing to bring you the most reliable, cutting-edge metrics our industry has to offer.


Be sure to check out the resources we’ve prepared to help you acclimate to the change, including an educational whitepaper and a presentation you can download to share with your clients, team, and stakeholders:

Explore more resources here

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

More Articles

Posted in Latest NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 6: Link Building & Establishing Authority

Posted by BritneyMuller

In Chapter 6 of the new Beginner’s Guide to SEO, we’ll be covering the dos and don’ts of link building and ways your site can build its authority. If you missed them, we’ve got the drafts of our outline, Chapter One, Chapter Two, Chapter Three, Chapter Four, and Chapter Five for your reading pleasure. Be sure to let us know what you think of Chapter 6 in the comments!


Chapter 6: Link Building & Establishing Authority

Turn up the volume.

You’ve created content that people are searching for, that answers their questions, and that search engines can understand, but those qualities alone don’t mean it’ll rank. To outrank the rest of the sites with those qualities, you have to establish authority. That can be accomplished by earning links from authoritative websites, building your brand, and nurturing an audience who will help amplify your content.

Google has confirmed that links and quality content (which we covered back in Chapter 4) are two of the three most important ranking factors for SEO. Trustworthy sites tend to link to other trustworthy sites, and spammy sites tend to link to other spammy sites. But what is a link, exactly? How do you go about earning them from other websites? Let’s start with the basics.

What are links?

Inbound links, also known as backlinks or external links, are HTML hyperlinks that point from one website to another. They’re the currency of the Internet, as they act a lot like real-life reputation. If you went on vacation and asked three people (all completely unrelated to one another) what the best coffee shop in town was, and they all said, “Cuppa Joe on Main Street,” you would feel confident that Cuppa Joe is indeed the best coffee place in town. Links do that for search engines.

Since the late 1990s, search engines have treated links as votes for popularity and importance on the web.

Internal links, or links that connect internal pages of the same domain, work very similarly for your website. A high amount of internal links pointing to a particular page on your site will provide a signal to Google that the page is important, so long as it’s done naturally and not in a spammy way.

The engines themselves have refined the way they view links, now using algorithms to evaluate sites and pages based on the links they find. But what’s in those algorithms? How do the engines evaluate all those links? It all starts with the concept of E-A-T.

You are what you E-A-T

Google’s Search Quality Rater Guidelines put a great deal of importance on the concept of E-A-T — an acronym for expert, authoritative, and trustworthy. Sites that don’t display these characteristics tend to be seen as lower-quality in the eyes of the engines, while those that do are subsequently rewarded. E-A-T is becoming more and more important as search evolves and increases the importance of solving for user intent.

Creating a site that’s considered expert, authoritative, and trustworthy should be your guiding light as you practice SEO. Not only will it simply result in a better site, but it’s future-proof. After all, providing great value to searchers is what Google itself is trying to do.

E-A-T and links to your site

The more popular and important a site is, the more weight the links from that site carry. A site like Wikipedia, for example, has thousands of diverse sites linking to it. This indicates it provides lots of expertise, has cultivated authority, and is trusted among those other sites.

To earn trust and authority with search engines, you’ll need links from websites that display the qualities of E-A-T. These don’t have to be Wikipedia-level sites, but they should provide searchers with credible, trustworthy content.

  • Tip: Moz has proprietary metrics to help you determine how authoritative a site is: Domain Authority, Page Authority, and Spam Score. In general, you’ll want links from sites with a higher Domain Authority than your sites.

Followed vs. nofollowed links

Remember how links act as votes? The rel=nofollow attribute (pronounced as two words, “no follow”) allows you to link to a resource while removing your “vote” for search engine purposes.

Just like it sounds, “nofollow” tells search engines not to follow the link. Some engines still follow them simply to discover new pages, but these links don’t pass link equity (the “votes of popularity” we talked about above), so they can be useful in situations where a page is either linking to an untrustworthy source or was paid for or created by the owner of the destination page (making it an unnatural link).

Say, for example, you write a post about link building practices, and want to call out an example of poor, spammy link building. You could link to the offending site without signaling to Google that you trust it.

Standard links (ones that haven’t had nofollow added) look like this:

<a href="https://moz.com">I love Moz</a>

Nofollow link markup looks like this:

<a href="https://moz.com" rel="nofollow">I love Moz</a>

If follow links pass all the link equity, shouldn’t that mean you want only follow links?

Not necessarily. Think about all the legitimate places you can create links to your own website: a Facebook profile, a Yelp page, a Twitter account, etc. These are all natural places to add links to your website, but they shouldn’t count as votes for your website. (Setting up a Twitter profile with a link to your site isn’t a vote from Twitter that they like your site.)

It’s natural for your site to have a balance between nofollowed and followed backlinks in its link profile (more on link profiles below). A nofollow link might not pass authority, but it could send valuable traffic to your site and even lead to future followed links.

  • Tip: Use the MozBar extension for Google Chrome to highlight links on any page to find out whether they’re nofollow or follow without ever having to view the source code!

Your link profile

Your link profile is an overall assessment of all the inbound links your site has earned: the total number of links, their quality (or spamminess), their diversity (is one site linking to you hundreds of times, or are hundreds of sites linking to you once?), and more. The state of your link profile helps search engines understand how your site relates to other sites on the Internet. There are various SEO tools that allow you to analyze your link profile and begin to understand its overall makeup.

How can I see which inbound links point to my website?

Visit Moz Link Explorer and type in your site’s URL. You’ll be able to see how many and which websites are linking back to you.

What are the qualities of a healthy link profile?

When people began to learn about the power of links, they began manipulating them for their benefit. They’d find ways to gain artificial links just to increase their search engine rankings. While these dangerous tactics can sometimes work, they are against Google’s terms of service and can get a website deindexed (removal of web pages or entire domains from search results). You should always try to maintain a healthy link profile.

A healthy link profile is one that indicates to search engines that you’re earning your links and authority fairly. Just like you shouldn’t lie, cheat, or steal, you should strive to ensure your link profile is honest and earned via your hard work.

Links are earned or editorially placed

Editorial links are links added naturally by sites and pages that want to link to your website.

The foundation of acquiring earned links is almost always through creating high-quality content that people genuinely wish to reference. This is where creating 10X content (a way of describing extremely high-quality content) is essential! If you can provide the best and most interesting resource on the web, people will naturally link to it.

Naturally earned links require no specific action from you, other than the creation of worthy content and the ability to create awareness about it.

  • Tip: Earned mentions are often unlinked! When websites are referring to your brand or a specific piece of content you’ve published, they will often mention it without linking to it. To find these earned mentions, use Moz’s Fresh Web Explorer. You can then reach out to those publishers to see if they’ll update those mentions with links.

Links are relevant and from topically similar websites

Links from websites within a topic-specific community are generally better than links from websites that aren’t relevant to your site. If your website sells dog houses, a link from the Society of Dog Breeders matters much more than one from the Roller Skating Association. Additionally, links from topically irrelevant sources can send confusing signals to search engines regarding what your page is about.

  • Tip: Linking domains don’t have to match the topic of your page exactly, but they should be related. Avoid pursuing backlinks from sources that are completely off-topic; there are far better uses of your time.

Anchor text is descriptive and relevant, without being spammy

Anchor text helps tell Google what the topic of your page is about. If dozens of links point to a page with a variation of a word or phrase, the page has a higher likelihood of ranking well for those types of phrases. However, proceed with caution! Too many backlinks with the same anchor text could indicate to the search engines that you’re trying to manipulate your site’s ranking in search results.

Consider this. You ask ten separate friends at separate times how their day was going, and they each responded with the same phrase:

“Great! I started my day by walking my dog, Peanut, and then had a picante beef Top Ramen for lunch.”

That’s strange, and you’d be quite suspicious of your friends. The same goes for Google. Describing the content of the target page with the anchor text helps them understand what the page is about, but the same description over and over from multiple sources starts to look suspicious. Aim for relevance; avoid spam.

  • Tip: Use the “Anchor Text” report in Moz’s Link Explorer to see what anchor text other websites are using to link to your content.

Links send qualified traffic to your site

Link building should never be solely about search engine rankings. Esteemed SEO and link building thought leader Eric Ward used to say that you should build your links as though Google might disappear tomorrow. In essence, you should focus on acquiring links that will bring qualified traffic to your website — another reason why it’s important to acquire links from relevant websites whose audience would find value in your site, as well.

  • Tip: Use the “Referral Traffic” report in Google Analytics to evaluate websites that are currently sending you traffic. How can you continue to build relationships with similar types of websites?

Link building don’ts & things to avoid

Spammy link profiles are just that: full of links built in unnatural, sneaky, or otherwise low-quality ways. Practices like buying links or engaging in a link exchange might seem like the easy way out, but doing so is dangerous and could put all of your hard work at risk. Google penalizes sites with spammy link profiles, so don’t give in to temptation.

A guiding principle for your link building efforts is to never try to manipulate a site’s ranking in search results. But isn’t that the entire goal of SEO? To increase a site’s ranking in search results? And herein lies the confusion. Google wants you to earn links, not build them, but the line between the two is often blurry. To avoid penalties for unnatural links (known as “link spam”), Google has made clear what should be avoided.

Purchased links

Google and Bing both seek to discount the influence of paid links in their organic search results. While a search engine can’t know which links were earned vs. paid for from viewing the link itself, there are clues it uses to detect patterns that indicate foul play. Websites caught buying or selling followed links risk severe penalties that will severely drop their rankings. (By the way, exchanging goods or services for a link is also a form of payment and qualifies as buying links.)

Link exchanges / reciprocal linking

If you’ve ever received a “you link to me and I’ll link you you” email from someone you have no affiliation with, you’ve been targeted for a link exchange. Google’s quality guidelines caution against “excessive” link exchange and similar partner programs conducted exclusively for the sake of cross-linking, so there is some indication that this type of exchange on a smaller scale might not trigger any link spam alarms.

It is acceptable, and even valuable, to link to people you work with, partner with, or have some other affiliation with and have them link back to you.

It’s the exchange of links at mass scale with unaffiliated sites that can warrant penalties.

Low-quality directory links

These used to be a popular source of manipulation. A large number of pay-for-placement web directories exist to serve this market and pass themselves off as legitimate, with varying degrees of success. These types of sites tend to look very similar, with large lists of websites and their descriptions (typically, the site’s critical keyword is used as the anchor text to link back to the submittor’s site).

There are many more manipulative link building tactics that search engines have identified. In most cases, they have found algorithmic methods for reducing their impact. As new spam systems emerge, engineers will continue to fight them with targeted algorithms, human reviews, and the collection of spam reports from webmasters and SEOs. By and large, it isn’t worth finding ways around them.

If your site does get a manual penalty, there are steps you can take to get it lifted.

How to build high-quality backlinks

Link building comes in many shapes and sizes, but one thing is always true: link campaigns should always match your unique goals. With that said, there are some popular methods that tend to work well for most campaigns. This is not an exhaustive list, so visit Moz’s blog posts on link building for more detail on this topic.

Find customer and partner links

If you have partners you work with regularly, or loyal customers that love your brand, there are ways to earn links from them with relative ease. You might send out partnership badges (graphic icons that signify mutual respect), or offer to write up testimonials of their products. Both of those offer things they can display on their website along with links back to you.

Publish a blog

This content and link building strategy is so popular and valuable that it’s one of the few recommended personally by the engineers at Google. Blogs have the unique ability to contribute fresh material on a consistent basis, generate conversations across the web, and earn listings and links from other blogs.

Careful, though — you should avoid low-quality guest posting just for the sake of link building. Google has advised against this and your energy is better spent elsewhere.

Create unique resources

Creating unique, high quality resources is no easy task, but it’s well worth the effort. High quality content that is promoted in the right ways can be widely shared. It can help to create pieces that have the following traits:

Creating a resource like this is a great way to attract a lot of links with one page. You could also create a highly-specific resource — without as broad of an appeal — that targeted a handful of websites. You might see a higher rate of success, but that approach isn’t as scalable.

Users who see this kind of unique content often want to share it with friends, and bloggers/tech-savvy webmasters who see it will often do so through links. These high quality, editorially earned votes are invaluable to building trust, authority, and rankings potential.

Build resource pages

Resource pages are a great way to build links. However, to find them you’ll want to know some Advanced Google operators to make discovering them a bit easier.

For example, if you were doing link building for a company that made pots and pans, you could search for: cooking intitle:”resources” and see which pages might be good link targets.

This can also give you great ideas for content creation — just think about which types of resources you could create that these pages would all like to reference/link to.

Get involved in your local community

For a local business (one that meets its customers in person), community outreach can result in some of the most valuable and influential links.

  • Engage in sponsorships and scholarships.
  • Host or participate in community events, seminars, workshops, and organizations.
  • Donate to worthy local causes and join local business associations.
  • Post jobs and offer internships.
  • Promote loyalty programs.
  • Run a local competition.
  • Develop real-world relationships with related local businesses to discover how you can team up to improve the health of your local economy.

All of these smart and authentic strategies provide good local link opportunities.

Refurbish top content

You likely already know which of your site’s content earns the most traffic, converts the most customers, or retains visitors for the longest amount of time.

Take that content and refurbish it for other platforms (Slideshare, YouTube, Instagram, Quora, etc.) to expand your acquisition funnel beyond Google.

You can also dust off, update, and simply republish older content on the same platform. If you discover that a few trusted industry websites all linked to a popular resource that’s gone stale, update it and let those industry websites know — you may just earn a good link.

You can also do this with images. Reach out to websites that are using your images and not citing/linking back to you and ask if they’d mind including a link.

Be newsworthy

Earning the attention of the press, bloggers, and news media is an effective, time-honored way to earn links. Sometimes this is as simple as giving something away for free, releasing a great new product, or stating something controversial. Since so much of SEO is about creating a digital representation of your brand in the real world, to succeed in SEO, you have to be a great brand.

Be personal and genuine

The most common mistake new SEOs make when trying to build links is not taking the time to craft a custom, personal, and valuable initial outreach email. You know as well as anyone how annoying spammy emails can be, so make sure yours doesn’t make people roll their eyes.

Your goal for an initial outreach email is simply to get a response. These tips can help:

  • Make it personal by mentioning something the person is working on, where they went to school, their dog, etc.
  • Provide value. Let them know about a broken link on their website or a page that isn’t working on mobile.
  • Keep it short.
  • Ask one simple question (typically not for a link; you’ll likely want to build a rapport first).

Pro Tip:

Earning links can be very resource-intensive, so you’ll likely want to measure your success to prove the value of those efforts.

Metrics for link building should match up with the site’s overall KPIs. These might be sales, email subscriptions, page views, etc. You should also evaluate Domain and/or Page Authority scores, the ranking of desired keywords, and the amount of traffic to your content — but we’ll talk more about measuring the success of your SEO campaigns in Chapter 7.

Beyond links: How awareness, amplification, and sentiment impact authority

A lot of the methods you’d use to build links will also indirectly build your brand. In fact, you can view link building as a great way to increase awareness of your brand, the topics on which you’re an authority, and the products or services you offer.

Once your target audience knows about you and you have valuable content to share, let your audience know about it! Sharing your content on social platforms will not only make your audience aware of your content, but it can also encourage them to amplify that awareness to their own networks, thereby extending your own reach.

Are social shares the same as links? No. But shares to the right people can result in links. Social shares can also promote an increase in traffic and new visitors to your website, which can grow brand awareness, and with a growth in brand awareness can come a growth in trust and links. The connection between social signals and rankings seems indirect, but even indirect correlations can be helpful for informing strategy.

Trustworthiness goes a long way

For search engines, trust is largely determined by the quality and quantity of the links your domain has earned, but that’s not to say that there aren’t other factors at play that can influence your site’s authority. Think about all the different ways you come to trust a brand:

  • Awareness (you know they exist)
  • Helpfulness (they provide answers to your questions)
  • Integrity (they do what they say they will)
  • Quality (their product or service provides value; possibly more than others you’ve tried)
  • Continued value (they continue to provide value even after you’ve gotten what you needed)
  • Voice (they communicate in unique, memorable ways)
  • Sentiment (others have good things to say about their experience with the brand)

That last point is what we’re going to focus on here. Reviews of your brand, its products, or its services can make or break a business.

In your effort to establish authority from reviews, follow these review rules of thumb:

  • Never pay any individual or agency to create a fake positive review for your business or a fake negative review of a competitor.
  • Don’t review your own business or the businesses of your competitors. Don’t have your staff do so either.
  • Never offer incentives of any kind in exchange for reviews.
  • All reviews must be left directly by customers in their own accounts; never post reviews on behalf of a customer or employ an agency to do so.
  • Don’t set up a review station/kiosk in your place of business; many reviews stemming from the same IP can be viewed as spam.
  • Read the guidelines of each review platform where you’re hoping to earn reviews.

Be aware that review spam is a problem that’s taken on global proportions, and that violation of governmental truth-in-advertising guidelines has led to legal prosecution and heavy fines. It’s just too dangerous to be worth it. Playing by the rules and offering exceptional customer experiences is the winning combination for building both trust and authority over time.

Authority is built when brands are doing great things in the real-world, making customers happy, creating and sharing great content, and earning links from reputable sources.

In the next and final section, you’ll learn how to measure the success of all your efforts, as well as tactics for iterating and improving upon them. Onward!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Articles

Posted in Latest NewsComments Off

Full Funnel Testing: SEO & CRO Together – Whiteboard Friday

Posted by willcritchlow

Testing for only SEO or only CRO isn’t always ideal. Some changes result in higher conversions and reduced site traffic, for instance, while others may rank more highly but convert less well. In today’s Whiteboard Friday, we welcome Will Critchlow as he demonstrates a method of testing for both your top-of-funnel SEO changes and your conversion-focused CRO changes at once.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, everyone. Welcome to another Whiteboard Friday. My name is Will Critchlow, one of the founders at Distilled. If you’ve been following what I’ve been writing and talking about around the web recently, today’s topic may not surprise you that much. I’m going to be talking about another kind of SEO testing.

Over at Distilled, we’ve been investing pretty heavily in building out our capability to do SEO tests and in particular built our optimization delivery network, which has let us do a new kind of SEO testing that hasn’t been previously available to most of our clients. Recently we’ve been working on a new enhancement to this, which is full funnel testing, and that’s what I want to talk about today.

So funnel testing is testing all the way through the funnel, from acquisition at the SEO end to conversion. So it’s SEO testing plus CRO testing together. I’m going to write a little bit more about some of the motivation for this. But, in a nutshell, it essentially boils down to the fact that it is perfectly possible, in fact we’ve seen in the wild cases of tests that win in SEO terms and lose in CRO terms or vice versa.

In other words, tests that maybe you make a change and it converts better, but you lose organic search traffic. Or the other way around, it ranks better, but it converts less well. If you’re only testing one, which is common — I mean most organizations are only testing the conversion rate side of things — it’s perfectly possible to have a winning test, roll it out, and do worse.

CRO testing

So let’s step back a little bit. A little bit of a primer. Conversion rate optimization testing works in an A/B split kind of way. You can test on a single page, if you want to, or a site section. The way it works is you split your audience. So your audience is split. Some of your audience gets one version of the page, and the rest of the audience gets a different version.

Then you can compare the conversion rate among the group who got the control and the group who got the variant. That’s very straightforward. Like I say, it can happen on a single page or across an entire site. SEO testing, a little bit newer. The way this works is you can’t split the audience, because we care very much about the search engine spiders in this case. For the purposes of this consideration, there’s essentially only one Googlebot. So you couldn’t put Google in Class A or Class B here and expect to get anything meaningful.

SEO testing

So the way that we do an SEO test is we actually split the pages. To do this, you need a substantial site section. So imagine, for example, an e-commerce website with thousands of products. You might have a hypothesis of something that will help those product pages perform better. You take your hypothesis and you only apply it to some of the pages, and you leave some of the pages unchanged as a control.

Then, crucially, search engines and users see the same experience. There’s no cloaking going on. There’s no duplication of content. You simply change some pages and not change others. Then you apply kind of advanced mathematical, statistical analysis trying to figure out do these pages get statistically more organic search traffic than we think they would have done if we hadn’t made this change. So that’s how an SEO test works.

Now, as I said, the problem that we are trying to tackle here is it’s really plausible, despite Google’s best intentions to do what’s right for users, it’s perfectly plausible that you can have a test that ranks better but converts less well or vice versa. We’ve seen this with, for example, removing content from a page. Sometimes having a cleaner, simpler page can convert better. But maybe that was where the keywords were and maybe that was helping the page rank. So we’re trying to avoid those kinds of situations.

Full funnel testing

That’s where full funnel testing comes in. So I want to just run through how you run a full funnel test. What you do is you first of all set it up in the same way as an SEO test, because we’re essentially starting with SEO at the top of the funnel. So it’s set up exactly the same way.

Some pages are unchanged. Some pages get the hypothesis applied to them. As far as Google is concerned, that’s the end of the story, because on any individual request to these pages that’s what we serve back. But the critically important thing here is I’ve got my little character. This is a human browser performs a search, “What do badgers eat?”

This was one of our silly examples that we came up with on one of our demo sites. The user lands on this page here. What we do is we then set a cookie. This is a cookie. This user then, as they navigate around the site, no matter where they go within this site section, they get the same treatment, either the control or the variant. They get the same treatment across the entire site section. This is more like the conversion rate test here.

Googlebot = stateless requests

So what I didn’t show in this diagram is if you were running this test across a site section, you would cookie this user and make sure that they always saw the same treatment no matter where they navigated around the site. So because Googlebot is making stateless requests, in other words just independent, one-off requests for each of these of these pages with no cookie set, Google sees the split.

Evaluate SEO test on entrances

Users get whatever their first page impression looks like. They then get that treatment applied across the entire site section. So what we can do then is we can evaluate independently the performance in search, evaluate that on entrances. So do we get significantly more entrances to the variant pages than we would have expected if we hadn’t applied a hypothesis to them?

That tells us the uplift from an SEO perspective. So maybe we say, “Okay, this is plus 11% in organic traffic.” Well, great. So in a vacuum, all else being equal, we’d love to roll out this test.

Evaluate conversion rate on users

But before we do that, what we can do now is we can evaluate the conversion rate, and we do that based on user metrics. So these users are cookied.

We can also set an analytics tag on them and say, “Okay, wherever they navigate around, how many of them end up converting?” Then we can evaluate the conversion rate based on whether they saw treatment A or treatment B. Because we’re looking at conversion rate, the audience size doesn’t exactly have to be the same. So the statistical analysis can take care of that fact, and we can evaluate the conversion rate on a user-centric basis.

So then we maybe see that it’s -5% in conversion rate. We then need to evaluate, “Is this something we should roll out?” So step 1 is: Do we just roll it out? If it’s a win in both, then the answer is yes probably. If they’re in different directions, then there are couple things we can do. Firstly, we can evaluate the relative performance in different directions, taking care that conversion rate applies generally across all channels, and so a relatively small drop in conversion rate can be a really big deal compared to even an uplift in organic traffic, because the conversion rate is applying to all channels, not just your organic traffic channel.

But suppose that it’s a small net positive or a small net negative. What we can then do is we might get to the point that it’s a net positive and roll it out. Either way, we might then say, “What can we take from this? What can we actually learn?” So back to our example of the content. We might say, “You know what? Users like this cleaner version of the page with apparently less content on it.The search engines are clearly relying on that content to understand what this page is about. How do we get the best of both worlds?”

Well, that might be a question of a redesign, moving the layout of the page around a little bit, keeping the content on there, but maybe not putting it front and center to the user as they land right at the beginning. We can test those different things, run sequential tests, try and take the best of the SEO tests and the best of the CRO tests and get it working together and crucially avoid those situations where you think you’ve got a win, because your conversion rate is up, but you actually are about to crater your organic search performance.

We think this is going to just be the more data-driven we get, the more accountable SEO testing makes us, the more important it’s going to be to join these dots and make sure that we’re getting true uplifts on a net basis when we combine them. So I hope that’s been useful to some of you. Thank you for joining me on this week’s Whiteboard Friday. I’m Will Critchlow from Distilled.

Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

The SEO Cyborg: How to Resonate with Users & Make Sense to Search Bots

Posted by alexis-sanders

SEO is about understanding how search bots and users react to an online experience. As search professionals, we’re required to bridge gaps between online experiences, search engine bots, and users. We need to know where to insert ourselves (or our teams) to ensure the best experience for both users and bots. In other words, we strive for experiences that resonate with humans and make sense to search engine bots.

This article seeks to answer the following questions:

  • How do we drive sustainable growth for our clients?
  • What are the building blocks of an organic search strategy?

What is the SEO cyborg?

A cyborg (or cybernetic organism) is defined as “a being with both organic and
biomechatronic body parts, whose physical abilities are extended beyond normal human limitations by mechanical elements.”

With the ability to relate between humans, search bots, and our site experiences, the SEO cyborg is an SEO (or team) that is able to work seamlessly between both technical and content initiatives (whose skills are extended beyond normal human limitations) to support driving of organic search performance. An SEO cyborg is able to strategically pinpoint where to place organic search efforts to maximize performance.

So, how do we do this?

The SEO model

Like so many classic triads (think: primary colors, the Three Musketeers, Destiny’s Child [the canonical version, of course]) the traditional SEO model, known as the crawl-index-rank method, packages SEO into three distinct steps. At the same time, however, this model fails to capture the breadth of work that we SEOs are expected to do on a daily basis, and not having a functioning model can be limiting. We need to expand this model without reinventing the wheel.

The enhanced model involves adding in a rendering, signaling, and connection phase.

You might be wondering, why do we need these?:

  • Rendering: There is increased prevalence of JavaScript, CSS, imagery, and personalization.
  • Signaling: HTML <link> tags, status codes, and even GSC signals are powerful indicators that tell search engines how to process and understand the page, determine its intent, and ultimately rank it. In the previous model, it didn’t feel as if these powerful elements really had a place.
  • Connecting: People are a critical component of search. The ultimate goal of search engines is to identify and rank content that resonates with people. In the previous model, “rank” felt cold, hierarchical, and indifferent towards the end user.

All of this brings us to the question: how do we find success in each stage of this model?

Note: When using this piece, I recommend skimming ahead and leveraging those sections of the enhanced model that are most applicable to your business’ current search program.

The enhanced SEO model

Crawling

Technical SEO starts with the search engine’s ability to find a site’s webpages (hopefully efficiently).

Finding pages

Initially finding pages can happen a few ways, via:

  • Links (internal or external)
  • Redirected pages
  • Sitemaps (XML, RSS 2.0, Atom 1.0, or .txt)

Side note: This information (although at first pretty straightforward) can be really useful. For example, if you’re seeing weird pages popping up in site crawls or performing in search, try checking:

  • Backlink reports
  • Internal links to URL
  • Redirected into URL

Obtaining resources

The second component of crawling relates to the ability to obtain resources (which later becomes critical for rendering a page’s experience).

This typically relates to two elements:

  1. Appropriate robots.txt declarations
  2. Proper HTTP status code (namely 200 HTTP status codes)

Crawl efficiency

Finally, there’s the idea of how efficiently a search engine bot can traverse your site’s most critical experiences.

Action items:

  • Is site’s main navigation simple, clear, and useful?
  • Are there relevant on-page links?
  • Is internal linking clear and crawlable (i.e., <a href=”/”>)?
  • Is an HTML sitemap available?
    • Side note: Make sure to check the HTML sitemap’s next page flow (or behavior flow reports) to find where those users are going. This may help to inform the main navigation.
  • Do footer links contain tertiary content?
  • Are important pages close to root?
  • Are there no crawl traps?
  • Are there no orphan pages?
  • Are pages consolidated?
  • Do all pages have purpose?
  • Has duplicate content been resolved?
  • Have redirects been consolidated?
  • Are canonical tags on point?
  • Are parameters well defined?

Information architecture

The organization of information extends past the bots, requiring an in-depth understanding of how users engage with a site.

Some seed questions to begin research include:

  • What trends appear in search volume (by location, device)? What are common questions users have?
  • Which pages get the most traffic?
  • What are common user journeys?
  • What are users’ traffic behaviors and flow?
  • How do users leverage site features (e.g., internal site search)?

Rendering

Rendering a page relates to search engines’ ability to capture the page’s desired essence.

JavaScript

The big kahuna in the rendering section is JavaScript. For Google, rendering of JavaScript occurs during a second wave of indexing and the content is queued and rendered as resources become available.

Image based off of Google I/O ’18 presentation by Tom Greenway and John Mueller, Deliver search-friendly JavaScript-powered websites

As an SEO, it’s critical that we be able to answer the question — are search engines rendering my content?

Action items:

  • Are direct “quotes” from content indexed?
  • Is the site using <a href=”/”> links (not onclick();)?
  • Is the same content being served to search engine bots (user-agent)?
  • Is the content present within the DOM?
  • What does Google’s Mobile-Friendly Testing Tool’s JavaScript console (click “view details”) say?

Infinite scroll and lazy loading

Another hot topic relating to JavaScript is infinite scroll (and lazy load for imagery). Since search engine bots are lazy users, they won’t scroll to attain content.

Action items:

Ask ourselves – should all of the content really be indexed? Is it content that provides value to users?

  • Infinite scroll: a user experience (and occasionally a performance optimizing) tactic to load content when the user hits a certain point in the UI; typically the content is exhaustive.

Solution one (updating AJAX):

1. Break out content into separate sections

  • Note: The breakout of pages can be /page-1, /page-2, etc.; however, it would be best to delineate meaningful divides (e.g., /voltron, /optimus-prime, etc.)

2. Implement History API (pushState(), replaceState()) to update URLs as a user scrolls (i.e., push/update the URL into the URL bar)

3. Add the <link> tag’s rel=”next” and rel=”prev” on relevant page

Solution two (create a view-all page)
Note: This is not recommended for large amounts of content.

1. If it’s possible (i.e., there’s not a ton of content within the infinite scroll), create one page encompassing all content

2. Site latency/page load should be considered

  • Lazy load imagery is a web performance optimization tactic, in which images loads upon a user scrolling (the idea is to save time, downloading images only when they’re needed)
  • Add <img> tags in <noscript> tags
  • Use JSON-LD structured data
    • Schema.org “image” attributes nested in appropriate item types
    • Schema.org ImageObject item type

CSS

I only have a few elements relating to the rendering of CSS.

Action items:

  • CSS background images not picked up in image search, so don’t count on for important imagery
  • CSS animations not interpreted, so make sure to add surrounding textual content
  • Layouts for page are important (use responsive mobile layouts; avoid excessive ads)

Personalization

Although a trend in the broader digital exists to create 1:1, people-based marketing, Google doesn’t save cookies across sessions and thus will not interpret personalization based on cookies, meaning there must be an average, base-user, default experience. The data from other digital channels can be exceptionally useful when building out audience segments and gaining a deeper understanding of the base-user.

Action item:

  • Ensure there is a base-user, unauthenticated, default experience

Technology

Google’s rendering engine is leveraging Chrome 41. Canary (Chrome’s testing browser) is currently operating on Chrome 69. Using CanIUse.com, we can infer that this affects Google’s abilities relating to HTTP/2, service workers (think: PWAs), certain JavaScript, specific advanced image formats, resource hints, and new encoding methods. That said, this does not mean we shouldn’t progress our sites and experiences for users — we just must ensure that we use progressive development (i.e., there’s a fallback for less advanced browsers [and Google too ☺]).

Action items:

  • Ensure there’s a fallback for less advanced browsers

Indexing

Getting pages into Google’s databases is what indexing is all about. From what I’ve experienced, this process is straightforward for most sites.

Action items:

  • Ensure URLs are able to be crawled and rendered
  • Ensure nothing is preventing indexing (e.g., robots meta tag)
  • Submit sitemap in Google Search Console
  • Fetch as Google in Google Search Console

Signaling

A site should strive to send clear signals to search engines. Unnecessarily confusing search engines can significantly impact a site’s performance. Signaling relates to suggesting best representation and status of a page. All this means is that we’re ensuring the following elements are sending appropriate signals.

Action items:

  • <link> tag: This represents the relationship between documents in HTML.
    • Rel=”canonical”: This represents appreciably similar content.
      • Are canonicals a secondary solution to 301-redirecting experiences?
      • Are canonicals pointing to end-state URLs?
      • Is the content appreciably similar?
        • Since Google maintains prerogative over determining end-state URL, it’s important that the canonical tags represent duplicates (and/or duplicate content).
      • Are all canonicals in HTML?
      • Is there safeguarding against incorrect canonical tags?
    • Rel=”next” and rel=”prev”: These represent a collective series and are not considered duplicate content, which means that all URLs can be indexed. That said, typically the first page in the chain is the most authoritative, so usually it will be the one to rank.
    • Rel=”alternate”
      • media: typically used for separate mobile experiences.
      • hreflang: indicate appropriate language/country
        • The hreflang is quite unforgiving and it’s very easy to make errors.
        • Ensure the documentation is followed closely.
        • Check GSC International Target reports to ensure tags are populating.
  • HTTP status codes can also be signals, particularly the 304, 404, 410, and 503 status codes.
    • 304 – a valid page that simply hasn’t been modified
    • 404 – file not found
    • 410 – file not found (and it is gone, forever and always)
    • 503 – server maintenance

  • Google Search Console settings: Make sure the following reports are all sending clear signals. Occasionally Google decides to honor these signals.
    • International Targeting
    • URL Parameters
    • Data Highlighter
    • Remove URLs
    • Sitemaps

Rank

Rank relates to how search engines arrange web experiences, stacking them against each other to see who ends up on top for each individual query (taking into account numerous data points surrounding the query).

Two critical questions recur often when understanding ranking pages:

  • Does or could your page have the best response?
  • Are you or could you become semantically known (on the Internet and in the minds of users) for the topics? (i.e., are you worthy of receiving links and people traversing the web to land on your experience?)

On-page optimizations

These are the elements webmasters control. Off-page is a critical component to achieving success in search; however, in an idyllic world, we shouldn’t have to worry about links and/or mentions – they should come naturally.

Action items:

  • Textual content:
    • Make content both people and bots can understand
    • Answer questions directly
    • Write short, logical, simple sentences
    • Ensure subjects are clear (not to be inferred)
    • Create scannable content (i.e., make sure <h#> tags are an outline, use bullets/lists, use tables, charts, and visuals to delineate content, etc.)
    • Define any uncommon vocabulary or link to a glossary
  • Multimedia (images, videos, engaging elements):
    • Use imagery, videos, engaging content where applicable
    • Ensure that image optimization best practices are followed
  • Meta elements (<title> tags, meta descriptions, OGP, Twitter cards, etc.)
  • Structured data

Image courtesy of @abbynhamilton

  • Is content accessible?
    • Is there keyboard functionality?
    • Are there text alternatives for non-text media? Example:
      • Transcripts for audio
      • Images with alt text
      • In-text descriptions of visuals
    • Is there adequate color contrast?
    • Is text resizable?

Finding interesting content

Researching and identifying useful content happens in three formats:

  • Keyword and search landscape research
  • On-site analytic deep dives
  • User research

Visual modified from @smrvl via @DannyProl

Audience research

When looking for audiences, we need to concentrate high percentages (super high index rates are great, but not required). Push channels (particularly ones with strong targeting capabilities) do better with high index rates. This makes sense, we need to know that 80% of our customers have certain leanings (because we’re looking for base-case), not that five users over-index on a niche topic (these five niche-topic lovers are perfect for targeted ads).

Some seed research questions:

  • Who are users?
  • Where are they?
  • Why do they buy?
  • How do they buy?
  • What do they want?
  • Are they new or existing users?
  • What do they value?
  • What are their motivators?
  • What is their relationship w/ tech?
  • What do they do online?
  • Are users engaging with other brands?
    • Is there an opportunity for synergy?
  • What can we borrow from other channels?
    • Digital presents a wealth of data, in which 1:1, closed-loop, people-based marketing exists. Leverage any data you can get and find useful.

Content journey maps

All of this data can then go into creating a map of the user journey and overlaying relevant content. Below are a few types of mappings that are useful.

Illustrative user journey map

Sometimes when trying to process complex problems, it’s easier to break it down into smaller pieces. Illustrative user journeys can help with this problem! Take a single user’s journey and map it out, aligning relevant content experiences.

Funnel content mapping

This chart is deceptively simple; however, working through this graph can help sites to understand how each stage in the funnel affects users (note: the stages can be modified). This matrix can help with mapping who writers are talking to, their needs, and how to push them to the next stage in the funnel.

Content matrix

Mapping out content by intent and branding helps to visualize conversion potential. I find these extremely useful for prioritizing top-converting content initiatives (i.e., start with ensuring branded, transactional content is delivering the best experience, then move towards more generic, higher-funnel terms).

Overviews

Regardless of how the data is broken down, it’s vital to have a high-level view on the audience’s core attributes, opportunities to improve content, and strategy for closing the gap.

Connecting

Connecting is all about resonating with humans. Connecting is about understanding that customers are human (and we have certain constraints). Our mind is constantly filtering, managing, multitasking, processing, coordinating, organizing, and storing information. It is literally in our mind’s best interest to not remember 99% of the information and sensations that surround us (think of the lights, sounds, tangible objects, people surrounding you, and you’re still able to focus on reading the words on your screen — pretty incredible!).

To become psychologically sticky, we must:

  1. Get past the mind’s natural filter. A positive aspect of being a pull marketing channel is that individuals are already seeking out information, making it possible to intersect their user journey in a micro-moment.
  2. From there we must be memorable. The brain tends to hold onto what’s relevant, useful, or interesting. Luckily, the searcher’s interest is already piqued (even if they aren’t consciously aware of why they searched for a particular topic).

This means we have a unique opportunity to “be there” for people. This leads to a very simple, abstract philosophy: a great brand is like a great friend.

We have similar relationship stages, we interweave throughout each other’s lives, and we have the ability to impact happiness. This comes down to the question: Do your online customers use adjectives they would use for a friend to describe your brand?

Action items:

  • Is all content either relevant, useful, or interesting?
  • Does the content honor your user’s questions?
  • Does your brand have a personality that aligns with reality?
  • Are you treating users as you would a friend?
  • Do your users use friend-like adjectives to describe your brand and/or site?
  • Do the brand’s actions align with overarching goals?
  • Is your experience trust-inspiring?
  • https://?
  • Using Limited ads in layout?
  • Does the site have proof of claims?
  • Does the site use relevant reviews and testimonials?
  • Is contact information available and easily findable?
  • Is relevant information intuitively available to users?
  • Is it as easy to buy/subscribe as it is to return/cancel?
  • Is integrity visible throughout the entire conversion process and experience?
  • Does site have credible reputation across the web?

Ultimately, being able to strategically, seamlessly create compelling user experiences which make sense to bots is what the SEO cyborg is all about. ☺

tl;dr

  • Ensure site = crawlable, renderable, and indexable
  • Ensure all signals = clear, aligned
  • Answering related, semantically salient questions
  • Research keywords, the search landscape, site performance, and develop audience segments
  • Use audience segments to map content and prioritize initiatives
  • Ensure content is relevant, useful, or interesting
  • Treat users as friend, be worthy of their trust

This article is based off of my MozCon talk (with a few slides from the Appendix pulled forward). The full deck is available on Slideshare, and the official videos can be purchased here. Please feel free to reach out with any questions in the comments below or via Twitter @AlexisKSanders.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Local Business Transparency & Empathy for the Holidays: Tips + Downloadable Checklist

Posted by MiriamEllis

Your local business will invest its all in stocking shelves and menus with the right goods and services in advance of the 2018 holiday season, but does your inventory include the on-and-offline experiences consumers say they want most?

Right now, a potential patron near you is having an experience that will inform their decision of whether to do business with you at year’s end, and their takeaway is largely hinging on two things: your brand’s transparency and empathy.

An excellent SproutSocial survey of 1,000 consumers found that people define transparency as being:

  • Open (59%)
  • Clear (53%)
  • Honest (49%)

Meanwhile, after a trying year of fake news, bad news, and privacy breaches, Americans could certainly use some empathy from brands that respect their rights, needs, aspirations, and time.

Today, let’s explore how your local brand can gift customers with both transparency and empathy before and during the holiday season, and let’s make it easy for your team with a shareable, downloadable checklist, complete with 20 tips for in-store excellence and holiday Google My Business best practices:

Grab the Holiday Checklist now!

For consumers, even the little things mean a lot

Your brother eats at that restaurant because its owner fed 10,000 meals to displaced residents during a wildfire. My sister won’t buy merchandise from that shop because their hiring practices are discriminatory. A friend was so amazed when the big brand CEO responded personally to her complaint that she’s telling all her social followers about it now.

Maybe it’s always been a national pastime for Americans to benefit one another with wisdom gained from their purchasing experiences. I own one of the first cookbooks ever published in this country and ‘tis full of wyse warnings about how to avoid “doctored” meats and grains in the marketplace. Social media has certainly amplified our voices, but it has done something else that truly does feel fresh and new. Consider SproutSocial’s findings that:

  • 86% of Americans say transparency from businesses is more important than ever before.
  • 40% of people who say brand transparency is more important than ever before attribute it to social media.
  • 63% of people say CEOs who have their own social profiles are better representatives for their companies than CEOs who do not.

What were customers’ chances of seeking redress and publicity just 20 years ago if a big brand treated them poorly? Today, they can document with video, write a review, tweet to the multitudes, even get picked up by national news. They can use a search engine to dig up the truth about a company’s past and present practices. And… they can find the social profiles of a growing number of brand representatives and speak to them directly about their experiences, putting the ball in the company’s court to respond for all to see.

In other words, people increasingly assume brands should be directly accessible. That’s new!

Should this increased expectation of interactive transparency terrify businesses?

Absolutely not, if their intentions and policies are open, clear, and honest. It’s a little thing to treat a customer with fairness and regard, but its impacts in the age of social media are not small. In fact, SproutSocial found that transparent practices are golden as far as consumer loyalty is concerned:

  • 85% of people say a business’ history of being transparent makes them more likely to give it a second chance after a bad experience.
  • 89% of people say a business can regain their trust if it admits to a mistake and is transparent about the steps it will take to resolve the issue.

I highly recommend reading the entire SproutSocial study, and while it focuses mainly on general brands and general social media, my read of it correlated again and again to the specific scenario of local businesses. Let’s talk about this!

How transparency & empathy relate to local brands

“73.8% of customers were either likely or extremely likely to continue to do business with a merchant once the complaint had been resolved.”
- GetFiveStars

On the local business scene, we’re also witnessing the rising trend of consumers who expect accountability and accessibility, and who speak up when they don’t encounter it. Local businesses need to commit to openness in terms of their business practices, just as digital businesses do, but there are some special nuances at play here, too.

I can’t count the number of negative reviews I’ve read that cited inconvenience caused by local business listings containing wrong addresses and incorrect hours. These reviewers have experienced a sense of ill-usage stemming from a perceived lack of respect for their busy schedules and a lack of brand concern for their well-being. Neglected online local business information leads to neglected-feeling customers who sometimes even believe that a company is hiding the truth from them!

These are avoidable outcomes. As the above quote from a GetFiveStars survey demonstrates, local brands that fully participate in anticipating, hearing, and responding to consumer needs are rewarded with loyalty. Given this, as we begin the countdown to holiday shopping, be sure you’re fostering basic transparency and empathy with simple steps like:

  • Checking your core citations for accurate names, addresses, phone numbers, and other info and making necessary corrections
  • Updating your local business listing hours to reflect extended holiday hours and closures
  • Updating your website and all local landing pages to reflect this information

Next, bolster more advanced transparency by:

  • Using Google Posts to clearly highlight your major sale dates so people don’t feel tricked or left out
  • Answering all consumer questions via Google Questions & Answers in your Google Knowledge Panels
  • Responding swiftly to both positive and negative reviews on core platforms
  • Monitoring and participating on all social discussion of your brand when concerns or complaints arise, letting customers know you are accessible
  • Posting in-store signage directing customers to complaint phone/text hotlines

And, finally, create an empathetic rapport with customers via efforts like:

  • Developing and publishing a consumer-centric service policy both on your website and in signage or print materials in all of your locations
  • Using Google My Business attributes to let patrons know about features like wheelchair accessibility, available parking, pet-friendliness, etc.
  • Publishing your company giving strategies so that customers can feel spending with you supports good things — for example, X% of sales going to a local homeless shelter, children’s hospital, or other worthy cause
  • Creating a true welcome for all patrons, regardless of gender, identity, race, creed, or culture — for example, gender neutral bathrooms, feeding stations for mothers, fragrance-free environments for the chemically sensitive, or even a few comfortable chairs for tired shoppers to rest in

A company commitment to standards like TAGFEE coupled with a basic regard for the rights, well-being, and aspirations of customers year-round can stand a local brand in very good stead at the holidays. Sometimes it’s the intangible goods a brand stocks — like goodwill towards one’s local community — that yield a brand of loyalty nothing else can buy.

Why not organize for it, organize for the mutual benefits of business and society with a detailed, step-by-step checklist you can take to your next team meeting?:

Download the 2018 Holiday Local SEO Checklist

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

How We More than Doubled Conversions & Leads for a New ICO [Case Study]

Posted by jkuria

Summary

We helped Repux generate 253% more leads, nearly 100% more token sales and millions of dollars in incremental revenue during their initial coin offering (ICO) by using our CRO expertise.

The optimized site also helped them get meetings with some of the biggest names in the venture capital community — a big feat for a Poland-based team without the pedigree typically required (no MIT, Stanford, Ivy League, Google, Facebook, Amazon, Microsoft background).

The details:

Repux is a marketplace that lets small and medium businesses sell anonymized data to developers. The developers use the data to build “artificially intelligent” apps, which they then sell back to businesses. Business owners and managers use the apps to make better business decisions.

Below is the original page, which linked to a dense whitepaper. We don’t know who decided that an ICO requires a long, dry whitepaper, but this seems to be the norm!

A screenshot of a cell phone</p>
<p>Description generated with very high confidence

This page above suffers from several issues:

  • The headline is pretty meaningless (“Decentralized Data & Applications Protocol for SMEs). Remember, as David Ogilvy noted, 90% of the success of an ad (in our case, a landing page) is determined by the headline. Visitors quickly scan the headline and if it doesn’t hold their interest, bounce immediately. With so much content on the web, attention is scarce — the average time spent on a page is a few seconds and the average bounce rate is about 85%.
  • The call to action is “Get Whitelisted,” which is also meaningless. What’s in it for me? Why should I want to “Get Whitelisted”?
  • A lack of urgency to act. There is a compelling reason to do so, but it was not being clearly articulated (“Get 50% OFF on the tokens if you buy before a certain date.”)
  • Lack of “evidentials”: Evidentials are elements that lend credibility or reduce anxiety and include things like mentions in trusted publications, well-known investors or advisors, industry seals, association affiliations, specific numbers (e.g. 99% Net Promoter Score), and so on.
  • Too much jargon and arcane technical language: Our research using Mouseflow’s on-page feedback feature showed that the non-accredited-investor ICO audience isn’t sophisticated. They typically reside outside of the US and have a limited command of English. Most are younger men (18–35) who made money from speculative activities on the Internet (affiliate marketing, Adsense arbitrage, and of course other crypto-currencies). When we surveyed them, many did not initially understand the concept. In our winning page (below), we dumbed down things a lot!

Below is the new page that produced a 253% gain in leads (email opt-ins). Coupled with the email follow-up sequence shown below, it produced a nearly 100% gain in token sales.

Winning page (above the fold):

Here are few of the elements that we believe made a difference:

  • Much clearer headline (which we improved upon further in a subsequent treatment).
  • Simple explanation of what the company is doing
  • Urgency to buy now — get 50% off on tokens if you buy before the countdown timer expires
  • Solicited and used press mentions
  • Social proof from the Economist; tapping a meme can be powerful as it’s always easier to swim downstream than upstream. “Data is the new oil” is a current meme.

More persuasive elements (below the fold):

In the second span (the next screenful below the fold) we added a few more persuasive elements.

For one, we highlighted key Repux accomplishments and included bios of two advisors who are well known in the crypto-community.

Having a working platform was an important differentiator because only one in 10 ICOs had a working product. Most launched with just a whitepaper!

A survey of the token buyers showed that mentioning well-known advisors worked — several respondents said it was the decisive factor in persuading them to buy. Before, the advisors were buried in a little-visited page. We featured them more prominently.

Interestingly, this seemed to cut both ways. One of the non-contributors said he was initially interested because of a certain advisor’s involvement. He later chose not to contribute because he felt this advisor’s other flagship project had been mismanaged!

We also used 3 concrete examples to show how the marketplace functions and how the tokens would be used:

When your product is highly abstract and technical, using concrete examples aids understanding. We also found this to be true when pitching to professional investors. They often asked, “Can you give me an example of how this would work in the real world?”

We like long-form pages because unlike a live selling situation, there’s no opportunity for a back-and-forth conversation. The page must therefore overcorrect and address every objection a web visitor might have.

Lastly, we explained why Repux is likely to succeed. We quoted Victor Hugo for good measure, to create an air of inevitability:

How much impact did Victor Hugo have? I don’t know, but the page did much better overall. Our experience shows that radical redesigns (that change many page elements at the same time) produce higher conversion lifts.

Once you attain a large lift, if you like, you can then do isolation testing of specific variables to determine how much each change contributed.

13% lift: Simplified alternate page

The page below led to a further 13% lift.

The key elements we changed were:

  • Simplified the headline even further: “Repux Monetizes Data from Millions of Small Enterprises.” What was previously the headline is now stated in the bullet points.
  • Added a “5 Reasons Why Repux is Likely to Succeed” section: When you number things, visitors are more likely to engage with the content. They may not read all the text but will at least skim over the numbered sub-headlines to learn what all the points are — just like power abhors a vacuum, the mind can’t seem to stand incompleteness!

We’ve seen this in Mouseflow heatmaps. You can do this test yourself: List a bunch of bullet points versus a numbered list and with a compelling headline: The 7 Reasons Why 20,0000 Doctors Recommend Product X or The 3 Key Things You Need to Know to Make an Informed Decision.

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML26c90c7c.PNG

Follow-up email sequence

We also created a follow-up email sequence for Repux that led to more token sales.

C:\Users\jkuri\AppData\Local\Temp\SNAGHTML4824f99e.PNG

As you can see, the average open rate is north of 40%, and the goal attained (token sales) is above 8%. According to Mailchimp, the average email marketing campaign open rate is about 20%, while the average CTR is about 3%.

We got more sales than most people get clicks. Here’s a link to three sample emails we sent.

Our emails are effective because:

  • They’re educational (versus pure sales pitch). This is also important to avoid “burning out” your list. If all you do is send pitch after pitch, soon you’ll be lucky to get a 1.3% open rate!
  • They employ storytelling. We use a technique known as the “Soap Opera Sequence.” Each email creates anticipation for the next one and also refers to some interesting fact in previous ones. If a person would only have opened one email, they are now likely to want to open future ones as well as look up older ones to “solve the puzzle.” This leads to higher open rates for the entire sequence, and more sales.
  • The calls to action are closer to the bottom, having first built up some value. Counterintuitively, this works better, but you should always test radically different approaches.

Email is a massively underutilized medium. Most businesses are sitting on goldmines (their email list) without realizing it! You can — and should — make at least 2x to 3x as many sales from your email list as you do from direct website sales.

It takes a lot of work to write an effective sequence, but once you do you can run it on autopilot for years, making money hand over fist. As customer acquisition gets ever more competitive and expensive, how well you monetize your list can make the difference between success and failure.

Conclusion

To increase the conversion rate on your website and get more sales, leads, or app downloads, follow these simple steps:

  • Put in the work to understand why the non-converting visitors are leaving and then systematically address their specific objections. This is what “research-driven” optimization means, as opposed to redesign based purely aesthetic appeal or “best practices.”
  • Find out why the converting visitors took the desired action — and then accentuate these things.
  • Capture emails and use a follow-up sequence to educate and tell stories to those who were not convinced by the website. Done correctly, this can produce 2x to 3x as many sales as the website.

Simple, but not easy. It takes diligence and discipline to do these things well. But if you do, you will be richly rewarded!

And if you’d like to learn more about conversion rate optimization or review additional case studies, we encourage you to take our free course.

Thanks to Jon Powell, Hayk Saakian, Vlad Mkrtumyan, and Nick Jordan for reading drafts of this post.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Internal Linking & Mobile First: Large Site Crawl Paths in 2018 & Beyond

Posted by Tom.Capper

By now, you’ve probably heard as much as you can bear about mobile first indexing. For me, there’s been one topic that’s been conspicuously missing from all this discussion, though, and that’s the impact on internal linking and previous internal linking best practices.

In the past, there have been a few popular methods for providing crawl paths for search engines — bulky main navigations, HTML sitemap-style pages that exist purely for internal linking, or blocks of links at the bottom of indexed pages. Larger sites have typically used at least two or often three of these methods. I’ll explain in this post why all of these are now looking pretty shaky, and what I suggest you do about it.

Quick refresher: WTF are “internal linking” & “mobile-first,” Tom?

Internal linking is and always has been a vital component of SEO — it’s easy to forget in all the noise about external link building that some of our most powerful tools to affect the link graph are right under our noses. If you’re looking to brush up on internal linking in general, it’s a topic that gets pretty complex pretty quickly, but there are a couple of resources I can recommend to get started:

I’ve also written in the past that links may be mattering less and less as a ranking factor for the most competitive terms, and though that may be true, they’re still the primary way you qualify for that competition.

A great example I’ve seen recently of what happens if you don’t have comprehensive internal linking is eflorist.co.uk. (Disclaimer: eFlorist is not a client or prospective client of Distilled, nor are any other sites mentioned in this post)

eFlorist has local landing pages for all sorts of locations, targeting queries like “Flower delivery in [town].” However, even though these pages are indexed, they’re not linked to internally. As a result, if you search for something like “flower delivery in London,” despite eFlorist having a page targeted at this specific query (which can be found pretty much only through use of advanced search operators), they end up ranking on page 2 with their “flowers under £30” category page:

¯\_(ツ)_/¯

If you’re looking for a reminder of what mobile-first indexing is and why it matters, these are a couple of good posts to bring you up to speed:

In short, though, Google is increasingly looking at pages as they appear on mobile for all the things it was previously using desktop pages for — namely, establishing ranking factors, the link graph, and SEO directives. You may well have already seen an alert from Google Search Console telling you your site has been moved over to primarily mobile indexing, but if not, it’s likely not far off.

Get to the point: What am I doing wrong?

If you have more than a handful of landing pages on your site, you’ve probably given some thought in the past to how Google can find them and how to make sure they get a good chunk of your site’s link equity. A rule of thumb often used by SEOs is how many clicks a landing page is from the homepage, also known as “crawl depth.”

Mobile-first indexing impacts this on two fronts:

  1. Some of your links aren’t present on mobile (as is common), so your internal linking simply won’t work in a world where Google is going primarily with the mobile-version of your page
  2. If your links are visible on mobile, they may be hideous or overwhelming to users, given the reduced on-screen real estate vs. desktop

If you don’t believe me on the first point, check out this Twitter conversation between Will Critchlow and John Mueller:

In particular, that section I’ve underlined in red should be of concern — it’s unclear how much time we have, but sooner or later, if your internal linking on the mobile version of your site doesn’t cut it from an SEO perspective, neither does your site.

And for the links that do remain visible, an internal linking structure that can be rationalized on desktop can quickly look overbearing on mobile. Check out this example from Expedia.co.uk’s “flights to London” landing page:

Many of these links are part of the site-wide footer, but they vary according to what page you’re on. For example, on the “flights to Australia” page, you get different links, allowing a tree-like structure of internal linking. This is a common tactic for larger sites.

In this example, there’s more unstructured linking both above and below the section screenshotted. For what it’s worth, although it isn’t pretty, I don’t think this is terrible, but it’s also not the sort of thing I can be particularly proud of when I go to explain to a client’s UX team why I’ve asked them to ruin their beautiful page design for SEO reasons.

I mentioned earlier that there are three main methods of establishing crawl paths on large sites: bulky main navigations, HTML-sitemap-style pages that exist purely for internal linking, or blocks of links at the bottom of indexed pages. I’ll now go through these in turn, and take a look at where they stand in 2018.

1. Bulky main navigations: Fail to scale

The most extreme example I was able to find of this is from Monoprice.com, with a huge 711 links in the sitewide top-nav:

Here’s how it looks on mobile:

This is actually fairly usable, but you have to consider the implications of having this many links on every page of your site — this isn’t going to concentrate equity where you need it most. In addition, you’re potentially asking customers to do a lot of work in terms of finding their way around such a comprehensive navigation.

I don’t think mobile-first indexing changes the picture here much; it’s more that this was never the answer in the first place for sites above a certain size. Many sites have tens of thousands (or more), not hundreds of landing pages to worry about. So simply using the main navigation is not a realistic option, let alone an optimal option, for creating crawl paths and distributing equity in a proportionate or targeted way.

2. HTML sitemaps: Ruined by the counterintuitive equivalence of noindex,follow & noindex,nofollow

This is a slightly less common technique these days, but still used reasonably widely. Take this example from Auto Trader UK:

This page isn’t mobile-friendly, although that doesn’t necessarily matter, as it isn’t supposed to be a landing page. The idea is that this page is linked to from Auto Trader’s footer, and allows link equity to flow through into deeper parts of the site.

However, there’s a complication: this page in an ideal world be “noindex,follow.” However, it turns out that over time, Google ends up treating “noindex,follow” like “noindex,nofollow.” It’s not 100% clear what John Mueller meant by this, but it does make sense that given the low crawl priority of “noindex” pages, Google could eventually stop crawling them altogether, causing them to behave in effect like “noindex,nofollow.” Anecdotally, this is also how third-party crawlers like Moz and Majestic behave, and it’s how I’ve seen Google behave with test pages on my personal site.

That means that at best, Google won’t discover new links you add to your HTML sitemaps, and at worst, it won’t pass equity through them either. The jury is still out on this worst case scenario, but it’s not an ideal situation in either case.

So, you have to index your HTML sitemaps. For a large site, this means you’re indexing potentially dozens or hundreds of pages that are just lists of links. It is a viable option, but if you care about the quality and quantity of pages you’re allowing into Google’s index, it might not be an option you’re so keen on.

3. Link blocks on landing pages: Good, bad, and ugly, all at the same time

I already mentioned that example from Expedia above, but here’s another extreme example from the Kayak.co.uk homepage:

Example 1

Example 2

It’s no coincidence that both these sites come from the travel search vertical, where having to sustain a massive number of indexed pages is a major challenge. Just like their competitor, Kayak have perhaps gone overboard in the sheer quantity here, but they’ve taken it an interesting step further — notice that the links are hidden behind dropdowns.

This is something that was mentioned in the post from Bridget Randolph I mentioned above, and I agree so much I’m just going to quote her verbatim:

Note that with mobile-first indexing, content which is collapsed or hidden in tabs, etc. due to space limitations will not be treated differently than visible content (as it may have been previously), since this type of screen real estate management is actually a mobile best practice.

Combined with a more sensible quantity of internal linking, and taking advantage of the significant height of many mobile landing pages (i.e., this needn’t be visible above the fold), this is probably the most broadly applicable method for deep internal linking at your disposal going forward. As always, though, we need to be careful as SEOs not to see a working tactic and rush to push it to its limits — usability and moderation are still important, just as with overburdened main navigations.

Summary: Bite the on-page linking bullet, but present it well

Overall, the most scalable method for getting large numbers of pages crawled, indexed, and ranking on your site is going to be on-page linking — simply because you already have a large number of pages to place the links on, and in all likelihood a natural “tree” structure, by very nature of the problem.

Top navigations and HTML sitemaps have their place, but lack the scalability or finesse to deal with this situation, especially given what we now know about Google’s treatment of “noindex,follow” tags.

However, the more we emphasize mobile experience, while simultaneously relying on this method, the more we need to be careful about how we present it. In the past, as SEOs, we might have been fairly nervous about placing on-page links behind tabs or dropdowns, just because it felt like deceiving Google. And on desktop, that might be true, but on mobile, this is increasingly going to become best practice, and we have to trust Google to understand that.

All that said, I’d love to hear your strategies for grappling with this — let me know in the comments below!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

More Articles

Posted in Latest NewsComments Off

NEW On-Demand Crawl: Quick Insights for Sales, Prospecting, & Competitive Analysis

Posted by Dr-Pete

In June of 2017, Moz launched our entirely rebuilt Site Crawl, helping you dive deep into crawl issues and technical SEO problems, fix those issues in your Moz Pro Campaigns (tracked websites), and monitor weekly for new issues. Many times, though, you need quick insights outside of a Campaign context, whether you’re analyzing a prospect site before a sales call or trying to assess the competition.

For years, Moz had a lab tool called Crawl Test. The bad news is that Crawl Test never made it to prime-time and suffered from some neglect. The good news is that I’m happy to announce the full launch (as of August 2018) of On-Demand Crawl, an entirely new crawl tool built on the engine that powers Site Crawl, but with a UI designed around quick insights for prospecting and competitive analysis.

While you don’t need a Campaign to run a crawl, you do need to be logged into your Moz Pro subscription. If you don’t have a subscription, you can sign-up for a free trial and give it a whirl.

How can you put On-Demand Crawl to work? Let’s walk through a short example together.


All you need is a domain

Getting started is easy. From the “Moz Pro” menu, find “On-Demand Crawl” under “Research Tools”:

Just enter a root domain or subdomain in the box at the top and click the blue button to kick off a crawl. While I don’t want to pick on anyone, I’ve decided to use a real site. Our recent analysis of the August 1st Google update identified some sites that were hit hard, and I’ve picked one (lilluna.com) from that list.

Please note that Moz is not affiliated with Lil’ Luna in any way. For the most part, it seems to be a decent site with reasonably good content. Let’s pretend, just for this post, that you’re looking to help this site out and determine if they’d be a good fit for your SEO services. You’ve got a call scheduled and need to spot-check for any major problems so that you can go into that call as informed as possible.

On-Demand Crawls aren’t instantaneous (crawling is a big job), but they’ll generally finish between a few minutes and an hour. We know these are time-sensitive situations. You’ll soon receive an email that looks like this:

The email includes the number of URLs crawled (On-Demand will currently crawl up to 3,000 URLs), the total issues found, and a summary table of crawl issues by category. Click on the [View Report] link to dive into the full crawl data.


Assess critical issues quickly

We’ve designed On-Demand Crawl to assist your own human intelligence. You’ll see some basic stats at the top, but then immediately move into a graph of your top issues by count. The graph only displays issues that occur at least once on your site – you can click “See More” to show all of the issues that On-Demand Crawl tracks (the top two bars have been truncated)…

Issues are also color-coded by category. Some items are warnings, and whether they matter depends a lot on context. Other issues, like “Critcal Errors” (in red) almost always demand attention. So, let’s check out those 404 errors. Scroll down and you’ll see a list of “Pages Crawled” with filters. You’re going to select “4xx” in the “Status Codes” dropdown…

You can then pretty easily spot-check these URLs and find out that they do, in fact, seem to be returning 404 errors. Some appear to be legitimate content that has either internal or external links (or both). So, within a few minutes, you’ve already found something useful.

Let’s look at those yellow “Meta Noindex” errors next. This is a tricky one, because you can’t easily determine intent. An intentional Meta Noindex may be fine. An unintentional one (or hundreds of unintentional ones) could be blocking crawlers and causing serious harm. Here, you’ll filter by issue type…

Like the top graph, issues appear in order of prevalence. You can also filter by all pages that have issues (any issues) or pages that have no issues. Here’s a sample of what you get back (the full table also includes status code, issue count, and an option to view all issues)…

Notice the “?s=” common to all of these URLs. Clicking on a few, you can see that these are internal search pages. These URLs have no particular SEO value, and the Meta Noindex is likely intentional. Good technical SEO is also about avoiding false alarms because you lack internal knowledge of a site. On-Demand Crawl helps you semi-automate and summarize insights to put your human intelligence to work quickly.


Dive deeper with exports

Let’s go back to those 404s. Ideally, you’d like to know where those URLs are showing up. We can’t fit everything into one screen, but if you scroll up to the “All Issues” graph you’ll see an “Export CSV” option…

The export will honor any filters set in the page list, so let’s re-apply that “4xx” filter and pull the data. Your export should download almost immediately. The full export contains a wealth of information, but I’ve zeroed in on just what’s critical for this particular case…

Now, you know not only what pages are missing, but exactly where they link from internally, and can easily pass along suggested fixes to the customer or prospect. Some of these turn out to be link-heavy pages that could probably benefit from some clean-up or updating (if newer recipes are a good fit).

Let’s try another one. You’ve got 8 duplicate content errors. Potentially thin content could fit theories about the August 1st update, so this is worth digging into. If you filter by “Duplicate Content” issues, you’ll see the following message…

The 8 duplicate issues actually represent 18 pages, and the table returns all 18 affected pages. In some cases, the duplicates will be obvious from the title and/or URL, but in this case there’s a bit of mystery, so let’s pull that export file. In this case, there’s a column called “Duplicate Content Group,” and sorting by it reveals something like the following (there’s a lot more data in the original export file)…

I’ve renamed “Duplicate Content Group” to just “Group” and included the word count (“Words”), which could be useful for verifying true duplicates. Look at group #7 – it turns out that these “Weekly Menu Plan” pages are very image heavy and have a common block of text before any unique text. While not 100% duplicated, these otherwise valuable pages could easily look like thin content to Google and represent a broader problem.


Real insights in real-time

Not counting the time spent writing the blog post, running this crawl and diving in took less than an hour, and even that small amount of time spent uncovered more potential issues than what I could cover in this post. In less than an hour, you can walk into a client meeting or sales call with in-depth knowledge of any domain.

Keep in mind that many of these features also exist in our Site Crawl tool. If you’re looking for long-term, campaign insights, use Site Crawl (if you just need to update your data, use our “Recrawl” feature). If you’re looking for quick, one-time insights, check out On-Demand Crawl. Standard Pro users currently get 5 On-Demand Crawls per month (with limits increasing at higher tiers).

Your On-Demand Crawls are currently stored for 90 days. When you re-enter the feature, you’ll see a table of all of your recent crawls (the image below has been truncated):

Click on any row to go back to see the crawl data for that domain. If you get the sale and decide to move forward, congratulations! You can port that domain directly into a Moz campaign.

We hope you’ll try On-Demand Crawl out and let us know what you think. We’d love to hear your case studies, whether it’s sales, competitive analysis, or just trying to solve the mysteries of a Google update.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Advert