Tag Archive | "Optimization"

Rewriting the Beginner’s Guide to SEO, Chapter 5: Technical Optimization

Posted by BritneyMuller

After a short break, we’re back to share our working draft of Chapter 5 of the Beginner’s Guide to SEO with you! This one was a whopper, and we’re really looking forward to your input. Giving beginner SEOs a solid grasp of just what technical optimization for SEO is and why it matters — without overwhelming them or scaring them off the subject — is a tall order indeed. We’d love to hear what you think: did we miss anything you think is important for beginners to know? Leave us your feedback in the comments!

And in case you’re curious, check back on our outline, Chapter One, Chapter Two, Chapter Three, and Chapter Four to see what we’ve covered so far.


Chapter 5: Technical Optimization

Basic technical knowledge will help you optimize your site for search engines and establish credibility with developers.

Now that you’ve crafted valuable content on the foundation of solid keyword research, it’s important to make sure it’s not only readable by humans, but by search engines too!

You don’t need to have a deep technical understanding of these concepts, but it is important to grasp what these technical assets do so that you can speak intelligently about them with developers. Speaking your developers’ language is important because you will likely need them to carry out some of your optimizations. They’re unlikely to prioritize your asks if they can’t understand your request or see its importance. When you establish credibility and trust with your devs, you can begin to tear away the red tape that often blocks crucial work from getting done.

Pro tip: SEOs need cross-team support to be effective

It’s vital to have a healthy relationship with your developers so that you can successfully tackle SEO challenges from both sides. Don’t wait until a technical issue causes negative SEO ramifications to involve a developer. Instead, join forces for the planning stage with the goal of avoiding the issues altogether. If you don’t, it can cost you in time and money later.

Beyond cross-team support, understanding technical optimization for SEO is essential if you want to ensure that your web pages are structured for both humans and crawlers. To that end, we’ve divided this chapter into three sections:

  1. How websites work
  2. How search engines understand websites
  3. How users interact with websites

Since the technical structure of a site can have a massive impact on its performance, it’s crucial for everyone to understand these principles. It might also be a good idea to share this part of the guide with your programmers, content writers, and designers so that all parties involved in a site’s construction are on the same page.

1. How websites work

If search engine optimization is the process of optimizing a website for search, SEOs need at least a basic understanding of the thing they’re optimizing!

Below, we outline the website’s journey from domain name purchase all the way to its fully rendered state in a browser. An important component of the website’s journey is the critical rendering path, which is the process of a browser turning a website’s code into a viewable page.

Knowing this about websites is important for SEOs to understand for a few reasons:

  • The steps in this webpage assembly process can affect page load times, and speed is not only important for keeping users on your site, but it’s also one of Google’s ranking factors.
  • Google renders certain resources, like JavaScript, on a “second pass.” Google will look at the page without JavaScript first, then a few days to a few weeks later, it will render JavaScript, meaning SEO-critical elements that are added to the page using JavaScript might not get indexed.

Imagine that the website loading process is your commute to work. You get ready at home, gather your things to bring to the office, and then take the fastest route from your home to your work. It would be silly to put on just one of your shoes, take a longer route to work, drop your things off at the office, then immediately return home to get your other shoe, right? That’s sort of what inefficient websites do. This chapter will teach you how to diagnose where your website might be inefficient, what you can do to streamline, and the positive ramifications on your rankings and user experience that can result from that streamlining.

Before a website can be accessed, it needs to be set up!

  1. Domain name is purchased. Domain names like moz.com are purchased from a domain name registrar such as GoDaddy or HostGator. These registrars are just organizations that manage the reservations of domain names.
  2. Domain name is linked to IP address. The Internet doesn’t understand names like “moz.com” as website addresses without the help of domain name servers (DNS). The Internet uses a series of numbers called an Internet protocol (IP) address (ex: 127.0.0.1), but we want to use names like moz.com because they’re easier for humans to remember. We need to use a DNS to link those human-readable names with machine-readable numbers.

How a website gets from server to browser

  1. User requests domain. Now that the name is linked to an IP address via DNS, people can request a website by typing the domain name directly into their browser or by clicking on a link to the website.
  2. Browser makes requests. That request for a web page prompts the browser to make a DNS lookup request to convert the domain name to its IP address. The browser then makes a request to the server for the code your web page is constructed with, such as HTML, CSS, and JavaScript.
  3. Server sends resources. Once the server receives the request for the website, it sends the website files to be assembled in the searcher’s browser.
  4. Browser assembles the web page. The browser has now received the resources from the server, but it still needs to put it all together and render the web page so that the user can see it in their browser. As the browser parses and organizes all the web page’s resources, it’s creating a Document Object Model (DOM). The DOM is what you can see when you right click + “inspect element” on a web page in your Chrome browser (learn how to inspect elements in other browsers).
  5. Browser makes final requests. The browser will only show a web page after all the page’s necessary code is downloaded, parsed, and executed, so at this point, if the browser needs any additional code in order to show your website, it will make an additional request from your server.
  6. Website appears in browser. Whew! After all that, your website has now been transformed (rendered) from code to what you see in your browser.

Pro tip: Talk to your developers about async!

Something you can bring up with your developers is shortening the critical rendering path by setting scripts to “async” when they’re not needed to render content above the fold, which can make your web pages load faster. Async tells the DOM that it can continue to be assembled while the browser is fetching the scripts needed to display your web page. If the DOM has to pause assembly every time the browser fetches a script (called “render-blocking scripts”), it can substantially slow down your page load.

It would be like going out to eat with your friends and having to pause the conversation every time one of you went up to the counter to order, only resuming once they got back. With async, you and your friends can continue to chat even when one of you is ordering. You might also want to bring up other optimizations that devs can implement to shorten the critical rendering path, such as removing unnecessary scripts entirely, like old tracking scripts.

Now that you know how a website appears in a browser, we’re going to focus on what a website is made of — in other words, the code (programming languages) used to construct those web pages.

The three most common are:

  • HTML – What a website says (titles, body content, etc.)
  • CSS – How a website looks (color, fonts, etc.)
  • JavaScript – How it behaves (interactive, dynamic, etc.)

HTML: What a website says

HTML stands for hypertext markup language, and it serves as the backbone of a website. Elements like headings, paragraphs, lists, and content are all defined in the HTML.

Here’s an example of a webpage, and what its corresponding HTML looks like:

HTML is important for SEOs to know because it’s what lives “under the hood” of any page they create or work on. While your CMS likely doesn’t require you to write your pages in HTML (ex: selecting “hyperlink” will allow you to create a link without you having to type in “a href=”), it is what you’re modifying every time you do something to a web page such as adding content, changing the anchor text of internal links, and so on. Google crawls these HTML elements to determine how relevant your document is to a particular query. In other words, what’s in your HTML plays a huge role in how your web page ranks in Google organic search!

CSS: How a website looks

CSS stands for cascading style sheets, and this is what causes your web pages to take on certain fonts, colors, and layouts. HTML was created to describe content, rather than to style it, so when CSS entered the scene, it was a game-changer. With CSS, web pages could be “beautified” without requiring manual coding of styles into the HTML of every page — a cumbersome process, especially for large sites.

It wasn’t until 2014 that Google’s indexing system began to render web pages more like an actual browser, as opposed to a text-only browser. A black-hat SEO practice that tried to capitalize on Google’s older indexing system was hiding text and links via CSS for the purpose of manipulating search engine rankings. This “hidden text and links” practice is a violation of Google’s quality guidelines.

Components of CSS that SEOs, in particular, should care about:

  • Since style directives can live in external stylesheet files (CSS files) instead of your page’s HTML, it makes your page less code-heavy, reducing file transfer size and making load times faster.
  • Browsers still have to download resources like your CSS file, so compressing them can make your web pages load faster, and page speed is a ranking factor.
  • Having your pages be more content-heavy than code-heavy can lead to better indexing of your site’s content.
  • Using CSS to hide links and content can get your website manually penalized and removed from Google’s index.

JavaScript: How a website behaves

In the earlier days of the Internet, web pages were built with HTML. When CSS came along, webpage content had the ability to take on some style. When the programming language JavaScript entered the scene, websites could now not only have structure and style, but they could be dynamic.

JavaScript has opened up a lot of opportunities for non-static web page creation. When someone attempts to access a page that is enhanced with this programming language, that user’s browser will execute the JavaScript against the static HTML that the server returned, resulting in a web page that comes to life with some sort of interactivity.

You’ve definitely seen JavaScript in action — you just may not have known it! That’s because JavaScript can do almost anything to a page. It could create a pop up, for example, or it could request third-party resources like ads to display on your page.

JavaScript can pose some problems for SEO, though, since search engines don’t view JavaScript the same way human visitors do. That’s because of client-side versus server-side rendering. Most JavaScript is executed in a client’s browser. With server-side rendering, on the other hand, the files are executed at the server and the server sends them to the browser in their fully rendered state.

SEO-critical page elements such as text, links, and tags that are loaded on the client’s side with JavaScript, rather than represented in your HTML, are invisible from your page’s code until they are rendered. This means that search engine crawlers won’t see what’s in your JavaScript — at least not initially.

Google says that, as long as you’re not blocking Googlebot from crawling your JavaScript files, they’re generally able to render and understand your web pages just like a browser can, which means that Googlebot should see the same things as a user viewing a site in their browser. However, due to this “second wave of indexing” for client-side JavaScript, Google can miss certain elements that are only available once JavaScript is executed.

There are also some other things that could go wrong during Googlebot’s process of rendering your web pages, which can prevent Google from understanding what’s contained in your JavaScript:

  • You’ve blocked Googlebot from JavaScript resources (ex: with robots.txt, like we learned about in Chapter 2)
  • Your server can’t handle all the requests to crawl your content
  • The JavaScript is too complex or outdated for Googlebot to understand
  • JavaScript doesn’t “lazy load” content into the page until after the crawler has finished with the page and moved on.

Needless to say, while JavaScript does open a lot of possibilities for web page creation, it can also have some serious ramifications for your SEO if you’re not careful. Thankfully, there is a way to check whether Google sees the same thing as your visitors. To see a page how Googlebot views your page, use Google Search Console’s “Fetch and Render” tool. From your site’s Google Search Console dashboard, select “Crawl” from the left navigation, then “Fetch as Google.”

From this page, enter the URL you want to check (or leave blank if you want to check your homepage) and click the “Fetch and Render” button. You also have the option to test either the desktop or mobile version.

In return, you’ll get a side-by-side view of how Googlebot saw your page versus how a visitor to your website would have seen the page. Below, Google will also show you a list of any resources they may not have been able to get for the URL you entered.

Understanding the way websites work lays a great foundation for what we’ll talk about next, which is technical optimizations to help Google understand the pages on your website better.

2. How search engines understand websites

Search engines have gotten incredibly sophisticated, but they can’t (yet) find and interpret web pages quite like a human can. The following sections outline ways you can better deliver content to search engines.

Help search engines understand your content by structuring it with Schema

Imagine being a search engine crawler scanning down a 10,000-word article about how to bake a cake. How do you identify the author, recipe, ingredients, or steps required to bake a cake? This is where schema (Schema.org) markup comes in. It allows you to spoon-feed search engines more specific classifications for what type of information is on your page.

Schema is a way to label or organize your content so that search engines have a better understanding of what certain elements on your web pages are. This code provides structure to your data, which is why schema is often referred to as “structured data.” The process of structuring your data is often referred to as “markup” because you are marking up your content with organizational code.

JSON-LD is Google’s preferred schema markup (announced in May ‘16), which Bing also supports. To view a full list of the thousands of available schema markups, visit Schema.org or view the Google Developers Introduction to Structured Data for additional information on how to implement structured data. After you implement the structured data that best suits your web pages, you can test your markup with Google’s Structured Data Testing Tool.

In addition to helping bots like Google understand what a particular piece of content is about, schema markup can also enable special features to accompany your pages in the SERPs. These special features are referred to as “rich snippets,” and you’ve probably seen them in action. They’re things like:

  • Top Stories carousel
  • Review stars
  • Sitelinks search boxes
  • Recipes

Remember, using structured data can help enable a rich snippet to be present, but does not guarantee it. Other types of rich snippets will likely be added in the future as the use of schema markup increases.

Some last words of advice for schema success:

  • You can use multiple types of schema markup on a page. However, if you mark up one element, like a product for example, and there are other products listed on the page, you must also mark up those products.
  • Don’t mark up content that is not visible to visitors and follow Google’s Quality Guidelines. For example, if you add review structured markup to a page, make sure those reviews are actually visible on that page.
  • If you have duplicate pages, Google asks that you mark up each duplicate page with your structured markup, not just the canonical version.
  • Provide original and updated (if applicable) content on your structured data pages.
  • Structured markup should be an accurate reflection of your page.
  • Try to use the most specific type of schema markup for your content.
  • Marked-up reviews should not be written by the business. They should be genuine unpaid business reviews from actual customers.

Tell search engines about your preferred pages with canonicalization

When Google crawls the same content on different web pages, it sometimes doesn’t know which page to index in search results. This is why the tag was invented: to help search engines better index the preferred version of content and not all its duplicates.

The rel=”canonical” tag allows you to tell search engines where the original, master version of a piece of content is located. You’re essentially saying, “Hey search engine! Don’t index this; index this source page instead.” So, if you want to republish a piece of content, whether exactly or slightly modified, but don’t want to risk creating duplicate content, the canonical tag is here to save the day.

Proper canonicalization ensures that every unique piece of content on your website has only one URL. To prevent search engines from indexing multiple versions of a single page, Google recommends having a self-referencing canonical tag on every page on your site. Without a canonical tag telling Google which version of your web page is the preferred one, http://www.example.com could get indexed separately from http://example.com, creating duplicates.

“Avoid duplicate content” is an Internet truism, and for good reason! Google wants to reward sites with unique, valuable content — not content that’s taken from other sources and repeated across multiple pages. Because engines want to provide the best searcher experience, they will rarely show multiple versions of the same content, opting instead to show only the canonicalized version, or if a canonical tag does not exist, whichever version they deem most likely to be the original.

Pro tip: Distinguishing between content filtering & content penalties
There is no such thing as a duplicate content penalty. However, you should try to keep duplicate content from causing indexing issues by using the rel=”canonical” tag when possible. When duplicates of a page exist, Google will choose a canonical and filter the others out of search results. That doesn’t mean you’ve been penalized. It just means that Google only wants to show one version of your content.

It’s also very common for websites to have multiple duplicate pages due to sort and filter options. For example, on an e-commerce site, you might have what’s called a faceted navigation that allows visitors to narrow down products to find exactly what they’re looking for, such as a “sort by” feature that reorders results on the product category page from lowest to highest price. This could create a URL that looks something like this: example.com/mens-shirts?sort=price_ascending. Add in more sort/filter options like color, size, material, brand, etc. and just think about all the variations of your main product category page this would create!

To learn more about different types of duplicate content, this post by Dr. Pete helps distill the different nuances.

3. How users interact with websites

In Chapter 1, we said that despite SEO standing for search engine optimization, SEO is as much about people as it is about search engines themselves. That’s because search engines exist to serve searchers. This goal helps explain why Google’s algorithm rewards websites that provide the best possible experiences for searchers, and why some websites, despite having qualities like robust backlink profiles, might not perform well in search.

When we understand what makes their web browsing experience optimal, we can create those experiences for maximum search performance.

Ensuring a positive experience for your mobile visitors

Being that well over half of all web traffic today comes from mobile, it’s safe to say that your website should be accessible and easy to navigate for mobile visitors. In April 2015, Google rolled out an update to its algorithm that would promote mobile-friendly pages over non-mobile-friendly pages. So how can you ensure that your website is mobile friendly? Although there are three main ways to configure your website for mobile, Google recommends responsive web design.

Responsive design

Responsive websites are designed to fit the screen of whatever type of device your visitors are using. You can use CSS to make the web page “respond” to the device size. This is ideal because it prevents visitors from having to double-tap or pinch-and-zoom in order to view the content on your pages. Not sure if your web pages are mobile friendly? You can use Google’s mobile-friendly test to check!

AMP

AMP stands for Accelerated Mobile Pages, and it is used to deliver content to mobile visitors at speeds much greater than with non-AMP delivery. AMP is able to deliver content so fast because it delivers content from its cache servers (not the original site) and uses a special AMP version of HTML and JavaScript. Learn more about AMP.

Mobile-first indexing

As of 2018, Google started switching websites over to mobile-first indexing. That change sparked some confusion between mobile-friendliness and mobile-first, so it’s helpful to disambiguate. With mobile-first indexing, Google crawls and indexes the mobile version of your web pages. Making your website compatible to mobile screens is good for users and your performance in search, but mobile-first indexing happens independently of mobile-friendliness.

This has raised some concerns for websites that lack parity between mobile and desktop versions, such as showing different content, navigation, links, etc. on their mobile view. A mobile site with different links, for example, will alter the way in which Googlebot (mobile) crawls your site and sends link equity to your other pages.

Breaking up long content for easier digestion

When sites have very long pages, they have the option of breaking them up into multiple parts of a whole. This is called pagination and it’s similar to pages in a book. In order to avoid giving the visitor too much all at once, you can break up your single page into multiple parts. This can be great for visitors, especially on e-commerce sites where there are a lot of product results in a category, but there are some steps you should take to help Google understand the relationship between your paginated pages. It’s called rel=”next” and rel=”prev.”

You can read more about pagination in Google’s official documentation, but the main takeaways are that:

  • The first page in a sequence should only have rel=”next” markup
  • The last page in a sequence should only have rel=”prev” markup
  • Pages that have both a preceding and following page should have both rel=”next” and rel=”prev”
  • Since each page in the sequence is unique, don’t canonicalize them to the first page in the sequence. Only use a canonical tag to point to a “view all” version of your content, if you have one.
  • When Google sees a paginated sequence, it will typically consolidate the pages’ linking properties and send searchers to the first page

Pro tip: rel=”next/prev” should still have anchor text and live within an <a> link
This helps Google ensure that they pick up the rel=”next/prev”.

Improving page speed to mitigate visitor frustration

Google wants to serve content that loads lightning-fast for searchers. We’ve come to expect fast-loading results, and when we don’t get them, we’ll quickly bounce back to the SERP in search of a better, faster page. This is why page speed is a crucial aspect of on-site SEO. We can improve the speed of our web pages by taking advantage of tools like the ones we’ve mentioned below. Click on the links to learn more about each.

Images are one of the main culprits of slow pages!

As discussed in Chapter 4, images are one of the number-one reasons for slow-loading web pages! In addition to image compression, optimizing image alt text, choosing the right image format, and submitting image sitemaps, there are other technical ways to optimize the speed and way in which images are shown to your users. Some primary ways to improve image delivery are as follows:

SRCSET: How to deliver the best image size for each device

The SRCSET attribute allows you to have multiple versions of your image and then specify which version should be used in different situations. This piece of code is added to the <img> tag (where your image is located in the HTML) to provide unique images for specific-sized devices.

This is like the concept of responsive design that we discussed earlier, except for images!

This doesn’t just speed up your image load time, it’s also a unique way to enhance your on-page user experience by providing different and optimal images to different device types.

Pro tip: There are more than just three image size versions!
It’s a common misconception that you just need a desktop, tablet, and mobile-sized version of your image. There are a huge variety of screen sizes and resolutions. Learn more about SRCSET.

Show visitors image loading is in progress with lazy loading

Lazy loading occurs when you go to a webpage and, instead of seeing a blank white space for where an image will be, a blurry lightweight version of the image or a colored box in its place appears while the surrounding text loads. After a few seconds, the image clearly loads in full resolution. The popular blogging platform Medium does this really well.

The low resolution version is initially loaded, and then the full high resolution version. This also helps to optimize your critical rendering path! So while all of your other page resources are being downloaded, you’re showing a low-resolution teaser image that helps tell users that things are happening/being loaded. For more information on how you should lazy load your images, check out Google’s Lazy Loading Guidance.

Improve speed by condensing and bundling your files

Page speed audits will often make recommendations such as “minify resource,” but what does that actually mean? Minification condenses a code file by removing things like line breaks and spaces, as well as abbreviating code variable names wherever possible.

“Bundling” is another common term you’ll hear in reference to improving page speed. The process of bundling combines a bunch of the same coding language files into one single file. For example, a bunch of JavaScript files could be put into one larger file to reduce the amount of JavaScript files for a browser.

By both minifying and bundling the files needed to construct your web page, you’ll speed up your website and reduce the number of your HTTP (file) requests.

Improving the experience for international audiences

Websites that target audiences from multiple countries should familiarize themselves with international SEO best practices in order to serve up the most relevant experiences. Without these optimizations, international visitors might have difficulty finding the version of your site that caters to them.

There are two main ways a website can be internationalized:

  • Language
    Sites that target speakers of multiple languages are considered multilingual websites. These sites should add something called an hreflang tag to show Google that your page has copy for another language. Learn more about hreflang.
  • Country
    Sites that target audiences in multiple countries are called multi-regional websites and they should choose a URL structure that makes it easy to target their domain or pages to specific countries. This can include the use of a country code top level domain (ccTLD) such as “.ca” for Canada, or a generic top-level domain (gTLD) with a country-specific subfolder such as “example.com/ca” for Canada. Learn more about locale-specific URLs.

You’ve researched, you’ve written, and you’ve optimized your website for search engines and user experience. The next piece of the SEO puzzle is a big one: establishing authority so that your pages will rank highly in search results.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 4: On-Page Optimization

Posted by BritneyMuller

Chapter Four of the Beginner’s Guide to SEO rewrite is chock full of on-page SEO learnings. After all the great feedback you’ve provided thus far on our outline, Chapter One, Chapter Two, and Chapter Three, we’re eager to hear how you feel about Chapter Four. What really works for you? What do you think is missing? Read on, and let us know your thoughts in the comments!


Chapter 4: On-Page Optimization

Use your research to craft your message.

Now that you know how your target market is searching, it’s time to dive into on-page optimization, the practice of crafting web pages that answer searcher’s questions. On-page SEO is multifaceted, and extends beyond content into other things like schema and meta tags, which we’ll discuss more at length in the next chapter on technical optimization. For now, put on your wordsmithing hats — it’s time to create your content!

Creating your content

Applying your keyword research

In the last chapter, we learned methods for discovering how your target audience is searching for your content. Now, it’s time to put that research into practice. Here is a simple outline to follow for applying your keyword research:

  1. Survey your keywords and group those with similar topics and intent. Those groups will be your pages, rather than creating individual pages for every keyword variation.
  2. If you haven’t done so already, evaluate the SERP for each keyword or group of keywords to determine what type and format your content should be. Some characteristics of ranking pages to take note of:
    1. Are they image or video heavy?
    2. Is the content long-form or short and concise?
    3. Is the content formatted in lists, bullets, or paragraphs?
  3. Ask yourself, “What unique value could I offer to make my page better than the pages that are currently ranking for my keyword?”

On-page optimization allows you to turn your research into content your audience will love. Just make sure to avoid falling into the trap of low-value tactics that could hurt more than help!

Low-value tactics to avoid

Your web content should exist to answer searchers’ questions, to guide them through your site, and to help them understand your site’s purpose. Content should not be created for the purpose of ranking highly in search alone. Ranking is a means to an end, the end being to help searchers. If we put the cart before the horse, we risk falling into the trap of low-value content tactics.

Some of these tactics were introduced in Chapter 2, but by way of review, let’s take a deeper dive into some low-value tactics you should avoid when crafting search engine optimized content.

Thin content

While it’s common for a website to have unique pages on different topics, an older content strategy was to create a page for every single iteration of your keywords in order to rank on page 1 for those highly specific queries.

For example, if you were selling bridal dresses, you might have created individual pages for bridal gowns, bridal dresses, wedding gowns, and wedding dresses, even if each page was essentially saying the same thing. A similar tactic for local businesses was to create multiple pages of content for each city or region from which they wanted clients. These “geo pages” often had the same or very similar content, with the location name being the only unique factor.

Tactics like these clearly weren’t helpful for users, so why did publishers do it? Google wasn’t always as good as it is today at understanding the relationships between words and phrases (or semantics). So, if you wanted to rank on page 1 for “bridal gowns” but you only had a page on “wedding dresses,” that may not have cut it.

This practice created tons of thin, low-quality content across the web, which Google addressed specifically with its 2011 update known as Panda. This algorithm update penalized low-quality pages, which resulted in more quality pages taking the top spots of the SERPs. Google continues to iterate on this process of demoting low-quality content and promoting high-quality content today.

Google is clear that you should have a comprehensive page on a topic instead of multiple, weaker pages for each variation of a keyword.

Depiction of distinct pages for each keyword variation versus one page covering multiple variations

Duplicate content

Like it sounds, “duplicate content” refers to content that is shared between domains or between multiple pages of a single domain. “Scraped” content goes a step further, and entails the blatant and unauthorized use of content from other sites. This can include taking content and republishing as-is, or modifying it slightly before republishing, without adding any original content or value.

There are plenty of legitimate reasons for internal or cross-domain duplicate content, so Google encourages the use of a rel=canonical tag to point to the original version of the web content. While you don’t need to know about this tag just yet, the main thing to note for now is that your content should be unique in word and in value.

Depiction of how duplicate content looks between pages.

Cloaking

A basic tenet of search engine guidelines is to show the same content to the engine’s crawlers that you’d show to a human visitor. This means that you should never hide text in the HTML code of your website that a normal visitor can’t see.

When this guideline is broken, search engines call it “cloaking” and take action to prevent these pages from ranking in search results. Cloaking can be accomplished in any number of ways and for a variety of reasons, both positive and negative. Below is an example of an instance where Spotify showed different content to users than to Google.

Spotify shows a login page to Google.Spotify shows a National Philharmonic Orchestra landing page to logged in visitors.

In some cases, Google may let practices that are technically cloaking pass because they contribute to a positive user experience. For more on the subject of cloaking and the levels of risk associated with various tactics, see our article on White Hat Cloaking.

Keyword stuffing

If you’ve ever been told, “You need to include {critical keyword} on this page X times,” you’ve seen the confusion over keyword usage in action. Many people mistakenly think that if you just include a keyword within your page’s content X times, you will automatically rank for it. The truth is, although Google looks for mentions of keywords and related concepts on your site’s pages, the page itself has to add value outside of pure keyword usage. If a page is going to be valuable to users, it won’t sound like it was written by a robot, so incorporate your keywords and phrases naturally in a way that is understandable to your readers.

Below is an example of a keyword-stuffed page of content that also uses another old method: bolding all your targeted keywords. Oy.

Screenshot of a site that bolds keywords in a paragraph.

Auto-generated content

Arguably one of the most offensive forms of low quality content is the kind that is auto-generated, or created programmatically with the intent of manipulating search rankings and not helping users. You may recognize some auto-generated content by how little it makes sense when read — they are technically words, but strung together by a program rather than a human being.

Gibberish text on a webpage

It is worth noting that advancements in machine learning have contributed to more sophisticated auto-generated content that will only get better over time. This is likely why in Google’s quality guidelines on automatically generated content, Google specifically calls out the brand of auto-generated content that attempts to manipulate search rankings, rather than any-and-all auto-generated content.

What to do instead: 10x it!

There is no “secret sauce” to ranking in search results. Google ranks pages highly because it has determined they are the best answers to the searcher’s questions. In today’s search engine, it’s not enough that your page isn’t duplicate, spamming, or broken. Your page has to provide value to searchers and be better than any other page Google is currently serving as the answer to a particular query. Here’s a simple formula for content creation:

  • Search the keyword(s) you want your page to rank for
  • Identify which pages are ranking highly for those keywords
  • Determine what qualities those pages possess
  • Create content that’s better than that

We like to call this 10x content. If you create a page on a keyword that is 10x better than the pages being shown in search results (for that keyword), Google will reward you for it, and better yet, you’ll naturally get people linking to it! Creating 10x content is hard work, but will pay dividends in organic traffic.

Just remember, there’s no magic number when it comes to words on a page. What we should be aiming for is whatever sufficiently satisfies user intent. Some queries can be answered thoroughly and accurately in 300 words while others might require 1,000 words!

Pro tip: Don’t reinvent the wheel!
If you already have content on your website, save yourself time by evaluating which of those pages are already bringing in good amounts of organic traffic and converting well. Refurbish that content on different platforms to help get more visibility to your site. On the other side of the coin, evaluate what existing content isn’t performing as well and adjust it, rather than starting from square one with all new content.

NAP: A note for local businesses

If you’re a business that makes in-person contact with your customers, be sure to include your business name, address, and phone number (NAP) prominently, accurately, and consistently throughout your site’s content. This information is often displayed in the footer or header of a local business website, as well as on any “contact us” pages. You’ll also want to mark up this information using local business schema. Schema and structured data are discussed more at length in the “Code” section of this chapter.

If you are a multi-location business, it’s best to build unique, optimized pages for each location. For example, a business that has locations in Seattle, Tacoma, and Bellevue should consider having a page for each:

example.com/seattle
example.com/tacoma
example.com/bellevue

Each page should be uniquely optimized for that location, so the Seattle page would have unique content discussing the Seattle location, list the Seattle NAP, and even testimonials specifically from Seattle customers. If there are dozens, hundreds, or even thousands of locations, a store locator widget could be employed to help you scale.

Hope you still have some energy left after handling the difficult-yet-rewarding task of putting together a page that is 10x better than your competitors’ pages, because there are just a few more things needed before your page is complete! In the next sections, we’ll talk about the other on-page optimizations your pages need, as well as naming and organizing your content.

Beyond content: Other optimizations your pages need

Can I just bump up the font size to create paragraph headings?

How can I control what title and description show up for my page in search results?

After reading this section, you’ll understand other important on-page elements that help search engines understand the 10x content you just created, so let’s dive in!

Header tags

Header tags are an HTML element used to designate headings on your page. The main header tag, called an H1, is typically reserved for the title of the page. It looks like this:

 <h1>Page Title</h1>

There are also sub-headings that go from H2 (<h2>) to H6 (<h6>) tags, although using all of these on a page is not required. The hierarchy of header tags goes from H1 to H6 in descending order of importance.

Each page should have a unique H1 that describes the main topic of the page, this is often automatically created from the title of a page. As the main descriptive title of the page, the H1 should contain that page’s primary keyword or phrase. You should avoid using header tags to mark up non-heading elements, such as navigational buttons and phone numbers. Use header tags to introduce what the following content will discuss.

Take this page about touring Copenhagen, for example:

<h1>Copenhagen Travel Guide</h1>
<h2>Copenhagen by the Seasons</h2>
<h3>Visiting in Winter</h3>
<h3>Visiting in Spring</h3>

The main topic of the page is introduced in the main <h1> heading, and each additional heading is used to introduce a new sub-topic. In this example, the <h2> is more specific than the <h1>, and the <h3> tags are more specific than the <h2>. This is just an example of a structure you could use.

Although what you choose to put in your header tags can be used by search engines to evaluate and rank your page, it’s important to avoid inflating their importance. Header tags are one among many on-page SEO factors, and typically would not move the needle like quality backlinks and content would, so focus on your site visitors when crafting your headings.

Internal links

In Chapter 2, we discussed the importance of having a crawlable website. Part of a website’s crawlability lies in its internal linking structure. When you link to other pages on your website, you ensure that search engine crawlers can find all your site’s pages, you pass link equity (ranking power) to other pages on your site, and you help visitors navigate your site.

The importance of internal linking is well established, but there can be confusion over how this looks in practice.

Link accessibility

Links that require a click (like a navigation drop-down to view) are often hidden from search engine crawlers, so if the only links to internal pages on your website are through these types of links, you may have trouble getting those pages indexed. Opt instead for links that are directly accessible on the page.

Anchor text

Anchor text is the text with which you link to pages. Below, you can see an example of what a hyperlink without anchor text and a hyperlink with anchor text would look like in the HTML.

<a href="http://www.domain.com/"></a>
<a href="http://www.domain.com/" title="Keyword Text">Keyword Text</a>

On live view, that would look like this:

http://www.example.com/

Keyword Text

The anchor text sends signals to search engines regarding the content of the destination page. For example, if I link to a page on my site using the anchor text “learn SEO,” that’s a good indicator to search engines that the targeted page is one at which people can learn about SEO. Be careful not to overdo it, though. Too many internal links using the same, keyword-stuffed anchor text can appear to search engines that you’re trying to manipulate a page’s ranking. It’s best to make anchor text natural rather than formulaic.

Link volume

In Google’s General Webmaster Guidelines, they say to “limit the number of links on a page to a reasonable number (a few thousand at most).” This is part of Google’s technical guidelines, rather than the quality guideline section, so having too many internal links isn’t something that on its own is going to get you penalized, but it does affect how Google finds and evaluates your pages.

The more links on a page, the less equity each link can pass to its destination page. A page only has so much equity to go around.

Depiction of how link equity works between pages

So it’s safe to say that you should only link when you mean it! You can learn more about link equity from our SEO Learning Center.

Aside from passing authority between pages, a link is also a way to help users navigate to other pages on your site. This is a case where doing what’s best for search engines is also doing what’s best for searchers. Too many links not only dilute the authority of each link, but they can also be unhelpful and overwhelming. Consider how a searcher might feel landing on a page that looks like this:

Welcome to our gardening website! We have many articles on gardening, how to garden, and helpful tips on herbs, fruits, vegetables, perennials, and annuals. Learn more about gardening from our gardening blog.

Whew! Not only is that a lot of links to process, but it also reads pretty unnaturally and doesn’t contain much substance (which could be considered “thin content” by Google). Focus on quality and helping your users navigate your site, and you likely won’t have to worry about too many links.

Redirection

Removing and renaming pages is a common practice, but in the event that you do move a page, make sure to update the links to that old URL! At the very least, you should make sure to redirect the URL to its new location, but if possible, update all internal links to that URL at the source so that users and crawlers don’t have to pass through redirects to arrive at the destination page. If you choose to redirect only, be careful to avoid redirect chains that are too long (Google says, “Avoid chaining redirects… keep the number of redirects in the chain low, ideally no more than 3 and fewer than 5.”)

Example of a redirect chain:

(original location of content) example.com/location1 >> example.com/location2 >> (current location of content) example.com/location3

Better:

example.com/location1 >> example.com/location3

Image optimization

Images are the biggest culprits of slow web pages! The best way to solve for this is to compress your images. While there is no one-size-fits-all when it comes to image compression, testing various options like “save for web,” image sizing, and compression tools like Optimizilla, ImageOptim for Mac (or Windows alternatives), as well as evaluating what works best is the way to go.

Another way to help optimize your images (and improve your page speed) is by choosing the right image format.

How to choose which image format to use:

Flowchart for how to choose image formatsSource: Google’s image optimization guide

Choosing image formats:

  • If your image requires animation, use a GIF.
  • If you don’t need to preserve high image resolution, use JPEG (and test out different compression settings).
  • If you do need to preserve high image resolution, use PNG.
    • If your image has a lot of colors, use PNG-24.
    • If your image doesn’t have a lot of colors, use PNG-8.

There are different ways to keep visitors on a semi-slow loading page by using images that produce a colored box or a very blurry/low resolution version while rendering to help visitors feel as if things are loading faster. We will discuss these options in more detail in Chapter 5.

Pro tip: Don’t forget about thumbnails!
Thumbnails (especially for E-Commerce sites) can be a huge page speed slow down. Optimize thumbnails properly to avoid slow pages and to help retain more qualified visitors.

Alt text

Alt text (alternative text) within images is a principle of web accessibility, and is used to describe images to the visually impaired via screen readers. It’s important to have alt text descriptions so that any visually impaired person can understand what the pictures on your website depict.

Search engine bots also crawl alt text to better understand your images, which gives you the added benefit of providing better image context to search engines. Just ensure that your alt descriptions reads naturally for people, and avoid stuffing keywords for search engines.

Bad:

<img src="grumpycat.gif" alt="grumpy cat, cat is grumpy, grumpy cat gif">

Good:

<img src="grumpycat.gif" alt="A black cat looking very grumpy at a big spotted dog">

Submit an image sitemap

To ensure that Google can crawl and index your images, submit an image sitemap in your Google Search Console account. This helps Google discover images they may have otherwise missed.

Formatting for readability & featured snippets

Your page could contain the best content ever written on a subject, but if it’s formatted improperly, your audience might never read it! While we can never guarantee that visitors will read our content, there are some principles that can promote readability, including:

  • Text size and color – Avoid fonts that are too tiny. Google recommends 16+px font to minimize the need for “pinching and zooming” on mobile. The text color in relation to the page’s background color should also promote readability. Additional information on text can be found in the website accessibility guidelines. (Google’s web accessibility fundamentals).
  • Headings – Breaking up your content with helpful headings can help readers navigate the page. This is especially useful on long pages where a reader might be looking only for information from a particular section.
  • Bullet points – Great for lists, bullet points can help readers skim and more quickly find the information they need.
  • Paragraph breaks – Avoiding walls of text can help prevent page abandonment and encourage site visitors to read more of your page.
  • Supporting media – When appropriate, include images, videos, and widgets that would complement your content.
  • Bold and italics for emphasis – Putting words in bold or italics can add emphasis, so they should be the exception, not the rule. Appropriate use of these formatting options can call out important points you want to communicate.

Formatting can also affect your page’s ability to show up in featured snippets, those “position 0” results that appear above the rest of organic results.

Screenshot of a featured snippet

There is no special code that you can add to your page to show up here, nor can you pay for this placement, but taking note of the query intent can help you better structure your content for featured snippets. For example, if you’re trying to rank for “cake vs. pie,” it might make sense to include a table in your content, with the benefits of cake in one column and the benefits of pie in the other. Or if you’re trying to rank for “best restaurants to try in Portland,” that could indicate Google wants a list, so formatting your content in bullets could help.

Title tags

A page’s title tag is a descriptive, HTML element that specifies the title of a particular web page. They are nested within the head tag of each page and look like this:

<head>
  <title>Example Title</title>
</head>

Each page on your website should have a unique, descriptive title tag. What you input into your title tag field will show up here in search results, although in some cases Google may adjust how your title tag appears in search results.

Screenshot with the page title highlighted in the SERPs

It can also show up in web browsers…

Screenshot of a page title in a browser window

Or when you share the link to your page on certain external websites…

Screenshot of a page title shared on an external website

Your title tag has a big role to play in people’s first impression of your website, and it’s an incredibly effective tool for drawing searchers to your page over any other result on the SERP. The more compelling your title tag, combined with high rankings in search results, the more visitors you’ll attract to your website. This underscores that SEO is not only about search engines, but rather the entire user experience.

What makes an effective title tag?

  • Keyword usage: Having your target keyword in the title can help both users and search engines understand what your page is about. Also, the closer to the front of the title tag your keywords are, the more likely a user will be to read them (and hopefully click) and the more helpful they can be for ranking.
  • Length: On average, search engines display the first 50–60 characters (~512 pixels) of a title tag in search results. If your title tag exceeds the characters allowed on that SERP, an ellipsis “…” will appear where the title was cut off. While sticking to 50–60 characters is safe, never sacrifice quality for strict character counts. If you can’t get your title tag down to 60 characters without harming its readability, go longer (within reason).
  • Branding: At Moz, we love to end our title tags with a brand name mention because it promotes brand awareness and creates a higher click-through rate among people who are familiar with Moz. Sometimes it makes sense to place your brand at the beginning of the title tag, such as on your homepage, but be mindful of what you are trying to rank for and place those words closer toward the beginning of your title tag.

Meta descriptions

Like title tags, meta descriptions are HTML elements that describe the contents of the page that they’re on. They are also nested in the head tag, and look like this:

<head>
  <meta name=”description” content=”Description of page here.”/>
</head>

What you input into the description field will show up here in search results:

In many cases though, Google will choose different snippets of text to display in search results, dependent upon the searcher’s query.

For example if you search “find backlinks,” Google will provide this meta description as it deems it more relevant to the specific search:

The meta description pulls each step from the page content and lists it out.

While the actual meta description is:

How to find backlinks? Step 1: Navigate to Link Explorer, a tool used to research the backlink profile of a website. It will show you the quality of backlinks using metrics like Domain Authority, Page Authority, and Spam Score. You can do a good amount of backlink research with the free version or pay to receive unlimited backlink data.

This often helps to improve your meta descriptions for unique searches. However, don’t let this deter you from writing a default page meta description — they’re still extremely valuable.

What makes an effective meta description?

The qualities that make an effective title tag also apply to effective meta descriptions. Although Google says that meta descriptions are not a ranking factor, like title tags, they are incredibly important for click-through rate.

  • Relevance: Meta descriptions should be highly relevant to the content of your page, so it should summarize your key concept in some form. You should give the searcher enough information to know they’ve found a page relevant enough to answer their question, without giving away so much information that it eliminates the need to click through to your web page.
  • Length: Search engines tend to truncate meta descriptions to around 300 characters. It’s best to write meta descriptions between 150–300 characters in length. On some SERPs, you’ll notice that Google gives much more real estate to the descriptions of some pages. This usually happens for web pages ranking right below a featured snippet.

URL structure: Naming and organizing your pages

URL stands for Uniform Resource Locator. URLs are the locations or addresses for individual pieces of content on the web. Like title tags and meta descriptions, search engines display URLs on the SERPs, so URL naming and format can impact click-through rates. Not only do searchers use them to make decisions about which web pages to click on, but URLs are also used by search engines in evaluating and ranking pages.

Clear page naming

Search engines require unique URLs for each page on your website so they can display your pages in search results, but clear URL structure and naming is also helpful for people who are trying to understand what a specific URL is about. For example, which URL is clearer?

example.com/desserts/chocolate-pie

OR

example.com/asdf/453?=recipe-23432-1123

Searchers are more likely to click on URLs that reinforce and clarify what information is contained on that page, and less likely to click on URLs that confuse them.

Page organization

If you discuss multiple topics on your website, you should also make sure to avoid nesting pages under irrelevant folders. For example:

example.com/commercial-litigation/alimony

It would have been better for this fictional multi-practice law firm website to nest alimony under “/family-law/” than to host it under the irrelevant “/commercial-litigation/” section of the website.

The folders in which you locate your content can also send signals about the type, not just the topic, of your content. For example, dated URLs can indicate time-sensitive content. While appropriate for news-based websites, dated URLs for evergreen content can actually turn searchers away because the information seems outdated. For example:

example.com/2015/april/what-is-seo/

vs.

example.com/what-is-seo/

Since the topic “What is SEO?” isn’t confined to a specific date, it’s best to host on a non-dated URL structure or else risk your information appearing stale.

As you can see, what you name your pages, and in what folders you choose to organize your pages, is an important way to clarify the topic of your page to users and search engines.

URL length

While it is not necessary to have a completely flat URL structure, many click-through rate studies indicate that, when given the choice between a URL and a shorter URL, searchers often prefer shorter URLs. Like title tags and meta descriptions that are too long, too-long URLs will also be cut off with an ellipsis. Just remember, having a descriptive URL is just as important, so don’t cut down on URL length if it means sacrificing the URL’s descriptiveness.

example.com/services/plumbing/plumbing-repair/toilets/leaks/

vs.

example.com/plumbing-repair/toilets/

Minimizing length, both by including fewer words in your page names and removing unnecessary subfolders, makes your URLs easier to copy and paste, as well as more clickable.

Keywords in URL

If your page is targeting a specific term or phrase, make sure to include it in the URL. However, don’t go overboard by trying to stuff in multiple keywords for purely SEO purposes. It’s also important to watch out for repeat keywords in different subfolders. For example, you may have naturally incorporated a keyword into a page name, but if located within other folders that are also optimized with that keyword, the URL could begin to appear keyword-stuffed.

Example:

example.com/seattle-dentist/dental-services/dental-crowns/

Keyword overuse in URLs can appear spammy and manipulative. If you aren’t sure whether your keyword usage is too aggressive, just read your URL through the eyes of a searcher and ask, “Does this look natural? Would I click on this?”

Static URLs

The best URLs are those that can easily be read by humans, so you should avoid the overuse of parameters, numbers, and symbols. Using technologies like mod_rewrite for Apache and ISAPI_rewrite for Microsoft, you can easily transform dynamic URLs like this:

http://moz.com/blog?id=123

into a more readable static version like this:

https://moz.com/google-algorithm-change

Hyphens for word separation

Not all web applications accurately interpret separators like underscores (_), plus signs (+), or spaces (%20). Search engines also do not understand how to separate words in URLs when they run together without a separator (example.com/optimizefeaturedsnippets/). Instead, use the hyphen character (-) to separate words in a URL.

Geographic Modifiers in URLs

Some local business owners omit geographic terms that describe their physical location or service area because they believe that search engines can figure this out on their own. On the contrary, it’s vital that local business websites’ content, URLs, and other on-page assets make specific mention of city names, neighborhood names, and other regional descriptors. Let both consumers and search engines know exactly where you are and where you serve, rather than relying on your physical location alone.

Protocols: HTTP vs. HTTPS

A protocol is that “http” or “https” preceding your domain name. Google recommends that all websites have a secure protocol (the “s” in “https” stands for “secure”). To ensure that your URLs are using the https:// protocol instead of http://, you must obtain an SSL (Secure Sockets Layer) certificate. SSL certificates are used to encrypt data. They ensure that any data passed between the web server and browser of the searcher remains private. As of July 2018, Google Chrome displays “not secure” for all HTTP sites, which could cause these sites to appear untrustworthy to visitors and result in them leaving the site.


If you’ve made it this far, congratulations on surpassing the halfway point of the Beginner’s Guide to SEO! So far, we’ve learned how search engines crawl, index, and rank content, how to find keyword opportunities to target, and now, you know the on-page optimization strategies that can help your pages get found. Next, buckle up, because we’ll be diving into the exciting world of technical SEO!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Articles

Posted in Latest NewsComments Off

Email Testing: 7 tips from your peers for email conversion optimization

We recently asked the MarketingSherpa audience for tips on running effective email tests. Here are a few of the most helpful responses to consider as you start to develop an email testing program.
MarketingSherpa Blog

Posted in Latest NewsComments Off

Declining Organic Traffic? How to Tell if it’s a Tracking or Optimization Issue

Posted by andrewchoco

Picture this scenario. You’re a new employee that has just been brought in to a struggling marketing department (or an agency brought on to help recover lost numbers). You get access to Google Analytics, and see something like this:

(Actual screenshot of the client I audited)

This can generate two types of emotional response: excitement or fear (or both). The steady decline in organic traffic excites you because you have so many tactics and ideas that you think can save this company from spiraling downward out of control. But there’s also the fear that these tactics wont be enough to correct the course.

Regardless of whether these new tactics would work or not, it’s important to understand the history of the account and determine not only what is happening, but why.

The company may have an idea of why the traffic is declining (i.e. competitors have come in and made ranking for keywords much harder, or they did a website redesign and have never recovered).

Essentially, this boils down to two things: 1) either you’re struggling with organic optimization, or 2) something was off with your tracking in Google Analytics, has since been corrected, and hasn’t been caught.

In this article, I’ll go over an audit I did for one of my clients to help determine if the decline we saw in organic traffic was due to actual poor SEO performance, an influx in competitors, tracking issues, or a combination of these things.

I’ll be breaking it down into five different areas of investigation:

  1. Keyword ranking differences from 2015–2017
    1. Did the keywords we were ranking for in 2015 change drastically in 2017? Did we lose rankings and therefore lose organic traffic?
  2. Top organic landing pages from 2015–2017
    1. Are the top ranking organic landing pages the same currently as they were in 2015? Are we missing any pages due to a website redesign?
  3. On-page metric
    1. Did something happen to the site speed / bounce rate / page views etc.
  4. SEMrush/Moz keyword, traffic, and domain authority data
    1. Looking at the SEMrush organic traffic cost metric as well as Moz metrics like Domain Authority and competitors.
  5. Goal completions
    1. Did our conversion numbers stay consistent throughout the traffic drop? Or did the conversions drop in correlation with the traffic drop?

By the end of this post, my goal is that you’ll be able to replicate this audit to determine exactly what’s causing your organic traffic decline and how to get back on the right track.

Let’s dive in!

Keyword ranking differences from 2015–2017

This was my initial starting point for my audit. I started with this specifically because the most obvious answer, for a decline in traffic is a decline in keyword rankings.

I wanted to look at what keywords we were ranking for in 2015 to see if we significantly dropped in the rankings or if the search volume had dropped. If the company you’re auditing has had a long-running Moz account, start by looking at the keyword rankings from the initial start of the decline, compared to current keyword rankings.

I exported keyword data from both SEMrush and Moz, and looked specifically at the ranking changes of core keywords.

March was a particularly strong month across the board, so I narrowed it down and exported the keyword rankings in:

  • March 2015
  • March 2016
  • March 2017
  • December 2017 (so I could get the most current rankings)

Once the keywords were exported, I went in and highlighted in red the keywords that we were ranking for in 2015 (and driving traffic from) that we were no longer ranking for in 2017. I also highlighted in yellow the keywords we were ranking for in 2015 that were still ranking in 2017.

2015 keywords:

2017 keywords:

(Brand-related queries and URLs are blurred out for anonymity)

One thing that immediately stood out: in 2015, this company was ranking for five keywords, including the word “free.” They have since changed their offering, so it made sense that in 2017, we weren’t ranking for those keywords.

After removing the free queries, we pulled the “core” keywords to look at their differences.

March 2015 core keywords:

  • Appointment scheduling software: position 9
  • Online appointment scheduling: position 11
  • Online appointment scheduling: position 9
  • Online scheduling software: position 9
  • Online scheduler: position 9
  • Online scheduling: position 13

December 2017 core keywords:

  • Appointment scheduler: position 11
  • Appointment scheduling software: position 10
  • Online schedule: position 6
  • Online appointment scheduler: position 11
  • Online appointment scheduling: position 12
  • Online scheduling software: position 12
  • Online scheduling tool: position 10
  • Online scheduling: position 15
  • SaaS appointment scheduling: position 2

There were no particular red flags here. While some of the keywords have moved down 1–2 spots, we had new ones jump up. These small changes in movement didn’t explain the nearly 30–40% drop in organic traffic. I checked this off my list and moved on to organic landing pages.

Top organic landing page changes

Since the dive into keyword rankings didn’t provide the answer for the decline in traffic, the next thing I looked at were the organic landing pages. I knew this client had switched over CMS systems in early 2017, and had done a few small redesign projects the past three years.

After exporting our organic landing pages for 2015, 2016, and 2017, we compared the top ten (by organic sessions) and got the following results.

2015 top organic landing pages:

2016 top organic landing pages:

2017 top organic landing pages:

Because of their redesign, you can see that the subfolders changed between 2015/2016 to 2017. What really got my attention, however, is the /get-started page. In 2015/2016, the Get Started page accounted for nearly 16% of all organic traffic. In 2017, the Get Started page was nowhere to be found.

If you run into this problem and notice there are pages missing from your current top organic pages, a great way to uncover why is to use the Wayback Machine. It’s a great tool that allows you to see what a web page looked like in the past.

When we looked at the /get-started URL in the Wayback Machine, we noticed something pretty interesting:

In 2015, their /get-started page also acted as their login page. When people were searching on Google for “[Company Name] login,” this page was ranking, bringing in a significant amount of organic traffic.

Their current setup sends logins to a subdomain that doesn’t have a GA code (as it’s strictly used as a portal to the actual application).

That helped explain some of the organic traffic loss, but knowing that this client had gone through a few website redesigns, I wanted to make sure that all redirects were done properly. Regardless of whether or not your traffic has changed, if you’ve recently done a website redesign where you’re changing URLs, it’s smart to look at your top organic landing pages from before the redesign and double check to make sure they’re redirecting to the correct pages.

While this helped explain some of the traffic loss, the next thing we looked at was the on-page metrics to see if we could spot any obvious tracking issues.

Comparing on-page engagement metrics

Looking at the keyword rankings and organic landing pages provided a little bit of insight into the organic traffic loss, but it was nothing definitive. Because of this, I moved to the on-page metrics for further clarity. As a disclaimer, when I talk about on-page metrics, I’m talking about bounce rate, page views, average page views per session, and time on site.

Looking at the same top organic pages, I compared the on-page engagement metrics.

2015 on-page metrics:

2016 on-page metrics:

2017 on-page metrics:

While the overall engagement metrics changed slightly, the biggest and most interesting discrepancy I saw was in the bounce rates for the home page and Get Started page.

According to a number of different studies (like this one, this one, or even this one), the average bounce rate for a B2B site is around 40–60%. Seeing the home page with a bounce rate under 20% was definitely a red flag.

This led me to look into some other metrics as well. I compared key metrics between 2015 and 2017, and was utterly confused by the findings:

Looking at the organic sessions (overall), we saw a decrease of around 80,000 sessions, or 27.93%.

Looking at the organic users (overall) we saw a similar number, with a decrease of around 38,000 users, or 25%.

When we looked at page views, however, we saw a much more drastic drop:

For the entire site, we saw a 50% decrease in pageviews, or a decrease of nearly 400,000 page views.

This didn’t make much sense, because even if we had those extra 38,000 users, and each user averaged roughly 2.49 pages per session (looking above), that would only account for, at most, 100,000 more page views. This left 300,000 page views unaccounted for.

This led me to believe that there was definitely some sort of tracking issue. The high number of page views and low bounce rate made me suspect that some users were being double counted.

However, to confirm these assumptions, I took a look at some external data sources.

Using SEMrush and Moz data to exclude user error

If you have a feeling that your tracking was messed up in previous years, a good way to confirm or deny this hypothesis is to check external sources like Moz and SEMrush.

Unfortunately, this particular client was fairly new, so as a result, their Moz campaign data wasn’t around during the high organic traffic times in 2015. However, if it was, a good place to start would be looking at the search visibility metric (as long as the primary keywords have stayed the same). If this metric has changed drastically over the years, it’s a good indicator that your organic rankings have slipped quite a bit.

Another good thing to look at is domain authority and core page authority. If your site has had a few redesigns, moved URLs, or anything like that, it’s important to make sure that the domain authority has carried over. It’s also important to look at the page authorities of your core pages. If these are much lower than when they were before the organic traffic slide, there’s a good chance your redirects weren’t done properly, and the page authority isn’t being carried over through those new domains.

If, like me, you don’t have Moz data that dates back far enough, a good thing to check is the organic traffic cost in SEMrush.

Organic traffic cost can change because of a few reasons:

  1. Your site is ranking for more valuable keywords, making the organic traffic cost rise.
  2. More competitors have entered the space, making the keywords you were ranking for more expensive to bid on.

Usually it’s a combination of both of these.

If our organic traffic really was steadily decreasing for the past 2 years, we’d likely see a similar trendline looking at our organic traffic cost. However, that’s not what we saw.

In March of 2015, the organic traffic cost of my client’s site was $ 14,300.

In March of 2016, the organic traffic cost was $ 22,200

In December of 2017, the organic traffic cost spiked all the way up to $ 69,200. According to SEMrush, we also saw increases in keywords and traffic.

Looking at all of this external data re-affirmed the assumption that something must have been off with our tracking.

However, as a final check, I went back to internal metrics to see if the conversion data had decreased at a similar rate as the organic traffic.

Analyzing and comparing conversion metrics

This seemed like a natural final step into uncovering the mystery in this traffic drop. After all, it’s not organic traffic that’s going to profit your business (although it’s a key component). The big revenue driver is goal completions and form fills.

This was a fairly simple procedure. I went into Google Analytics to compare goal completion numbers and goal completion conversion rates over the past three years.

If your company is like my client’s, there’s a good chance you’re taking advantage of the maximum 20 goal completions that can be simultaneously tracked in Analytics. However, to make things easier and more consistent (since goal completions can change), I looked at only buyer intent conversions. In this case it was Enterprise, Business, and Personal edition form fills, as well as Contact Us form fills.

If you’re doing this on your own site, I would recommend doing the same thing. Gated content goal completions usually have a natural shelf life, and this natural slowdown in goal completions can skew the data. I’d look at the most important conversion on your site (usually a contact us or a demo form) and go strictly off those numbers.

For my client, you can see those goal completion numbers below:

Goal completion name

2015

2016

2017

Contact Us

579

525

478

Individual Edition

3,372

2,621

3,420

Business Edition

1,147

1,437

1,473

Enterprise Edition

1,178

1,053

502

Total

6,276

5,636

5,873

Conversion rates:

Goal completion name

2015

2016

2017

Contact Us

0.22%

0.22%

0.23%

Individual Edition

1.30%

1.09%

1.83%

Business Edition

0.46%

0.60%

0.76%

Enterprise Edition

0.46%

0.44%

0.29%

Average

0.61%

0.58%

0.77%

This was pretty interesting. Although there was clearly fluctuation in the goal completions and conversion rates, there were no differences that made sense with our nearly 40,000 user drop from 2015 to 2016 to 2017.

All of these findings further confirmed that we were chasing an inaccurate goal. In fact, we spent the first three months working together to try and get back a 40% loss that, quite frankly, was never even there in the first place.

Tying everything together and final thoughts

For this particular case, we had to go down all five of these roads in order to reach the conclusion that we did: Our tracking was off in the past.

However, this may not be the case for your company or your clients. You may start by looking at keyword rankings, and realize that you’re no longer ranking on the first page for ten of your core keywords. If that’s the case, you quickly discovered your issue, and your game plan should be investing in your core pages to help get them ranking again for these core keywords.

If your goal completions are way down (by a similar percentage as your traffic), that’s also a good clue that your declining traffic numbers are correct.

If you’ve looked at all of these metrics and still can’t seem to figure out the reasoning for the decrease and you’re blindly trying tactics and struggling to crawl your way back up, this is a great checklist to go through to confirm the ominous question of tracking issue or optimization issue.

If you’re having a similar issue as me, I’m hoping this post helps you get to the root of the problem quickly, and gets you one step closer to create realistic organic traffic goals for the future!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Call-to-Action Optimization: 132% increase in clickthrough from changing four simple words

Small changes to call-to-action wording can have a large impact on conversion.
MarketingSherpa Blog

Posted in Latest NewsComments Off

Marketing 101: What is CRO (Conversion Rate Optimization)?

If you’re in advertising or marketing, it helps to have an understanding of what conversion rate optimization is. CRO can be a powerful tool to improve the success of every marketing campaign, initiative and website you work on.
MarketingSherpa Blog

Posted in Latest NewsComments Off

Your Daily SEO Fix: Keywords, Concepts, Page Optimization, and Happy NAPs

Posted by FeliciaCrawford

Howdy, readers! We’re back with our last round of videos for this go of the Daily SEO Fix series. To recap, here are the other topics we’ve covered previously:

Today we’ll be delving into more keyword and concept research, quick wins for on-page optimization, and a neat way to stay abreast of duplicates and inaccuracies in your local listings. We use Moz Pro, the MozBar, and Moz Local in this week’s fixes.


Fix #1: Grouping and analyzing keywords by label to judge how well you’re targeting a concept


The idea of “concepts over keywords” has been around for a little while now, but tracking rankings for a concept isn’t quite as straightforward as it is for keywords. In this fix, Kristina shows you how to label groups of keywords to track and sort their rankings in Moz Pro so you can easily see how you’re ranking for grouped terms, chopping and analyzing the data as you see fit.


Fix #2: Adding alternate NAP details to uncover and clean up duplicate or inaccurate listings


If you work in local SEO, you know how important it is for listings to have an accurate NAP (name, address, phone number). When those details change for a business, it can wreak absolute havoc and confuse potential searchers. Jordan walks you through adding alternate NAP details in Moz Local to make sure you uncover and clean up old and/or duplicate listings, making closure requests a breeze. (This Whiteboard Friday is an excellent explanation of why that’s really important; I like it so much that I link to it in the resources below, too. ;)

Remember, you can always use the free Check Listing tool to see how your local listings and NAP are popping up on search engines:

Is my NAP accurate?


Fix #3: Research keywords and concepts to fuel content suggestions — on the fly

You’re already spying on your competitors’ sites; you might as well do some keyword research at the same time, right? Chiaryn walks you through how to use MozBar to get keyword and content suggestions and discover how highly ranking competitor sites are using those terms. (Plus a cameo from Lettie Pickles, star of our 2015 Happy Holidays post!)


Fix #4: Discover whether your pages are well-optimized as you browse — then fix them with these suggestions


A fine accompaniment to your on-the-go keyword research is on-the-go on-page optimization. (Try saying that five times fast.) Janisha gives you the low-down on how to check whether a page is well-optimized for a keyword and identify which fixes you should make (and how to prioritize them) using the SEO tool bar.


Further reading & fond farewells

I’ve got a whole passel of links if you’re interested in reading more educational content around these topics. And by “reading,” I mean “watching,” because I really stacked the deck with Whiteboard Fridays this time. Here you are:

And of course, if you need a better handle on all this SEO stuff and reading blog posts just doesn’t cut the mustard, we now offer classes that cover all the essentials.

My sincere thanks to all of you tuning in to check out our Daily SEO Fix video series over the past couple of weeks — it’s been fun writing to you and hearing from you in the comments! Be sure to keep those ideas and questions comin’ — we’re listening.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

5 Content Optimization Mistakes You’ll Wish You Fixed Sooner

"Show your site visitors that you’re a match for them — faster." – Stefanie Flaxman

By now you know that — technical details aside — SEO is not separate from content marketing; it’s an integrated aspect of content marketing.

Optimizing your content for search engines is part of your craft and a skill you can strengthen with practice.

But even when you rank well for search terms your audience uses, the real test is what happens when someone clicks through to your website. As Brian wrote on Monday:

“There’s nothing worse than a quick bounce.”

To avoid a quick bounce, you need to focus on content optimization. Since you don’t want to miss any opportunities to connect with your site visitors, study this list of five common mistakes — and how to fix them.

Mistake #1: Your visitors can’t tell if your content’s right for them

A row of four new houses that all look basically the same were just built on the street where I live. When a real estate agent starts taking potential buyers on tours of the houses, do you know what’s going to happen?

The potential buyers are going to examine the properties and make judgments about the differences they notice.

A woman is going to dislike the filigree on one of the front gates and select the house with the simple brown gate and extra large balconies. A man is going to love the house with the filigree on the front gate. Another woman is going to hate the house with the extra large balconies and prefer the house with additional living room space.

You get the point.

While these houses appear roughly similar from the outside, visitors quickly assess which property is right for them based on their personal preferences.

The same thing happens when people search for information about a topic. The websites that appear at the top of search results for a keyword phrase might all look the same at first, so visitors will quickly inspect your content to see if it contains the qualities that are right for them.

If your special qualities (your proverbial front gate with filigree, large balcony, or spacious living room) aren’t clear, you won’t convince the people you want to attract that you can satisfy their preferences.

How to fix it

Take 15 Minutes to Find Your Winning Difference

When you stop trying to attract everyone, it’s easier to attract those who recognize and appreciate your unique selling proposition (USP).

You’re right for some visitors and your competitors may be right for others. That’s okay.

Mistake #2: Your headlines aren’t specific

The quickest way to a quick bounce is a generic headline that could appear on any other website in your niche.

Typically, these weak headlines fail to offer a benefit, or the benefit could be so vague that it fails to capture the attention of the people who you actually created the content for.

They could also be boring.

How to fix it

Ask Yourself These 3 Simple Questions to Craft Better Headlines

If you immediately communicate details about why your content is helpful, you’ll grab the attention of people who need that kind of help.

Aim to infuse your headlines with the essence of your USP and show your site visitors that you’re a match for them — faster.

Mistake #3: You don’t edit

Plenty of websites have success publishing first-draft content. If rough drafts form a bond with the people you aim to serve … cool.

But if your content isn’t striking a chord with the people you want to attract and develop relationships with, you may need to push yourself further.

How to fix it

Discover Why Content Marketers Need Editors

Rough drafts often fail to effectively convey your messages. They may contain too much information or tangents that distract busy readers and make your content less useful.

Editing is about creating a content experience. Rather than expressing raw thoughts, you craft a thoughtful presentation that helps solve a problem. When you click on the link above, you’ll learn how to think like an editor.

Mistake #4: You don’t give visitors more opportunities to learn

Websites with a lot of content may still look like “brochure” websites if they don’t present a different angle or perspective that makes visitors think, “I like this specific approach to this topic.”

When visitors feel you offer them something they can’t find on other websites, they want to hear more from you and stay connected.

If you don’t anticipate a reader’s desire to learn more, he might bounce to other sites to see if they offer more resources.

How to fix it

Add a Tantalizing Incentive that Will Build Your Email List

Ideally, you want to have so much great content that when visitors land on your site they’re frustrated that they don’t have enough time to consume it all in one sitting.

They’ll have to make a note to come back. Now the question is:

Do they sign up for your email list so they don’t miss any new content?

Make signing up for your email list a no-brainer by providing an incentive that is a perfect match for their needs. Your email list could also offer exclusive content the public doesn’t see.

Visitors will feel like they hit the jackpot that day on their journey.

Mistake #5: You don’t empower visitors to make a purchase

Information is … information. It doesn’t spark the buying process.

If you don’t give visitors a taste of what it’s like to do business with you, you won’t convert prospects to customers.

How to fix it

Educate to Convert Your Prospects

When you convince your website visitors to keep up with everything you publish, you’re able to build the relationships that will build your business. And the right balance of content and copy helps your prospects imagine what it’s like to buy from you.

Demonstrate why your product or service will give them the transformation they desire.

Optimize your content to grow your audience

Here’s a suggestion:

Assign each of the mistakes above to a day next week, Monday through Friday, and spend a couple hours each day identifying where you might make those errors and how you can fix them. By the end of the week, you’ll have a wealth of new ideas about how you can improve going forward.

What’s your process for producing exceptional content that impresses your website visitors? In the comments below, let us know how you stand out.

The post 5 Content Optimization Mistakes You’ll Wish You Fixed Sooner appeared first on Copyblogger.


Copyblogger

More Articles

Posted in Latest NewsComments Off

The 4 Fundamental Steps of Conversion Optimization

Once upon a time, I was sitting in my office looking over data for one our new clients and reviewing the conversion project roadmap. The phone rang and on the other end was the VP of marketing for a multi-billion-dollar company. It is very unusual to get an unannounced call from someone at his level, but he had an urgent problem to solve. A good number of his website visitors were not converting.

His problem did not surprise me. We deal with conversion rates optimization every day.

He invited me to meet with his team to discuss the problem further. The account would be a huge win for Invesp, so we agreed on a time that worked for both us. When the day came, our team went to the company’s location.

We started the discussion, and things did NOT go as I expected. The VP, who led the meeting, said, “we have a conversion problem.”

“First-time visitors to our website convert at a rate of 48%. Repeat visitors convert at 80%!”

I was puzzled.

Not sure what exactly puzzled me. Was it the high conversion numbers or was it the fact that the VP was not happy with them. He wanted more.

I thought he had his conversion numbers wrong. But nope. We looked at his analytics, and he was correct. The numbers were simply amazing by all standards. The VP, however, had a different mindset. The company runs thousands of stores around the US. When someone picks up the phone and calls them, they convert callers at a 90% rate. He was expecting the same conversion rate for his online store.

Let’s face it. A typical e-commerce store converts at an average of 3%. Few websites are able to get to anywhere from 10 to 18%. These are considered the stars of the world of conversion rates.

The sad truth about a website with 15% conversion rate is that 85% of the visitors simply leave without converting. Money left on the table, cash the store will not be able to capture. Whatever way you think about it, we can agree that there is a huge opportunity, but it is also a very difficult one to conquer.

The Problem with Conversion Optimization

Most companies jump into conversion optimization with a lot of excitement. As you talk to teams conducting conversion optimization, you notice a common thread. They take different pages of the website and run tests on them. Some tests produce results; others do not. After a while, the teams run out of ideas. The managers run out of excitement.

The approach of randomly running tests on different pages sees conversion rate optimization in a linear fashion. The real problem is that no one shops online in a linear fashion. We do not follow a linear path when we navigate from one area of the website to the next. Humans most of the time are random, or, at least, they appear random.

What does that mean?

The right approach to increase conversion rates needs to be systematical, because it deals with irrational and random human behavior.

So, how do you do this?

The Four Steps to Breaking to Double Digits Conversion Rates

After ten years of doing conversion optimization at Invesp, I can claim that we have a process that works for many online businesses. The truth is that it continues to be a work in progress.

These are the four steps you should follow to achieve your desired conversion rate:

Create Personas for Your Website

I could never stop talking about personas and the impact they have on your website. While most companies talk about their target market, personas help you translate your generalized and somewhat abstract target market data into a personalized experience that impacts your website design, copy and layout.

Let’s take the example of a consulting company that targets “e-commerce companies with a revenue of 10 million dollars or more.” There are two problems with this statement:

  • The statement is too general about the target market (no verticals and no geography, for example)
  • I am not sure how to translate this statement into actionable items on my website or marketing activity

You should first think about the actual person who would hire the services of this consulting company. Most likely, the sales take place to:

  • A business owner for a company with annual revenue from 10 to 20 million dollars.
  • A marketing director for a company with annual revenue from 20 to 50 million dollars.
  • A VP of marketing for a company with annual revenue over 50 million dollars.

Now, translate each of these three different cases into a persona.

So, instead of talking about a business owner for a company that is generating annual revenue from 10 to 20 million dollars, we will talk about:

John Riley, 43 years old, completed his B.A. in physics from the University of Michigan-Ann Arbor. He is a happy father of three. He started the company in 2007 and financed it from his own pocket. His company generated 13.5 million dollars of revenue in 2014 and expects to see a modest 7% increase in sales in 2015. John is highly competitive, but he also cares about his customers and thinks of them as an extended family. He would like to find a way to increase this year’s revenue by 18%, but he is not sure how to do so. He is conservative when it comes to using new marketing techniques. In general, John does not trust consultants and thinks of them as overpaid.

This is an oversimplification of the persona creation process and its final product. But you get the picture. If you are the consulting company that targets John, then what type of website design, copy and visitor flow would you use to persuade him to do business with you?

What data points do you use to create personas for your website? I would start with this:

  • Market research
  • Demographical studies
  • Usability studies
  • Zip code analysis
  • Existing customer surveys
  • Competitive landscape
  • AB and Multivariate testing data

A website or a business should typically target four to seven personas.

Add Traffic Sources

So, you have the personas. These personas should impact your design, copy and visitor flow.

But how?

Let’s start by looking at analytics data. Look for a period of six months to one year and see the top traffic sources/mediums. If your website has been online for a while, then you will probably have hundreds of different sources. Start with your top 10 traffic sources/medium and create a matrix for each of the personas/traffic source/landing pages:

Now, your job is to evaluate each top landing page for each traffic source through the eyes of your website personas. For each page, you will answer eight questions.

The persona questions: Eight questions to ask

  • What type of information would persona “x” need to see to click on to the next page on the website?
  • What would be the top concerns persona “x” have looking at the page?
  • What kind of copy does persona “x” need to see?
  • What type of trigger words are important to include on the page for persona “x”?
  • What words should I avoid for persona “x”?
  • What kind of headline should I use to persuade persona “x” to stay on my website?
  • What kind of images should I use to capture persona “x” attention?
  • What elements on the page could distract persona “x”?

As you answer these questions for each of the personas, you will end up with a large set of answers and actions. The challenge and the art will be to combine all these and make the same landing page work for all different personas. This is not a small task, but this is where the fun begins.

Consider the Buying Stages 

You thought the previous work was complex? Well, you haven’t seen anything just yet!

Not every visitor who lands on your website is ready to buy. Visitors come to your website in different buying stages, and only 15-20% are in the action stage. The sequential buying stages of a visitor are:

  • Awareness stage (top of the sales funnel)
  • Research stage
  • Evaluating alternatives
  • Action stage
  • Post action

A typical buying funnel looks like this:

How does that translate into actionable items on your website?

In the previous exercise, we created a list of changes on different screens or sections of your website based on the different personas. Now, we are going to think about each persona landing on the website in one of the first four buying stages.

Instead of thinking of how to adjust a particular screen for John Riley, now you think of a new scenario:
Persona “x” is in the “evaluating alternatives” stage of the buying funnel. He lands on a particular landing page. What do I need to adjust in the website design and copy to persuade persona “x” to convert?

Our previous table looks like this now:

Next, answer all eight persona-questions again, based on the different buying stages.

Test your different scenarios

This goes without saying; you should NEVER introduce changes to your website without actually testing them. You can find plenty of blogs and books out there on how to conduct testing correctly if you are interested in learning more about AB testing and multivariate testing.

For a start, keep the five No’s of AB testing in mind:

1. No to “Large and complex tests”

Your goal is NOT to conduct large AB or multivariate tests. Your goal is to discover what elements on the page cause visitors to act a specific way. Break complex tests into smaller ones. The more you can isolate the changes to one or two elements, the easier it will be to understand the impact of different design and copy elements on visitors’ actions.

2. No to “Tests without a hypothesis”

I can never say it enough. A test without a good hypothesis is a gambling exercise. A hypothesis is a predictive statement about a problem or set of problems on your page and the impact of solving these problems on visitor behavior.

3. No to “Polluted data”

Do not run tests for less than seven days or longer than four weeks. In both scenarios, you are leaving yourself open to the chance of inconsistent and polluted data. When you run a test for less than seven days, website data inconsistencies you are not aware of may affect your results. So, give the test results a chance to stabilize. If you run a test for more than four weeks, you are allowing external factors to have a larger impact on your results.

4. No to “Quick fixes”

Human psychology is complex. Conversion optimization is about understanding visitor behavior and adjusting website design, copy and process to persuade these visitors to convert. Conversion optimization is not a light switch you turn on and off. It is a long-term commitment. Some tests will produce results and some will not. Increases in conversion rates are great but what you are looking for is a window to visitor behavior.

5. No to “Tests without marketing insights”

Call it whatever you like: forensic analysis, posttest analysis, test results assessment. You should learn actionable marketing insights from the test to deploy across channels and verticals. The real power of any testing program lays beyond the results.

If you follow the steps outlined in this blog, you will have a lot to do.

So, happy testing!

About the author: This guide was written by Khalid Saleh. He is the CEO of Invesp, a conversion optimization software and services firm with clients in 11 different countries.

Categories: 

SEO Book

Posted in Latest NewsComments Off

How 3 Mega E-Commerce Websites Approach Title Tag & Meta Description Optimization

ecommerce-shopping

Here’s a test: Google “fairy wings” right now. Your job is to quickly find which result will sell you a set of glitter fairy wings and preferably include free shipping.

Now that you’ve begun your search, how do you know which result will bring you to the most qualified products? One way is to visually scan the title and description snippets in your search results.

There are nearly 12 billion Google searches per month. Consumers conduct searches for products or services they need, and often use the snippets in search results as deciding factor on whether to click, or keep scrolling.

The examples below show how three mega e-commerce sites approach title tags and meta descriptions, what they’re doing right and some additional opportunities.

Alibaba.com

alibaba logo

Alibiba.com takes a keyword-heavy approach to title tags and meta descriptions. Jam-packed with keywords, their title tags and meta descriptions often exceed recommended character counts and do not create compelling arguments for click-throughs.

What they’re doing right

  • Specifying title tags and meta descriptions on every page
  • Including keywords in title tags and meta descriptions
  • Using action-oriented meta descriptions to call readers to “Find quality [product name here]”

Strategic recommendations

  • Reduce title tag length: Lengths are consistently over 100 characters. Limiting the character count to 50-60 will reduce truncation in search results and allow Alibaba.com to reign in their optimization strategy to focus on 1-2 top priority keywords per page.
  • Reduce meta description length: Descriptions tend to be upwards of 200 characters on this site. Limiting the character length to 160 characters or less will allow Alibaba.com to lead with a complete, cohesive sentence in search results.
  • Draft unique and compelling meta descriptions: Meta descriptions on this site’s product pages simply reorder keywords listed in title tag and tend to trail off into lists of keywords for the bots to read. Draft descriptions for readers instead of search bots to improve click-through rates with concise, actionable language that emphasizes Alibaba.com’s value proposition.

alibaba meta example

Amazon

amazon logo

Amazon appears to take a minimalistic approach to title tags and meta descriptions. Template-style descriptions leave room for improvement in terms of providing useful and compelling reasons to click.

What they’re doing right

  • Specifying title tags and meta descriptions on every page
  • Not exceeding character limitations in most cases

Strategic recommendations

  • Draft more robust meta descriptions: Second level category pages such as Toys & Games or Electronics appear to have effortless, default meta titles and descriptions. Due to the incomplete description provided by Amazon in the example below, Google has opted to feed additional copy from the page that it feels better represents the content of the page. Amazon could take control of this lost meta description real estate by providing a detailed and compelling description.
  • Draft compelling title tags: Although “Toys & Games” may be the actual page title, this title tag does not compel a reader to click. It does not evoke interest, or curiosity, or excitement. We recommend drafting a title that highlights the value proposition or ties into an overarching brand voice.

Amazon meta example

Best Buy

best buy logo

Best Buy’s approach to meta titles and descriptions is the perfect mix of taglines, keywords and marketing objectives to provide attractive page snippets you can’t help but click.

What they’re doing right

  • Concise language that defines the benefit of shopping with them: In-store pickup, free shipping on thousands of products, expert service.
  • Appropriate lengths to avoid truncation in search results
  • Unique title tags and meta descriptions on every page

Strategic Recommendations

  • Keep rocking your mad meta skills!

best buy meta example

For years, optimization experts have been told that keywords within meta titles and descriptions do not effect organic ranking – and while Google’s ranking algorithm may not be reading these keywords, users are. They’re deciding which search result to click on based on their perception of the relevance of each result page.

How compelling are your title tags and meta descriptions? Ensure they follow recommended character limits, include 1-2 keywords most relevant to the page’s content and concisely pitch your value proposition. If you do, your glittery fairy wings should be flying off the shelves in no time.

For more real-life examples of search engine optimization strategies and results, check out TopRank Marketing’s integrated marketing case studies.

Header image via Shutterstock.


Email Newsletter
Gain a competitive advantage by subscribing to the
TopRank® Online Marketing Newsletter.

© Online Marketing Blog – TopRank®, 2015. |
How 3 Mega E-Commerce Websites Approach Title Tag & Meta Description Optimization | http://www.toprankblog.com

The post How 3 Mega E-Commerce Websites Approach Title Tag & Meta Description Optimization appeared first on Online Marketing Blog – TopRank®.

Online Marketing Blog – TopRank®

Find More Articles

Posted in Latest NewsComments Off

Advert