Tag Archive | "Optimization"

Ask MarketingSherpa: Maturity of conversion rate optimization (CRO) industry

Marketers and experts weigh in on where CRO is in the adoption lifecycle.
MarketingSherpa Blog

Related Articles

Posted in Latest NewsComments Off

Landing Page Optimization: Original MarketingSherpa Landing Page Handbook now available for free download

The MarketingSherpa Landing Page Handbook is one of the most popular resources we have offered in 20 years of publishing, and we are now offering this handbook free to you, the MarketingSherpa reader.
MarketingSherpa Blog

Posted in Latest NewsComments Off

The One-Hour Guide to SEO: Keyword Targeting & On-Page Optimization – Whiteboard Friday

Posted by randfish

We’ve covered strategy, keyword research, and how to satisfy searcher intent — now it’s time to tackle optimizing the webpage itself! In the fourth part of the One-Hour Guide to SEO, Rand offers up an on-page SEO checklist to start you off on your way towards perfectly optimized and keyword-targeted pages.

If you missed them, check out the other episodes in the series so far:

A picture of the whiteboard. The content is all detailed within the transcript below.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. Welcome to another edition of our special One-Hour Guide to SEO. We are now on Part IV – Keyword Targeting and On-Page Optimization. So hopefully, you’ve watched Part III, where we talked about searcher satisfaction, how to make sure searchers are happy with the page content that you create and the user experience that you build for them, as well as Part II, where we talked about keyword research and how to make sure that you are targeting the right words and phrases that searchers are actually looking for, that you think you can actually rank for, and that actually get real organic click-through rate, because Google’s zero-click searches are rising.

A depiction of a site with important on-page SEO elements highlighted, drawn on the whiteboard.

Now we’re into on-page SEO. So this is essentially taking the words and phrases that we know we want to rank for with the content that we know will help searchers accomplish their task. Now how do we make sure that the page is optimal for ranking in Google?

On-page SEO has evolved

Well, this is very different from the way it was years ago. A long time ago, and unfortunately many people still believe this to be true about SEO, it was: How do I stuff my keywords into all the right tags and places on the page? How do I take advantage of things like the meta keywords tag, which hasn’t been used in a decade, maybe two? How do I take advantage of putting all the words and phrases stuffed into my title, my URL, my description, my headline, my H2 through H7 tags, all these kinds of things?

Most of that does not matter, but some of it still does. Some of it is still important, and we need to run through what those are so that you give yourself the best possible chance for ranking.

The on-page SEO checklist

So what I’ve done here is created a sort of brief, on-page SEO checklist. This is not comprehensive, especially on the technical portion, because we’re saving that for Part V, the technical SEO section, which we will get into, of this Guide. In this checklist, some of the most important things are on here. 

☑ Descriptive, compelling, keyword-rich title element

Many of the most important things are on here, and those include things like a descriptive, compelling, keyword-rich but not stuffed title element, also called the page title or a title tag. So, for example, if I am a tool website, like toolsource.com — I made that domain name up, I assume it’s registered to somebody — and I want to rank for the “best online survey tools,” well, “The Best Online Survey Tools for 2019″ is a great title tag, and it’s very different from best online survey tools, best online survey software, best online survey software 2019. You’ve seen title tags like that. You’ve seen pages that contain stuff like that. That is no longer good SEO practices.

So we want that descriptive, compelling, makes me want to click. Remember that this title is also going to show up in the search results as the title of the snippet that your website appears in.

☑ Meta description designed to draw the click

Second, a meta description. This is still used by search engines, not for rankings though. Sort of think of it like ad text. You are drawing a click, or you’re attempting to draw the click. So what you want to do is have a description that tells people what’s on the page and inspires them, incites them, makes them want to click on your result instead of somebody else’s. That’s your chance to say, “Here’s why we’re valuable and useful.”

☑ Easy-to-read, sensible, short URL

An easy-to-read, sensible, short URL. For example, toolsource.com/reviews/best-online-surveys-2019. Perfect, very legible, very readable. I see that in the results, I think, “Okay, I know what that page is going to be.” I see that copied and pasted somewhere on the web, I think, “I know what’s going to be at that URL. That looks relevant to me.”

Or reviews.best-online-tools.info. Okay, well, first off, that’s a freaking terrible domain name. /oldseqs?ide=17 bunch of weird letters and tab detail equals this, and UTM parameter equals that. I don’t know what this is. I don’t know what all this means. By the way, having more than one or two URL parameters is very poorly correlated with and not recommended for trying to rank in search results. So you want to try and rewrite these to be more friendly, shorter, more sensible, and readable by a human being. That will help Google as well.

☑ First paragraph optimized for appearing in featured snippets

That first paragraph, the first paragraph of the content or the first few words of the page should be optimized for appearing in what Google calls featured snippets. Now, featured snippets is when I perform a search, for many queries, I don’t just see a list of pages. Sometimes I’ll see this box, often with an image and a bunch of descriptive text that’s drawn from the page, often from the first paragraph or two. So if you want to get that featured snippet, you have to be able to rank on page one, and you need to be optimized to answer the query right in your first paragraph. But this is an opportunity for you to be ranking in position three or four or five, but still have the featured snippet answer above all the other results. Awesome when you can do this in SEO, very, very powerful thing. Featured snippet optimization, there’s a bunch of resources on Moz’s website that we can point you to there too.

☑ Use the keyword target intelligently in…

☑ The headline

So if I’m trying to rank for “best online survey tools,” I would try and use that in my headline. Generally speaking, I like to have the headline and the title of the piece nearly the same or exactly the same so that when someone clicks on that title, they get the same headline on the page and they don’t get this cognitive dissonance between the two.

☑ The first paragraph

The first paragraph, we talked about. 

☑ The page content

The page’s content, you don’t want to have a page that’s talking about best online survey tools and you never mention online surveys. That would be a little weird. 

☑ Internal link anchors

An internal link anchor. So if other places on your website talk about online survey tools, you should be linking to this page. This is helpful for Google finding it, helpful for visitors finding it, and helpful to say this is the page that is about this on our website.

A whiteboard drawing depicting how to target one page with multiple keywords vs multiple pages targeting single keywords.

I do strongly recommend taking the following advice, which is we are no longer in a world where it makes sense to target one keyword per page. For example, best online survey tools, best online survey software, and best online survey tools 2019 are technically three unique keyword phrases. They have different search volumes. Slightly different results will show up for each of them. But it is no longer the case, whereas it was maybe a decade ago, that I would go create a page for each one of those separate things.

Instead, because these all share the same searcher intent, I want to go with one page, just a single URL that targets all the keywords that share the exact same searcher intent. If searchers are looking to find exactly the same thing but with slightly modified or slight variations in how they phrase things, you should have a page that serves all of those keywords with that same searcher intent rather than multiple pages that try to break those up, for a bunch of reasons. One, it’s really hard to get links to all those different pages. Getting links just period is very challenging, and you need them to rank.

Second off, the difference between those is going to be very, very subtle, and it will be awkward and seem to Google very awkward that you have these slight variations with almost the same thing. It might even look to them like duplicate or very similar or low-quality content, which can get you down-ranked. So stick to one page per set of shared intent keywords.

☑ Leverage appropriate rich snippet options

Next, you want to leverage appropriate rich snippet options. So, for example, if you are in the recipes space, you can use a schema markup for recipes to show Google that you’ve got a picture of the recipe and a cooking time and all these different details. Google offers this in a wide variety of places. When you’re doing reviews, they offer you the star ratings. Schema.org has a full list of these, and Google’s rich snippets markup page offers a bunch more. So we’ll point you to both of those as well.

☑ Images on the page employ…

Last, but certainly not least, because image search is such a huge portion of where Google’s search traffic comes from and goes to, it is very wise to optimize the images on the page. Image search traffic can now send significant traffic to you, and optimizing for images can sometimes mean that other people will find your images through Google images and then take them, put them on their own website and link back to you, which solves a huge problem. Getting links is very hard. Images is a great way to do it.

☑ Descriptive, keyword-rich filenames

The images on your page should employ descriptive, keyword-rich filenames, meaning if I have one for typeform, I don’t want it to be pick one, two or three. I want it to be typeformlogo or typeformsurveysoftware as the name of the file.

☑ Descriptive alt attributes

The alt attribute or alt tag is part of how you describe that for screen readers and other accessibility-focused devices, and Google also uses that text too. 

☑ Caption text (if appropriate)

Caption text, if that’s appropriate, if you have like a photograph and a caption describing it, you want to be descriptive of what’s actually in the picture.

☑ Stored in same domain and subdomain

These files, in order to perform well, they generally need to be hosted on the same domain and subdomain. If, for example, all your images are stored on an Amazon Web Services domain and you don’t bother rewriting or making sure that the domain looks like it’s on toolsource.com/photos or /images here, that can cause real ranking problems. Oftentimes you won’t perform at all in Google images because they don’t associate the image with the same domain. Same subdomain as well is preferable.

If you do all these things and you nail searcher intent and you’ve got your keyword research, you are ready to move on to technical SEO and link building and then start ranking. So we’ll see you for that next edition next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Page Speed Optimization: Metrics, Tools, and How to Improve

Posted by BritneyMuller

Page speed is an important consideration for your SEO work, but it’s a complex subject that tends to be very technical. What are the most crucial things to understand about your site’s page speed, and how can you begin to improve? In this week’s edition of Whiteboard Friday, Britney Muller goes over what you need to know to get started.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome to another edition of Whiteboard Friday. Today we’re going over all things page speed and really getting to the bottom of why it’s so important for you to be thinking about and working on as you do your work.

At the very fundamental level I’m going to briefly explain just how a web page is loaded. That way we can sort of wrap our heads around why all this matters.

How a webpage is loaded

A user goes to a browser, puts in your website, and there is a DNS request. This points at your domain name provider, so maybe GoDaddy, and this points to your server where your files are located, and this is where it gets interesting. So the DOM starts to load all of your HTML, your CSS, and your JavaScript. But very rarely does this one pull all of the needed scripts or needed code to render or load a web page.

Typically the DOM will need to request additional resources from your server to make everything happen, and this is where things start to really slow down your site. Having that sort of background knowledge I hope will help in us being able to triage some of these issues.

Issues that could be slowing down your site

What are some of the most common culprits?

  1. First and foremost is images. Large images are the biggest culprit of slow loading web pages.
  2. Hosting can cause issues.
  3. Plugins, apps, and widgets, basically any third-party script as well can slow down load time.
  4. Your theme and any large files beyond that can really slow things down as well.
  5. Redirects, the number of hops needed to get to a web page will slow things down.
  6. Then JavaScript, which we’ll get into in a second.

But all of these things can be a culprit. So we’re going to go over some resources, some of the metrics and what they mean, and then what are some of the ways that you can improve your page speed today.

Page speed tools and resources

The primary resources I have listed here are Google tools and Google suggested insights. I think what’s really interesting about these is we get to see what their concerns are as far as page speed goes and really start to see the shift towards the user. We should be thinking about that anyway. But first and foremost, how is this affecting people that come to your site, and then secondly, how can we also get the dual benefit of Google perceiving it as higher quality?

We know that Google suggests a website to load anywhere between two to three seconds. The faster the better, obviously. But that’s sort of where the range is. I also highly suggest you take a competitive view of that. Put your competitors into some of these tools and benchmark your speed goals against what’s competitive in your industry. I think that’s a cool way to kind of go into this.

Chrome User Experience Report

This is Chrome real user metrics. Unfortunately, it’s only available for larger, popular websites, but you get some really good data out of it. It’s housed on Big ML, so some basic SQL knowledge is needed.

Lighthouse

Lighthouse, one of my favorites, is available right in Chrome Dev Tools. If you are on a web page and you click Inspect Element and you open up Chrome Dev Tools, to the far right tab where it says Audit, you can run a Lighthouse report right in your browser.

What I love about it is it gives you very specific examples and fixes that you can do. A fun fact to know is it will automatically be on the simulated fast 3G, and notice they’re focused on mobile users on 3G. I like to switch that to applied fast 3G, because it has Lighthouse do an actual run of that load. It takes a little bit longer, but it seems to be a little bit more accurate. Good to know.

Page Speed Insights

Page Speed Insights is really interesting. They’ve now incorporated Chrome User Experience Report. But if you’re not one of those large sites, it’s not even going to measure your actual page speed. It’s going to look at how your site is configured and provide feedback according to that and score it. Just something good to be aware of. It still provides good value.

Test your mobile website speed and performance

I don’t know what the title of this is. If you do, please comment down below. But it’s located on testmysite.thinkwithgoogle.com. This one is really cool because it tests the mobile speed of your site. If you scroll down, it directly ties it into ROI for your business or your website. We see Google leveraging real-world metrics, tying it back to what’s the percentage of people you’re losing because your site is this slow. It’s a brilliant way to sort of get us all on board and fighting for some of these improvements.

Pingdom and GTmetrix are non-Google products or non-Google tools, but super helpful as well.

Site speed metrics

So what are some of the metrics?

First paint

We’re going to go over first paint, which is basically just the first non-blank paint on a screen. It could be just the first pixel change. That initial change is first paint.

First contentful paint

First contentful paint is when the first content appears. This might be part of the nav or the search bar or whatever it might be. That’s the first contentful paint.

First meaningful paint

First meaningful paint is when primary content is visible. When you sort of get that reaction of, “Oh, yeah, this is what I came to this page for,” that’s first meaningful paint.

Time to interactive

Time to interactive is when it’s visually usable and engage-able. So we’ve all gone to a web page and it looks like it’s done, but we can’t quite use it yet. That’s where this metric comes in. So when is it usable for the user? Again, notice how user-centric even these metrics are. Really, really neat.

DOM content loaded

The DOM content loaded, this is when the HTML is completely loaded and parsed. So some really good ones to keep an eye on and just to be aware of in general.

Ways to improve your page speed

HTTP/2

HTTP/2 can definitely speed things up. As to what extent, you have to sort of research that and test.

Preconnect, prefetch, preload

Preconnect, prefetch, and preload really interesting and important in speeding up a site. We see Google doing this on their SERPs. If you inspect an element, you can see Google prefetching some of the URLs so that it has it faster for you if you were to click on some of those results. You can similarly do this on your site. It helps to load and speed up that process.

Enable caching & use a content delivery network (CDN)

Caching is so, so important. Definitely do your research and make sure that’s set up properly. Same with CDNs, so valuable in speeding up a site, but you want to make sure that your CDN is set up properly.

Compress images

The easiest and probably quickest way for you to speed up your site today is really just to compress those images. It’s such an easy thing to do. There are all sorts of free tools available for you to compress them. Optimizilla is one. You can even use free tools on your computer, Save for Web, and compress properly.

Minify resources

You can also minify resources. So it’s really good to be aware of what minification, bundling, and compression do so you can have some of these more technical conversations with developers or with anyone else working on the site.

So this is sort of a high-level overview of page speed. There’s a ton more to cover, but I would love to hear your input and your questions and comments down below in the comment section.

I really appreciate you checking out this edition of Whiteboard Friday, and I will see you all again soon. Thanks so much. See you.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 5: Technical Optimization

Posted by BritneyMuller

After a short break, we’re back to share our working draft of Chapter 5 of the Beginner’s Guide to SEO with you! This one was a whopper, and we’re really looking forward to your input. Giving beginner SEOs a solid grasp of just what technical optimization for SEO is and why it matters — without overwhelming them or scaring them off the subject — is a tall order indeed. We’d love to hear what you think: did we miss anything you think is important for beginners to know? Leave us your feedback in the comments!

And in case you’re curious, check back on our outline, Chapter One, Chapter Two, Chapter Three, and Chapter Four to see what we’ve covered so far.


Chapter 5: Technical Optimization

Basic technical knowledge will help you optimize your site for search engines and establish credibility with developers.

Now that you’ve crafted valuable content on the foundation of solid keyword research, it’s important to make sure it’s not only readable by humans, but by search engines too!

You don’t need to have a deep technical understanding of these concepts, but it is important to grasp what these technical assets do so that you can speak intelligently about them with developers. Speaking your developers’ language is important because you will likely need them to carry out some of your optimizations. They’re unlikely to prioritize your asks if they can’t understand your request or see its importance. When you establish credibility and trust with your devs, you can begin to tear away the red tape that often blocks crucial work from getting done.

Pro tip: SEOs need cross-team support to be effective

It’s vital to have a healthy relationship with your developers so that you can successfully tackle SEO challenges from both sides. Don’t wait until a technical issue causes negative SEO ramifications to involve a developer. Instead, join forces for the planning stage with the goal of avoiding the issues altogether. If you don’t, it can cost you in time and money later.

Beyond cross-team support, understanding technical optimization for SEO is essential if you want to ensure that your web pages are structured for both humans and crawlers. To that end, we’ve divided this chapter into three sections:

  1. How websites work
  2. How search engines understand websites
  3. How users interact with websites

Since the technical structure of a site can have a massive impact on its performance, it’s crucial for everyone to understand these principles. It might also be a good idea to share this part of the guide with your programmers, content writers, and designers so that all parties involved in a site’s construction are on the same page.

1. How websites work

If search engine optimization is the process of optimizing a website for search, SEOs need at least a basic understanding of the thing they’re optimizing!

Below, we outline the website’s journey from domain name purchase all the way to its fully rendered state in a browser. An important component of the website’s journey is the critical rendering path, which is the process of a browser turning a website’s code into a viewable page.

Knowing this about websites is important for SEOs to understand for a few reasons:

  • The steps in this webpage assembly process can affect page load times, and speed is not only important for keeping users on your site, but it’s also one of Google’s ranking factors.
  • Google renders certain resources, like JavaScript, on a “second pass.” Google will look at the page without JavaScript first, then a few days to a few weeks later, it will render JavaScript, meaning SEO-critical elements that are added to the page using JavaScript might not get indexed.

Imagine that the website loading process is your commute to work. You get ready at home, gather your things to bring to the office, and then take the fastest route from your home to your work. It would be silly to put on just one of your shoes, take a longer route to work, drop your things off at the office, then immediately return home to get your other shoe, right? That’s sort of what inefficient websites do. This chapter will teach you how to diagnose where your website might be inefficient, what you can do to streamline, and the positive ramifications on your rankings and user experience that can result from that streamlining.

Before a website can be accessed, it needs to be set up!

  1. Domain name is purchased. Domain names like moz.com are purchased from a domain name registrar such as GoDaddy or HostGator. These registrars are just organizations that manage the reservations of domain names.
  2. Domain name is linked to IP address. The Internet doesn’t understand names like “moz.com” as website addresses without the help of domain name servers (DNS). The Internet uses a series of numbers called an Internet protocol (IP) address (ex: 127.0.0.1), but we want to use names like moz.com because they’re easier for humans to remember. We need to use a DNS to link those human-readable names with machine-readable numbers.

How a website gets from server to browser

  1. User requests domain. Now that the name is linked to an IP address via DNS, people can request a website by typing the domain name directly into their browser or by clicking on a link to the website.
  2. Browser makes requests. That request for a web page prompts the browser to make a DNS lookup request to convert the domain name to its IP address. The browser then makes a request to the server for the code your web page is constructed with, such as HTML, CSS, and JavaScript.
  3. Server sends resources. Once the server receives the request for the website, it sends the website files to be assembled in the searcher’s browser.
  4. Browser assembles the web page. The browser has now received the resources from the server, but it still needs to put it all together and render the web page so that the user can see it in their browser. As the browser parses and organizes all the web page’s resources, it’s creating a Document Object Model (DOM). The DOM is what you can see when you right click + “inspect element” on a web page in your Chrome browser (learn how to inspect elements in other browsers).
  5. Browser makes final requests. The browser will only show a web page after all the page’s necessary code is downloaded, parsed, and executed, so at this point, if the browser needs any additional code in order to show your website, it will make an additional request from your server.
  6. Website appears in browser. Whew! After all that, your website has now been transformed (rendered) from code to what you see in your browser.

Pro tip: Talk to your developers about async!

Something you can bring up with your developers is shortening the critical rendering path by setting scripts to “async” when they’re not needed to render content above the fold, which can make your web pages load faster. Async tells the DOM that it can continue to be assembled while the browser is fetching the scripts needed to display your web page. If the DOM has to pause assembly every time the browser fetches a script (called “render-blocking scripts”), it can substantially slow down your page load.

It would be like going out to eat with your friends and having to pause the conversation every time one of you went up to the counter to order, only resuming once they got back. With async, you and your friends can continue to chat even when one of you is ordering. You might also want to bring up other optimizations that devs can implement to shorten the critical rendering path, such as removing unnecessary scripts entirely, like old tracking scripts.

Now that you know how a website appears in a browser, we’re going to focus on what a website is made of — in other words, the code (programming languages) used to construct those web pages.

The three most common are:

  • HTML – What a website says (titles, body content, etc.)
  • CSS – How a website looks (color, fonts, etc.)
  • JavaScript – How it behaves (interactive, dynamic, etc.)

HTML: What a website says

HTML stands for hypertext markup language, and it serves as the backbone of a website. Elements like headings, paragraphs, lists, and content are all defined in the HTML.

Here’s an example of a webpage, and what its corresponding HTML looks like:

HTML is important for SEOs to know because it’s what lives “under the hood” of any page they create or work on. While your CMS likely doesn’t require you to write your pages in HTML (ex: selecting “hyperlink” will allow you to create a link without you having to type in “a href=”), it is what you’re modifying every time you do something to a web page such as adding content, changing the anchor text of internal links, and so on. Google crawls these HTML elements to determine how relevant your document is to a particular query. In other words, what’s in your HTML plays a huge role in how your web page ranks in Google organic search!

CSS: How a website looks

CSS stands for cascading style sheets, and this is what causes your web pages to take on certain fonts, colors, and layouts. HTML was created to describe content, rather than to style it, so when CSS entered the scene, it was a game-changer. With CSS, web pages could be “beautified” without requiring manual coding of styles into the HTML of every page — a cumbersome process, especially for large sites.

It wasn’t until 2014 that Google’s indexing system began to render web pages more like an actual browser, as opposed to a text-only browser. A black-hat SEO practice that tried to capitalize on Google’s older indexing system was hiding text and links via CSS for the purpose of manipulating search engine rankings. This “hidden text and links” practice is a violation of Google’s quality guidelines.

Components of CSS that SEOs, in particular, should care about:

  • Since style directives can live in external stylesheet files (CSS files) instead of your page’s HTML, it makes your page less code-heavy, reducing file transfer size and making load times faster.
  • Browsers still have to download resources like your CSS file, so compressing them can make your web pages load faster, and page speed is a ranking factor.
  • Having your pages be more content-heavy than code-heavy can lead to better indexing of your site’s content.
  • Using CSS to hide links and content can get your website manually penalized and removed from Google’s index.

JavaScript: How a website behaves

In the earlier days of the Internet, web pages were built with HTML. When CSS came along, webpage content had the ability to take on some style. When the programming language JavaScript entered the scene, websites could now not only have structure and style, but they could be dynamic.

JavaScript has opened up a lot of opportunities for non-static web page creation. When someone attempts to access a page that is enhanced with this programming language, that user’s browser will execute the JavaScript against the static HTML that the server returned, resulting in a web page that comes to life with some sort of interactivity.

You’ve definitely seen JavaScript in action — you just may not have known it! That’s because JavaScript can do almost anything to a page. It could create a pop up, for example, or it could request third-party resources like ads to display on your page.

JavaScript can pose some problems for SEO, though, since search engines don’t view JavaScript the same way human visitors do. That’s because of client-side versus server-side rendering. Most JavaScript is executed in a client’s browser. With server-side rendering, on the other hand, the files are executed at the server and the server sends them to the browser in their fully rendered state.

SEO-critical page elements such as text, links, and tags that are loaded on the client’s side with JavaScript, rather than represented in your HTML, are invisible from your page’s code until they are rendered. This means that search engine crawlers won’t see what’s in your JavaScript — at least not initially.

Google says that, as long as you’re not blocking Googlebot from crawling your JavaScript files, they’re generally able to render and understand your web pages just like a browser can, which means that Googlebot should see the same things as a user viewing a site in their browser. However, due to this “second wave of indexing” for client-side JavaScript, Google can miss certain elements that are only available once JavaScript is executed.

There are also some other things that could go wrong during Googlebot’s process of rendering your web pages, which can prevent Google from understanding what’s contained in your JavaScript:

  • You’ve blocked Googlebot from JavaScript resources (ex: with robots.txt, like we learned about in Chapter 2)
  • Your server can’t handle all the requests to crawl your content
  • The JavaScript is too complex or outdated for Googlebot to understand
  • JavaScript doesn’t “lazy load” content into the page until after the crawler has finished with the page and moved on.

Needless to say, while JavaScript does open a lot of possibilities for web page creation, it can also have some serious ramifications for your SEO if you’re not careful. Thankfully, there is a way to check whether Google sees the same thing as your visitors. To see a page how Googlebot views your page, use Google Search Console’s “Fetch and Render” tool. From your site’s Google Search Console dashboard, select “Crawl” from the left navigation, then “Fetch as Google.”

From this page, enter the URL you want to check (or leave blank if you want to check your homepage) and click the “Fetch and Render” button. You also have the option to test either the desktop or mobile version.

In return, you’ll get a side-by-side view of how Googlebot saw your page versus how a visitor to your website would have seen the page. Below, Google will also show you a list of any resources they may not have been able to get for the URL you entered.

Understanding the way websites work lays a great foundation for what we’ll talk about next, which is technical optimizations to help Google understand the pages on your website better.

2. How search engines understand websites

Search engines have gotten incredibly sophisticated, but they can’t (yet) find and interpret web pages quite like a human can. The following sections outline ways you can better deliver content to search engines.

Help search engines understand your content by structuring it with Schema

Imagine being a search engine crawler scanning down a 10,000-word article about how to bake a cake. How do you identify the author, recipe, ingredients, or steps required to bake a cake? This is where schema (Schema.org) markup comes in. It allows you to spoon-feed search engines more specific classifications for what type of information is on your page.

Schema is a way to label or organize your content so that search engines have a better understanding of what certain elements on your web pages are. This code provides structure to your data, which is why schema is often referred to as “structured data.” The process of structuring your data is often referred to as “markup” because you are marking up your content with organizational code.

JSON-LD is Google’s preferred schema markup (announced in May ‘16), which Bing also supports. To view a full list of the thousands of available schema markups, visit Schema.org or view the Google Developers Introduction to Structured Data for additional information on how to implement structured data. After you implement the structured data that best suits your web pages, you can test your markup with Google’s Structured Data Testing Tool.

In addition to helping bots like Google understand what a particular piece of content is about, schema markup can also enable special features to accompany your pages in the SERPs. These special features are referred to as “rich snippets,” and you’ve probably seen them in action. They’re things like:

  • Top Stories carousel
  • Review stars
  • Sitelinks search boxes
  • Recipes

Remember, using structured data can help enable a rich snippet to be present, but does not guarantee it. Other types of rich snippets will likely be added in the future as the use of schema markup increases.

Some last words of advice for schema success:

  • You can use multiple types of schema markup on a page. However, if you mark up one element, like a product for example, and there are other products listed on the page, you must also mark up those products.
  • Don’t mark up content that is not visible to visitors and follow Google’s Quality Guidelines. For example, if you add review structured markup to a page, make sure those reviews are actually visible on that page.
  • If you have duplicate pages, Google asks that you mark up each duplicate page with your structured markup, not just the canonical version.
  • Provide original and updated (if applicable) content on your structured data pages.
  • Structured markup should be an accurate reflection of your page.
  • Try to use the most specific type of schema markup for your content.
  • Marked-up reviews should not be written by the business. They should be genuine unpaid business reviews from actual customers.

Tell search engines about your preferred pages with canonicalization

When Google crawls the same content on different web pages, it sometimes doesn’t know which page to index in search results. This is why the tag was invented: to help search engines better index the preferred version of content and not all its duplicates.

The rel=”canonical” tag allows you to tell search engines where the original, master version of a piece of content is located. You’re essentially saying, “Hey search engine! Don’t index this; index this source page instead.” So, if you want to republish a piece of content, whether exactly or slightly modified, but don’t want to risk creating duplicate content, the canonical tag is here to save the day.

Proper canonicalization ensures that every unique piece of content on your website has only one URL. To prevent search engines from indexing multiple versions of a single page, Google recommends having a self-referencing canonical tag on every page on your site. Without a canonical tag telling Google which version of your web page is the preferred one, http://www.example.com could get indexed separately from http://example.com, creating duplicates.

“Avoid duplicate content” is an Internet truism, and for good reason! Google wants to reward sites with unique, valuable content — not content that’s taken from other sources and repeated across multiple pages. Because engines want to provide the best searcher experience, they will rarely show multiple versions of the same content, opting instead to show only the canonicalized version, or if a canonical tag does not exist, whichever version they deem most likely to be the original.

Pro tip: Distinguishing between content filtering & content penalties
There is no such thing as a duplicate content penalty. However, you should try to keep duplicate content from causing indexing issues by using the rel=”canonical” tag when possible. When duplicates of a page exist, Google will choose a canonical and filter the others out of search results. That doesn’t mean you’ve been penalized. It just means that Google only wants to show one version of your content.

It’s also very common for websites to have multiple duplicate pages due to sort and filter options. For example, on an e-commerce site, you might have what’s called a faceted navigation that allows visitors to narrow down products to find exactly what they’re looking for, such as a “sort by” feature that reorders results on the product category page from lowest to highest price. This could create a URL that looks something like this: example.com/mens-shirts?sort=price_ascending. Add in more sort/filter options like color, size, material, brand, etc. and just think about all the variations of your main product category page this would create!

To learn more about different types of duplicate content, this post by Dr. Pete helps distill the different nuances.

3. How users interact with websites

In Chapter 1, we said that despite SEO standing for search engine optimization, SEO is as much about people as it is about search engines themselves. That’s because search engines exist to serve searchers. This goal helps explain why Google’s algorithm rewards websites that provide the best possible experiences for searchers, and why some websites, despite having qualities like robust backlink profiles, might not perform well in search.

When we understand what makes their web browsing experience optimal, we can create those experiences for maximum search performance.

Ensuring a positive experience for your mobile visitors

Being that well over half of all web traffic today comes from mobile, it’s safe to say that your website should be accessible and easy to navigate for mobile visitors. In April 2015, Google rolled out an update to its algorithm that would promote mobile-friendly pages over non-mobile-friendly pages. So how can you ensure that your website is mobile friendly? Although there are three main ways to configure your website for mobile, Google recommends responsive web design.

Responsive design

Responsive websites are designed to fit the screen of whatever type of device your visitors are using. You can use CSS to make the web page “respond” to the device size. This is ideal because it prevents visitors from having to double-tap or pinch-and-zoom in order to view the content on your pages. Not sure if your web pages are mobile friendly? You can use Google’s mobile-friendly test to check!

AMP

AMP stands for Accelerated Mobile Pages, and it is used to deliver content to mobile visitors at speeds much greater than with non-AMP delivery. AMP is able to deliver content so fast because it delivers content from its cache servers (not the original site) and uses a special AMP version of HTML and JavaScript. Learn more about AMP.

Mobile-first indexing

As of 2018, Google started switching websites over to mobile-first indexing. That change sparked some confusion between mobile-friendliness and mobile-first, so it’s helpful to disambiguate. With mobile-first indexing, Google crawls and indexes the mobile version of your web pages. Making your website compatible to mobile screens is good for users and your performance in search, but mobile-first indexing happens independently of mobile-friendliness.

This has raised some concerns for websites that lack parity between mobile and desktop versions, such as showing different content, navigation, links, etc. on their mobile view. A mobile site with different links, for example, will alter the way in which Googlebot (mobile) crawls your site and sends link equity to your other pages.

Breaking up long content for easier digestion

When sites have very long pages, they have the option of breaking them up into multiple parts of a whole. This is called pagination and it’s similar to pages in a book. In order to avoid giving the visitor too much all at once, you can break up your single page into multiple parts. This can be great for visitors, especially on e-commerce sites where there are a lot of product results in a category, but there are some steps you should take to help Google understand the relationship between your paginated pages. It’s called rel=”next” and rel=”prev.”

You can read more about pagination in Google’s official documentation, but the main takeaways are that:

  • The first page in a sequence should only have rel=”next” markup
  • The last page in a sequence should only have rel=”prev” markup
  • Pages that have both a preceding and following page should have both rel=”next” and rel=”prev”
  • Since each page in the sequence is unique, don’t canonicalize them to the first page in the sequence. Only use a canonical tag to point to a “view all” version of your content, if you have one.
  • When Google sees a paginated sequence, it will typically consolidate the pages’ linking properties and send searchers to the first page

Pro tip: rel=”next/prev” should still have anchor text and live within an <a> link
This helps Google ensure that they pick up the rel=”next/prev”.

Improving page speed to mitigate visitor frustration

Google wants to serve content that loads lightning-fast for searchers. We’ve come to expect fast-loading results, and when we don’t get them, we’ll quickly bounce back to the SERP in search of a better, faster page. This is why page speed is a crucial aspect of on-site SEO. We can improve the speed of our web pages by taking advantage of tools like the ones we’ve mentioned below. Click on the links to learn more about each.

Images are one of the main culprits of slow pages!

As discussed in Chapter 4, images are one of the number-one reasons for slow-loading web pages! In addition to image compression, optimizing image alt text, choosing the right image format, and submitting image sitemaps, there are other technical ways to optimize the speed and way in which images are shown to your users. Some primary ways to improve image delivery are as follows:

SRCSET: How to deliver the best image size for each device

The SRCSET attribute allows you to have multiple versions of your image and then specify which version should be used in different situations. This piece of code is added to the <img> tag (where your image is located in the HTML) to provide unique images for specific-sized devices.

This is like the concept of responsive design that we discussed earlier, except for images!

This doesn’t just speed up your image load time, it’s also a unique way to enhance your on-page user experience by providing different and optimal images to different device types.

Pro tip: There are more than just three image size versions!
It’s a common misconception that you just need a desktop, tablet, and mobile-sized version of your image. There are a huge variety of screen sizes and resolutions. Learn more about SRCSET.

Show visitors image loading is in progress with lazy loading

Lazy loading occurs when you go to a webpage and, instead of seeing a blank white space for where an image will be, a blurry lightweight version of the image or a colored box in its place appears while the surrounding text loads. After a few seconds, the image clearly loads in full resolution. The popular blogging platform Medium does this really well.

The low resolution version is initially loaded, and then the full high resolution version. This also helps to optimize your critical rendering path! So while all of your other page resources are being downloaded, you’re showing a low-resolution teaser image that helps tell users that things are happening/being loaded. For more information on how you should lazy load your images, check out Google’s Lazy Loading Guidance.

Improve speed by condensing and bundling your files

Page speed audits will often make recommendations such as “minify resource,” but what does that actually mean? Minification condenses a code file by removing things like line breaks and spaces, as well as abbreviating code variable names wherever possible.

“Bundling” is another common term you’ll hear in reference to improving page speed. The process of bundling combines a bunch of the same coding language files into one single file. For example, a bunch of JavaScript files could be put into one larger file to reduce the amount of JavaScript files for a browser.

By both minifying and bundling the files needed to construct your web page, you’ll speed up your website and reduce the number of your HTTP (file) requests.

Improving the experience for international audiences

Websites that target audiences from multiple countries should familiarize themselves with international SEO best practices in order to serve up the most relevant experiences. Without these optimizations, international visitors might have difficulty finding the version of your site that caters to them.

There are two main ways a website can be internationalized:

  • Language
    Sites that target speakers of multiple languages are considered multilingual websites. These sites should add something called an hreflang tag to show Google that your page has copy for another language. Learn more about hreflang.
  • Country
    Sites that target audiences in multiple countries are called multi-regional websites and they should choose a URL structure that makes it easy to target their domain or pages to specific countries. This can include the use of a country code top level domain (ccTLD) such as “.ca” for Canada, or a generic top-level domain (gTLD) with a country-specific subfolder such as “example.com/ca” for Canada. Learn more about locale-specific URLs.

You’ve researched, you’ve written, and you’ve optimized your website for search engines and user experience. The next piece of the SEO puzzle is a big one: establishing authority so that your pages will rank highly in search results.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 4: On-Page Optimization

Posted by BritneyMuller

Chapter Four of the Beginner’s Guide to SEO rewrite is chock full of on-page SEO learnings. After all the great feedback you’ve provided thus far on our outline, Chapter One, Chapter Two, and Chapter Three, we’re eager to hear how you feel about Chapter Four. What really works for you? What do you think is missing? Read on, and let us know your thoughts in the comments!


Chapter 4: On-Page Optimization

Use your research to craft your message.

Now that you know how your target market is searching, it’s time to dive into on-page optimization, the practice of crafting web pages that answer searcher’s questions. On-page SEO is multifaceted, and extends beyond content into other things like schema and meta tags, which we’ll discuss more at length in the next chapter on technical optimization. For now, put on your wordsmithing hats — it’s time to create your content!

Creating your content

Applying your keyword research

In the last chapter, we learned methods for discovering how your target audience is searching for your content. Now, it’s time to put that research into practice. Here is a simple outline to follow for applying your keyword research:

  1. Survey your keywords and group those with similar topics and intent. Those groups will be your pages, rather than creating individual pages for every keyword variation.
  2. If you haven’t done so already, evaluate the SERP for each keyword or group of keywords to determine what type and format your content should be. Some characteristics of ranking pages to take note of:
    1. Are they image or video heavy?
    2. Is the content long-form or short and concise?
    3. Is the content formatted in lists, bullets, or paragraphs?
  3. Ask yourself, “What unique value could I offer to make my page better than the pages that are currently ranking for my keyword?”

On-page optimization allows you to turn your research into content your audience will love. Just make sure to avoid falling into the trap of low-value tactics that could hurt more than help!

Low-value tactics to avoid

Your web content should exist to answer searchers’ questions, to guide them through your site, and to help them understand your site’s purpose. Content should not be created for the purpose of ranking highly in search alone. Ranking is a means to an end, the end being to help searchers. If we put the cart before the horse, we risk falling into the trap of low-value content tactics.

Some of these tactics were introduced in Chapter 2, but by way of review, let’s take a deeper dive into some low-value tactics you should avoid when crafting search engine optimized content.

Thin content

While it’s common for a website to have unique pages on different topics, an older content strategy was to create a page for every single iteration of your keywords in order to rank on page 1 for those highly specific queries.

For example, if you were selling bridal dresses, you might have created individual pages for bridal gowns, bridal dresses, wedding gowns, and wedding dresses, even if each page was essentially saying the same thing. A similar tactic for local businesses was to create multiple pages of content for each city or region from which they wanted clients. These “geo pages” often had the same or very similar content, with the location name being the only unique factor.

Tactics like these clearly weren’t helpful for users, so why did publishers do it? Google wasn’t always as good as it is today at understanding the relationships between words and phrases (or semantics). So, if you wanted to rank on page 1 for “bridal gowns” but you only had a page on “wedding dresses,” that may not have cut it.

This practice created tons of thin, low-quality content across the web, which Google addressed specifically with its 2011 update known as Panda. This algorithm update penalized low-quality pages, which resulted in more quality pages taking the top spots of the SERPs. Google continues to iterate on this process of demoting low-quality content and promoting high-quality content today.

Google is clear that you should have a comprehensive page on a topic instead of multiple, weaker pages for each variation of a keyword.

Depiction of distinct pages for each keyword variation versus one page covering multiple variations

Duplicate content

Like it sounds, “duplicate content” refers to content that is shared between domains or between multiple pages of a single domain. “Scraped” content goes a step further, and entails the blatant and unauthorized use of content from other sites. This can include taking content and republishing as-is, or modifying it slightly before republishing, without adding any original content or value.

There are plenty of legitimate reasons for internal or cross-domain duplicate content, so Google encourages the use of a rel=canonical tag to point to the original version of the web content. While you don’t need to know about this tag just yet, the main thing to note for now is that your content should be unique in word and in value.

Depiction of how duplicate content looks between pages.

Cloaking

A basic tenet of search engine guidelines is to show the same content to the engine’s crawlers that you’d show to a human visitor. This means that you should never hide text in the HTML code of your website that a normal visitor can’t see.

When this guideline is broken, search engines call it “cloaking” and take action to prevent these pages from ranking in search results. Cloaking can be accomplished in any number of ways and for a variety of reasons, both positive and negative. Below is an example of an instance where Spotify showed different content to users than to Google.

Spotify shows a login page to Google.Spotify shows a National Philharmonic Orchestra landing page to logged in visitors.

In some cases, Google may let practices that are technically cloaking pass because they contribute to a positive user experience. For more on the subject of cloaking and the levels of risk associated with various tactics, see our article on White Hat Cloaking.

Keyword stuffing

If you’ve ever been told, “You need to include {critical keyword} on this page X times,” you’ve seen the confusion over keyword usage in action. Many people mistakenly think that if you just include a keyword within your page’s content X times, you will automatically rank for it. The truth is, although Google looks for mentions of keywords and related concepts on your site’s pages, the page itself has to add value outside of pure keyword usage. If a page is going to be valuable to users, it won’t sound like it was written by a robot, so incorporate your keywords and phrases naturally in a way that is understandable to your readers.

Below is an example of a keyword-stuffed page of content that also uses another old method: bolding all your targeted keywords. Oy.

Screenshot of a site that bolds keywords in a paragraph.

Auto-generated content

Arguably one of the most offensive forms of low quality content is the kind that is auto-generated, or created programmatically with the intent of manipulating search rankings and not helping users. You may recognize some auto-generated content by how little it makes sense when read — they are technically words, but strung together by a program rather than a human being.

Gibberish text on a webpage

It is worth noting that advancements in machine learning have contributed to more sophisticated auto-generated content that will only get better over time. This is likely why in Google’s quality guidelines on automatically generated content, Google specifically calls out the brand of auto-generated content that attempts to manipulate search rankings, rather than any-and-all auto-generated content.

What to do instead: 10x it!

There is no “secret sauce” to ranking in search results. Google ranks pages highly because it has determined they are the best answers to the searcher’s questions. In today’s search engine, it’s not enough that your page isn’t duplicate, spamming, or broken. Your page has to provide value to searchers and be better than any other page Google is currently serving as the answer to a particular query. Here’s a simple formula for content creation:

  • Search the keyword(s) you want your page to rank for
  • Identify which pages are ranking highly for those keywords
  • Determine what qualities those pages possess
  • Create content that’s better than that

We like to call this 10x content. If you create a page on a keyword that is 10x better than the pages being shown in search results (for that keyword), Google will reward you for it, and better yet, you’ll naturally get people linking to it! Creating 10x content is hard work, but will pay dividends in organic traffic.

Just remember, there’s no magic number when it comes to words on a page. What we should be aiming for is whatever sufficiently satisfies user intent. Some queries can be answered thoroughly and accurately in 300 words while others might require 1,000 words!

Pro tip: Don’t reinvent the wheel!
If you already have content on your website, save yourself time by evaluating which of those pages are already bringing in good amounts of organic traffic and converting well. Refurbish that content on different platforms to help get more visibility to your site. On the other side of the coin, evaluate what existing content isn’t performing as well and adjust it, rather than starting from square one with all new content.

NAP: A note for local businesses

If you’re a business that makes in-person contact with your customers, be sure to include your business name, address, and phone number (NAP) prominently, accurately, and consistently throughout your site’s content. This information is often displayed in the footer or header of a local business website, as well as on any “contact us” pages. You’ll also want to mark up this information using local business schema. Schema and structured data are discussed more at length in the “Code” section of this chapter.

If you are a multi-location business, it’s best to build unique, optimized pages for each location. For example, a business that has locations in Seattle, Tacoma, and Bellevue should consider having a page for each:

example.com/seattle
example.com/tacoma
example.com/bellevue

Each page should be uniquely optimized for that location, so the Seattle page would have unique content discussing the Seattle location, list the Seattle NAP, and even testimonials specifically from Seattle customers. If there are dozens, hundreds, or even thousands of locations, a store locator widget could be employed to help you scale.

Hope you still have some energy left after handling the difficult-yet-rewarding task of putting together a page that is 10x better than your competitors’ pages, because there are just a few more things needed before your page is complete! In the next sections, we’ll talk about the other on-page optimizations your pages need, as well as naming and organizing your content.

Beyond content: Other optimizations your pages need

Can I just bump up the font size to create paragraph headings?

How can I control what title and description show up for my page in search results?

After reading this section, you’ll understand other important on-page elements that help search engines understand the 10x content you just created, so let’s dive in!

Header tags

Header tags are an HTML element used to designate headings on your page. The main header tag, called an H1, is typically reserved for the title of the page. It looks like this:

 <h1>Page Title</h1>

There are also sub-headings that go from H2 (<h2>) to H6 (<h6>) tags, although using all of these on a page is not required. The hierarchy of header tags goes from H1 to H6 in descending order of importance.

Each page should have a unique H1 that describes the main topic of the page, this is often automatically created from the title of a page. As the main descriptive title of the page, the H1 should contain that page’s primary keyword or phrase. You should avoid using header tags to mark up non-heading elements, such as navigational buttons and phone numbers. Use header tags to introduce what the following content will discuss.

Take this page about touring Copenhagen, for example:

<h1>Copenhagen Travel Guide</h1>
<h2>Copenhagen by the Seasons</h2>
<h3>Visiting in Winter</h3>
<h3>Visiting in Spring</h3>

The main topic of the page is introduced in the main <h1> heading, and each additional heading is used to introduce a new sub-topic. In this example, the <h2> is more specific than the <h1>, and the <h3> tags are more specific than the <h2>. This is just an example of a structure you could use.

Although what you choose to put in your header tags can be used by search engines to evaluate and rank your page, it’s important to avoid inflating their importance. Header tags are one among many on-page SEO factors, and typically would not move the needle like quality backlinks and content would, so focus on your site visitors when crafting your headings.

Internal links

In Chapter 2, we discussed the importance of having a crawlable website. Part of a website’s crawlability lies in its internal linking structure. When you link to other pages on your website, you ensure that search engine crawlers can find all your site’s pages, you pass link equity (ranking power) to other pages on your site, and you help visitors navigate your site.

The importance of internal linking is well established, but there can be confusion over how this looks in practice.

Link accessibility

Links that require a click (like a navigation drop-down to view) are often hidden from search engine crawlers, so if the only links to internal pages on your website are through these types of links, you may have trouble getting those pages indexed. Opt instead for links that are directly accessible on the page.

Anchor text

Anchor text is the text with which you link to pages. Below, you can see an example of what a hyperlink without anchor text and a hyperlink with anchor text would look like in the HTML.

<a href="http://www.domain.com/"></a>
<a href="http://www.domain.com/" title="Keyword Text">Keyword Text</a>

On live view, that would look like this:

http://www.example.com/

Keyword Text

The anchor text sends signals to search engines regarding the content of the destination page. For example, if I link to a page on my site using the anchor text “learn SEO,” that’s a good indicator to search engines that the targeted page is one at which people can learn about SEO. Be careful not to overdo it, though. Too many internal links using the same, keyword-stuffed anchor text can appear to search engines that you’re trying to manipulate a page’s ranking. It’s best to make anchor text natural rather than formulaic.

Link volume

In Google’s General Webmaster Guidelines, they say to “limit the number of links on a page to a reasonable number (a few thousand at most).” This is part of Google’s technical guidelines, rather than the quality guideline section, so having too many internal links isn’t something that on its own is going to get you penalized, but it does affect how Google finds and evaluates your pages.

The more links on a page, the less equity each link can pass to its destination page. A page only has so much equity to go around.

Depiction of how link equity works between pages

So it’s safe to say that you should only link when you mean it! You can learn more about link equity from our SEO Learning Center.

Aside from passing authority between pages, a link is also a way to help users navigate to other pages on your site. This is a case where doing what’s best for search engines is also doing what’s best for searchers. Too many links not only dilute the authority of each link, but they can also be unhelpful and overwhelming. Consider how a searcher might feel landing on a page that looks like this:

Welcome to our gardening website! We have many articles on gardening, how to garden, and helpful tips on herbs, fruits, vegetables, perennials, and annuals. Learn more about gardening from our gardening blog.

Whew! Not only is that a lot of links to process, but it also reads pretty unnaturally and doesn’t contain much substance (which could be considered “thin content” by Google). Focus on quality and helping your users navigate your site, and you likely won’t have to worry about too many links.

Redirection

Removing and renaming pages is a common practice, but in the event that you do move a page, make sure to update the links to that old URL! At the very least, you should make sure to redirect the URL to its new location, but if possible, update all internal links to that URL at the source so that users and crawlers don’t have to pass through redirects to arrive at the destination page. If you choose to redirect only, be careful to avoid redirect chains that are too long (Google says, “Avoid chaining redirects… keep the number of redirects in the chain low, ideally no more than 3 and fewer than 5.”)

Example of a redirect chain:

(original location of content) example.com/location1 >> example.com/location2 >> (current location of content) example.com/location3

Better:

example.com/location1 >> example.com/location3

Image optimization

Images are the biggest culprits of slow web pages! The best way to solve for this is to compress your images. While there is no one-size-fits-all when it comes to image compression, testing various options like “save for web,” image sizing, and compression tools like Optimizilla, ImageOptim for Mac (or Windows alternatives), as well as evaluating what works best is the way to go.

Another way to help optimize your images (and improve your page speed) is by choosing the right image format.

How to choose which image format to use:

Flowchart for how to choose image formatsSource: Google’s image optimization guide

Choosing image formats:

  • If your image requires animation, use a GIF.
  • If you don’t need to preserve high image resolution, use JPEG (and test out different compression settings).
  • If you do need to preserve high image resolution, use PNG.
    • If your image has a lot of colors, use PNG-24.
    • If your image doesn’t have a lot of colors, use PNG-8.

There are different ways to keep visitors on a semi-slow loading page by using images that produce a colored box or a very blurry/low resolution version while rendering to help visitors feel as if things are loading faster. We will discuss these options in more detail in Chapter 5.

Pro tip: Don’t forget about thumbnails!
Thumbnails (especially for E-Commerce sites) can be a huge page speed slow down. Optimize thumbnails properly to avoid slow pages and to help retain more qualified visitors.

Alt text

Alt text (alternative text) within images is a principle of web accessibility, and is used to describe images to the visually impaired via screen readers. It’s important to have alt text descriptions so that any visually impaired person can understand what the pictures on your website depict.

Search engine bots also crawl alt text to better understand your images, which gives you the added benefit of providing better image context to search engines. Just ensure that your alt descriptions reads naturally for people, and avoid stuffing keywords for search engines.

Bad:

<img src="grumpycat.gif" alt="grumpy cat, cat is grumpy, grumpy cat gif">

Good:

<img src="grumpycat.gif" alt="A black cat looking very grumpy at a big spotted dog">

Submit an image sitemap

To ensure that Google can crawl and index your images, submit an image sitemap in your Google Search Console account. This helps Google discover images they may have otherwise missed.

Formatting for readability & featured snippets

Your page could contain the best content ever written on a subject, but if it’s formatted improperly, your audience might never read it! While we can never guarantee that visitors will read our content, there are some principles that can promote readability, including:

  • Text size and color – Avoid fonts that are too tiny. Google recommends 16+px font to minimize the need for “pinching and zooming” on mobile. The text color in relation to the page’s background color should also promote readability. Additional information on text can be found in the website accessibility guidelines. (Google’s web accessibility fundamentals).
  • Headings – Breaking up your content with helpful headings can help readers navigate the page. This is especially useful on long pages where a reader might be looking only for information from a particular section.
  • Bullet points – Great for lists, bullet points can help readers skim and more quickly find the information they need.
  • Paragraph breaks – Avoiding walls of text can help prevent page abandonment and encourage site visitors to read more of your page.
  • Supporting media – When appropriate, include images, videos, and widgets that would complement your content.
  • Bold and italics for emphasis – Putting words in bold or italics can add emphasis, so they should be the exception, not the rule. Appropriate use of these formatting options can call out important points you want to communicate.

Formatting can also affect your page’s ability to show up in featured snippets, those “position 0” results that appear above the rest of organic results.

Screenshot of a featured snippet

There is no special code that you can add to your page to show up here, nor can you pay for this placement, but taking note of the query intent can help you better structure your content for featured snippets. For example, if you’re trying to rank for “cake vs. pie,” it might make sense to include a table in your content, with the benefits of cake in one column and the benefits of pie in the other. Or if you’re trying to rank for “best restaurants to try in Portland,” that could indicate Google wants a list, so formatting your content in bullets could help.

Title tags

A page’s title tag is a descriptive, HTML element that specifies the title of a particular web page. They are nested within the head tag of each page and look like this:

<head>
  <title>Example Title</title>
</head>

Each page on your website should have a unique, descriptive title tag. What you input into your title tag field will show up here in search results, although in some cases Google may adjust how your title tag appears in search results.

Screenshot with the page title highlighted in the SERPs

It can also show up in web browsers…

Screenshot of a page title in a browser window

Or when you share the link to your page on certain external websites…

Screenshot of a page title shared on an external website

Your title tag has a big role to play in people’s first impression of your website, and it’s an incredibly effective tool for drawing searchers to your page over any other result on the SERP. The more compelling your title tag, combined with high rankings in search results, the more visitors you’ll attract to your website. This underscores that SEO is not only about search engines, but rather the entire user experience.

What makes an effective title tag?

  • Keyword usage: Having your target keyword in the title can help both users and search engines understand what your page is about. Also, the closer to the front of the title tag your keywords are, the more likely a user will be to read them (and hopefully click) and the more helpful they can be for ranking.
  • Length: On average, search engines display the first 50–60 characters (~512 pixels) of a title tag in search results. If your title tag exceeds the characters allowed on that SERP, an ellipsis “…” will appear where the title was cut off. While sticking to 50–60 characters is safe, never sacrifice quality for strict character counts. If you can’t get your title tag down to 60 characters without harming its readability, go longer (within reason).
  • Branding: At Moz, we love to end our title tags with a brand name mention because it promotes brand awareness and creates a higher click-through rate among people who are familiar with Moz. Sometimes it makes sense to place your brand at the beginning of the title tag, such as on your homepage, but be mindful of what you are trying to rank for and place those words closer toward the beginning of your title tag.

Meta descriptions

Like title tags, meta descriptions are HTML elements that describe the contents of the page that they’re on. They are also nested in the head tag, and look like this:

<head>
  <meta name=”description” content=”Description of page here.”/>
</head>

What you input into the description field will show up here in search results:

In many cases though, Google will choose different snippets of text to display in search results, dependent upon the searcher’s query.

For example if you search “find backlinks,” Google will provide this meta description as it deems it more relevant to the specific search:

The meta description pulls each step from the page content and lists it out.

While the actual meta description is:

How to find backlinks? Step 1: Navigate to Link Explorer, a tool used to research the backlink profile of a website. It will show you the quality of backlinks using metrics like Domain Authority, Page Authority, and Spam Score. You can do a good amount of backlink research with the free version or pay to receive unlimited backlink data.

This often helps to improve your meta descriptions for unique searches. However, don’t let this deter you from writing a default page meta description — they’re still extremely valuable.

What makes an effective meta description?

The qualities that make an effective title tag also apply to effective meta descriptions. Although Google says that meta descriptions are not a ranking factor, like title tags, they are incredibly important for click-through rate.

  • Relevance: Meta descriptions should be highly relevant to the content of your page, so it should summarize your key concept in some form. You should give the searcher enough information to know they’ve found a page relevant enough to answer their question, without giving away so much information that it eliminates the need to click through to your web page.
  • Length: Search engines tend to truncate meta descriptions to around 300 characters. It’s best to write meta descriptions between 150–300 characters in length. On some SERPs, you’ll notice that Google gives much more real estate to the descriptions of some pages. This usually happens for web pages ranking right below a featured snippet.

URL structure: Naming and organizing your pages

URL stands for Uniform Resource Locator. URLs are the locations or addresses for individual pieces of content on the web. Like title tags and meta descriptions, search engines display URLs on the SERPs, so URL naming and format can impact click-through rates. Not only do searchers use them to make decisions about which web pages to click on, but URLs are also used by search engines in evaluating and ranking pages.

Clear page naming

Search engines require unique URLs for each page on your website so they can display your pages in search results, but clear URL structure and naming is also helpful for people who are trying to understand what a specific URL is about. For example, which URL is clearer?

example.com/desserts/chocolate-pie

OR

example.com/asdf/453?=recipe-23432-1123

Searchers are more likely to click on URLs that reinforce and clarify what information is contained on that page, and less likely to click on URLs that confuse them.

Page organization

If you discuss multiple topics on your website, you should also make sure to avoid nesting pages under irrelevant folders. For example:

example.com/commercial-litigation/alimony

It would have been better for this fictional multi-practice law firm website to nest alimony under “/family-law/” than to host it under the irrelevant “/commercial-litigation/” section of the website.

The folders in which you locate your content can also send signals about the type, not just the topic, of your content. For example, dated URLs can indicate time-sensitive content. While appropriate for news-based websites, dated URLs for evergreen content can actually turn searchers away because the information seems outdated. For example:

example.com/2015/april/what-is-seo/

vs.

example.com/what-is-seo/

Since the topic “What is SEO?” isn’t confined to a specific date, it’s best to host on a non-dated URL structure or else risk your information appearing stale.

As you can see, what you name your pages, and in what folders you choose to organize your pages, is an important way to clarify the topic of your page to users and search engines.

URL length

While it is not necessary to have a completely flat URL structure, many click-through rate studies indicate that, when given the choice between a URL and a shorter URL, searchers often prefer shorter URLs. Like title tags and meta descriptions that are too long, too-long URLs will also be cut off with an ellipsis. Just remember, having a descriptive URL is just as important, so don’t cut down on URL length if it means sacrificing the URL’s descriptiveness.

example.com/services/plumbing/plumbing-repair/toilets/leaks/

vs.

example.com/plumbing-repair/toilets/

Minimizing length, both by including fewer words in your page names and removing unnecessary subfolders, makes your URLs easier to copy and paste, as well as more clickable.

Keywords in URL

If your page is targeting a specific term or phrase, make sure to include it in the URL. However, don’t go overboard by trying to stuff in multiple keywords for purely SEO purposes. It’s also important to watch out for repeat keywords in different subfolders. For example, you may have naturally incorporated a keyword into a page name, but if located within other folders that are also optimized with that keyword, the URL could begin to appear keyword-stuffed.

Example:

example.com/seattle-dentist/dental-services/dental-crowns/

Keyword overuse in URLs can appear spammy and manipulative. If you aren’t sure whether your keyword usage is too aggressive, just read your URL through the eyes of a searcher and ask, “Does this look natural? Would I click on this?”

Static URLs

The best URLs are those that can easily be read by humans, so you should avoid the overuse of parameters, numbers, and symbols. Using technologies like mod_rewrite for Apache and ISAPI_rewrite for Microsoft, you can easily transform dynamic URLs like this:

http://moz.com/blog?id=123

into a more readable static version like this:

https://moz.com/google-algorithm-change

Hyphens for word separation

Not all web applications accurately interpret separators like underscores (_), plus signs (+), or spaces (%20). Search engines also do not understand how to separate words in URLs when they run together without a separator (example.com/optimizefeaturedsnippets/). Instead, use the hyphen character (-) to separate words in a URL.

Geographic Modifiers in URLs

Some local business owners omit geographic terms that describe their physical location or service area because they believe that search engines can figure this out on their own. On the contrary, it’s vital that local business websites’ content, URLs, and other on-page assets make specific mention of city names, neighborhood names, and other regional descriptors. Let both consumers and search engines know exactly where you are and where you serve, rather than relying on your physical location alone.

Protocols: HTTP vs. HTTPS

A protocol is that “http” or “https” preceding your domain name. Google recommends that all websites have a secure protocol (the “s” in “https” stands for “secure”). To ensure that your URLs are using the https:// protocol instead of http://, you must obtain an SSL (Secure Sockets Layer) certificate. SSL certificates are used to encrypt data. They ensure that any data passed between the web server and browser of the searcher remains private. As of July 2018, Google Chrome displays “not secure” for all HTTP sites, which could cause these sites to appear untrustworthy to visitors and result in them leaving the site.


If you’ve made it this far, congratulations on surpassing the halfway point of the Beginner’s Guide to SEO! So far, we’ve learned how search engines crawl, index, and rank content, how to find keyword opportunities to target, and now, you know the on-page optimization strategies that can help your pages get found. Next, buckle up, because we’ll be diving into the exciting world of technical SEO!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Articles

Posted in Latest NewsComments Off

Email Testing: 7 tips from your peers for email conversion optimization

We recently asked the MarketingSherpa audience for tips on running effective email tests. Here are a few of the most helpful responses to consider as you start to develop an email testing program.
MarketingSherpa Blog

Posted in Latest NewsComments Off

Declining Organic Traffic? How to Tell if it’s a Tracking or Optimization Issue

Posted by andrewchoco

Picture this scenario. You’re a new employee that has just been brought in to a struggling marketing department (or an agency brought on to help recover lost numbers). You get access to Google Analytics, and see something like this:

(Actual screenshot of the client I audited)

This can generate two types of emotional response: excitement or fear (or both). The steady decline in organic traffic excites you because you have so many tactics and ideas that you think can save this company from spiraling downward out of control. But there’s also the fear that these tactics wont be enough to correct the course.

Regardless of whether these new tactics would work or not, it’s important to understand the history of the account and determine not only what is happening, but why.

The company may have an idea of why the traffic is declining (i.e. competitors have come in and made ranking for keywords much harder, or they did a website redesign and have never recovered).

Essentially, this boils down to two things: 1) either you’re struggling with organic optimization, or 2) something was off with your tracking in Google Analytics, has since been corrected, and hasn’t been caught.

In this article, I’ll go over an audit I did for one of my clients to help determine if the decline we saw in organic traffic was due to actual poor SEO performance, an influx in competitors, tracking issues, or a combination of these things.

I’ll be breaking it down into five different areas of investigation:

  1. Keyword ranking differences from 2015–2017
    1. Did the keywords we were ranking for in 2015 change drastically in 2017? Did we lose rankings and therefore lose organic traffic?
  2. Top organic landing pages from 2015–2017
    1. Are the top ranking organic landing pages the same currently as they were in 2015? Are we missing any pages due to a website redesign?
  3. On-page metric
    1. Did something happen to the site speed / bounce rate / page views etc.
  4. SEMrush/Moz keyword, traffic, and domain authority data
    1. Looking at the SEMrush organic traffic cost metric as well as Moz metrics like Domain Authority and competitors.
  5. Goal completions
    1. Did our conversion numbers stay consistent throughout the traffic drop? Or did the conversions drop in correlation with the traffic drop?

By the end of this post, my goal is that you’ll be able to replicate this audit to determine exactly what’s causing your organic traffic decline and how to get back on the right track.

Let’s dive in!

Keyword ranking differences from 2015–2017

This was my initial starting point for my audit. I started with this specifically because the most obvious answer, for a decline in traffic is a decline in keyword rankings.

I wanted to look at what keywords we were ranking for in 2015 to see if we significantly dropped in the rankings or if the search volume had dropped. If the company you’re auditing has had a long-running Moz account, start by looking at the keyword rankings from the initial start of the decline, compared to current keyword rankings.

I exported keyword data from both SEMrush and Moz, and looked specifically at the ranking changes of core keywords.

March was a particularly strong month across the board, so I narrowed it down and exported the keyword rankings in:

  • March 2015
  • March 2016
  • March 2017
  • December 2017 (so I could get the most current rankings)

Once the keywords were exported, I went in and highlighted in red the keywords that we were ranking for in 2015 (and driving traffic from) that we were no longer ranking for in 2017. I also highlighted in yellow the keywords we were ranking for in 2015 that were still ranking in 2017.

2015 keywords:

2017 keywords:

(Brand-related queries and URLs are blurred out for anonymity)

One thing that immediately stood out: in 2015, this company was ranking for five keywords, including the word “free.” They have since changed their offering, so it made sense that in 2017, we weren’t ranking for those keywords.

After removing the free queries, we pulled the “core” keywords to look at their differences.

March 2015 core keywords:

  • Appointment scheduling software: position 9
  • Online appointment scheduling: position 11
  • Online appointment scheduling: position 9
  • Online scheduling software: position 9
  • Online scheduler: position 9
  • Online scheduling: position 13

December 2017 core keywords:

  • Appointment scheduler: position 11
  • Appointment scheduling software: position 10
  • Online schedule: position 6
  • Online appointment scheduler: position 11
  • Online appointment scheduling: position 12
  • Online scheduling software: position 12
  • Online scheduling tool: position 10
  • Online scheduling: position 15
  • SaaS appointment scheduling: position 2

There were no particular red flags here. While some of the keywords have moved down 1–2 spots, we had new ones jump up. These small changes in movement didn’t explain the nearly 30–40% drop in organic traffic. I checked this off my list and moved on to organic landing pages.

Top organic landing page changes

Since the dive into keyword rankings didn’t provide the answer for the decline in traffic, the next thing I looked at were the organic landing pages. I knew this client had switched over CMS systems in early 2017, and had done a few small redesign projects the past three years.

After exporting our organic landing pages for 2015, 2016, and 2017, we compared the top ten (by organic sessions) and got the following results.

2015 top organic landing pages:

2016 top organic landing pages:

2017 top organic landing pages:

Because of their redesign, you can see that the subfolders changed between 2015/2016 to 2017. What really got my attention, however, is the /get-started page. In 2015/2016, the Get Started page accounted for nearly 16% of all organic traffic. In 2017, the Get Started page was nowhere to be found.

If you run into this problem and notice there are pages missing from your current top organic pages, a great way to uncover why is to use the Wayback Machine. It’s a great tool that allows you to see what a web page looked like in the past.

When we looked at the /get-started URL in the Wayback Machine, we noticed something pretty interesting:

In 2015, their /get-started page also acted as their login page. When people were searching on Google for “[Company Name] login,” this page was ranking, bringing in a significant amount of organic traffic.

Their current setup sends logins to a subdomain that doesn’t have a GA code (as it’s strictly used as a portal to the actual application).

That helped explain some of the organic traffic loss, but knowing that this client had gone through a few website redesigns, I wanted to make sure that all redirects were done properly. Regardless of whether or not your traffic has changed, if you’ve recently done a website redesign where you’re changing URLs, it’s smart to look at your top organic landing pages from before the redesign and double check to make sure they’re redirecting to the correct pages.

While this helped explain some of the traffic loss, the next thing we looked at was the on-page metrics to see if we could spot any obvious tracking issues.

Comparing on-page engagement metrics

Looking at the keyword rankings and organic landing pages provided a little bit of insight into the organic traffic loss, but it was nothing definitive. Because of this, I moved to the on-page metrics for further clarity. As a disclaimer, when I talk about on-page metrics, I’m talking about bounce rate, page views, average page views per session, and time on site.

Looking at the same top organic pages, I compared the on-page engagement metrics.

2015 on-page metrics:

2016 on-page metrics:

2017 on-page metrics:

While the overall engagement metrics changed slightly, the biggest and most interesting discrepancy I saw was in the bounce rates for the home page and Get Started page.

According to a number of different studies (like this one, this one, or even this one), the average bounce rate for a B2B site is around 40–60%. Seeing the home page with a bounce rate under 20% was definitely a red flag.

This led me to look into some other metrics as well. I compared key metrics between 2015 and 2017, and was utterly confused by the findings:

Looking at the organic sessions (overall), we saw a decrease of around 80,000 sessions, or 27.93%.

Looking at the organic users (overall) we saw a similar number, with a decrease of around 38,000 users, or 25%.

When we looked at page views, however, we saw a much more drastic drop:

For the entire site, we saw a 50% decrease in pageviews, or a decrease of nearly 400,000 page views.

This didn’t make much sense, because even if we had those extra 38,000 users, and each user averaged roughly 2.49 pages per session (looking above), that would only account for, at most, 100,000 more page views. This left 300,000 page views unaccounted for.

This led me to believe that there was definitely some sort of tracking issue. The high number of page views and low bounce rate made me suspect that some users were being double counted.

However, to confirm these assumptions, I took a look at some external data sources.

Using SEMrush and Moz data to exclude user error

If you have a feeling that your tracking was messed up in previous years, a good way to confirm or deny this hypothesis is to check external sources like Moz and SEMrush.

Unfortunately, this particular client was fairly new, so as a result, their Moz campaign data wasn’t around during the high organic traffic times in 2015. However, if it was, a good place to start would be looking at the search visibility metric (as long as the primary keywords have stayed the same). If this metric has changed drastically over the years, it’s a good indicator that your organic rankings have slipped quite a bit.

Another good thing to look at is domain authority and core page authority. If your site has had a few redesigns, moved URLs, or anything like that, it’s important to make sure that the domain authority has carried over. It’s also important to look at the page authorities of your core pages. If these are much lower than when they were before the organic traffic slide, there’s a good chance your redirects weren’t done properly, and the page authority isn’t being carried over through those new domains.

If, like me, you don’t have Moz data that dates back far enough, a good thing to check is the organic traffic cost in SEMrush.

Organic traffic cost can change because of a few reasons:

  1. Your site is ranking for more valuable keywords, making the organic traffic cost rise.
  2. More competitors have entered the space, making the keywords you were ranking for more expensive to bid on.

Usually it’s a combination of both of these.

If our organic traffic really was steadily decreasing for the past 2 years, we’d likely see a similar trendline looking at our organic traffic cost. However, that’s not what we saw.

In March of 2015, the organic traffic cost of my client’s site was $ 14,300.

In March of 2016, the organic traffic cost was $ 22,200

In December of 2017, the organic traffic cost spiked all the way up to $ 69,200. According to SEMrush, we also saw increases in keywords and traffic.

Looking at all of this external data re-affirmed the assumption that something must have been off with our tracking.

However, as a final check, I went back to internal metrics to see if the conversion data had decreased at a similar rate as the organic traffic.

Analyzing and comparing conversion metrics

This seemed like a natural final step into uncovering the mystery in this traffic drop. After all, it’s not organic traffic that’s going to profit your business (although it’s a key component). The big revenue driver is goal completions and form fills.

This was a fairly simple procedure. I went into Google Analytics to compare goal completion numbers and goal completion conversion rates over the past three years.

If your company is like my client’s, there’s a good chance you’re taking advantage of the maximum 20 goal completions that can be simultaneously tracked in Analytics. However, to make things easier and more consistent (since goal completions can change), I looked at only buyer intent conversions. In this case it was Enterprise, Business, and Personal edition form fills, as well as Contact Us form fills.

If you’re doing this on your own site, I would recommend doing the same thing. Gated content goal completions usually have a natural shelf life, and this natural slowdown in goal completions can skew the data. I’d look at the most important conversion on your site (usually a contact us or a demo form) and go strictly off those numbers.

For my client, you can see those goal completion numbers below:

Goal completion name

2015

2016

2017

Contact Us

579

525

478

Individual Edition

3,372

2,621

3,420

Business Edition

1,147

1,437

1,473

Enterprise Edition

1,178

1,053

502

Total

6,276

5,636

5,873

Conversion rates:

Goal completion name

2015

2016

2017

Contact Us

0.22%

0.22%

0.23%

Individual Edition

1.30%

1.09%

1.83%

Business Edition

0.46%

0.60%

0.76%

Enterprise Edition

0.46%

0.44%

0.29%

Average

0.61%

0.58%

0.77%

This was pretty interesting. Although there was clearly fluctuation in the goal completions and conversion rates, there were no differences that made sense with our nearly 40,000 user drop from 2015 to 2016 to 2017.

All of these findings further confirmed that we were chasing an inaccurate goal. In fact, we spent the first three months working together to try and get back a 40% loss that, quite frankly, was never even there in the first place.

Tying everything together and final thoughts

For this particular case, we had to go down all five of these roads in order to reach the conclusion that we did: Our tracking was off in the past.

However, this may not be the case for your company or your clients. You may start by looking at keyword rankings, and realize that you’re no longer ranking on the first page for ten of your core keywords. If that’s the case, you quickly discovered your issue, and your game plan should be investing in your core pages to help get them ranking again for these core keywords.

If your goal completions are way down (by a similar percentage as your traffic), that’s also a good clue that your declining traffic numbers are correct.

If you’ve looked at all of these metrics and still can’t seem to figure out the reasoning for the decrease and you’re blindly trying tactics and struggling to crawl your way back up, this is a great checklist to go through to confirm the ominous question of tracking issue or optimization issue.

If you’re having a similar issue as me, I’m hoping this post helps you get to the root of the problem quickly, and gets you one step closer to create realistic organic traffic goals for the future!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Call-to-Action Optimization: 132% increase in clickthrough from changing four simple words

Small changes to call-to-action wording can have a large impact on conversion.
MarketingSherpa Blog

Posted in Latest NewsComments Off

Marketing 101: What is CRO (Conversion Rate Optimization)?

If you’re in advertising or marketing, it helps to have an understanding of what conversion rate optimization is. CRO can be a powerful tool to improve the success of every marketing campaign, initiative and website you work on.
MarketingSherpa Blog

Posted in Latest NewsComments Off

Advert