Tag Archive | "Pages"

Diagnosing Why a Site’s Set of Pages May Be Ranking Poorly – Whiteboard Friday

Posted by randfish

Your rankings have dropped and you don’t know why. Maybe your traffic dropped as well, or maybe just a section of your site has lost rankings. It’s an important and often complex mystery to solve, and there are a number of boxes to check off while you investigate. In this Whiteboard Friday, Rand shares a detailed process to follow to diagnose what went wrong to cause your rankings drop, why it happened, and how to start the recovery process.

Diagnosing why a site's pages may be ranking poorly

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to talk about diagnosing a site and specifically a section of a site’s pages and why they might be performing poorly, why their traffic may have dropped, why rankings may have dropped, why both of them might have dropped. So we’ve got a fairly extensive process here, so let’s get started.

Step 1: Uncover the problem

First off, our first step is uncovering the problem or finding whether there is actually a problem. A good way to think about this is especially if you have a larger website, if we’re talking about a site that’s 20 or 30 or even a couple hundred pages, this is not a big issue. But many websites that SEOs are working on these days are thousands, tens of thousands, hundreds of thousands of pages. So what I like to urge folks to do is to

A. Treat different site sections as unique segments for investigation. You should look at them individually.

A lot of times subfolders or URL structures are really helpful here. So I might say, okay, MySite.com, I’m going to look exclusively at the /news section. Did that fall in rankings? Did it fall in traffic? Or was it /posts, where my blog posts and my content is? Or was it /cities? Let’s say I have a website that’s dealing with data about the population of cities. So I rank for lots of those types of queries, and it seems like I’m ranking for fewer of them, and it’s my cities pages that are poorly performing in comparison to where they were a few months ago or last year at this time.

B. Check traffic from search over time.

So I go to my Google Analytics or whatever analytics you’re using, and you might see something like, okay, I’m going to look exclusively at the /cities section. If you can structure your URLs in this fashion, use subfolders, this is a great way to do it. Then take a look and see, oh, hang on, that’s a big traffic drop. We fell off a cliff there for these particular pages.

This data can be hiding inside your analytics because it could be that the rest of your site is performing well. It’s going sort of up and to the right, and so you see this slow plateauing or a little bit of a decline, but it’s not nearly as sharp as it is if you look at the traffic specifically for a single subsection that might be performing poorly, like this /cities section.

From there, I’m going to next urge you to use Google Trends. Why? Why would I go to Google Trends? Because what I want you to do is I want you to look at some of your big keywords and topics in Google Trends to see if there has been a serious decline in search volume at the same time. If search demand is rising or staying stable over the course of time where you have lost traffic, it’s almost certainly something you’ve done, not something searchers are doing. But if you see that traffic has declined, for example, maybe you were ranking really well for population data from 2015. It turns out people are now looking for population data for 2016 or ’17 or ’18. Maybe that is part of the problem, that search demand has fallen and your curve matches that.

C. Perform some diagnostic queries or use your rank tracking data if you have it on these types of things.

This is one of the reasons I like to rank track for even these types of queries that don’t get a lot of traffic.

1. Target keywords. In this case, it might be “Denver population growth,” maybe that’s one of your keywords. You would see, “Do I still rank for this? How well do I rank for this? Am I ranking more poorly than I used to?”

2. Check brand name plus target keyword. So, in this case, it would be my site plus the above here plus “Denver population growth,” so My Site or MySite.com Denver population growth. If you’re not ranking for that, that’s usually an indication of a more serious problem, potentially a penalty or some type of dampening that’s happening around your brand name or around your website.

3. Look for a 10 to 20-word text string from page content without quotes. It could be shorter. It could be only six or seven words, or it could be longer, 25 words if you really need it. But essentially, I want to take a string of text that exists on the page and put it in order in Google search engine, not in quotes. I do not want to use quotes here, and I want to see how it performs. This might be several lines of text here.

4. Look for a 10 to 20-word text string with quotes. So those lines of text, but in quotes searched in Google. If I’m not ranking for this, but I am for this one … sorry, if I’m not ranking for the one not in quotes, but I am in quotes, I might surmise this is probably not duplicate content. It’s probably something to do with my content quality or maybe my link profile or Google has penalized or dampened me in some way.

5. site: urlstring/ So I would search for “site:MySite.com/cities/Denver.” I would see: Wait, has Google actually indexed my page? When did they index it? Oh, it’s been a month. I wonder why they haven’t come back. Maybe there’s some sort of crawl issue, robots.txt issue, meta robots issue, something. I’m preventing Google from potentially getting there. Or maybe they can’t get there at all, and this results in zero results. That means Google hasn’t even indexed the page. Now we have another type of problem.

D. Check your tools.

1. Google Search Console. I would start there, especially in the site issues section.

2. Check your rank tracker or whatever tool you’re using, whether that’s Moz or something else.

3. On-page and crawl monitoring. Hopefully you have something like that. It could be through Screaming Frog. Maybe you’ve run some crawls over time, or maybe you have a tracking system in place. Moz has a crawl system. OnPage.org has a really good one.

4. Site uptime. So I might check Pingdom or other things that alert me to, “Oh, wait a minute, my site was down for a few days last week. That obviously is why traffic has fallen,” those types of things.

Step 2: Offer hypothesis for falling rankings/traffic

Okay, you’ve done your diagnostics. Now it’s time to offer some hypotheses. So now that we understand which problem I might have, I want to understand what could be resulting in that problem. So there are basically two situations you can have. Rankings have stayed stable or gone up, but traffic has fallen.

A. If rankings are up, but traffic is down…

In those cases, these are the five things that are most typically to blame.

1. New SERP features. There’s a bunch of featured snippets that have entered the population growth for cities search results, and so now number one is not what number one used to be. If you don’t get that featured snippet, you’re losing out to one of your competitors.

2. Lower search demand. Like we talked about in Google Trends. I’m looking at search demand, and there are just not as many people searching as there used to be.

3. Brand or reputation issues. I’m ranking just fine, but people now for some reason hate me. People who are searching this sector think my brand is evil or bad or just not as helpful as it used to be. So I have issues, and people are not clicking on my results. They’re choosing someone else actively because of reputation issues.

4. Snippet problems. I’m ranking in the same place I used to be, but I’m no longer the sexiest, most click-drawing snippet in the search results, and other people are earning those clicks instead.

5. Shift in personalization or location biasing by Google. It used to be the case that everyone who searched for city name plus population growth got the same results, but now suddenly people are seeing different results based on maybe their device or things they’ve clicked in the past or where they’re located. Location is often a big cause for this.

So for many SEOs for many years, “SEO consultant” resulted in the same search results. Then Google introduced the Maps results and pushed down a lot of those folks, and now “SEO consultant” results in different ranked results in each city and each geography that you search in. So that can often be a cause for falling traffic even though rankings remain high.

B. If rankings and traffic are down…

If you’re seeing that rankings have fallen and traffic has fallen in conjunction, there’s a bunch of other things that are probably going on that are not necessarily these things. A few of these could be responsible still, like snippet problems could cause your rankings and your traffic to fall, or brand and reputation issues could cause your click-through rate to fall, which would cause you to get dampened. But oftentimes it’s things like this:

1. & 2. Duplicate content and low-quality or thin content. Google thinks that what you’re providing just isn’t good enough.

3. Change in searcher intent. People who were searching for population growth used to want what you had to offer, but now they want something different and other people in the SERP are providing that, but you are not, so Google is ranking you lower. Even though your content is still good, it’s just not serving the new searcher intent.

4. Loss to competitors. So maybe you have worse links than they do now or less relevance or you’re not solving the searcher’s query as well. Your user interface, your UX is not as good. Your keyword targeting isn’t as good as theirs. Your content quality and the unique value you provide isn’t as good as theirs. If you see that one or two competitors are consistently outranking you, you might diagnose that this is the problem.

5. Technical issues. So if I saw from over here that the crawl was the problem, I wasn’t getting indexed, or Google hasn’t updated my pages in a long time, I might look into accessibility things, maybe speed, maybe I’m having problems like letting Googlebot in, HTTPS problems, or indexable content, maybe Google can’t see the content on my page anymore because I made some change in the technology of how it’s displayed, or crawlability, internal link structure problems, robots.txt problems, meta robots tag issues, that kind of stuff.

Maybe at the server level, someone on the tech ops team of my website decided, “Oh, there’s this really problematic bot coming from Mountain View that’s costing us a bunch of bandwidth. Let’s block bots from Mountain View.” No, don’t do that. Bad. Those kinds of technical issues can happen.

6. Spam and penalties. We’ll talk a little bit more about how to diagnose those in a second.

7. CTR, engagement, or pogo-sticking issues. There could be click-through rate issues or engagement issues, meaning pogo sticking, like people are coming to your site, but they are clicking back because they weren’t satisfied by your results, maybe because their expectations have changed or market issues have changed.

Step 3: Make fixes and observe results

All right. Next and last in this process, what we’re going to do is make some fixes and observe the results. Hopefully, we’ve been able to correctly diagnose and form some wise hypotheses about what’s going wrong, and now we’re going to try and resolve them.

A. On-page and technical issues should solve after a new crawl + index.

So on-page and technical issues, if we’re fixing those, they should usually resolve, especially on small sections of sites, pretty fast. As soon as Google has crawled and indexed the page, you should generally see performance improve. But this can take a few weeks if we’re talking about a large section on a site, many thousands of pages, because Google has to crawl and index all of them to get the new sense that things are fixed and traffic is coming in. Since it’s long tail to many different pages, you’re not going to see that instant traffic gain and rise as fast.

B. Link issues and spam penalty problems can take months to show results.

Look, if you have crappier links or not a good enough link profile as your competitors, growing that can take months or years even to fix. Penalty problems and spam problems, same thing. Google can take sometimes a long time. You’ve seen a lot of spam experts on Twitter saying, “Oh, well, all my clients who had issues over the last nine months suddenly are ranking better today,” because Google made some fix in their latest index rollout or their algorithm changed, and it’s sort of, okay, well we’ll reward the people for all the fixes that they’ve made. Sometimes that’s in batches that take months.

C. Fixing a small number of pages in a section that’s performing poorly might not show results very quickly.

For example, let’s say you go and you fix /cities/Milwaukee. You determine from your diagnostics that the problem is a content quality issue. So you go and you update these pages. They have new content. It serves the searchers much better, doing a much better job. You’ve tested it. People really love it. You fixed two cities, Milwaukee and Denver, to test it out. But you’ve left 5,000 other cities pages untouched.

Sometimes Google will sort of be like, “No, you know what? We still think your cities pages, as a whole, don’t do a good job solving this query. So even though these two that you’ve updated do a better job, we’re not necessarily going to rank them, because we sort of think of your site as this whole section and we grade it as a section or apply some grades as a section.” That is a real thing that we’ve observed happening in Google’s results.

Because of this, one of the things that I would urge you to do is if you’re seeing good results from the people you’re testing it with and you’re pretty confident, I would roll out the changes to a significant subset, 30%, 50%, 70% of the pages rather than doing only a tiny, tiny sample.

D. Sometimes when you encounter these issues, a remove and replace strategy works better than simply upgrading old URLs.

So if Google has decided /cities, your /cities section is just awful, has all sorts of problems, not performing well on a bunch of different vectors, you might take your /cities section and actually 301 redirect them to a new URL, /location, and put the new UI and the new content that better serves the searcher and fixes a lot of these issues into that location section, such that Google now goes, “Ah, we have something new to judge. Let’s see how these location pages on MySite.com perform versus the old cities pages.”

So I know we’ve covered a ton today and there are a lot of diagnostic issues that we haven’t necessarily dug deep into, but I hope this can help you if you’re encountering rankings challenges with sections of your site or with your site as a whole. Certainly, I look forward to your comments and your feedback. If you have other tips for folks facing this, that would be great. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Find More Articles

Posted in Latest NewsComments Off

Designing a Page’s Content Flow to Maximize SEO Opportunity – Whiteboard Friday

Posted by randfish

Controlling and improving the flow of your on-site content can actually help your SEO. What’s the best way to capitalize on the opportunity present in your page design? Rand covers the questions you need to ask (and answer) and the goals you should strive for in today’s Whiteboard Friday.

Designing a page's content flow to maximize SEO opportunity

Click on the whiteboard image above to open a high-resolution version in a new tab!


Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about a designing a page’s content flow to help with your SEO.

Now, unfortunately, somehow in the world of SEO tactics, this one has gotten left by the wayside. I think a lot of people in the SEO world are investing in things like content and solving searchers’ problems and getting to the bottom of searcher intent. But unfortunately, the page design and the flow of the elements, the UI elements, the content elements that sit in a page is discarded or left aside. That’s unfortunate because it can actually make a huge difference to your SEO.

Q: What needs to go on this page, in what order, with what placement?

So if we’re asking ourselves like, “Well, what’s the question here?” Well, it’s what needs to go on this page. I’m trying to rank for “faster home Wi-Fi.” Right now, Lifehacker and a bunch of other people are ranking in these results. It gets a ton of searches. I can drive a lot of revenue for my business if I can rank there. But what needs to go on this page in what order with what placement in order for me to perform the best that I possibly can? It turns out that sometimes great content gets buried in a poor page design and poor page flow. But if we want to answer this question, we actually have to ask some other ones. We need answers to at least these three:

A. What is the searcher in this case trying to accomplish?

When they enter “faster home Wi-Fi,” what’s the task that they want to get done?

B. Are there multiple intents behind this query, and which ones are most popular?

What’s the popularity of those intents in what order? We need to know that so that we can design our flow around the most common ones first and the secondary and tertiary ones next.

C. What’s the business goal of ranking? What are we trying to accomplish?

That’s always going to have to be balanced out with what is the searcher trying to accomplish. Otherwise, in a lot of cases, there’s no point in ranking at all. If we can’t get our goals met, we should just rank for something else where we can.

Let’s assume we’ve got some answers:

Let’s assume that, in this case, we have some good answers to these questions so we can proceed. So pretty simple. If I search for “faster home Wi-Fi,” what I want is usually it’s going to be…

A. Faster download speed at home.

That’s what the searcher is trying to accomplish. But there are multiple intents behind this. Sometimes the searcher is looking to do that..

B1. With their current ISP and their current equipment.

They want to know things they can optimize that don’t cause them to spend money. Can they place their router in different places? Can they change out a cable? Do they need to put it in a different room? Do they need to move their computer? Is the problem something else that’s interfering with their Wi-Fi in their home that they need to turn off? Those kinds of issues.

B2. With a new ISP.

Or can they get a new ISP? They might be looking for an ISP that can provide them with faster home internet in their area, and they want to know what’s available, which is a very different intent than the first one.

B3. With current ISP but new equipment.

maybe they want to keep their ISP, but they are willing to upgrade to new equipment. So they’re looking for what’s the equipment that I could buy that would make the current ISP I have, which in many cases in the United States, sadly, there’s only one ISP that can provide you with service in a lot of areas. So they can’t change ISP, but they can change out their equipment.

C. Affiliate revenue with product referrals.

Let’s assume that (C) is we know that what we’re trying to accomplish is affiliate revenue from product referrals. So our business is basically we’re going to send people to new routers or the Google Mesh Network home device, and we get affiliate revenue by passing folks off to those products and recommending them.

Now we can design a content flow.

Okay, fair enough. We now have enough to be able to take care of this design flow. The design flow can involve lots of things. There are a lot of things that could live on a page, everything from navigation to headline to the lead-in copy or the header image or body content, graphics, reference links, the footer, a sidebar potentially.

The elements that go in here are not actually what we’re talking about today. We can have that conversation too. I want a headline that’s going to tell people that I serve all of these different intents. I want to have a lead-in that has a potential to be the featured snippet in there. I want a header image that can rank in image results and be in the featured snippet panel. I’m going to want body content that serves all of these in the order that’s most popular. I want graphics and visuals that suggest to people that I’ve done my research and I can provably show that the results that you get with this different equipment or this different ISP will be relevant to them.

But really, what we’re talking about here is the flow that matters. The content itself, the problem is that it gets buried. What I see many times is folks will take a powerful visual or a powerful piece of content that’s solving the searcher’s query and they’ll put it in a place on the page where it’s hard to access or hard to find. So even though they’ve actually got great content, it is buried by the page’s design.

5 big goals that matter.

The goals that matter here and the ones that you should be optimizing for when you’re thinking about the design of this flow are:

1. How do I solve the searcher’s task quickly and enjoyably?

So that’s about user experience as well as the UI. I know that, for many people, they are going to want to see and, in fact, the result that’s ranking up here on the top is Lifehacker’s top 10 list for how to get your home Wi-Fi faster. They include things like upgrading your ISP, and here’s a tool to see what’s available in your area. They include maybe you need a better router, and here are the best ones. Maybe you need a different network or something that expands your network in your home, and here’s a link out to those. So they’re serving that purpose up front, up top.

2. Serve these multiple intents in the order of demand.

So if we can intuit that most people want to stick with their ISP, but are willing to change equipment, we can serve this one first (B3). We can serve this one second (B1), and we can serve the change out my ISP third (B2), which is actually the ideal fit in this scenario for us. That helps us

3. Optimize for the business goal without sacrificing one and two.

I would urge you to design generally with the searcher in mind and if you can fit in the business goal, that is ideal. Otherwise, what tends to happen is the business goal comes first, the searcher comes second, and you come tenth in the results.

4. If possible, try to claim the featured snippet and the visual image that go up there.

That means using the lead-in up at the top. It’s usually the first paragraph or the first few lines of text in an ordered or unordered list, along with a header image or visual in order to capture that featured snippet. That’s very powerful for search results that are still showing it.

5. Limit our bounce back to the SERP as much as possible.

In many cases, this means limiting some of the UI or design flow elements that hamper people from solving their problems or that annoy or dissuade them. So, for example, advertising that pops up or overlays that come up before I’ve gotten two-thirds of the way down the page really tend to hamper efforts, really tend to increase this bounce back to the SERP, the search engine call pogo-sticking and can harm your rankings dramatically. Design elements, design flows where the content that actually solves the problem is below an advertising block or below a promotional block, that also is very limiting.

So to the degree that we can control the design of our pages and optimize for that, we can actually take existing content that you might already have and improve its rankings without having to remake it, without needing new links, simply by improving the flow.

I hope we’ll see lots of examples of those in the comments, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

How to Diagnose Pages that Rank in One Geography But Not Another – Whiteboard Friday

Posted by randfish

Are you ranking pretty well in one locale, only to find out your rankings tank in another? It’s not uncommon, even for sites without an intent to capture local queries. In today’s Whiteboard Friday, Rand shows you how to diagnose the issue with a few clever SEO tricks, then identify the right strategy to get back on top.

Diagnose Why Pages ranks for One Geography But Not Another

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to this edition of Whiteboard Friday. This week we’re going to chat about rankings that differ from geography to geography. Many of you might see that you are ranking particularly well in one city, but when you perform that search in another city or in another country perhaps, that still speaks the same language and has very similar traits, that maybe you’re not performing well.

Maybe you do well in Canada, but you don’t do well in the United States. Maybe you do well in Portland, Oregon, but you do poorly in San Diego, California. Sometimes you might be thinking to yourself, “Well, wait, this search is not particularly local, or at least I didn’t think of it as being particularly local. Why am I ranking in one and not the other?” So here’s a process that you can use to diagnose.

Confirm the rankings you see are accurate:

The first thing we need to do is confirm that the rankings you see or that you’ve heard about are accurate. This is actually much more difficult than it used to be. It used to be you could scroll to the bottom of Google and change your location to whatever you wanted. Now Google will geolocate you by your IP address or by a precise location on your mobile device, and unfortunately you can’t just specify one particular location or another — unless you know some of these SEO hacks.

A. Google’s AdPreview Tool – Google has an ad preview tool, where you can specify and set a particular location. That’s at AdWords.Google.com slash a bunch of junk slash ad preview. We’ll make sure that the link is down in the notes below.

B. The ampersand-near-equals parameter (&near=) - Now, some SEOs have said that this is not perfect, and I agree it is imperfect, but it is pretty close. We’ve done some comparisons here at Moz. I’ve done them while I’m traveling. It’s not bad. Occasionally, you’ll see one or two things that are not the same. The advertisements are frequently not the same. In fact, they don’t seem to work well. But the organic results look pretty darn close. The maps results look pretty darn close. So I think it’s a reasonable tool that you can use.

That is by basically changing the Google search query — so this is the URL in the search query — from Google.com/search?q= and then you might have ice+cream or WordPress+web+design, and then you use this, &near= and the city and state here in the United States or city and province in Canada or city and region in another country. In this case, I’m going with Portland+OR. This will change my results. You can give this a try yourself. You can see that you will see the ice cream places that are in Portland, Oregon, when you perform this search query.

For countries, you can use another one. You can either go directly to the country code Google, so for the UK Google.co.uk, or for New Zealand Google.co.nz, or for Canada Google.ca. Then you can type that in.You can also use this parameter &GL= instead of &near. This is global location equals the country code, and then you could put in CA for Canada or UK for the UK or NZ for New Zealand.

C. The Mozbar’s search profiles – You can also do this with the MozBar. The MozBar kind of hacks the near parameter for you, and you can just specify a location and create a search profile. Do that right inside the MozBar. That’s one of the very nice things about using it.

D. Rank tracking with a platform that supports location-specific rankings - Some of them don’t, some of them do. Moz does right now. I believe Searchmetrics does if you use the enterprise. Oh, I’m trying to remember if Rob Bucci said STAT does. Well, Rob will answer in the comments, and he’ll tell us whether STAT does. I think that they do.

Look at who IS ranking and what features they may have:

So next, once you’ve figured out whether this ranking anomaly that you perceive is real or not, you can step two look at who is ranking in the one where you’re not and figure out what factors they might have going for them.

  • Have they gotten a lot of local links, location-specific links from these websites that are in that specific geography or serve that geography, local chambers of commerce, local directories, those kinds of things?
  • Do they have a more hyper-local service area? On a map, if this is the city, do they serve that specific region? You serve a broad set of locations all over the place, and maybe you don’t have a geo-specific region that you’re serving.
  • Do they have localized listings, listings in places like where Moz Local or a competitor like Yext or Whitespark might push all their data to? Those could be things like Google Maps and Bing Maps, directories, local data aggregators, Yelp, TripAdvisor, etc., etc.
  • Do they have rankings in Google Maps? If you go and look and you see that this website is ranking particularly well in Google Maps for that particular region and you are not, that might be another signal that hyper-local intent and hyper-local ranking signals, ranking algorithm is in play there.
  • Are they running local AdWords ads? I know this might seem like, “Wait a minute. Rand, I thought ads were not directly connected to organic search results.” They’re not, but it tends to be the case that if you bid on AdWords, you tend to increase your organic click-through rate as well, because people see your ad up at the top, and then they see you again a second time, and so they’re a little more biased to click. Therefore, buying local ads can sometimes increase organic click-through rate as well. It can also brand people with your particular business. So that is one thing that might make a difference here.

Consider location-based searcher behaviors:

Now we’re not considering who is ranking, but we’re considering who is doing the searching, these location-based searchers and what their behavior is like.

  • Are they less likely to search for your brand because you’re not as well known in that region?
  • Are they less likely to click your site in the SERPs because you’re not as well known?
  • Is their intent somehow different because of their geography? Maybe there’s a language issue or a regionalism of some kind. This could be a local language thing even here in the United States, where parts of the country say “soda” and parts of the country say “pop.” Maybe those mean two different things, and “pop” means, “Oh, it’s a popcorn store in Seattle,” because there’s the Pop brand, but in the Midwest, “pop” clearly refers to types of soda beverages.
  • Are they more or less sensitive to a co-located solution? So it could be that in many geographies, a lot of your market doesn’t care about whether the solution that they’re getting is from their local region, and in others it does. A classic one on a country level is France, whose searchers tend to care tremendously more that they are getting .fr results and that the location of the business they are clicking on is in France versus other folks in Europe who might click a .com or a .co.uk with no problem.


Divide into three buckets:

You’re going to divide the search queries that you care about that have these challenges into three different types of buckets:

Bucket one: Hyper-geo-sensitive

This would be sort of the classic geo-specific search, where you see maps results right up at the top. The SERPs change completely from geo to geo. So if you perform the search in Portland and then you perform it in San Diego, you see very, very different results. Seven to nine of the top ten at least are changing up, and it’s the case that almost no non-local listings are showing in the top five results. When you see these, this is probably non-targetable without a physical location in that geography. So if you don’t have a physical location, you’re kind of out of business until you get there. If you do, then you can work on the local ranking signals that might be holding you back.

Bucket two: Semi-geo-sensitive

I’ve actually illustrated this one over here, because this can be a little bit challenging to describe. But basically, you’re getting a mix of geo-specific and global results. So, for example, I use the &near=Portland, Oregon, because I’m in Seattle and I want to see Portland’s results for WordPress web design.

WordPress web design, when I do the search all over the United States, the first one or two results are pretty much always the same. They’re always this Web Savvy Marketing link and this Creative Bloq, and they’re very broad. They are not specifically about a local provider of WordPress web design.

But then you get to number three and four and five, and the results change to be local-specific businesses. So in Portland, it’s these Mozak Design guys. Mozak, no relation to Moz, to my knowledge anyway. In San Diego, it’s Kristin Falkner, who’s ranking number three, and then other local San Diego WordPress web design businesses at four and five. So it’s kind of this mix of geo and non-geo. You can generally tell this by looking and changing your geography in this fashion seeing those different things.

Some of the top search results usually will be like this, and they’ll stay consistent from geography to geography. In these cases, what you want to do is work on boosting those local-specific signals. So if you are ranking number five or six and you want to be number three, go for that, or you can try and be in the global results, in which case you’re trying to boost the classic ranking signals, not the local ones so you can get up there.

Bucket three: Non-geo-sensitive

Those would be, “I do this search, and I don’t see any local-specific results.” It’s just a bunch of nationwide or worldwide brands. There are no maps, usually only one, maybe two geo-specific results in the top 10, and they tend to be further down, and the SERPs barely change from geo to geo. They’re pretty much the same throughout the country.

So once you put these into these three buckets, then you know which thing to do. Here, it’s pursue classic signals. You probably don’t need much of a local boost.

Here, you have the option of going one way or the other, boosting local signals to get into these rankings or boosting the classic signals to get into those global ones.

Here you’re going to need the physical business.

All right, everyone. I hope you’ve enjoyed this edition of Whiteboard Friday, and we’ll see you again next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

New Site Crawl: Rebuilt to Find More Issues on More Pages, Faster Than Ever!

Posted by Dr-Pete

First, the good news — as of today, all Moz Pro customers have access to the new version of Site Crawl, our entirely rebuilt deep site crawler and technical SEO auditing platform. The bad news? There isn’t any. It’s bigger, better, faster, and you won’t pay an extra dime for it.

A moment of humility, though — if you’ve used our existing site crawl, you know it hasn’t always lived up to your expectations. Truth is, it hasn’t lived up to ours, either. Over a year ago, we set out to rebuild the back end crawler, but we realized quickly that what we wanted was an entirely re-imagined crawler, front and back, with the best features we could offer. Today, we launch the first version of that new crawler.

Code name: Aardwolf

The back end is entirely new. Our completely rebuilt “Aardwolf” engine crawls twice as fast, while digging much deeper. For larger accounts, it can support up to ten parallel crawlers, for actual speeds of up to 20X the old crawler. Aardwolf also fully supports SNI sites (including Cloudflare), correcting a major shortcoming of our old crawler.

View/search *all* URLs

One major limitation of our old crawler is that you could only see pages with known issues. Click on “All Crawled Pages” in the new crawler, and you’ll be brought to a list of every URL we crawled on your site during the last crawl cycle:

You can sort this list by status code, total issues, Page Authority (PA), or crawl depth. You can also filter by URL, status codes, or whether or not the page has known issues. For example, let’s say I just wanted to see all of the pages crawled for Moz.com in the “/blog” directory…

I just click the [+], select “URL,” enter “/blog,” and I’m on my way.

Do you prefer to slice and dice the data on your own? You can export your entire crawl to CSV, with additional data including per-page fetch times and redirect targets.

Recrawl your site immediately

Sometimes, you just can’t wait a week for a new crawl. Maybe you relaunched your site or made major changes, and you have to know quickly if those changes are working. No problem, just click “Recrawl my site” from the top of any page in the Site Crawl section, and you’ll be on your way…

Starting at our Medium tier, you’ll get 10 recrawls per month, in addition to your automatic weekly crawls. When the stakes are high or you’re under tight deadlines for client reviews, we understand that waiting just isn’t an option. Recrawl allows you to verify that your fixes were successful and refresh your crawl report.

Ignore individual issues

As many customers have reminded us over the years, technical SEO is not a one-sized-fits-all task, and what’s critical for one site is barely a nuisance for another. For example, let’s say I don’t care about a handful of overly dynamic URLs (for many sites, it’s a minor issue). With the new Site Crawl, I can just select those issues and then “Ignore” them (see the green arrow for location):

If you make a mistake, no worries — you can manage and restore ignored issues. We’ll also keep tracking any new issues that pop up over time. Just because you don’t care about something today doesn’t mean you won’t need to know about it a month from now.

Fix duplicate content

Under “Content Issues,” we’ve launched an entirely new duplicate content detection engine and a better, cleaner UI for navigating that content. Duplicate content is now automatically clustered, and we do our best to consistently detect the “parent” page. Here’s a sample from Moz.com:

You can view duplicates by the total number of affected pages, PA, and crawl depth, and you can filter by URL. Click on the arrow (far-right column) for all of the pages in the cluster (shown in the screenshot). Click anywhere in the current table row to get a full profile, including the source page we found that link on.

Prioritize quickly & tactically

Prioritizing technical SEO problems requires deep knowledge of a site. In the past, in the interest of simplicity, I fear that we’ve misled some of you. We attempted to give every issue a set priority (high, medium, or low), when the difficult reality is that what’s a major problem on one site may be deliberate and useful on another.

With the new Site Crawl, we decided to categorize crawl issues tactically, using five buckets:

  • Critical Crawler Issues
  • Crawler Warnings
  • Redirect Issues
  • Metadata Issues
  • Content Issues

Hopefully, you can already guess what some of these contain. Critical Crawler Issues still reflect issues that matter first to most sites, such as 5XX errors and redirects to 404s. Crawler Warnings represent issues that might be very important for some sites, but require more context, such as meta NOINDEX.

Prioritization often depends on scope, too. All else being equal, one 500 error may be more important than one duplicate page, but 10,000 duplicate pages is a different matter. Go to the bottom of the Site Crawl Overview Page, and we’ve attempted to balance priority and scope to target your top three issues to fix:

Moving forward, we’re going to be launching more intelligent prioritization, including grouping issues by folder and adding data visualization of your known issues. Prioritization is a difficult task and one we haven’t helped you do as well as we could. We’re going to do our best to change that.

Dive in & tell us what you think!

All existing customers should have access to the new Site Crawl as of earlier this morning. Even better, we’ve been crawling existing campaigns with the Aardwolf engine for a couple of weeks, so you’ll have history available from day one! Stay tuned for a blog post tomorrow on effectively prioritizing Site Crawl issues, and be sure to register for the upcoming webinar.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Articles

Posted in Latest NewsComments Off

How Google assesses the ‘authority’ of web pages

Google has no single authority metric but rather uses a bucket of signals to determine authority on a page-by-page basis.

The post How Google assesses the ‘authority’ of web pages appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

How to Build Backlinks Using Your Competitors’ Broken Pages

Posted by TomCaulton

We all know building backlinks is one of the most important aspects of any successful SEO and digital marketing campaign. However, I believe there is an untapped resource out there for link building: finding your competitors’ broken pages that have been linked to by external sources.

Allow me to elaborate.

Finding the perfect backlink often takes hours, and it can can take days, weeks, or even longer to acquire. That’s where the link building method I’ve outlined below comes in. I use it on a regular basis to build relevant backlinks from competitors’ 404 pages.

Please note: In this post, I will be using Search Engine Land as an example to make my points.

Ready to dive in? Great, because I’m going to walk you through the entire link building process now.

First, you need to find your competitor(s). This is as easy as searching for the keyword you’re targeting on Google and selecting websites that are above you in the SERPs. Once you have a list of competitors, create a spreadsheet to put all of your competitors on, including their position in the rankings and the date you listed them.

Next, download Screaming Frog SEO Spider [a freemium tool]. This software will allow you to crawl all of your competitors website, revealing all their 404 pages. To do this, simply enter your competitors’ URLs in the search bar one at a time, like this:OOskptt.png

Once the crawl is complete, click “Response Codes.”

e4LciHG.png

Then, click on the dropdown arrow next to “filter” and select “Client Error 4xx.”

HYi6TWa.png

Now you’ll be able to see the brand’s 404 pages.

Once you’ve completed the step above, simply press the “Export” button to export all of their 404 pages into a file. Next, import this file into to a spreadsheet in Excel or Google Docs. On this part of the spreadsheet, create tabs called “Trust Flow,” “Citation Flow,” “Referring Domains,” and “External Backlinks.”

Now that you’ve imported all of their 404 pages, you need to dissect the images and external links if there are any. A quick way to do this is to highlight the cell block by pressing on the specific cell at the top, then press “Filter” under the “Data” tab.H3YN9BG.pngLook for the drop-down arrow on the first cell of that block. Click the drop-down arrow, and underneath “Filter by values,” you will see two links: “Select all” and “Clear.”

Press “Clear,” like this:

ZERYiSm.pngThis will clear all preset options. Now, type in the URL of the competitor’s website in the search box and click “Select all.”SKqXxQ2.png

This will filter out all external links and just leave you with their 404 pages. Go through the whole list, highlighting the pages you think you can rewrite.

Now that you have all of your relevant 404 pages in place, run them through Majestic [a paid tool] or Moz’s Open Site Explorer (OSE) [a freemium tool] to see if their 404 pages actually have any external links (which is what we’re ultimately looking for). Add the details from Majestic or Moz to the spreadsheet. No matter which tool you use (I use OSE), hit “Request a CSV” for the backlink data. (Import the data into a new tab on your spreadsheet, or create a new spreadsheet altogether if you wish.)

Find relevant backlinks linking to (X’s) website. Once you’ve found all of the relevant websites, you can either highlight them or remove the ones that aren’t from your spreadsheet.

Please note: It’s worth running each of the websites you’re potentially going to be reaching out to through Majestic and Moz to find out their citation flow, trust flow, and domain authority (DA). You may only want to go for the highest DA; however, in my opinion, if it’s relevant to your niche and will provide useful information, it’s worth targeting.

With the 404s and link opportunities in hand, focus on creating content that’s relevant for the brands you hope to earn a link from. Find the contact information for someone at the brand you want the link from. This will usually be clear on their website; but if not, you can use tools such as VoilaNorbert and Email Hunter to get the information you need. Once you have this information, you need to send them an email similar to this one:


Hi [THEIR NAME],

My name is [YOUR NAME], and I carry out the [INSERT JOB ROLE – i.e., MARKETING] at [YOUR COMPANY'S NAME or WEBSITE].

I have just come across your blog post regarding [INSERT THEIR POST TITLE] and when I clicked on one of the links on that post, it happened to go to a 404 page. As you’re probably aware, this is bad for user experience, which is the reason I’m emailing you today.

We recently published an in-depth article regarding the same subject of the broken link you have on your website: [INSERT YOUR POST TITLE].

Here’s the link to our article: [URL].

I was wondering if you wouldn’t mind linking to our article instead of the 404 page you’re currently linking to, as our article will provide your readers with a better user experience.

We will be updating this article so we can keep people provided with the very latest information as the industry evolves.

Thank you for reading this email and I look forward to hearing from you.

[YOUR NAME]


Disclaimer: The email example above is just an example and should be tailored to your own style of writing.

In closing, remember to keep detailed notes of the conversations you have with people during outreach, and always follow up with people you connect with.

I hope this tactic helps your SEO efforts in the future. It’s certainly helped me find new places to earn links. Not only that, but it gives me new content ideas on a regular basis.

Do you use a similar process to build links? I’d love to hear about it in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Google AMP has reached 125 million documents & is expanding to apps & recipe pages

AMP is growing and expanding, here are some of the latest information from Richard Gingras of Google out of Google I/O.

The post Google AMP has reached 125 million documents & is expanding to apps & recipe pages appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Google Rethinking Payday Loans & Doorway Pages?

Nov 12, 2013 WSJ: Google Ventures Backs LendUp to Rethink Payday Loans

Google Ventures Partner Blake Byers joined LendUp’s board of directors with his firm’s investment. The investor said he expects LendUp to make short-term lending reasonable and favorable for the “80 million people banks won’t give credit cards to,” and help reshape what had been “a pretty terrible industry.”

What sort of strategy is helping to drive that industry transformation?

How about doorway pages.

That in spite of last year Google going out of their way to say they were going to kill those sorts of strategies.

March 16, 2015 Google To Launch New Doorway Page Penalty Algorithm

Google does not want to rank doorway pages in their search results. The purpose behind many of these doorway pages is to maximize their search footprint by creating pages both externally on the web or internally on their existing web site, with the goal of ranking multiple pages in the search results, all leading to the same destination.

These sorts of doorway pages are still live to this day.

Simply look at the footer area of lendup.com/payday-loans

But the pages existing doesn’t mean they rank.

For that let’s head over to SEMrush and search for LendUp.com


(Click for enlarged image)

Hot damn, they rank for about “payday” keywords.

And you know their search traffic is only going to increase now that competitors are getting scrubbed from the marketplace.

Today we get journalists conduits for Google’s public relations efforts writing headlines like: Google: Payday Loans Are Too Harmful to Advertise.

Today those sorts of stories are literally everywhere.

Tomorrow the story will be over.

And when it is.

Precisely zero journalists will have covered the above contrasting behaviors.

As they weren’t in the press release.

Best yet, not only does Google maintain their investment in payday loans via LendUp, but there is also a bubble in the personal loans space, so Google will be able to show effectively the same ads for effectively the same service & by the time the P2P loan bubble pops some of the payday lenders will have followed LendUp’s lead in re-branding their offers as being something else in name.

A user comment on Google’s announcement blog post gets right to the point…

Are you disgusted by Google’s backing of LendUp, which lends money at rates of ~ 395% for short periods of time? Check it out. GV (formerly known as Google Ventures) has an investment in LendUp. They currently hold that position.

Oh, the former CIO and VP of Engineering of Google is the CEO of Zest Finance and Zest Cash. Zest Cash lends at an APR of 390%.

Meanwhile, off to revolutionize the next industry by claiming everyone else is greedy and scummy and there is a wholesome way to do the same thing leveraging new technology, when in reality the primary difference between the business models is simply a thin veneer of tech utopian PR misinformation.

Don’t expect to see a link to this blog post on TechCrunch.

There you’ll read some hard-hitting cutting edge tech news like:

Banks are so greedy that LendUp can undercut them, help people avoid debt, and still make a profit on its payday loans and credit card.

#MomentOfZeroTruth #ZMOT

Categories: 

SEO Book

Posted in Latest NewsComments Off

How to Create Product Pages that Produce the Results You Want

does your product page really work?

Do me a favor, dear content marketer. Type this phrase into Google: “You Are a Writer (So Start Acting Like One).”

I’ll wait.

Okay, did you do it? What is the top result?

It should be the product page for Jeff Goins’s ebook on Amazon.

In fact, the first entry should be the Kindle version and the second entry should be the paperback version — both on Amazon.

The third entry should be the author’s own product page dedicated to this book.

So, here’s a question for you: Why do you think Amazon has the top result and not the author’s own product page? Got any guesses?

If not, let me explain.

3 reasons certain product pages dominate in search results

You-are-a-writer

The first reason why Amazon dominates the search term for someone else’s book is the age and authority of the site.

Amazon has been online since 1995 and has established itself as the go-to place for many products, particularly books. Moreover, millions of transactions over the last 11 years have helped establish Amazon as a site that can be trusted.

Jeff is a good guy who’s been killing it for the last five years, but he’s no Amazon.

Second, Amazon’s product page has the most links pointing to it (an affiliate program has something to do with this). How many are we talking about? According to the Majestic SEO tool:

Amazon-Product-Page

Compare that to the product page on Jeff’s website.

Goins-Page

Google interprets all those links to Amazon as a sign that it is the most authoritative page for the search phrase “You Are a Writer (So Start Acting Like One).” Granted, Jeff’s page also drives traffic to the Amazon page.

Third, Amazon’s page is loaded with relevant keywords, and those keywords can be found in the:

  • Title
  • Description
  • Editorial reviews
  • Product details (which are lower on the page)
  • Customer reviews (testimonials)

Of course, Google considers many factors when evaluating the content of a web page. But these three factors provide a dramatic boost over other pages about the book, including the one on Jeff’s own website:

  1. Age and authority of a site
  2. Links
  3. Keyword-rich content

The Amazon page is an invaluable resource for anyone who wants to buy the book. The product description is rich in tantalizing detail, as are the editorial reviews, which are essentially third-party endorsements. The strong customer reviews are user-generated content.

Each of these features contribute to keyword-rich copy and build trust with a prospect.

While not everyone will read every single line on the page nor sift through every single review (Jeff has 566!), there is more than enough content to present a clear, accurate product description.

Not to mention, the more you tell, the more you raise the perceived value of a product while lowering the prospect’s resistance to buy. Now, pay attention to discover what you can learn from Amazon’s copy to improve your own product pages.

Here’s why this is important

About three or four times a year, I get the privilege of participating in one of our Authority Business Coaching calls, where Pamela Wilson, Sonia Simone, and I evaluate people’s businesses, websites, and (sometimes) product pages.

In addition, I sit on the review committee for our Certified Content Marketer program, where I critique three pieces of content from an applicant — one of which is a landing page.

Product pages and landing pages share a lot of the same features. Over time, I’ve seen detrimental patterns among applicants when it comes to landing page copy, which are also problematic for product pages.

One such pattern is what I call the “short copy trap.” It can happen in any type of content, but many copywriters make this mistake when writing copy for free opt-in content. In other words, copywriters and content marketers fall into this trap when they believe that “free” is enough to convert a prospect.

Our thinking goes something like this: because we understand the value of the headline, introduction copy, three bullet points, and call to action, we assume our visitors have that same knowledge.

But when your prospect visits a landing page that’s light on copy, something entirely different happens. Your prospect examines the copy and assumes that if the copy is thin, then the free content is probably thin, too.

Paid products fall into this trap, too

I’ve also seen this mistake with paid products. Here are two examples we recently evaluated during Authority sessions (both companies gave me permission to use their pages as examples).

Mike Cerio said he wanted to start his own business, but he just wasn’t sure what it would be. Then he decided, “Hey, if I can make it selling bird seed, I can make it selling anything.”

He’s off to a good start, but his product pages could be improved. Here’s one for a premium mix called Woody Worm – Woodpecker Blend.

woody-worm

Mike’s site looks professional and there is a clear image of the product, but you aren’t presented with a reason why you should buy this bird seed over a competitor’s product. All you see are a few sentences on how much your woodpecker is going to enjoy this blend, other types of birds the seed will attract, and the ingredients.

Here are a few questions I have:

  • Do I just put the seeds out and they attract woodpeckers?
  • Do I need a special type of feeder? (If I do, this is a chance to offer me another product)
  • What makes this worth $ 17.88? I’m going to check to see if I can get it cheaper. If I can, why should I pay more?
  • What’s the source of your ingredients? Is there a unique source you use that provides an advantage over the competition?
  • What’s the story behind your blend? Did you get a scientist, ornithologist, or another expert to help you mix it?
  • Where are the reviews?
  • Who has endorsed this product?
  • Has the blend won any awards?

As you can see, this product page offers many opportunities to illustrate why someone would choose Mike’s product over the competition.

For the moment, let’s move on to our next example, Middle Mountain Mead.

rose-mead

The wine market is glutted, so you have to stand out — particularly if you are selling a $ 68 bottle of wine.

In Middle Mountain Mead’s copy, we get a sense of what it takes to make such a wine and when you could use the wine (celebrations, honeymoons, etc.).

Unfortunately, strangers usually don’t care about all the hard work you put into your product … but in this case, the copy tells a story of how this wine evolved. It’s a pretty elaborate maturation process, years in the making. So that’s something special.

But why should potential customers care? Why should they choose this $ 68 mead over a less expensive one?

  • Will it give them prestige? Bragging rights? Help them throw the best parties?
  • Will they feel like they are part of something bigger than themselves? Will they feel like they are trendsetters? That they are making history?
  • Will the mead take them to a new dimension of intoxication?

Ultimately, it comes down to this: Why should we buy wine from this company and not another one?

Later in this article, I’ll summarize how to write product pages that produce the results you want. Next, however, I’m going to focus on one of the most powerful features of effective selling on product pages: stories.

And I’m going to start by talking about another wine merchant.

How to turn the ordinary into the irresistible

Consider Garagiste’s email newsletter.

Jon Rimmerman is the founder of Garagiste, the world’s largest email-based wine business. That line should cause you to pause.

You can sell wine through email? Indeed. Keep in mind, he hasn’t advertised since 1996.

So, what is Rimmerman’s secret? He is the author of one of the most popular email newsletters on wine.

In the latest numbers I could find from 2012, he boasted an email subscriber list of 136,000 and annual sales of $ 30 million. I’m sure his subscriber numbers and sales have grown since then.

And let me be clear on this. Rimmerman sells cases of wine through an email newsletter. Once he procures a new stock, he sits down, writes an email, and hits send.

In this case, his emails are product pages. You purchase through a link to his website. And he never fails to move all the cases. Even at several hundred dollars a case.

What makes his emails so great?

His meandering missives are part salesmanship in print, part trivia junk-drawer on topics like vintage 1960s tube amplifiers or 100k bike rides in the high-hills of France (after two bottles of Beaujolais).

According to a story in The New York Times on Rimmerman:

“Rimmerman has built his reputation by differentiating his tastes from those of other critics, favoring the austere, eccentric and putatively authentic over what you might call the merely delicious.”

Then there are the stories.

Stories about dealing with a flat bicycle tire in the French countryside. Chasing down an elusive contact in a remote European village who simply screams “The Quatre Saison!”

It makes for compelling reading (and compelling wine). Rimmerman sees his purpose as the intermediary for wine visionaries.

Here, he’s talking about a certain Washington state winemaker’s products:

“What I’m trying to uncover is something that is culturally important or of the moment, which this definitely is. This is cutting-edge Washington State winemaking. So, first: Are the wines sound? And then, would people that read everything that I write every day be interested not just for the wine but for the story, the cats, the meth lab, the geologist, the maybe-no-woman-in-his-life. Would they like to kind of taste that story in the bottle.”

Rimmerman feels many wines can be good, even great. But not all wines can boast a history that involves cats, illegal drug production, an EPA agent, and the monk-like dedication to winemaking.

That’s the difference, which unfolds through Rimmerman’s stories.

But just in case I haven’t convinced you of the power of stories to increase the value of a product and actually sell it, let me take you on another journey.

More proof for those skeptical of stories

The quasi-anthropological Significant Objects project started with the hunch that “stories can add measurable value to near-worthless trinkets.”

Journalist Rob Walker and writer Joshua Glenn bought cheap trinkets at thrift stores and garage sales. Then they paired each object with a writer (such as Jonathan Letham or Nicholson Baker) who wrote a fictional story about the object.

A photo of the trinket and the story were then published on eBay.

Let’s look at a Utah Snow Globe. A story about it was written by Blake Butler:

My granddad’s granddad had a box under his bed. If you got to open the box (you had to beg) you would find a little door. The little door had a combination on it that you had to know to get inside the second box, which I did. I had the combination tattooed on my spinemeat when I was four while on a trip to see the circus. The tattoo was free. My granddad’s granddad was very powerful and rich.

You can read the rest of the story here.

This mundane item was originally bought for 99 cents and sold on eBay for $ 59.00.

A silly story increased its value by $ 58.00.

Here’s how some of the other items fared.

Why were people bidding on these insignificant objects on eBay?

Stories.

How to write stories for a product page

The stories you include on your product page don’t have to be linear like Jon Rimmerman’s.

For instance:

  • You can tell one story in the product description and tell several other stories through testimonials.
  • You can tell a story through photographs of the product, like Saddleback Leather (one of Sonia’s favorite places to shop) did for their classic briefcase. And notice the customer photographs.
  • You can tell a story about a unique product-creation process. Each fountain pen in this series by Goulet is hand-torched to get a distinctive look. In other words, each pen is one of a kind.

Check out these resources that will help you write great stories:

Now let’s summarize how to create a product page that produces results.

6 essential qualities every product page needs

A product page that dominates search engine results and makes sales has these six qualities:

  1. Noteworthy age and authority (both the website and the page). This is why it’s important to launch your site as soon as possible. The longer it is online, the more trust it earns.
  2. A growing list of links pointing to the page. One of the best ways to encourage people to link to a product page is through an affiliate program. But if that’s not your cup of tea, then build a product and page that becomes a resource people want to share with others. Each link is another way to drive traffic to your product page.
  3. Keyword-rich copy. As I’ve said before, the more you tell, the more you sell. And all of this content doesn’t have to come from you. You can publish great stories with relevant keywords through testimonials and reviews.
  4. Engaging stories. See above.
  5. A variety of photographs. Encourage customers to share their own photographs to continue the tradition of storytelling through pictures.
  6. A video demonstrating the product’s benefits. You could even interview customers. And don’t forget the transcript! It satisfies those who prefer to read and provides more keyword-rich copy.

These essentials are part of winning product pages whether you sell physical objects, services (consulting), or digital products like software, online courses, or ebooks.

We covered a lot of material today, so let me know if you have any questions. Just drop a note in the comments section below. And while you’re at it, share your favorite product pages. I’d love to hear from you!

The post How to Create Product Pages that Produce the Results You Want appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

Google’s AMP: The Fun and User-Friendly Guide to Accelerated Mobile Pages

what does Google AMP mean for you?

Watch the video. Ignore the copy.

That’s my advice to you once you land on Google’s site dedicated to the new Accelerated Mobile Pages (AMP) Project:

“The Accelerated Mobile Pages (AMP) Project is an open source initiative that embodies the vision that publishers can create mobile optimized content once and have it load instantly everywhere.”

If you are not a developer and you read the copy, you will be swallowed alive by jargon.

Watch the video, however, and you’ll almost immediately understand what AMP is all about (not to mention a funny Spinal Tap reference, see below).

Or you could just read this guide because it will be the most fun you’ve ever had reading about AMP and how it affects your content marketing.

I promise.

What is Google’s AMP Project?

Since the birth of Google’s Zero Moment of Truth philosophy back in 2011, it’s been no secret that they want to “dramatically improve the performance of the mobile web.”

And I probably don’t need to tell you that there is a small problem with the performance of content on the mobile web.

Chances are, you have a mobile device. And chances are that you’ve clicked a link on that device from a search results page, social media site, or inside your email inbox … eager to consume the content.

But it never comes.

Well, it comes, but in a convulsing patchwork of lurching, jerky images, videos, and ads as the page loads. You look on in horror, eyes dilated, bouncing around in your subway seat like someone who has to go to the bathroom.

“How long is this going to take?” you cry out to no one in particular.

Who knows, but you’ll never find out because, like 40 percent of the population, you’ll bail on the loading page within 3 seconds. Then you will say, like my black cat Apollo Monkeystrap, “Le sigh.”

It’s insufferable mobile moments like these that Google wants to eliminate with AMP.

How fast will AMP make your pages?

On a scale from 1 to 10, with one being “not loading at all” and ten being “loading in one second,” AMP content will load at a page speed of 11. (That’s the Spinal Tap reference I warned you about).

But seriously, how fast?

Jon Parise, software engineer at Pinterest, said, “Accelerated Mobile Pages load four times faster and use eight times less data than traditional mobile-optimized pages.”

Four times faster is good! But what does that mean for you, our friendly subway commuter trying to download a web page on his mobile phone?

According to NiemanLab, an AMP optimized New York Times article fully downloaded on mobile in 2.99 seconds. For a comparison, in a test in Chrome on a fast iMac, the desktop version of that page took 3.82 seconds (the mobile version was faster).

If you weren’t aware, NYTimes.com was notorious for being one of the slowest loading mobile news websites this side of Saturn.

They’ve since fixed that.

Are you sure that’s fast enough?

But wait, you say, 2.99 seconds is still at the upper reaches of the time frame our subway commuter is willing to wait. He, like 40 percent of the population, bailed at 3 seconds. 2.99 seconds is cutting it close! That’s not much of an improvement.

True, but the difference is that in the first, non-AMP scenario on a desktop, the page was still loading after 3 seconds. In the AMP version of the scenario, it was fully downloaded in 2.99.

More importantly, the AMP version reached “domContentLoaded — a key point in a webpage’s load where the HTML is fully downloaded and certain important parsing has been completed” in 0.857 seconds.

A blink of an eye takes around 0.33 seconds.

In other words, blink your eye twice and you, our subway commuter, can start reading the useful part of the content almost instantly, thanks to AMP.

What makes traditional mobile pages load so slowly?

I think I know, but since I’m not a developer, designer, or programmer (and not even sure if those are different disciplines), I rung up one of our developers here at Rainmaker Digital, Mike Hale, to help me translate.

Mike said that when a desktop site gets pulled into a mobile browser, you’ve got several dozens of packets of information getting “called,” often from different hosts, into the mobile browser.

In the old, slow New York Times example above, one article could have “192 requests, some to Times’s hosts, most to a flurry of other servers hosting scores of scripts.”

The most useful part will download in 5 seconds after you click, but the rest of it is still coming in, which is why the screen bounces as the browser sets and then resets.

Still lost, I pressed Mike, “But what kind of things are being requested?! What’s being called?!”

Mike told me that your mobile could be calling the scripts for social plugins, image carousels, SlideShares, and videos. And then there is probably analytics software, ads, and tracking scripts running in the background.

Sensible, everyday things, but they add up. And fast.

The problem is your mobile device simply doesn’t have the processing muscle to pull this off wicked fast.

This illustration explains the problem with mobile content

Imagine if you went to the bank, walked up to the teller and requested $ 100, but in a peculiar order: fifteen one-dollar bills, three five-dollar bills, five ten-dollar bills, and one twenty-dollar bill.

If the bank operated like the mobile web does now, then the teller would make multiple trips to give you each bill separately.

Each one of those trips is a “call.”

If the bank was optimized with AMP, however, the teller would give you all of your bills at once. Furthermore, he would probably ignore your specific request for certain bills and just deliver a hundred-dollar bill.

AMP aims to simplify the requests a browser has to make.

“The only script you can call is the AMP javascript,” Mike said. “Everything is wrapped in one bundle. I’m handing you everything at once.”

And here is how Yoast described AMP mobile content:

“Let’s compare this to a race car. If you want to make a race car faster, you give it a faster engine and you strip all the weight. In this weight stripping, you also remove things like back seats, air conditioning, etc. AMP is not unlike that. It’s the trimmed-down version of a normal web, because Google cares for speed more than for nifty features.”

Feel that sting of bitterness in the last line? It’s not your imagination. We’ll get to that in a moment. First, let’s look at an AMP experience to help you see what I mean.

What does AMP-optimized mobile content look like?

Fortunately, you can test a demo of AMP news content.

  1. Whip out that smartphone of yours.
  2. Open up a browser (any browser).
  3. Type in g.co/ampdemo.
  4. Type in popular keyword phrases like “Justin Bieber Birthday,” “Jennifer Lawrence,” or “Yoko Ono” into the Google search box.
  5. Hit “Send.”

Justin Bieber Google AMP Demo

This is an example of how the results will look:

Justin Bieber Google's AMP demo results

As you can see, AMP content gets top billing. It’s in the carousel that sits above the fold. The standard entries sit below it.

You can scroll through that carousel and when you find an AMP-powered article you like, click on it. This is how it will look:

Justin Bieber Google's AMP results demo

Fun reading, eh!

Why is Google doing this, and should you care? (Sort of)

This is where it gets strange.

As I mentioned above, Google has been obsessed with mobile web performance optimization for the last five years, which is why we’ve been writing about the importance of seriously fast website speed (there are six tools to test your site speed in that article, by the way) for just as long.

It’s why we warned you last year about Google’s Mobile Update, “Mobilegeddon.”

So, at this point, AMP is sort of a culmination of Google’s focus on mobile web optimization, but how does it help publishers?

It sounds simplistic, but Google thinks: Users love content. They love fast content. The faster you can dish out fast content, the more content can be consumed.

There is something else to this move, though: Google is trying to win the news consumption war.

How is AMP different from Instant Articles and Apple News?

Last year, both Apple and Facebook launched applications that allow users to consume news articles: Apple News and Instant Articles.

Both of these applications are stand-alone products.

Christian Cantrell, a Senior Experience Development Engineer on Adobe’s XD team, wrote on Smashing Magazine, “They’re essentially fancy news aggregators with custom renderers built on top of proprietary syndication formats.”

In other words, Instant Articles and Apple News are RSS reborn.

Google, on the other hand, seeks to go straight to the publishers and get them to optimize mobile content for consumption on the open web, which allows for effortless entry and easy distribution, something you can’t get in an app.

So far, some big-name news publishers have taken part:

  • Time Inc.
  • The Atlantic
  • Vox
  • BBC
  • The Huffington Post

A number of technology companies like WordPress, Twitter, Adobe Analytics, Chartbeat, LinkedIn, and Pinterest joined, too.

Two common complaints about AMP

The complaints come in at least two varieties:

  1. Detractors hate the restrictions on what mobile content can do.
  2. Detractors lament that the small publisher will be punished.

Yoast falls into the first category. Joost de Valk wrote, “AMP restricts what you can do in HTML pages. Fancy design is stripped out in favor of speed. AMP is very much a function over form project.”

Be aware that it’s not just looks that are affected — out of the box, AMP doesn’t support forms, which means AMP pages won’t help you grow your email list. There are tech workarounds available, but the situation is still too new to see how those will evolve.

You’ll notice mostly large publishers are using AMP, but does that mean AMP is out to hurt smaller publishers?

Paul Shapiro at Search Engine Land had this to say:

“Although experienced developers can often achieve similar results through intensive performance optimizations, publishers often neglect this due to resource constraints. AMP allows these optimizations to be easily achieved without altering the primary mobile web experience.”

AMP may prove to be a solid tool for optimizing mobile experience. So should you AMP?

Who should worry about AMP?

It depends.

  • Are you a giant publisher of news? Then panic, unless you already have a solid strategy for mobile user experience. Which you should have already had in place months ago, but hey.
  • Do you get a signifiant source of traffic from mobile? Again, you need to make sure your strategy for mobile experience is rock solid. You probably want to watch and see how AMP develops, but it might be worth some experiments.
  • Are you in a new market without much competition? Make it a point to optimize your mobile content in the next 30 days, but don’t panic.
  • Don’t fit in any of the categories above? Then sit on your hands and monitor what happens.

“Definitely something to keep on a publisher’s radar,” our Chief Product Officer and StudioPress founder Brian Gardner told me. “But my guess is that it’ll be a fluid deal for some time.”

Who knows? There might be an easier way to do all of this down the road — something Jake Goldman, president and founder of 10up, a WordPress content management consulting agency, also suggested:

“Given time, we suspect that AMP will eventually become as complex as ‘unrestricted’ HTML or be rendered moot by advances in mobile hardware, broadband speeds, and standardized privacy features — a solution for a problem we no longer have.

But let’s say you do want to play around with AMP.

How to create your first AMP page

This is where you get to pull up your big boy/girl developer pants, because it takes some basic markup to get started with AMP.

I recommend you work your way through this basic tutorial. It will teach you how to:

  • Create your first AMP page using boilerplate code.
  • Stage it.
  • Validate that it’s AMP compliant (this is a super important step) using Google’s validator.
  • Prepare for publication and distribution.

In your markup, you’ll see some elements have an AMP tag. Thus, the <img> tag becomes <amp-img>. The <anim> tag becomes <amp-anim>. The <video> tag becomes <amp-video>.

And so on.

What should WordPress users do?

If you want to try AMP out, you’re in luck.

On February 24, 2016, Automattic released an official AMP plugin. Therefore, WordPress users are just a downloadable plugin away from AMP-optimized content.

Keep this in mind, however: according to the official AMP plugin page, the plugin does not support pages or archives. Just posts.

But once you activate the plugin — bam — all of your posts are automatically AMPlified!

With the plugin active, all posts on your site will have dynamically generated AMP-compatible versions.

You can see the AMP version of your web posts by putting “/amp” at the end of the url. For instance, the AMP version of yourwebsite.com/amp-wicked-fast becomes yourwebsite.com/amp-wicked-fast/amp.

And if you’re a Rainmaker Platform customer: we’ve got AMP on our radar screens. We’re not going to rush into anything, given Google’s history of dramatic 180 degree turnarounds. But if AMP does prove to be important in the future, we’ll offer an easy AMP solution to our users.

Is AMP a search ranking factor?

No.

Google’s John Mueller stated that AMP is not a ranking factor. He did say that “Converting pages to AMP format will satisfy the mobile-friendly ranking signal, but there’s no ranking signal that’s solely associated with AMP.”

Search Engine Journal staff writer Matt Southern points out:

“My question is, what does it matter if AMP is a ranking signal or not if AMP content already has a one-way ticket to the top of the first page? For the most part AMP content is already ranking above organic results, which is one of the greatest ranking boosts one can ask for.”

So, there is an advantage to getting out there and being ahead of your competitors in the mobile content game — a carrot on a stick the Distilled folks wave in your face in this Whiteboard Friday video.

Another reason I recommend you watch that Whiteboard Friday video is the folks at Distilled belabored another important distinction AMP offers: super fast delivery through a caching server.

Tom Anthony, head of R&D at Distilled, says:

“And then all of this is designed to be really heavily cached so that Google can host these pages, host your actual content right there, and so they don’t even need to fetch it from you anymore.”

Just for grins, this is what the Google Blog had to say about their new approach to caching:

“So, as part of this effort, we’ve designed a new approach to caching that allows the publisher to continue to host their content while allowing for efficient distribution through Google’s high performance global cache. We intend to open our cache servers to be used by anyone free of charge.”

I like free.

Will AMP affect your advertising?

It shouldn’t.

According to Google, “We want to support a comprehensive range of ad formats, ad networks and technologies. Any sites using AMP HTML will retain their choice of ad networks, as well as any formats that don’t detract from the user experience.”

And in case you are wondering, here’s a list of supported ad networks from the official Google AMP Project site:

  • A9
  • Adform
  • AdReactor
  • AdSense
  • AdTech
  • Dot and Media
  • Doubleclick
  • Flite
  • plista
  • Smart AdServer
  • Yieldmo
  • Revcontent

By the way, I think AMP is also meant to soothe publishers’ fears of the growing adoption of ad blockers. But that’s another story for another time. It’s probably time we wrap this up.

Over to you …

Wow. We covered a lot of ground here.

Thank you for sticking with me. Hopefully you found this useful and I answered all of your questions about the Google AMP Project.

If not, feel free to drop me a line in the comments below. Plus, leave a comment if you have anything to add to what I wrote or if I missed something.

Either way, I look forward to hearing from you.

The post Google’s AMP: The Fun and User-Friendly Guide to Accelerated Mobile Pages appeared first on Copyblogger.


Copyblogger

Find More Articles

Posted in Latest NewsComments Off

Advert