Tag Archive | "Need"

Why Local Businesses Will Need Websites More than Ever in 2019

Posted by MiriamEllis

64% of 1,411 surveyed local business marketers agree that Google is becoming the new “homepage” for local businesses. Via Moz State of Local SEO Industry Report

…but please don’t come away with the wrong storyline from this statistic.

As local brands and their marketers watch Google play Trojan horse, shifting from top benefactor to top competitor by replacing former “free” publicity with paid packs, Local Service Ads, zero-click SERPs, and related structures, it’s no surprise to see forum members asking, “Do I even need a website anymore?”

Our answer to this question is,“Yes, you’ve never needed a website more than you will in 2019.” In this post, we’ll examine:

  • Why it looks like local businesses don’t need websites
  • Statistical proofs of why local businesses need websites now more than ever
  • The current status of local business websites and most-needed improvements

How Google stopped bearing so many gifts

Within recent memory, a Google query with local intent brought up a big pack of ten nearby businesses, with each entry taking the user directly to these brands’ websites for all of their next steps. A modest amount of marketing effort was rewarded with a shower of Google gifts in the form of rankings, traffic, and conversions.

Then these generous SERPs shrank to seven spots, and then three, with the mobile sea change thrown into the bargain and consisting of layers and layers of Google-owned interfaces instead of direct-to-website links. In 2018, when we rustle through the wrapping paper, the presents we find from Google look cheaper, smaller, and less magnificent.

Consider these five key developments:

1) Zero-click mobile SERPs

This slide from a recent presentation by Rand Fishkin encapsulates his findings regarding the growth of no-click SERPs between 2016–2018. Mobile users have experienced a 20% increase in delivery of search engine results that don’t require them to go any deeper than Google’s own interface.

2) The encroachment of paid ads into local packs

When Dr. Peter J. Myers surveyed 11,000 SERPs in 2018, he found that 35% of competitive local packs feature ads.

3) Google becoming a lead gen agency

At last count, Google’s Local Service Ads program via which they interposition themselves as the paid lead gen agent between businesses and consumers has taken over 23 business categories in 77 US cities.

4) Even your branded SERPs don’t belong to you

When a user specifically searches for your brand and your Google Knowledge Panel pops up, you can likely cope with the long-standing “People Also Search For” set of competitors at the bottom of it. But that’s not the same as Google allowing Groupon to advertise at the top of your KP, or putting lead gen from Doordash and GrubHub front and center to nickel and dime you on your own customers’ orders.

5) Google is being called the new “homepage” for local businesses

As highlighted at the beginning of this post, 64% of marketers agree that Google is becoming the new “homepage” for local businesses. This concept, coined by Mike Blumenthal, signifies that a user looking at a Google Knowledge Panel can get basic business info, make a phone call, get directions, book something, ask a question, take a virtual tour, read microblog posts, see hours of operation, thumb through photos, see busy times, read and leave reviews. Without ever having to click through to a brand’s domain, the user may be fully satisfied.

“Nothing is enough for the man to whom enough is too little.”
- Epicurus

There are many more examples we could gather, but they can all be summed up in one way: None of Google’s most recent local initiatives are about driving customers to brands’ own websites. Local SERPs have shrunk and have been re-engineered to keep users within Google’s platforms to generate maximum revenue for Google and their partners.

You may be as philosophical as Epicurus about this and say that Google has every right to be as profitable as they can with their own product, even if they don’t really need to siphon more revenue off local businesses. But if Google’s recent trajectory causes your brand or agency to conclude that websites have become obsolete in this heavily controlled environment, please keep reading.

Your website is your bedrock

“65% of 1,411 surveyed marketers observe strong correlation between organic and local rank.” – Via Moz State of Local SEO Industry Report

What this means is that businesses which rank highly organically are very likely to have high associated local pack rankings. In the following screenshot, if you take away the directory-type platforms, you will see how the brand websites ranking on page 1 for “deli athens ga” are also the two businesses that have made it into Google’s local pack:

How often do the top 3 Google local pack results also have a 1st page organic rankings?

In a small study, we looked at 15 head keywords across 7 US cities and towns. This yielded 315 possible entries in Google’s local pack. Of that 315, 235 of the businesses ranking in the local packs also had page 1 organic rankings. That’s a 75% correlation between organic website rankings and local pack presence.

*It’s worth noting that where local and organic results did not correlate, it was sometimes due the presence of spam GMB listings, or to mystery SERPs that did not make sense at first glance — perhaps as a result of Google testing, in some cases.

Additionally, many local businesses are not making it to the first page of Google anymore in some categories because the organic SERPs are inundated with best-of lists and directories. Often, local business websites were pushed down to the second page of the organic results. In other words, if spam, “best-ofs,” and mysteries were removed, the local-organic correlation would likely be much higher than 75%.

Further, one recent study found that even when Google’s Local Service Ads are present, 43.9% of clicks went to the organic SERPs. Obviously, if you can make it to the top of the organic SERPs, this puts you in very good CTR shape from a purely organic standpoint.

Your takeaway from this

The local businesses you market may not be able to stave off the onslaught of Google’s zero-click SERPs, paid SERPs, and lead gen features, but where “free” local 3-packs still exist, your very best bet for being included in them is to have the strongest possible website. Moreover, organic SERPs remain a substantial source of clicks.

Far from it being the case that websites have become obsolete, they are the firmest bedrock for maintaining free local SERP visibility amidst an increasing scarcity of opportunities.

This calls for an industry-wide doubling down on organic metrics that matter most.

Bridging the local-organic gap

“We are what we repeatedly do. Excellence, then, is not an act, but a habit.”
- Aristotle

A 2017 CNBC survey found that 45% of small businesses have no website, and, while most large enterprises have websites, many local businesses qualify as “small.”

Moreover, a recent audit of 9,392 Google My Business listings found that 27% have no website link.

When asked which one task 1,411 marketers want clients to devote more resources to, it’s no coincidence that 66% listed a website-oriented asset. This includes local content development, on-site optimization, local link building, technical analysis of rankings/traffic/conversions, and website design as shown in the following Moz survey graphic:

In an environment in which websites are table stakes for competitive local pack rankings, virtually all local businesses not only need one, but they need it to be as strong as possible so that it achieves maximum organic rankings.

What makes a website strong?

The Moz Beginner’s Guide to SEO offers incredibly detailed guidelines for creating the best possible website. While we recommend that everyone marketing a local business read through this in-depth guide, we can sum up its contents here by stating that strong websites combine:

  • Technical basics
  • Excellent usability
  • On-site optimization
  • Relevant content publication
  • Publicity

For our present purpose, let’s take a special look at those last three elements.

On-site optimization and relevant content publication

There was a time when on-site SEO and content development were treated almost independently of one another. And while local businesses will need a make a little extra effort to put their basic contact information in prominent places on their websites (such as the footer and Contact Us page), publication and optimization should be viewed as a single topic. A modern strategy takes all of the following into account:

  • Keyword and real-world research tell a local business what consumers want
  • These consumer desires are then reflected in what the business publishes on its website, including its homepage, location landing pages, about page, blog and other components
  • Full reflection of consumer desires includes ensuring that human language (discovered via keyword and real-world research) is implemented in all elements of each page, including its tags, headings, descriptions, text, and in some cases, markup

What we’re describing here isn’t a set of disconnected efforts. It’s a single effort that’s integral to researching, writing, and publishing the website. Far from stuffing keywords into a tag or a page’s content, focus has shifted to building topical authority in the eyes of search engines like Google by building an authoritative resource for a particular consumer demographic. The more closely a business is able to reflect customers’ needs (including the language of their needs), in every possible component of its website, the more relevant it becomes.

A hypothetical example of this would be a large medical clinic in Dallas. Last year, their phone staff was inundated with basic questions about flu shots, like where and when to get them, what they cost, would they cause side effects, what about side effects on people with pre-existing health conditions, etc. This year, the medical center’s marketing team took a look at Moz Keyword Explorer and saw that there’s an enormous volume of questions surrounding flu shots:

This tiny segment of the findings of the free keyword research tool, Answer the Public, further illustrates how many questions people have about flu shots:

The medical clinic need not compete nationally for these topics, but at a local level, a page on the website can answer nearly every question a nearby patient could have about this subject. The page, created properly, will reflect human language in its tags, headings, descriptions, text, and markup. It will tell all patients where to come and when to come for this procedure. It has the potential to cut down on time-consuming phone calls.

And, finally, it will build topical authority in the eyes of Google to strengthen the clinic’s chances of ranking well organically… which can then translate to improved local rankings.

It’s important to note that keyword research tools typically do not reflect location very accurately, so research is typically done at a national level, and then adjusted to reflect regional or local language differences and geographic terms, after the fact. In other words, a keyword tool may not accurately reflect exactly how many local consumers in Dallas are asking “Where do I get a flu shot?”, but keyword and real-world research signals that this type of question is definitely being asked. The local business website can reflect this question while also adding in the necessary geographic terms.

Local link building must be brought to the fore of publicity efforts

Moz’s industry survey found that more than one-third of respondents had no local link building strategy in place. Meanwhile, link building was listed as one of the top three tasks to which marketers want their clients to devote more resources. There’s clearly a disconnect going on here. Given the fundamental role links play in building Domain Authority, organic rankings, and subsequent local rankings, building strong websites means bridging this gap.

First, it might help to examine old prejudices that could cause local business marketers and their clients to feel dubious about link building. These most likely stem from link spam which has gotten so out of hand in the general world of SEO that Google has had to penalize it and filter it to the best of their ability.

Not long ago, many digital-only businesses were having a heyday with paid links, link farms, reciprocal links, abusive link anchor text and the like. An online company might accrue thousands of links from completely irrelevant sources, all in hopes of escalating rank. Clearly, these practices aren’t ones an ethical business can feel good about investing in, but they do serve as an interesting object lesson, especially when a local marketer can point out to a client, that best local links are typically going to result from real-world relationship-building.

Local businesses are truly special because they serve a distinct, physical community made up of their own neighbors. The more involved a local business is in its own community, the more naturally link opportunities arise from things like local:

  • Sponsorships
  • Event participation and hosting
  • Online news
  • Blogs
  • Business associations
  • B2B cross-promotions

There are so many ways a local business can build genuine topical and domain authority in a given community by dint of the relationships it develops with neighbors.

An excellent way to get started on this effort is to look at high-ranking local businesses in the same or similar business categories to discover what work they’ve put in to achieve a supportive backlink profile. Moz Link Intersect is an extremely actionable resource for this, enabling a business to input its top competitors to find who is linking to them.

In the following example, a small B&B in Albuquerque looks up two luxurious Tribal resorts in its city:

Link Intersect then lists out a blueprint of opportunities, showing which links one or both competitors have earned. Drilling down, the B&B finds that Marriott.com is linking to both Tribal resorts on an Albuquerque things-to-do page:

The small B&B can then try to earn a spot on that same page, because it hosts lavish tea parties as a thing-to-do. Outreach could depend on the B&B owner knowing someone who works at the local Marriott personally. It could include meeting with them in person, or on the phone, or even via email. If this outreach succeeds, an excellent, relevant link will have been earned to boost organic rank, underpinning local rank.

Then, repeat the process. Aristotle might well have been speaking of link building when he said we are what we repeatedly do and that excellence is a habit. Good marketers can teach customers to have excellent habits in recognizing a good link opportunity when they see it.

Taken altogether

Without a website, a local business lacks the brand-controlled publishing and link-earning platform that so strongly influences organic rankings. In the absence of this, the chances of ranking well in competitive local packs will be significantly less. Taken altogether, the case is clear for local businesses investing substantially in their websites.

Acting now is actually a strategy for the future

“There is nothing permanent except change.”
- Heraclitus

You’ve now determined that strong websites are fundamental to local rankings in competitive markets. You’ve absorbed numerous reasons to encourage local businesses you market to prioritize care of their domains. But there’s one more thing you’ll need to be able to convey, and that’s a sense of urgency.

Right now, every single customer you can still earn from a free local pack listing is immensely valuable for the future.

This isn’t a customer you’ve had to pay Google for, as you very well might six months, a year, or five years from now. Yes, you’ve had to invest plenty in developing the strong website that contributed to the high local ranking, but you haven’t paid a penny directly to Google for this particular lead. Soon, you may be having to fork over commissions to Google for a large portion of your new customers, so acting now is like insurance against future spend.

For this to work out properly, local businesses must take the leads Google is sending them right now for free, and convert them into long-term, loyal customers, with an ultimate value of multiple future transactions without Google as a the middle man. And if these freely won customers can be inspired to act as word-of-mouth advocates for your brand, you will have done something substantial to develop a stream of non-Google-dependent revenue.

This offer may well expire as time goes by. When it comes to the capricious local SERPs, marketers resemble the Greek philosophers who knew that change is the only constant. The Trojan horse has rolled into every US city, and it’s a gift with a questionable shelf life. We can’t predict if or when free packs might become obsolete, but we share your concerns about the way the wind is blowing.

What we can see clearly right now is that websites will be anything but obsolete in 2019. Rather, they are the building blocks of local rankings, precious free leads, and loyal revenue, regardless of how SERPs may alter in future.

For more insights into where local businesses should focus in 2019, be sure to explore the Moz State of Local SEO industry report:

Read the State of Local SEO industry report

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Why Content Marketers Need Editors

I’m good at math. If you looked at my standardized test results from when I was back in school, you’d…

The post Why Content Marketers Need Editors appeared first on Copyblogger.


Copyblogger

Related Articles

Posted in Latest NewsComments Off

Do You Need Local Pages? – Whiteboard Friday

Posted by Tom.Capper

Does it make sense for you to create local-specific pages on your website? Regardless of whether you own or market a local business, it may make sense to compete for space in the organic SERPs using local pages. Please give a warm welcome to our friend Tom Capper as he shares a 4-point process for determining whether local pages are something you should explore in this week’s Whiteboard Friday!


Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hello, Moz fans. Welcome to another Whiteboard Friday. I’m Tom Capper. I’m a consultant at Distilled, and today I’m going to be talking to you about whether you need local pages. Just to be clear right off the bat what I’m talking about, I’m not talking about local rankings as we normally think of them, the local map pack results that you see in search results, the Google Maps rankings, that kind of thing.

A 4-step process to deciding whether you need local pages

I’m talking about conventional, 10 blue links rankings but for local pages, and by local pages I mean pages from a national or international business that are location-specific. What are some examples of that? Maybe on Indeed.com they would have a page for jobs in Seattle. Indeed doesn’t have a bricks-and-mortar premises in Seattle, but they do have a page that is about jobs in Seattle.

You might get a similar thing with flower delivery. You might get a similar thing with used cars, all sorts of different verticals. I think it can actually be quite a broadly applicable tactic. There’s a four-step process I’m going to outline for you. The first step is actually not on the board. It’s just doing some keyword research.

1. Know (or discover) your key transactional terms

I haven’t done much on that here because hopefully you’ve already done that. You already know what your key transactional terms are. Because whatever happens you don’t want to end up developing location pages for too many different keyword types because it’s gong to bloat your site, you probably just need to pick one or two key transactional terms that you’re going to make up the local variants of. For this purpose, I’m going to talk through an SEO job board as an example.

2. Categorize your keywords as implicit, explicit, or near me and log their search volumes

We might have “SEO jobs” as our core head term. We then want to figure out what the implicit, explicit, and near me versions of that keyword are and what the different volumes are. In this case, the implicit version is probably just “SEO jobs.” If you search for “SEO jobs” now, like if you open a new tab in your browser, you’re probably going to find that a lot of local orientated results appear because that is an implicitly local term and actually an awful lot of terms are using local data to affect rankings now, which does affect how you should consider your rank tracking, but we’ll get on to that later.

SEO jobs, maybe SEO vacancies, that kind of thing, those are all going to be going into your implicitly local terms bucket. The next bucket is your explicitly local terms. That’s going to be things like SEO jobs in Seattle, SEO jobs in London, and so on. You’re never going to get a complete coverage of different locations. Try to keep it simple.

You’re just trying to get a rough idea here. Lastly you’ve got your near me or nearby terms, and it turns out that for SEO jobs not many people search SEO jobs near me or SEO jobs nearby. This is also going to vary a lot by vertical. I would imagine that if you’re in food delivery or something like that, then that would be huge.

3. Examine the SERPs to see whether local-specific pages are ranking

Now we’ve categorized our keywords. We want to figure out what kind of results are going to do well for what kind of keywords, because obviously if local pages is the answer, then we might want to build some.

In this case, I’m looking at the SERP for “SEO jobs.” This is imaginary. The rankings don’t really look like this. But we’ve got SEO jobs in Seattle from Indeed. That’s an example of a local page, because this is a national business with a location-specific page. Then we’ve got SEO jobs Glassdoor. That’s a national page, because in this case they’re not putting anything on this page that makes it location specific.

Then we’ve got SEO jobs Seattle Times. That’s a local business. The Seattle Times only operates in Seattle. It probably has a bricks-and-mortar location. If you’re going to be pulling a lot of data of this type, maybe from stats or something like that, obviously tracking from the locations that you’re mentioning, where you are mentioning locations, then you’re probably going to want to categorize these at scale rather than going through one at a time.

I’ve drawn up a little flowchart here that you could encapsulate in a Excel formula or something like that. If the location is mentioned in the URL and in the domain, then we know we’ve got a local business. Most of the time it’s just a rule of thumb. If the location is mentioned in the URL but not mentioned in the domain, then we know we’ve got a local page and so on.

4. Compare & decide where to focus your efforts

You can just sort of categorize at scale all the different result types that we’ve got. Then we can start to fill out a chart like this using the rankings. What I’d recommend doing is finding a click-through rate curve that you are happy to use. You could go to somewhere like AdvancedWebRanking.com, download some example click-through rate curves.

Again, this doesn’t have to be super precise. We’re looking to get a proportionate directional indication of what would be useful here. I’ve got Implicit, Explicit, and Near Me keyword groups. I’ve got Local Business, Local Page, and National Page result types. Then I’m just figuring out what the visibility share of all these types is. In my particular example, it turns out that for explicit terms, it could be worth building some local pages.

That’s all. I’d love to hear your thoughts in the comments. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

The Minimum Viable Knowledge You Need to Work with JavaScript & SEO Today

Posted by sergeystefoglo

If your work involves SEO at some level, you’ve most likely been hearing more and more about JavaScript and the implications it has on crawling and indexing. Frankly, Googlebot struggles with it, and many websites utilize modern-day JavaScript to load in crucial content today. Because of this, we need to be equipped to discuss this topic when it comes up in order to be effective.

The goal of this post is to equip you with the minimum viable knowledge required to do so. This post won’t go into the nitty gritty details, describe the history, or give you extreme detail on specifics. There are a lot of incredible write-ups that already do this — I suggest giving them a read if you are interested in diving deeper (I’ll link out to my favorites at the bottom).

In order to be effective consultants when it comes to the topic of JavaScript and SEO, we need to be able to answer three questions:

  1. Does the domain/page in question rely on client-side JavaScript to load/change on-page content or links?
  2. If yes, is Googlebot seeing the content that’s loaded in via JavaScript properly?
  3. If not, what is the ideal solution?

With some quick searching, I was able to find three examples of landing pages that utilize JavaScript to load in crucial content.

I’m going to be using Sitecore’s Symposium landing page through each of these talking points to illustrate how to answer the questions above.

We’ll cover the “how do I do this” aspect first, and at the end I’ll expand on a few core concepts and link to further resources.

Question 1: Does the domain in question rely on client-side JavaScript to load/change on-page content or links?

The first step to diagnosing any issues involving JavaScript is to check if the domain uses it to load in crucial content that could impact SEO (on-page content or links). Ideally this will happen anytime you get a new client (during the initial technical audit), or whenever your client redesigns/launches new features of the site.

How do we go about doing this?

Ask the client

Ask, and you shall receive! Seriously though, one of the quickest/easiest things you can do as a consultant is contact your POC (or developers on the account) and ask them. After all, these are the people who work on the website day-in and day-out!

“Hi [client], we’re currently doing a technical sweep on the site. One thing we check is if any crucial content (links, on-page content) gets loaded in via JavaScript. We will do some manual testing, but an easy way to confirm this is to ask! Could you (or the team) answer the following, please?

1. Are we using client-side JavaScript to load in important content?

2. If yes, can we get a bulleted list of where/what content is loaded in via JavaScript?”

Check manually

Even on a large e-commerce website with millions of pages, there are usually only a handful of important page templates. In my experience, it should only take an hour max to check manually. I use the Chrome Web Developers plugin, disable JavaScript from there, and manually check the important templates of the site (homepage, category page, product page, blog post, etc.)

In the example above, once we turn off JavaScript and reload the page, we can see that we are looking at a blank page.

As you make progress, jot down notes about content that isn’t being loaded in, is being loaded in wrong, or any internal linking that isn’t working properly.

At the end of this step we should know if the domain in question relies on JavaScript to load/change on-page content or links. If the answer is yes, we should also know where this happens (homepage, category pages, specific modules, etc.)

Crawl

You could also crawl the site (with a tool like Screaming Frog or Sitebulb) with JavaScript rendering turned off, and then run the same crawl with JavaScript turned on, and compare the differences with internal links and on-page elements.

For example, it could be that when you crawl the site with JavaScript rendering turned off, the title tags don’t appear. In my mind this would trigger an action to crawl the site with JavaScript rendering turned on to see if the title tags do appear (as well as checking manually).

Example

For our example, I went ahead and did a manual check. As we can see from the screenshot below, when we disable JavaScript, the content does not load.

In other words, the answer to our first question for this pages is “yes, JavaScript is being used to load in crucial parts of the site.”

Question 2: If yes, is Googlebot seeing the content that’s loaded in via JavaScript properly?

If your client is relying on JavaScript on certain parts of their website (in our example they are), it is our job to try and replicate how Google is actually seeing the page(s). We want to answer the question, “Is Google seeing the page/site the way we want it to?”

In order to get a more accurate depiction of what Googlebot is seeing, we need to attempt to mimic how it crawls the page.

How do we do that?

Use Google’s new mobile-friendly testing tool

At the moment, the quickest and most accurate way to try and replicate what Googlebot is seeing on a site is by using Google’s new mobile friendliness tool. My colleague Dom recently wrote an in-depth post comparing Search Console Fetch and Render, Googlebot, and the mobile friendliness tool. His findings were that most of the time, Googlebot and the mobile friendliness tool resulted in the same output.

In Google’s mobile friendliness tool, simply input your URL, hit “run test,” and then once the test is complete, click on “source code” on the right side of the window. You can take that code and search for any on-page content (title tags, canonicals, etc.) or links. If they appear here, Google is most likely seeing the content.

Search for visible content in Google

It’s always good to sense-check. Another quick way to check if GoogleBot has indexed content on your page is by simply selecting visible text on your page, and doing a site:search for it in Google with quotations around said text.

In our example there is visible text on the page that reads…

“Whether you are in marketing, business development, or IT, you feel a sense of urgency. Or maybe opportunity?”

When we do a site:search for this exact phrase, for this exact page, we get nothing. This means Google hasn’t indexed the content.

Crawling with a tool

Most crawling tools have the functionality to crawl JavaScript now. For example, in Screaming Frog you can head to configuration > spider > rendering > then select “JavaScript” from the dropdown and hit save. DeepCrawl and SiteBulb both have this feature as well.

From here you can input your domain/URL and see the rendered page/code once your tool of choice has completed the crawl.

Example:

When attempting to answer this question, my preference is to start by inputting the domain into Google’s mobile friendliness tool, copy the source code, and searching for important on-page elements (think title tag, <h1>, body copy, etc.) It’s also helpful to use a tool like diff checker to compare the rendered HTML with the original HTML (Screaming Frog also has a function where you can do this side by side).

For our example, here is what the output of the mobile friendliness tool shows us.

After a few searches, it becomes clear that important on-page elements are missing here.

We also did the second test and confirmed that Google hasn’t indexed the body content found on this page.

The implication at this point is that Googlebot is not seeing our content the way we want it to, which is a problem.

Let’s jump ahead and see what we can recommend the client.

Question 3: If we’re confident Googlebot isn’t seeing our content properly, what should we recommend?

Now we know that the domain is using JavaScript to load in crucial content and we know that Googlebot is most likely not seeing that content, the final step is to recommend an ideal solution to the client. Key word: recommend, not implement. It’s 100% our job to flag the issue to our client, explain why it’s important (as well as the possible implications), and highlight an ideal solution. It is 100% not our job to try to do the developer’s job of figuring out an ideal solution with their unique stack/resources/etc.

How do we do that?

You want server-side rendering

The main reason why Google is having trouble seeing Sitecore’s landing page right now, is because Sitecore’s landing page is asking the user (us, Googlebot) to do the heavy work of loading the JavaScript on their page. In other words, they’re using client-side JavaScript.

Googlebot is literally landing on the page, trying to execute JavaScript as best as possible, and then needing to leave before it has a chance to see any content.

The fix here is to instead have Sitecore’s landing page load on their server. In other words, we want to take the heavy lifting off of Googlebot, and put it on Sitecore’s servers. This will ensure that when Googlebot comes to the page, it doesn’t have to do any heavy lifting and instead can crawl the rendered HTML.

In this scenario, Googlebot lands on the page and already sees the HTML (and all the content).

There are more specific options (like isomorphic setups)

This is where it gets to be a bit in the weeds, but there are hybrid solutions. The best one at the moment is called isomorphic.

In this model, we’re asking the client to load the first request on their server, and then any future requests are made client-side.

So Googlebot comes to the page, the client’s server has already executed the initial JavaScript needed for the page, sends the rendered HTML down to the browser, and anything after that is done on the client-side.

If you’re looking to recommend this as a solution, please read this post from the AirBNB team which covers isomorphic setups in detail.

AJAX crawling = no go

I won’t go into details on this, but just know that Google’s previous AJAX crawling solution for JavaScript has since been discontinued and will eventually not work. We shouldn’t be recommending this method.

(However, I am interested to hear any case studies from anyone who has implemented this solution recently. How has Google responded? Also, here’s a great write-up on this from my colleague Rob.)

Summary

At the risk of severely oversimplifying, here’s what you need to do in order to start working with JavaScript and SEO in 2018:

  1. Know when/where your client’s domain uses client-side JavaScript to load in on-page content or links.
    1. Ask the developers.
    2. Turn off JavaScript and do some manual testing by page template.
    3. Crawl using a JavaScript crawler.
  2. Check to see if GoogleBot is seeing content the way we intend it to.
    1. Google’s mobile friendliness checker.
    2. Doing a site:search for visible content on the page.
    3. Crawl using a JavaScript crawler.
  3. Give an ideal recommendation to client.
    1. Server-side rendering.
    2. Hybrid solutions (isomorphic).
    3. Not AJAX crawling.

Further resources

I’m really interested to hear about any of your experiences with JavaScript and SEO. What are some examples of things that have worked well for you? What about things that haven’t worked so well? If you’ve implemented an isomorphic setup, I’m curious to hear how that’s impacted how Googlebot sees your site.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Want to target position 0? Here’s what you need to make that happen

Hey Google, how do you become the answer people hear on their voice assistants? Contributor Karen Bone explains how to make that happen by doing your homework on featured snippets.

The post Want to target position 0? Here’s what you need to make that happen appeared first on Search Engine…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Do Content Writers Really Need to Think about SEO?

In my experience, creative writing pros have an endless appetite for writing advice. How to add more color and texture to your writing, storytelling techniques, endless discussions about the serial comma and finer points of usage. Elements like copywriting and conversion strategy? That tends to start to divide people up. Some writers want to pick
Read More…

The post Do Content Writers Really Need to Think about SEO? appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

You Need Both of These Skill Sets to Keep Your Audience Coming Back for More

When I’m not performing my typical duties as Rainmaker Digital’s Marketing Technologist, I’m cooking up a storm in my kitchen. Amidst the rhythmic chopping of fresh produce, the clashing of pots and pans, and the roar of boiling water, I realized that my two roles have a lot in common. They both require a balance
Read More…

The post You Need Both of These Skill Sets to Keep Your Audience Coming Back for More appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

Three Killer Skills Professional Writers Need to Succeed in 2018

What brought you here today? What are you hoping to learn, be, become, do, or change by reading Copyblogger? We’ll be asking that question a lot in the coming year, but while we wait (feel free to answer in the comments below — we’d love to hear it), allow us to talk about why we
Read More…

The post Three Killer Skills Professional Writers Need to Succeed in 2018 appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

State of Enterprise SEO 2017: Overworked SEOs Need Direction

Posted by NorthStarInbound

This survey and its analysis was co-authored with North Star Inbound’s senior creative strategist, Andrea Pretorian.

In the spring of 2017, North Star Inbound partnered up with seoClarity and BuzzStream to survey the state of enterprise SEO. We had a fair share of anecdotal evidence from our clients, but we wanted a more objective measurement of how SEO teams are assembled, what resources are allocated to them, what methods they use, and how they perform.

We hadn’t seen such data collected, particularly for enterprise SEO. We found this surprising given its significance, evident even in the number of “enterprise SEO tools” and solutions being marketed.

What is enterprise SEO?

There is no single fixed-industry definition of “enterprise” beyond “large business.” For the purposes of this survey, we defined enterprise businesses as being comprised of 500 or more employees. “Small enterprise” means 500–1000 employees, while “large enterprise” means over 1000 employees.

Industry discussion often points to the number of pages as being a potential defining factor for enterprise SEO, but even that is not necessarily a reliable measure.

What was our survey methodology?

We developed the widest enterprise SEO survey to date, made up of 29 questions that delved into every aspect of the enterprise SEO practice. From tools and tactics to content development, keyword strategy, and more, we left no stone unturned. We then picked the brains of 240 SEO specialists across the country. You can check out our complete survey, methodology, and results here.

Team size matters — or does it?

Let’s start by looking at enterprise team size and the resources allocated to them. We focused on companies with an in-house SEO team, and broke them down in terms of small (500–1000 employees) and large enterprise (>1000 employees).

We found that 76% of small enterprise companies have in-house SEO teams of 5 people or less, but were surprised that 68% of large enterprise companies also had teams of this size. We expected a more pronounced shift into larger team sizes paralleling the larger size of their parent company; we did not expect to see roughly the same team size across small and large enterprise companies.

Chart_Q4_170522.png

Interestingly, in larger companies we also see less confidence in the team’s experience in SEO. Of the companies with in-house SEO, only 31.67% of large enterprise teams called themselves “leaders” in the SEO space, which was defined in this survey as part of a team engaged broadly and critically within the business. 40% of small enterprise teams called themselves “leaders.” In terms of viewing themselves more positively (leaders, visionaries) or less (SEO pioneers in their company or else new SEO teams), we did not notice a big difference between small or large enterprise in-house SEO teams.

Large enterprise companies should have more resources at their disposal — HR teams to hire the best talent, reliable onboarding practices in place, access to more sophisticated project management tools, and more experience managing teams — which makes these results surprising. Why are large enterprise companies not more confident about their SEO skills and experience?

Before going too far in making assumptions about their increased resources, we made sure to ask our survey-takers about this. Specifically, we asked for how much budget is allocated to SEO activity per month — not including the cost of employees’ salaries, or the overhead costs of keeping the lights on — since this would result in a figure easier to report consistently across all survey takers.

It turns out that 57% of large enterprise companies had over $ 10K dedicated strictly to SEO activity each month, in contrast to just 24% of small enterprise companies allocating this much budget. 40% of large enterprise had over $ 20K dedicated to SEO activity each month, suggesting that SEO is a huge priority for them. And yet, as we saw earlier, they are not sold on their team having reached leader status.

Enterprise SEO managers in large companies value being scalable and repeatable

We asked survey takers to rate the success of their current SEO strategy, per the scale mapped below, and here are the results:

Chart_Q8_170522.png

A smaller percentage of large enterprise SEOs had a clearly positive rating of the current success of their SEO strategy than did small enterprise SEOs. We even see more large enterprise SEOs “on the fence” about their strategy’s performance as opposed to small. This suggests that, from the enterprise SEOs we surveyed, the ones who work for smaller companies tend to be slightly more optimistic about their campaigns’ performance than the larger ones.

What’s notable about the responses to this question is that 18.33% of managers at large enterprise companies would rate themselves as successful — calling themselves “scalable and repeatable.” No one at a small enterprise selected this to describe their strategy. We clearly tapped into an important value for these teams, who use it enough to measure their performance that it’s a value they can report on to others as a benchmark of their success.

Anyone seeking to work with large enterprise clients needs to make sure their processes are scalable and repeatable. This also suggests that one way for a growing company to step up its SEO team’s game as it grows is by achieving these results. This would be a good topic for us to address in greater detail in articles, webinars, and other industry communication.

Agencies know best? (Agencies think they know best.)

Regardless of the resources available to them, across the board we see that in-house SEOs do not show as much confidence as agencies. Agencies are far more likely to rate their SEO strategy as successful: 43% of survey takers who worked for agencies rated their strategy as outright successful, as opposed to only 13% of in-house SEOs. That’s huge!

While nobody said their strategy was a total disaster — we clearly keep awesome company — 7% of in-house SEOs expressed frustration with their strategy, as opposed to only 1% of agencies.

Putting our bias as a link building agency aside, we would expect in-house SEO enterprise teams to work like in-house agencies. With the ability to hire top talent and purchase enterprise software solutions to automate and track campaigns, we expect them to have the appropriate tools and resources at their disposal to generate the same results and confidence as any agency.

So why the discrepancy? It’s hard to say for sure. One theory might be that those scalable, repeatable results we found earlier that serve as benchmarks for enterprise are difficult to attain, but the way agencies evolve might serve them better. Agencies tend to develop somewhat organically — expanding their processes over time and focusing on SEO from day one — as opposed to an in-house team in a company, which rarely was there from day one and, more often than not, sprouted up when the company’s growth made it such that marketing became a priority.

One clue for answering this question might come from examining the differences between how agencies and in-house SEO teams responded to the question asking them what they find to be the top two most difficult SEO obstacles they are currently facing.

Agencies have direction, need budget; in-house teams have budget, need direction

If we look at the top three obstacles faced by agencies and in-house teams, both of them place finding SEO talent up there. Both groups also say that demonstrating ROI is an issue, although it’s more of an obstacle for agencies rather than in-house SEO teams.

When we look at the third obstacles, we find the biggest reveal. While agencies find themselves hindered by trying to secure enough budget, in-house SEO teams struggle to develop the right content; this seems in line with the point we made in the previous section comparing agency versus in-house success. Agencies have the processes down, but need to work hard to fit their clients’ budgets. In-house teams have the budget they need, but have trouble lining them up to the exact processes their company needs to grow as desired. The fact that almost half of the in-house SEOs would rank developing the right content as their biggest obstacle — as opposed to just over a quarter of agencies — further supports this, particularly given how important content is to any marketing campaign.

Now, let’s take a step back and dig deeper into that second obstacle we noted: demonstrating ROI.

Everyone seems to be measuring success differently

One question that we asked of survey takers was about the top two technical SEO issues they monitor:

The spread across the different factors were roughly the same across the two different groups. The most notable difference between the two groups was that even more in-house SEO teams looked at page speed, although this was the top factor for both groups. Indexation was the second biggest factor for both groups, followed by duplicate content. There seems to be some general consensus about monitoring technical SEO issues.

But when we asked everyone what their top two factors are when reviewing their rankings, we got these results:

For both agencies and in-house SEO teams, national-level keywords were the top factor, although this was true for almost-three quarters of in-house SEOs and about half of agencies. Interestingly, agencies focused a bit more on geo/local keywords as well as mobile. From when we first opened this data we found this striking, because it suggests a narrative where in-house SEO teams focus on more conservative, “seasoned” methods, while agencies are more likely to stay on the cutting-edge.

Looking at the “Other” responses (free response), we had several write-ins from both subgroups who answered that traffic and leads were important to them. One agency survey-taker brought up a good point: that what they monitor “differs by client.” We would be remiss if we did not mention the importance of vertical-specific and client-specific approaches — even if you are working in-house, and your only client is your company. From this angle, it makes sense that everyone is measuring rankings and SEO differently.

However, we would like to see a bit more clarity within the community on setting these parameters, and we hope that these results will foster that sort of discussion. Please do feel free to reply in the comments:

  • How do you measure ROI on your SEO efforts?
  • How do you show your campaigns’ value?
  • What would you change about how you’re currently measuring the success of your efforts?

So what’s next?

We’d love to hear about your experiences, in-house or agency, and how you’ve been able to demonstrate ROI on your campaigns.

We’re going to repeat this survey again next year, so stay tuned. We hope to survey a larger audience so that we can break down the groups we examine further and analyze response trends among the resulting subgroups. We wanted to do this here in this round of analysis, but were hesitant because of how small the resulting sample size would be.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Google (Almost Certainly) Has an Organic Quality Score (Or Something a Lot Like It) that SEOs Need to Optimize For – Whiteboard Friday

Posted by randfish

Entertain the idea, for a moment, that Google assigned a quality score to organic search results. Say it was based off of click data and engagement metrics, and that it would function in a similar way to the Google AdWords quality score. How exactly might such a score work, what would it be based off of, and how could you optimize for it?

While there’s no hard proof it exists, the organic quality score is a concept that’s been pondered by many SEOs over the years. In today’s Whiteboard Friday, Rand examines this theory inside and out, then offers some advice on how one might boost such a score.

Google's Organic Quality Score

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about organic quality score.

So this is a concept. This is not a real thing that we know Google definitely has. But there’s this concept that SEOs have been feeling for a long time, that similar to what Google has in their AdWords program with a paid quality score, where a page has a certain score assigned to it, that on the organic side Google almost definitely has something similar. I’ll give you an example of how that might work.

So, for example, if on my site.com I have these three — this is a very simplistic website — but I have these three subfolders: Products, Blog, and About. I might have a page in my products, 14axq.html, and it has certain metrics that Google associates with it through activity that they’ve seen from browser data, from clickstream data, from search data, and from visit data from the searches and bounces back to the search results, and all these kinds of things, all the engagement and click data that we’ve been talking about a lot this year on Whiteboard Friday.

So they may have these metrics, pogo stick rate and bounce rate and a deep click rate (the rate with which someone clicks to the site and then goes further in from that page), the time that they spend on the site on average, the direct navigations that people make to it each month through their browsers, the search impressions and search clicks, perhaps a bunch of other statistics, like whether people search directly for this URL, whether they perform branded searches. What rate do unique devices in one area versus another area do this with? Is there a bias based on geography or device type or personalization or all these kinds of things?

But regardless of that, you get this idea that Google has this sort of sense of how the page performs in their search results. That might be very different across different pages and obviously very different across different sites. So maybe this blog post over here on /blog is doing much, much better in all these metrics and has a much higher quality score as a result.

Current SEO theories about organic quality scoring:

Now, when we talk to SEOs, and I spend a lot of time talking to my fellow SEOs about theories around this, a few things emerge. I think most folks are generally of the opinion that if there is something like an organic quality score…

1. It is probably based on this type of data — queries, clicks, engagements, visit data of some kind.

We don’t doubt for a minute that Google has much more sophistication than the super-simplified stuff that I’m showing you here. I think Google publicly denies a lot of single types of metric like, “No, we don’t use time on site. Time on site could be very variable, and sometimes low time on site is actually a good thing.” Fine. But there’s something in there, right? They use some more sophisticated format of that.

2. We also are pretty sure that this is applying on three different levels:

This is an observation from experimentation as well as from Google statements which is…

  • Domain-wide, so that would be across one domain, if there are many pages with high quality scores, Google might view that domain differently from a domain with a variety of quality scores on it or one with generally low ones.
  • Same thing for a subdomain. So it could be that a subdomain is looked at differently than the main domain, or that two different subdomains may be viewed differently. If content appears to have high quality scores on this one, but not on this one, Google might generally not pass all the ranking signals or give the same weight to the quality scores over here or to the subdomain over here.
  • Same thing is true with subfolders, although to a lesser extent. In fact, this is kind of in descending order. So you can generally surmise that Google will pass these more across subfolders than they will across subdomains and more across subdomains than across root domains.

3. A higher density of good scores to bad ones can mean a bunch of good things:

  • More rankings in visibility even without other signals. So even if a page is sort of lacking in these other quality signals, if it is in this blog section, this blog section tends to have high quality scores for all the pages, Google might give that page an opportunity to rank well that it wouldn’t ordinarily for a page with those ranking signals in another subfolder or on another subdomain or on another website entirely.
  • Some sort of what we might call “benefit of the doubt”-type of boost, even for new pages. So a new page is produced. It doesn’t yet have any quality signals associated with it, but it does particularly well.

    As an example, within a few minutes of this Whiteboard Friday being published on Moz’s website, which is usually late Thursday night or very early Friday morning, at least Pacific time, I will bet that you can search for “Google organic quality score” or even just “organic quality score” in Google’s engine, and this Whiteboard Friday will perform very well. One of the reasons that probably is, is because many other Whiteboard Friday videos, which are in this same subfolder, Google has seen them perform very well in the search results. They have whatever you want to call it — great metrics, a high organic quality score — and because of that, this Whiteboard Friday that you’re watching right now, the URL that you see in the bar up above is almost definitely going to be ranking well, possibly in that number one position, even though it’s brand new. It hasn’t yet earned the quality signals, but Google assumes, it gives it the benefit of the doubt because of where it is.

  • We surmise that there’s also more value that gets passed from links, both internal and external, from pages with high quality scores. That is right now a guess, but something we hope to validate more, because we’ve seen some signs and some testing that that’s the case.

3 ways to boost your organic quality score

If this is true — and it’s up to you whether you want to believe that it is or not — even if you don’t believe it, you’ve almost certainly seen signs that something like it’s going on. I would urge you to do these three things to boost your organic quality score or whatever you believe is causing these same elements.

1. You could add more high-performing pages. So if you know that pages perform well and you know what those look like versus ones that perform poorly, you can make more good ones.

2. You can improve the quality score of existing pages. So if this one is kind of low, you’re seeing that these engagement and use metrics, the SERP click-through rate metrics, the bounce rate metrics from organic search visits, all of these don’t look so good in comparison to your other stuff, you can boost it, improve the content, improve the navigation, improve the usability and the user experience of the page, the load time, the visuals, whatever you’ve got there to hold searchers’ attention longer, to keep them engaged, and to make sure that you’re solving their problem. When you do that, you will get higher quality scores.

3. Remove low-performing pages through a variety of means. You could take a low-performing page and you might say, “Hey, I’m going to redirect that to this other page, which does a better job answering the query anyway.” Or, “Hey, I’m going to 404 that page. I don’t need it anymore. In fact, no one needs it anymore.” Or, “I’m going to no index it. Some people may need it, maybe the ones who are visitors to my website, who need it for some particular direct navigation purpose or internal purpose. But Google doesn’t need to see it. Searchers don’t need it. I’m going to use the no index, either in the meta robots tag or in the robots.txt file.”

One thing that’s really interesting to note is we’ve seen a bunch of case studies, especially since MozCon, when Britney Muller, Moz’s Head of SEO, shared the fact that she had done some great testing around removing tens of thousands of low-quality, really low-quality performing pages from Moz’s own website and seen our rankings and our traffic for the remainder of our content go up quite significantly, even controlling for seasonality and other things.

That was pretty exciting. When we shared that, we got a bunch of other people from the audience and on Twitter saying, “I did the same thing. When I removed low-performing pages, the rest of my site performed better,” which really strongly suggests that there’s something like a system in this fashion that works in this way.

So I’d urge you to go look at your metrics, go find pages that are not performing well, see what you can do about improving them or removing them, see what you can do about adding new ones that are high organic quality score, and let me know your thoughts on this in the comments.

We’ll look forward to seeing you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Advert