Tag Archive | "2018"

Martin Luther King Jr. Day 2018 Google doodle honors Dr. King & his dream for a better world

The image was designed by guest artist Cannaday Chapman and created in collaboration with the Black Googlers Network.

The post Martin Luther King Jr. Day 2018 Google doodle honors Dr. King & his dream for a better world appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

SearchCap: Smart devices, SEO in 2018 & ad targeting

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Smart devices, SEO in 2018 & ad targeting appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

9 Predictions for SEO in 2018

Posted by randfish

For the last decade, I’ve made predictions about how the year in SEO and web marketing would go. So far, my track record is pretty decent — the correct guesses outweigh the wrong ones. But today’s the day of reckoning, to grade my performance from 2017 and, if the tally is high enough, share my list for the year ahead.

In keeping with tradition, my predictions will be graded on the following scale:

  • Nailed It (+2) – When a prediction is right on the money and the primary criteria are fulfilled
  • Partially Accurate (+1) – Predictions that are in the ballpark, but are somewhat different than reality
  • Not Completely Wrong (-1) – Those that got near the truth, but are more “incorrect” than “correct”
  • Way Off (-2) – Guesses which didn’t come close

Breakeven or better means I make new predictions for the year ahead, and under that total means my predicting days are over. Let’s see how this shakes out… I’m not nervous… You’re nervous! This sweat on my brow… It’s because… because it was raining outside. It’s Seattle! Yeesh.

Grading Rand’s 2017 Predictions

#1: Voice search will be more than 25% of all US Google searches within 12 months. Despite this, desktop volume will stay nearly flat and mobile (non-voice) will continue to grow.

+1 – We have data for desktop and mobile search volume via Jumpshot, showing that the former did indeed stay relatively flat and the other kept growing.

But, unfortunately, we don’t know the percent of searches that are done with voice rather than keyboards or screens. My guess is 25% of all searches is too high, but until Google decides to share an updated number, all we have is the old 2016 stat that 20% of mobile searches happened via voice input.

#2: Google will remain the top referrer of website traffic by 5X+. Neither Facebook, nor any other source, will make a dent.

+2Nailed it! Although, to be fair, there’s no serious challenger. The social networks and e-commerce leaders of the web want people to stay on their site, not leave and go elsewhere. No surprise Google’s the only big traffic referrer left.

#3: The Marketing Technology space will not have much consolidation (fewer exits and acquisitions, by percentage, than 2015 or 2016), but there will be at least one major exit or IPO among the major SEO software providers.

+2 – As best I can tell from Index.co’s thorough database (which, BTW, deserves more attention than Crunchbase, whose data I’ve found to be of far lower quality), Martech as a whole had nearly half the number of acquisitions in 2017 (22) versus 2016 (39). 2017 did, however, see the Yext IPO, so I’m taking full credit on this one.

#4: Google will offer paid search ads in featured snippets, knowledge graph, and/or carousels.

0 – Turns out, Google had actually done a little of this prior to 2017, which I think invalidates the prediction. Thus I’m giving myself no credit either way, though Google did expand their testing and ad types in this direction last year.

#5: Amazon search will have 4% or more of Google’s web search volume by end of year.

-2 – Way off, Rand. From the Jumpshot data, it looks like Amazon’s not even at 1% of Google’s search volume yet. I was either way too early on this one, or Amazon searches may never compete, volume-wise, with how Google’s users employ their search system.

#6: Twitter will remain independent, and remain the most valuable and popular network for publishers and influencers.

+2 – I’m actually shocked that I made this prediction given the upheaval Twitter has faced in the last few years. Still, it’s good to see a real competitor (despite their much smaller size) to Facebook stay independent.

#7: The top 10 mobile apps will remain nearly static for the year ahead, with, at most, one new entrant and 4 or fewer position changes.

+1 – I was slighly aggressive on wording this prediction, though the reality is pretty accurate. The dominance of a few companies in the mobile app world remains unchallenged. Here’s 2016′s top apps, and here’s 2017′s. The only real change was Apple Music and Amazon falling a couple spots and Pandora and Snapchat sneaking into the latter half of the list.

#8: 2017 will be the year Google admits publicly they use engagement data as an input to their ranking systems, not just for training/learning

-2 – I should have realized Google will continue to use engagement data for rankings, but they’re not gonna talk about it. They have nothing to gain from being open, and a reasonable degree of risk if they invite spammers and manipulators to mimic searchers and click for rankings (a practice that, sadly, has popped up in the gray hat SEO world, and does sometimes, unfortunately, work).

Final Score: +4 — not too shabby, so let’s continue this tradition and see what 2018 holds. I’m going to be a little more cavalier with this year’s predictions, just to keep things exciting :-)


Rand’s 9 Predictions for 2018

#1: The total number of organic clicks Google refers will drop by ~5% by the end of the year

In 2017, we saw the start of a concerning trend — fewer clicks being generated by Google search on desktop and mobile. I don’t think that was a blip. In my estimation, Google’s actions around featured snippets, knowledge panels, and better instant answers in the SERPs overall, combined with more aggressive ads and slowing search growth (at least in the United States), will lead to there being slightly less SEO opportunity in 2018 than what we had in 2017.

I don’t think this trend will accelerate much long term (i.e. it’s certainly not the end for SEO, just a time of greater competition for slightly fewer click opportunities).

#2: Twitter and LinkedIn will both take active steps to reduce the amount of traffic they refer out to other sites

Facebook, Instagram, and Snapchat have all had success algorithmically or structurally limiting clicks off their platforms and growing as a result. I think in 2018, Twitter and LinkedIn are gonna take their own steps to limit content with links from doing as well, to limit the visibility of external links in their platform, and to better reward content that keeps people on their sites.

#3: One or more major SEO software providers will shutter as a result of increased pressure from Google and heavy competition

Google Search Console is, slowly but surely, getting better. Google’s getting a lot more aggressive about making rank tracking more difficult (some rank tracking folks I’m friendly with told me that Q4 2017 was particularly gut-punching), and the SEO software field is way, way more densely packed with competitors than ever before. I estimate at least ten SEO software firms are over $ 10 million US in annual revenue (Deepcrawl, SEMRush, Majestic, Ahrefs, Conductor, Brightedge, SISTRIX, GinzaMetrics, SEOClarity, and Moz), and I’m probably underestimating at least 4 or 5 others (in local SEO, Yext is obviously huge, and 3–4 of their competitors are also above $ 10mm).

I predict this combination of factors will mean that 2018 sees one or more casualties (possibly through a less-than-rewarding acquisition rather than straight-out bankruptcy) in the SEO software space.

#4: Alexa will start to take market share away from Google, especially via devices with screens like the Echo Show

Voice search devices are useful, but somewhat limited by virtue of missing a screen. The Echo Show was the first stab at solving this, and I think in 2018 we’re going to see more and better devices as well as vastly better functionality. Even just the “Alexa, show me a photo of Rodney Dangerfield from 1965.” (see, Rand, I told you he used to be handsome!) will take away a lot of the more simplistic searches that today happen on Google and Google Images (the latter of which is a silent giant in the US search world).

#5: One of the non-Google tech giants will start on a more serious competitor to YouTube

Amazon’s feud with Google and the resulting loss of YouTube on certain devices isn’t going unnoticed in major tech company discussions. I think in 2018, that turns into a full-blown decision to invest in a competitor to the hosted video platform. There’s too much money, time, attention, and opportunity for some of the big players not to at least dip a toe in the water.

Side note: If I were an investor, I’d be pouring meetings and dollars into startups that might become this. I think acquisitions are a key way for a Facebook, an Amazon, or a Microsoft to reduce their risk here.

#6: Facebook Audience Network (that lets publishers run FB ads on their own sites) will get the investment it needs and become a serious website adtech player

Facebook ads on the web should be as big or bigger than anything Google does in this realm, mostly because the web functions more like Facebook than it does like search results pages, and FB’s got the data to make those ads high quality and relevant. Unfortunately, they’ve underinvested in Audience Network the last couple years, but I think with Facebook usage in developed countries leveling out and the company seeking ways to grow their ad reach and effectiveness, it’s time.

#7: Mobile apps will fade as the default for how brands, organizations, and startups of all sizes invest in the mobile web; PWAs and mobile-first websites will largely take their place

I’m calling it. Mobile apps, for 95% of companies and organizations who want to do well on the web, are the wrong decision. Not only that, most everyone now realizes and agrees on it. PWAs (and straightforward mobile websites) are there to pick up the slack. That’s not to say the app stores won’t continue to generate downloads or make money — they will. But those installs and dollars will flow to a very few number of apps and app developers at the very top of the charts, while the long tail of apps (which never really took off), fades into obscurity.

Side note: games are probably an exception (though even there, Nintendo Switch proved in 2017 that mobile isn’t the only or best platform for games).

#8: WordPress will continue its dominance over all other CMS’, growing its use from ~25% to 35%+ of the top few million sites on the web

While it depends what you consider “the web” to be, there’s no doubt WordPress has dominated every other CMS in the market among the most popular few million sites on it. I think 2018 will be a year when WordPress extends their lead, mostly because they’re getting more aggressive about investments in growth and marketing, and secondarily because no one is stepping up to be a suitable (free) alternative.

35%+ might sound like a bold step, but I’m seeing more and more folks moving off of other platforms for a host of reasons, and migrating to WordPress for its flexibility, its cost structure, its extensibility, and its strong ecosystem of plugins, hosting providers, security options, and developers.

#9: The United States will start to feel the pain of net neutrality’s end with worse Internet connectivity, more limitations, and a less free-and-open web

Tragically, we lost the battle to maintain Title II protections on net neutrality here in the US, and the news is a steady drumbeat of awfulness around this topic. Just recently, Trump’s FCC announced that they’d be treating far slower connections as “broadband,” thus lessening requirements for what’s considered “penetration” and “access,” all the way down to mobile connection speeds.

It’s hard to notice what this means right now, but by the end of 2018, I predict we’ll be feeling the pain through even slower average speeds, restrictions on web usage (like what we saw before Title II protections with Verizon and T-Mobile blocking services and favoring sites). In fact, my guess is that some enterprising ISP is gonna try to block cryptocurrency mining, trading, or usage as an early step.

Over time, I suspect this will lead to a tiered Internet access world here in the US, where the top 10% of American earners (and those in a few cities and states that implement their own net neutrality laws) have vastly better and free-er access (probably with more competitive pricing, too).


Now it’s time for your feedback! I want to know:

  1. Which of these predictions do you find most likely?
  2. Which do you find most outlandish?
  3. What obvious predictions do you think I’ve shamefully missed? ;-)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

How to Troubleshoot Local Ranking Failures [Updated for 2018]

Posted by MiriamEllis

I love a mystery… especially a local search ranking mystery I can solve for someone.

Now, the truth is, some ranking puzzles are so complex, they can only be solved by a formal competitive audit. But there are many others that can be cleared up by spending 15 minutes or less going through an organized 10-point checklist of the commonest problems that can cause a business to rank lower than the owner thinks it should. By zipping through the following checklist, there’s a good chance you’ll be able to find one or more obvious “whodunits” contributing to poor Google local pack visibility for a given search.

Since I wrote the original version of this post in 2014, so much has changed. Branding, tools, tactics — things are really different in 2018. Definitely time for a complete overhaul, with the goal of making you a super sleuth for your forum friends, clients, agency teammates, or executive superiors.

Let’s emulate the Stratemeyer Syndicate, which earned lasting fame by hitting on a simple formula for surfacing and solving mysteries in a most enjoyable way.

Before we break out our magnifying glass, it’s critical to stress one very important thing. The local rankings I see from an office in North Beach, San Francisco are not the rankings you see while roaming around Golden Gate park in the same city. The rankings your client in Des Moines sees for things in his town are not the same rankings you see from your apartment in Albuquerque when you look at Des Moines results. With the user having become the centroid of search for true local searches, it is no mystery at all that we see different results when we are different places, and it is no cause for concern.

And now that we’ve gotten that out of the way and are in the proper detective spirit, let’s dive into how to solve for each item on our checklist!


☑ Google updates/bugs

The first thing to ask if a business experiences a sudden change in rankings is whether Google has done something. Search Engine Land strikes me as the fastest reporter of Google updates, with MozCast offering an ongoing weather report of changes in the SERPs. Also, check out the Moz Google Algo Change history list and the Moz Blog for some of the most in-depth strategic coverage of updates, penalties, and filters.

For local-specific bugs (or even just suspected tests), check out the Local Search Forum, the Google My Business forum, and Mike Blumenthal’s blog. See if the effects being described match the weirdness you are seeing in your local packs. If so, it’s a matter of fixing a problematic practice (like iffy link building) that has been caught in an update, waiting to see how the update plays out, or waiting for Google to fix a bug or turn a dial down to normalize results.

*Pro tip: Don’t make the mistake of thinking organic updates have nothing to do with local SEO. Crack detectives know organic and local are closely connected.

☑ Eligibility to list and rank

When a business owner wants to know why he isn’t ranking well locally, always ask these four questions:

  1. Does the business have a real address? (Not a PO box, virtual office, or a string of employees’ houses!)
  2. Does the business make face-to-face contact with its customers?
  3. What city is the business in?
  4. What is the exact keyword phrase they are hoping to rank for?

If the answer is “no” to either of the first two questions, the business isn’t eligible for a Google My Business listing. And while spam does flow through Google, a lack of eligibility could well be the key to a lack of rankings.

For the third question, you need to know the city the business is in so that you can see if it’s likely to rank for the search phrase cited in the fourth question. For example, a plumber with a street address in Sugar Land, TX should not expect to rank for “plumber Dallas TX.” If a business lacks a physical location in a given city, it’s atypical for it to rank for queries that stem from or relate to that locale. It’s amazing just how often this simple fact solves local pack mysteries.

☑ Guideline spam

To be an ace local sleuth, you must commit to memory the guidelines for representing your business on Google so that you can quickly spot violations. Common acts of spam include:

  • Keyword stuffing the business name field
  • Improper wording of the business name field
  • Creating listings for ineligible locations, departments, or people
  • Category spam
  • Incorrect phone number implementation
  • Incorrect website URL implementation
  • Review guideline violations

If any of the above conundrums are new to you, definitely spend 10 minutes reading the guidelines. Make flash cards, if necessary, to test yourself on your spam awareness until you can instantly detect glaring errors. With this enhanced perception, you’ll be able to see problems that may possibly be leading to lowered rankings, or even… suspensions!

☑ Suspensions

There are two key things to look for here when a local business owner comes to you with a ranking woe:

  1. If the listing was formerly verified, but has mysteriously become unverified, you should suspect a soft suspension. Soft suspensions might occur around something like a report of keyword-stuffing the GMB business name field. Oddly, however, there is little anecdotal evidence to support the idea that soft suspensions cause ranking drops. Nevertheless, it’s important to spot the un-verification clue and tell the owner to stop breaking guidelines. It’s possible that the listing may lose reviews or images during this type of suspension, but in most cases, the owner should be able to re-verify his listing. Just remember: a soft suspension is not a likely cause of low local pack rankings.
  2. If the listing’s rankings totally disappear and you can’t even find the listing via a branded search, it’s time to suspect a hard suspension. Hard suspensions can result from a listing falling afoul of a Google guideline or new update, a Google employee, or just a member of the public who has reported the business for something like an ineligible location. If the hard suspension is deserved, as in the case of creating a listing at a fake address, then there’s nothing you can do about it. But, if a hard suspension results from a mistake, I recommend taking it to the Google My Business forum to plead for help. Be prepared to prove that you are 100% guideline-compliant and eligible in hopes of getting your listing reinstated with its authority and reviews intact.

☑ Duplicates

Notorious for their ability to divide ranking strength, duplicate listings are at their worst when there is more than one verified listing representing a single entity. If you encounter a business that seems like it should be ranking better than it is for a given search, always check for duplicates.

The quickest way to do this is to get all present and past NAP (name, address, phone) from the business and plug it into the free Moz Check Listing tool. Pay particular attention to any GMB duplicates the tool surfaces. Then:

  1. If the entity is a brick-and-mortar business or service area business, and the NAP exactly matches between the duplicates, contact Google to ask them to merge the listings. If the NAP doesn’t match and represents a typo or error on the duplicate, use the “suggest an edit” link in Google Maps to toggle the “yes/no” toggle to “yes,” and then select the radio button for “never existed.”
  2. If the duplicates represent partners in a multi-practitioner business, Google won’t simply delete them. Things get quite complicated in this scenario, and if you discover practitioner duplicates, tread carefully. There are half a dozen nuances here, including whether you’re dealing with actual duplicates, whether they represent current or past staffers, whether they are claimed or unclaimed, and even whether a past partner is deceased. There isn’t perfect industry agreement on the handling of all of the ins-and-outs of practitioner listings. Given this, I would advise an affected business to read all of the following before making a move in any direction:

☑ Missing/inaccurate listings

While you’ve got Moz Check Listing fired up, pay attention to anything it tells you about missing or inaccurate listings. The tool will show you how accurate and complete your listings on are on the major local business data aggregators, plus other important platforms like Google My Business, Facebook, Factual, Yelp, and more. Why does this matter?

  1. Google can pull information from anywhere on the web and plunk it into your Google My Business listing.
  2. While no one can quantify the exact degree to which citation/listing consistency directly impacts Google local rankings for every possible search query, it has been a top 5 ranking factor in the annual Local Search Ranking Factors survey as far back as I can remember. Recently, I’ve seen some industry discussion as to whether citations still matter, with some practitioners claiming they can’t see the difference they make. I believe that conclusion may stem from working mainly in ultra-competitive markets where everyone has already got their citations in near-perfect order, forcing practitioners to look for differentiation tactics beyond the basics. But without those basics, you’re missing table stakes in the game.
  3. Indirectly, listing absence or inconsistency impacts local rankings in that it undermines the quest for good local KPIs as well as organic authority. Every lost or misdirected consumer represents a failure to have someone click-for-directions, click-to-call, click-to-your website, or find your website at all. Online and offline traffic, conversions, reputation, and even organic authority all hang in the balance of active citation management.

☑ Lack of organic authority

Full website or competitive audits are not the work of a minute. They really take time, and deep delving. But, at a glance, you can access some quick metrics to let you know whether a business’ lack of achievement on the organic side of things could be holding them back in the local packs. Get yourself the free MozBar SEO toolbar and try this:

  1. Turn the MozBar on by clicking the little “M” at the top of your browser so that it is blue.
  2. Perform your search and look at the first few pages of the organic results, ignoring anything from major directory sites like Yelp (they aren’t competing with you for local pack rankings, eh?).
  3. Note down the Page Authority, Domain Authority, and link counts for each of the businesses coming up on the first 3 pages of the organic results.
  4. Finally, bring up the website of the business you’re investigating. If you see that the top competitors have Domain Authorities of 50 and links numbering in the hundreds or thousands, whereas your target site is well below in these metrics, chances are good that organic authority is playing a strong role in lack of local search visibility. How do we know this is true? Do some local searches and note just how often the businesses that make it into the 3-pack or the top of the local finder view have correlating high organic rankings.

Where organic authority is poor, a business has a big job of work ahead. They need to focus on content dev + link building + social outreach to begin building up their brand in the minds of consumers and the “RankBrain” of Google.

One other element needs to be mentioned here, and that’s the concept of how time affects authority. When you’re talking to a business with a ranking problem, it’s very important to ascertain whether they just launched their website or just built their local business listings last week, or even just a few months ago. Typically, if they have, the fruits of their efforts have yet to fully materialize. That being said, it’s not a given that a new business will have little authority. Large brands have marketing departments which exist solely to build tremendous awareness of new assets before they even launch. It’s important to keep that in mind, while also realizing that if the business is smaller, building authority will likely represent a longer haul.

☑ Possum effect

Where local rankings are absent, always ask:

“Are there any other businesses in your building or even on your street that share your Google category?”

If the answer is “yes,” search for the business’ desired keyword phase and look at the local finder view in Google Maps. Note which companies are ranking. Then begin to zoom in on the map, level by level, noting changes in the local finder as you go. If, a few levels in, the business you’re advising suddenly appears on the map and in the local finder, chances are good it’s the Possum filter that’s causing their apparent invisibility at the automatic zoom level.

Google Possum rolled out in September 2016, and its observable effects included a geographic diversification of the local results, filtering out many listings that share a category and are in close proximity to one another. Then, about one year later, Google initiated the Hawk update, which appears to have tightened the radius of Possum, with the result that while many businesses in the same building are still being filtered out, a number of nearby neighbors have reappeared at the automatic zoom level of the results.

If your sleuthing turns up a brand that is being impacted by Possum/Hawk, the only surefire way to beat the filter is to put in the necessary work to become the most authoritative answer for the desired search phrase. It’s important to remember that filters are the norm in Google’s local results, and have long been observed impacting listings that share an address, share a phone number, etc. If it’s vital for a particular listing to outrank all others that possess shared characteristics, then authority must be built around it in every possible way to make it one of the most dominant results.

☑ Local Service Ads effect

The question you ask here is:

“Is yours a service-area business?”

And if the answer is “yes,” then brace yourself for ongoing results disruption in the coming year.

Google’s Local Service Ads (formerly Home Service Ads) make Google the middleman between consumers and service providers, and in the 2+ years since first early testing, they’ve caused some pretty startling things to happen to local search results. These have included:

Suffice it to say, rollout to an ever-increasing number of cities and categories hasn’t been for the faint of heart, and I would hazard a guess that Google’s recent re-brand of this program signifies their intention to move beyond the traditional SAB market. One possible benefit of Google getting into this type of lead gen is that it could decrease spam, but I’m not sold on this, given that fake locations have ended up qualifying for LSA inclusion. While I honor Google’s need to be profitable, I share some of the qualms business owners have expressed about the potential impacts of this venture.

Since I can’t offer a solid prediction of what precise form these impacts will take in the coming months, the best I can do here is to recommend that if an SAB experiences a ranking change/loss, the first thing to look for is whether LSA has come to town. If so, alteration of the SERPs may be unavoidable, and the only strategy left for overcoming vanished visibility may be to pay for it… by qualifying for the program.

☑ GMB neglect

Sometimes, a lack of competitive rankings can simply be chalked up to a lack of effort. If a business wonders why they’re not doing better in the local packs, pull up their GMB listing and do a quick evaluation of:

  • Verification status – While you can rank without verifying, lack of verification is a hallmark of listing neglect.
  • Basic accuracy – If NAP or map markers are incorrect, it’s a sure sign of neglect.
  • Category choices – Wrong categories make right rankings impossible.
  • Image optimization – Every business needs a good set of the most professional, persuasive photos it can acquire, and should even consider periodic new photo shoots for seasonal freshness; imagery impacts KPIs, which are believed to impact rank.
  • Review count, sentiment and management – Too few reviews, low ratings, and lack of responses = utter neglect of this core rank/reputation-driver.
  • Hours of operation – If they’re blank or incorrect, conversions are being missed.
  • Main URL choice – Does the GMB listing point to a strong, authoritative website page or a weak one?
  • Additional URL choices – If menus, bookings, reservations, or placing orders is part of the business model, a variety of optional URLs are supported by Google and should be explored.
  • Google Posts – Early-days testing indicates that regular posting may impact rank.
  • Google Questions and Answers – Pre-populate with best FAQs and actively manage incoming questions.

There is literally no business, large or small, with a local footprint that can afford to neglect its Google My Business listing. And while some fixes and practices move the ranking needle more than others, the increasing number of consumer actions that take place within Google is reason enough to put active GMB management at the top of your list.


Closing the case

The Hardy Boys never went anywhere without their handy kit of detection tools. Their father was so confident in their utter preparedness that he even let them chase down gangs in Hong Kong and dictators in the Guyanas (which, on second thought, doesn’t seem terribly wise.) But I have that kind of confidence in you. I hope my troubleshooting checklist is one you’ll bookmark and share to be prepared for the local ranking mysteries awaiting you and your digital marketing colleagues in 2018. Happy sleuthing!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

More Website Articles

Posted in Latest NewsComments Off

2018 Search Engine Land Awards open now

Exciting news! The 4th annual awards ceremony & celebration will be held in Seattle, WA during SMX Advanced and the entire industry is invited to participate!

The post 2018 Search Engine Land Awards open now appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Three Killer Skills Professional Writers Need to Succeed in 2018

What brought you here today? What are you hoping to learn, be, become, do, or change by reading Copyblogger? We’ll be asking that question a lot in the coming year, but while we wait (feel free to answer in the comments below — we’d love to hear it), allow us to talk about why we
Read More…

The post Three Killer Skills Professional Writers Need to Succeed in 2018 appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

New Year’s Day 2018 Google doodle brings in the new year with a bright sunrise

The illustrated image includes the penguins featured in all of Google’s holiday doodle series back home on a snow-filled landscape.

The post New Year’s Day 2018 Google doodle brings in the new year with a bright sunrise appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

How to Rank in 2018: The SEO Checklist – Whiteboard Friday

Posted by randfish

It’s hard enough as it is to explain to non-SEOs how to rank a webpage. In an increasingly complicated field, to do well you’ve got to have a good handle on a wide variety of detailed subjects. This edition of Whiteboard Friday covers a nine-point checklist of the major items you’ve got to cross off to rank in the new year — and maybe get some hints on how to explain it to others, too.

How to Rank in 2018: An SEO Checklist

Click on the whiteboard image above to open a high-resolution version in a new tab!


Video Transcription

Howdy, Moz fans, and welcome to a special New Year’s edition of Whiteboard Friday. This week we’re going to run through how to rank in 2018 in a brief checklist format.

So I know that many of you sometimes wonder, “Gosh, it feels overwhelming to try and explain to someone outside the SEO profession how to get a web page ranked.” Well, you know what? Let’s explore that a little bit this week on Whiteboard Friday. I sent out a tweet asking folks, “Send me a brief checklist in 280 characters or less,” and I got back some amazing responses. I have credited some folks here when they’ve contributed. There is a ton of detail to ranking in the SEO world, to try and rank in Google’s results. But when we pull out, when we go broad, I think that just a few items, in fact just the nine we’ve got here can basically take you through the majority of what’s required to rank in the year ahead. So let’s dive into that.

I. Crawlable, accessible URL whose content Google can easily crawl and parse.

So we want Googlebot’s spiders to be able to come to this page, to understand the content that’s on there in a text readable format, to understand images and visuals or video or embeds or anything else that you’ve got on the page in a way that they are going to be able to put into their web index. That is crucial. Without it, none of the rest of this stuff even matters.

II. Keyword research

We need to know and to uncover the words and phrases that searchers are actually using to solve or to get answers to the problem that they are having in your world. Those should be problems that your organization, your website is actually working to solve, that your content will help them to solve.

What you want here is a primary keyword and hopefully a set of related secondary keywords that share the searcher’s intent. So the intent behind of all of these terms and phrases should be the same so that the same content can serve it. When you do that, we now have a primary and a secondary set of keywords that we can target in our optimization efforts.

III. Investigate the SERP to find what Google believes to be relevant to the keywords’s searches

I want you to do some SERP investigation, meaning perform a search query in Google, see what comes back to you, and then figure out from there what Google believes to be relevant to the keywords searches. What does Google think is the content that will answer this searcher’s query? You’re trying to figure out intent, the type of content that’s required, and whatever missing pieces might be there. If you can find holes where, hey, no one is serving this, but I know that people want the answer to it, you might be able to fill that gap and take over that ranking position. Thanks to Gaetano, @gaetano_nyc, for the great suggestion on this one.

IV. Have the most credible, amplifiable person or team available create content that’s going to serve the searcher’s goal and solve their task better than anyone else on page one.

There are three elements here. First, we want an actually credible, worthy of amplification person or persons to create the content. Why is that? Well, because if we do that, we make amplification, we make link building, we make social sharing way more likely to happen, and our content becomes more credible, both in the eyes of searchers and visitors as well as in Google’s eyes too. So to the degree that that is possible, I would certainly urge you to do it.

Next, we’re trying to serve the searcher’s goal and solve their task, and we want to do that better than anyone else does it on page one, because if we don’t, even if we’ve optimized a lot of these other things, over time Google will realize, you know what? Searchers are frustrated with your result compared to other results, and they’re going to rank those other people higher. Huge credit to Dan Kern, @kernmedia on Twitter, for the great suggestion on this one.

V. Craft a compelling title, meta description.

Yes, Google still does use the meta description quite frequently. I know it seems like sometimes they don’t. But, in fact, there’s a high percent of the time when the actual meta description from the page is used. There’s an even higher percentage where the title is used. The URL, while Google sometimes truncates those, also used in the snippet as well as other elements. We’ll talk about schema and other kinds of markup later on. But the snippet is something that is crucial to your SEO efforts, because that determines how it displays in the search result. How Google displays your result determines whether people want to click on your listing or someone else’s. The snippet is your opportunity to say, “Come click me instead of those other guys.” If you can optimize this, both from a keyword perspective using the words and phrases that people want, as well as from a relevancy and a pure drawing the click perspective, you can really win.

VI. Intelligently employ those primary, secondary, and related keywords

Related keywords meaning those that are semantically connected that Google is going to view as critical to proving to them that your content is relevant to the searcher’s query — in the page’s text content. Why am I saying text content here? Because if you put it purely in visuals or in video or some other embeddable format that Google can’t necessarily easily parse out, eeh, they might not count it. They might not treat it as that’s actually content on the page, and you need to prove to Google that you have the relevant keywords on the page.

VII. Where relevant and possible, use rich snippets and schema markup to enhance the potential visibility that you’re going to get.

This is not possible for everyone. But in some cases, in the case that you’re getting into Google news, or in the case that you’re in the recipe world and you can get visuals and images, or in the case where you have a featured snippet opportunity and you can get the visual for that featured snippet along with that credit, or in the case where you can get rich snippets around travel or around flights, other verticals that schema is supporting right now, well, that’s great. You should take advantage of those opportunities.

VIII. Optimize the page to load fast, as fast as possible and look great.

I mean look great from a visual, UI perspective and look great from a user experience perspective, letting someone go all the way through and accomplish their task in an easy, fulfilling way on every device, at every speed, and make it secure too. Security critically important. HTTPS is not the only thing, but it is a big part of what Google cares about right now, and HTTPS was a big focus in 2016 and 2017. It will certainly continue to be a focus for Google in 2018.

IX. You need to have a great answer to the question: Who will help amplify this and why?

When you have that great answer, I mean a specific list of people and publications who are going to help you amplify it, you’ve got to execute to earn solid links and mentions and word of mouth across the web and across social media so that your content can be seen by Google’s crawlers and by human beings, by people as highly relevant and high quality.

You do all this stuff, you’re going to rank very well in 2018. Look forward to your comments, your additions, your contributions, and feel free to look through the tweet thread as well.

Thanks to all of you who contributed via Twitter and to all of you who followed us here at Moz and Whiteboard Friday in 2017. We hope you have a great year ahead. Thanks for watching. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

SearchCap: Google images, algorithm updates & top in 2018

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google images, algorithm updates & top in 2018 appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

How Long Should Your Meta Description Be? (2018 Edition)

Posted by Dr-Pete

Summary: The end of November saw a spike in the average length of SERP snippets. Across 10K keywords (90K results), we found a definite increase but many oddities, such as video snippets. Our data suggests that many snippets are exceeding 300 characters, and going into 2018 we recommend a new meta description limit of 300 characters.

Back in spring of 2015, we reported that Google search snippets seemed to be breaking the 155-character limit, but our data suggested that these cases were fairly rare. At the end of November, RankRanger’s tools reported a sizable jump in the average search snippet length (to around 230 characters). Anecdotally, we’re seeing many long snippets in the wild, such as this 386-character one on a search for “non compete agreement”:

Search Engine Land was able to get confirmation from Google of a change in how they handle snippets, but we don’t have a lot of details. Is it time to revisit our guidelines on meta descriptions limits heading into 2018? We dug into our daily 10,000-keyword tracking data to find out…

The trouble with averages

In our 10K tracking data for December 15th, which consisted of 89,909 page-one organic results, the average display snippet (stripped of HTML, of course) was 215 characters long, slightly below RankRanger’s numbers, but well above historical trends.

This number is certainly interesting, but it leaves out quite a bit. First of all, the median character length is 186, suggesting that some big numbers are potentially skewing the average. On the other hand, some snippets are very short because their Meta Descriptions are very short. Take this snippet for Vail.com:

Sure enough, this is Vail.com’s meta description tag (I’m not gonna ask):

Do we really care that a lot of people just write ridiculously short meta descriptions? No, what we really want to know is at what point Google is cutting off long descriptions. So, let’s just look at the snippets that were cut (determined by the “…” at the end). In our data set, this leaves just about 33% (29,664), so we can already see that about two-thirds of descriptions aren’t getting cut off.

Looking at just the descriptions that were cut off, the average jumps all the way up to 292, but let’s look at the frequency distribution of the lengths of just the cut snippets. The graph below shows cut-snippet lengths in bins of 25 (0-25, 25-50, etc.):

We’ve got a clear spike in the 300-325 character range, but why are we seeing descriptions being cut off in the 100-200 character range (and some even below 100 characters? Digging in deeper, we discovered that two things were going on here…

Oddity #1: Video snippets

Spot-checking some of the descriptions cut off in the under-200 character range, we realized that a number of them were video snippets, which seem to have shorter limits:

These snippets seem to generally max out at two lines, and they’re further restricted by the space the video thumbnail occupies. In our data set, a full 88% of video snippets were cut off (ended in “…”). Separating out video, only 32% of organic snippets (removing the video results) were cut off.

Oddity #2: Pre-cut metas

A second oddity was that some meta description tags seem to be pre-truncated (possibly by CMS systems). So, the “…” in those cases is an unreliable indicator. Take this snippet, for example:

This clocks in at 150 characters, right around the old limit. Now, let’s look at the meta description:

This Goodreads snippet is being pre-truncated. This was true for almost all of the Goodreads meta descriptions in our data set, and may be a CMS setting or a conscious choice by their SEO team. Either way, it’s not very useful for our current analysis.

So, we attempted to gather all of the original meta description tags to check for pre-truncated data. We were unable to gather data from all sites, and some sites don’t use meta description tags at all, but we were still able to remove some of the noise.

Let’s try this again (…)

So, let’s pull out all of the cut snippets with video thumbnails and the ones where we know the meta description ended in “…”. This cuts us down to 26,766 snippets (about 30% of the original 89,909). Here’s what the frequency distribution of lengths looks like now:

We’ve cleaned up some of the lower end, but it’s not a dramatic difference. We’re still seeing some snippets cut at less than 200 characters. Some of these may be situations where we couldn’t retrieve the original Meta Description tag, but others seem to be legitimate cuts.

The bulk of these snippets are being cut off in the 275–350 character range. In this cleaned-up distribution, we’ve got a mean of 309 characters and a median of 317 characters. There’s still a bit of a tail to the left, so the distribution isn’t quite normal, but it’s clear that the lion’s share of cut-offs are happening in that 300-325 bin.

What about the snippets over 350 characters? It’s hard to see from this graph, but they maxed out at 375 characters. In some cases, Google is appending their own information:

While the entire snippet is 375 characters, the “Jump…” link is added by Google. The rest of the snippet is 315 characters long. Google also adds result counts and dates to the front of some snippets. These characters don’t seem to count against the limit, but it’s a bit hard to tell, because we don’t have a lot of data points.

Do metas even matter?

Before we reveal the new limit, here’s an uncomfortable question — when it seems like Google is rewriting so many snippets, is it worth having meta description tags at all? Across the data set, we were able to successfully capture 70,059 original Meta Description tags (in many of the remaining cases, the sites simply didn’t define one). Of those, just over one-third (35.9%) were used as-is for display snippets.

Keep in mind, though, that Google truncates some of these and appends extra data to some. In 15.4% of cases, Google used the original meta description tag, but added some text. This number may seem high, but most of these cases were simply Google adding a period to the end of the snippet. Apparently, Google is a stickler for complete sentences. So, now we’re up to 51.3% of cases where either the display snippet perfectly matched the meta description tag or fully contained it.

What about cases where the display snippet used a truncated version of the meta description tag? Just 3.2% of snippets matched this scenario. Putting it all together, we’re up to almost 55% of cases where Google is using all or part of the original meta description tag. This number is probably low, as we’re not counting cases where Google used part of the original meta description but modified it in some way.

It’s interesting to note that, in some cases, Google rewrote a meta description because the original description was too short or not descriptive enough. Take this result, for example:

Now, let’s check out the original meta description tag…

In this case, the original meta description was actually too short for Google’s tastes. Also note that, even though Google created the snippet themselves, they still cut it off with a “…”. This strongly suggests that cutting off a snippet isn’t a sign that Google thinks your description is low quality.

On the flip side, I should note that some very large sites don’t use meta description tags at all, and they seem to fare perfectly well in search results. One notable example is Wikipedia, a site for which defining meta descriptions would be nearly impossible without automation, and any automation would probably fall short of Google’s own capabilities.

I think you should be very careful using Wikipedia as an example of what to do (or what not do), when it comes to technical SEO, but it seems clear from the data that, in the absence of a meta description tag, Google is perfectly capable of ranking sites and writing their own snippets.

At the end of the day, I think it comes down to control. For critical pages, writing a good meta description is like writing ad copy — there’s real value in crafting that copy to drive interest and clicks. There’s no guarantee Google will use that copy, and that fact can be frustrating, but the odds are still in your favor.

Is the 155 limit dead?

Unless something changes, and given the partial (although lacking in details) confirmation from Google, I think it’s safe to experiment with longer meta description tags. Looking at the clean distribution, and just to give it a nice even number, I think 300 characters is a pretty safe bet. Some snippets that length may get cut off, but the potential gain of getting in more information (when needed) offsets that relatively small risk.

That’s not to say you should pad out your meta descriptions just to cash in on more characters. Snippets should be useful and encourage clicks. In part, that means not giving so much away that there’s nothing left to drive the click. If you’re artificially limiting your meta descriptions, though, or if you think more text would be beneficial to search visitors and create interest, then I would definitely experiment with expanding.

Update (December 19): My sincere apologies, but I discovered a substantial error in my original analysis which caused me to exclude many data points from the cut-off analysis. All counts, percentages, and graphs have been updated. The 300-character conclusion did not change based on this re-analysis.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Advert