Tag Archive | "back"

You Need Both of These Skill Sets to Keep Your Audience Coming Back for More

When I’m not performing my typical duties as Rainmaker Digital’s Marketing Technologist, I’m cooking up a storm in my kitchen. Amidst the rhythmic chopping of fresh produce, the clashing of pots and pans, and the roar of boiling water, I realized that my two roles have a lot in common. They both require a balance
Read More…

The post You Need Both of These Skill Sets to Keep Your Audience Coming Back for More appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

A Look Back at a Great 2017: 5 Major Moz Product Investments and a Sneak Peek Into 2018

Posted by adamf

It’s hard to believe that 2017 is already past. We entered the year with big ambitions and we’ve made some great strides. As has become tradition, I’ve compiled a rundown of some of the most interesting updates that you may have seen (or missed) this past year. We’ve intentionally focused on significant product updates, but I’ve also shared a little about some newer programs that provide value for customers in different ways.

TL;DR, here are some of the larger and more interesting additions to Moz in 2017:

  1. Keywords by Site: Keyword Explorer adds site-based keyword research and competitive intelligence
  2. Site Crawl V2: Overhauled Site Crawl for better auditing and workflow
  3. Major investments in infrastructure: Better performance and resilience across the Moz toolset
  4. New instructor-led training programs: Targeted classes to level-up your SEO knowledge
  5. Customer Success: Custom walkthroughs to help you get the most out of Moz
  6. Bonus! MozPod: Moz’s new free podcast keeps you up to date on the latest industry topics and trends

Big updates

This year and last, we’ve been spending a disproportionate focus on releasing large infrastructural improvements, new datasets, and foundational product updates. We feel these are crucial elements that serve the core needs of SEOs and will fuel frequent improvements and iterations for years to come.

To kick things off, I wanted to share some details about two big updates from 2017.


1) Keywords by Site: Leveling up keyword research and intelligence

Rank tracking provides useful benchmarks and insights for specific, targeted keywords, but you can’t track all of the keywords that are relevant to you. Sometimes you need a broader look at how visible your sites (and your competitors’ sites) are in Google results.

We built Keywords by Site to provide this powerful view into your Google presence. This brand-new dataset in Moz significantly extends Keyword Explorer and improves the quality of results in many other areas throughout Moz Pro. Our US corpus currently includes 40 million Google SERPs updated every two weeks, and allows you to do the following:

See how visible your site is in Google results

This view not only shows how authoritative a site is from a linking perspective, but also shows how prominent a site is in Google search results.

Compare your ranking prominence to your competitors

Compare up to three sites to get a feel for their relative scale of visibility and keyword ranking overlap. Click on any section in the Venn diagram to view the keywords that fall into that section.

Dig deep: Sort, filter, and find opportunities, then stash them in keyword lists

For example, let’s say you’re looking to determine which pages or content on your site might only require a little nudge to garner meaningful search visibility and traffic. Run a report for your site in Keyword Explorer and then use the filters to quickly hone in on these opportunities:

Our focus on data quality

We’ve made a few decisions to help ensure the freshness and accuracy of our keyword corpus. These extend the cost and work to maintain this dataset, but we feel they make a discernible difference in quality.

  • We recollect all of our keyword data every 2 weeks. This means that the results you see are more recent and more similar to the results on the day that you’re researching.
  • We cycle up to 15 million of our keywords out on a monthly basis. This means that as new keywords or terms trend up in popularity, we add them to our corpus, replacing terms that are no longer getting much search volume.

A few improvements we’ve made since launch:

  • Keyword recommendations in your campaigns (tracked sites) are much improved and now backed by our keyword corpus.
  • These keyword suggestions are also included in your weekly insights, suggesting new keywords worth tracking and pages worth optimizing.
  • Coming very soon: We’re also on the cusp of launching keyword corpuses for the UK, Canada, and Australia. Stay tuned.

A few resources to help you get more from Keywords by Site:

Try out Keywords by Site!


2) Site Crawl V2: Big enhancements to site crawling and auditing

Another significant project we completed in 2017 was a complete rewrite of our aging Site Crawler. In short, our new crawler is faster, more reliable, can crawl more pages, and surfaces more issues. We’ve also made some enhancements to the workflow, to make regular crawls more customizable and easy to manage. Here are a few highlights:

Week-over-week crawl comparisons

Our new crawler keeps tabs on what happened in your previous crawl to show you which specific issues are no longer present, and which are brand new.

Ignore (to hide) individual issues or whole issue types

This feature was added in response to a bunch of customer requests. While Moz does its best to call out the issues and priorities that apply to most sites, not all sites or SEOs have the same needs. For example, if you regularly noindex a big portion of your site, you don’t need us to keep reminding you that you’ve applied noindex to a huge number of pages. If you don’t want them showing your reports, just ignore individual issues or the entire issue type.

Another workflow improvement we added was the ability to mark an issue as fixed. This allows you to get it out of your way until the next crawl runs and verifies the fix.

All Pages view with improved sorting and filtering

If you’re prioritizing across a large number of pages or trying to track down an issue in a certain area of your site, you can now sort all pages crawled by Issue Count, Page Authority, or Crawl Depth. You can also filter to show, for instance, all pages in the /blog section of my site that are redirects, and have a crawl issue.

Recrawl to verify fixes

Moz’s crawler monitors your site by crawling it every week. But if you’ve made some changes and want to verify them, you can now recrawl your site in between regular weekly crawls instead of waiting for the next crawl the start.

Seven new issues checked and tracked

These include such favorites as detecting Thin Content, Redirect Chains, and Slow Pages. While we were at it, we revamped duplicate page detection and improved the UI to help you better analyze clusters of duplicate content and figure out which page should be canonical.

A few resources to help you get more from Site Crawl:


3) Major investments in infrastructure for performance and resilience

You may not have directly noticed many of the updates we’ve made this year. We made some significant investments in Moz Pro and Moz Local to make them faster, more reliable, and allow us to build new features more quickly. But here are a few tangible manifestations of these efforts:

“Infinite” history on organic Moz Pro search traffic reports

Okay, infinite is a bit of a stretch, but we used to only show the last 12 months or weeks of data. Now we’ll show data from the very inception of a campaign, broken down by weeks or months. This is made possible by an updated architecture that makes full historical data easy to surface and present in the application. It also allows for custom access to selected date ranges.

Also worth noting is that the new visualization shows how many different pages were receiving organic search traffic in context with total organic search traffic. This can help you figure out whether traffic increase was due to improved rankings across many pages, or just a spike in organic traffic for one or a few pages.

More timely and reliable access to Moz Local data at all scales

As Moz Local has brought on more and bigger customers with large numbers of locations, the team discovered a need to bolster systems for speed and reliability. A completely rebuilt scheduling system and improved core location data systems help ensure all of your data is collected and easy to access when you need it.

Improved local data distribution

Moz Local distributes your location data through myriad partners, each of which have their own formats and interfaces. The Local team updated and fine-tuned those third-party connections to improve the quality of the data and speed of distribution.


4) New instructor-led training programs: Never stop learning

Not all of our improvements this year have shown up in the product. Another investment we’ve made is in training. We’ve gotten a lot of requests for this over the years and are finally delivering. Brian Childs, our trainer extraordinaire, has built this program from the ground up. It includes:

  • Boot camps to build up core skills
  • Advanced Seminars to dig into more intensive topics
  • Custom Training for businesses that want a more tailored approach

We have even more ambitious plans for 2018, so if training interests you, check out all of our training offerings here.


5) Customer Success: Helping customers get the most out of Moz

Our customer success program took off this year and has one core purpose: to help customers get maximum value from Moz. Whether you’re a long-time customer looking to explore new features or you’re brand new to Moz and figuring out how to get started, our success team offers product webinars every week, as well as one-on-one product walkthroughs tailored to your needs, interests, and experience level.

The US members of our customer success team hone their skills at a local chocolate factory (Not pictured: our fantastic team members in the UK, Australia, and Dubai)

If you want to learn more about Moz Pro, check out a webinar or schedule a walkthrough.


Bonus! MozPod: Moz’s new free podcast made its debut

Okay, this really strays from product news, but another fun project that’s been gaining momentum is MozPod. This came about as a side passion project by our ever-ambitious head trainer. Lord knows that SEO and digital marketing are fast-moving and ever-changing; to help you keep up on hot topics and new developments, we’ve started the Mozpod. This podcast covers a range of topics, drawing from the brains of key folks in the industry. With topics ranging from structured data and app store optimization to machine learning and even blockchain, there’s always something interesting to learn about. If you’ve got an idea for an episode or a topic you’d like to hear about, submit it here.

Join Brian every week for a new topic and guest:


What’s next?

We have a lot planned for 2018 — probably way too much. But one thing I can promise is that it won’t be a dull year. I prefer not to get too specific about projects that we’ve not yet started, but here are a few things already in the works:

  • A significant upgrade to our link data and toolset
  • On-demand Site Crawl
  • Added keyword research corpuses for the UK, Australia, and Canada
  • Expanded distribution channels for local to include Facebook, Waze, and Uber
  • More measurement and analytics features around local rankings, categories, & keywords
  • Verticalized solutions to address specific local search needs in the restaurant, hospitality, financial, legal, & medical sectors

On top of these and many other features we’re considering, we also plan to make it a lot easier for you to use our products. Right now, we know it can be a bit disjointed within and between products. We plan to change that.

We’ve also waited too long to solve for some specific needs of our agency customers. We’re prioritizing some key projects that’ll make their jobs easier and their relationships with Moz more valuable.


Thank you!

Before I go, I just want to thank you all for sharing your support, suggestions, and critical feedback. We strive to build the best SEO data and platform for our diverse and passionate customers. We could not succeed without you. If you’d like to be a part of making Moz a better platform, please let us know. We often reach out to customers and community members for feedback and insight, so if you’re the type who likes to participate in user research studies, customer interviews, beta tests, or surveys, please volunteer here.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

People Buy From People: Five examples of how to bring the humanity back to marketing

As a B2B marketer, you understand how important the right tech stack is. But don’t forget why you have that technology to begin with – to serve customers. So get some ideas for how to best serve prospective customers as people in today’s blog post.
MarketingSherpa Blog

Related Articles

Posted in Latest NewsComments Off

Are Any of These 5 Reasonable (but Wrong) Fears Holding You Back?

Let’s get something important out of the way: It’s okay to have fears. Actually, it can be good to have fears. Especially when it comes to any kind of entrepreneurial venture. Fear can keep you working hard. Fear can keep you asking the right questions. Fear can keep you from taking any success for granted.
Read More…

The post Are Any of These 5 Reasonable (but Wrong) Fears Holding You Back? appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

"SEO Is Always Changing"… Or Is It?: Debunking the Myth and Getting Back to Basics

Posted by bridget.randolph

Recently I made the shift to freelancing full-time, and it’s led me to participate in a few online communities for entrepreneurs, freelancers, and small business owners. I’ve noticed a trend in the way many of them talk about SEO; specifically, the blocks they face in attempting to “do SEO” for their businesses. Again and again, the concept that “SEO is too hard to stay on top of… it’s always changing” was being stated as a major reason that people feel a) overwhelmed by SEO; b) intimidated by SEO; and c) uninformed about SEO.

And it’s not just non-SEOs who use this phrase. The concept of “the ever-changing landscape of SEO” is common within SEO circles as well. In fact, I’ve almost certainly used this phrase myself.

But is it actually true?

To answer that question, we have to separate the theory of search engine optimization from the various tactics which we as SEO professionals spend so much time debating and testing. The more that I work with smaller businesses and individuals, the clearer it becomes to me that although the technology is always evolving and developing, and tactics (particularly those that attempt to trick Google rather than follow their guidelines) do need to adapt fairly rapidly, there are certain fundamentals of SEO that change very little over time, and which a non-specialist can easily understand.

The unchanging fundamentals of SEO

Google’s algorithm is based on an academia-inspired model of categorization and citations, which utilizes keywords as a way to decipher the topic of a page, and links from other sites (known as “backlinks”) to determine the relative authority of that site. Their method and technology keeps getting more sophisticated over time, but the principles have remained the same.

So what are these basic principles?

It comes down to answering the following questions:

  1. Can the search engine find your content? (Crawlability)
  2. How should the search engine organize and prioritize this content? (Site structure)
  3. What is your content about? (Keywords)
  4. How does the search engine know that your content provides trustworthy information about this topic? (Backlinks)

If your website is set up to help Google and other search engines answer these 4 questions, you will have covered the basic fundamentals of search engine optimization.

There is a lot more that you can do to optimize in all of these areas and beyond, but for businesses that are just starting out and/or on a tight budget, these are the baseline concepts you’ll need to know.

Crawlability

You could have the best content in the world, but it won’t drive any search traffic if the search engines can’t find it. This means that the crawlability of your site is one of the most important factors in ensuring a solid SEO foundation.

In order to find your content and rank it in the search results, a search engine needs to be able to:

  1. Access the content (at least the pages that you want to rank)
  2. Read the content

This is primarily a technical task, although it is related to having a good site structure (the next core area). You may need to adapt the code, and/or use an SEO plugin if your site runs on WordPress.

For more in-depth guides to technical SEO and crawlability, check out the following posts:

Site structure

In addition to making sure that your content is accessible and crawlable, it’s also important to help search engines understand the hierarchy and relative importance of that content. It can be tempting to think that every page is equally important to rank, but failing to structure your site in a hierarchical way often dilutes the impact of your “money” pages. Instead, you should think about what the most important pages are, and structure the rest of your site around these.

When Google and other search engine crawlers visit a site, they attempt to navigate to the homepage; then click on every link. Googlebot assumes that the pages it sees the most are the most important pages. So when you can reach a page with a single click from the homepage, or when it is linked to on every page (for example, in a top or side navigation bar, or a site footer section), Googlebot will see those pages more, and will therefore consider them to be more important. For less important pages, you’ll still need to link to them from somewhere for search engines to be able to see them, but you don’t need to emphasize them quite as frequently or keep them as close to the homepage.

The main question to ask is: Can search engines tell what your most important pages are, just by looking at the structure of your website? Google’s goal is to to save users steps, so the easier you make it for them to find and prioritize your content, the more they’ll like it.

For more in-depth guides to good site structure, check out the following posts:

Keywords

Once the content you create is accessible to crawlers, the next step is to make sure that you’re giving the search engines an accurate picture of what that content is about, to help them understand which search queries your pages would be relevant to. This is where keywords come into the mix.

We use keywords to tell the search engine what each page is about, so that they can rank our content for queries which are most relevant to our website. You might hear advice to use your keywords over and over again on a page in order to rank well. The problem with this approach is that it doesn’t always create a great experience for users, and over time Google has stopped ranking pages which it perceives as being a poor user experience.

Instead, what Google is looking for in terms of keyword usage is that you:

  1. Answer the questions that real people actually have about your topic
  2. Use the terminology that real people (specifically, your target audience) actually use to refer to your topic
  3. Use the term in the way that Google thinks real people use it (this is often referred to as “user intent” or “searcher intent”).

You should only ever target one primary keyword (or phrase) per page. You can include “secondary” keywords, which are related to the primary keyword directly (think category vs subcategory). I sometimes see people attempting to target too many topics with a single page, in an effort to widen the net. But it is better to separate these out so that there’s a different page for each different angle on the topic.

The easiest way to think about this is in physical terms. Search engines’ methods are roughly based on the concept of library card catalogs, and so we can imagine that Google is categorizing pages in a similar way to a library using the Dewey decimal system to categorize books. You might have a book categorized as Romance, subcategory Gothic Romance; but you wouldn’t be able to categorize it as Romance and also Horror, even though it might be related to both topics. You can’t have the same physical book on 2 different shelves in 2 different sections of the library. Keyword targeting works the same way: 1 primary topic per page.

For more in-depth guides to keyword research and keyword targeting, check out the following posts:

Backlinks

Another longstanding ranking factor is the number of links from other sites to your content, known as backlinks.

It’s not enough for you to say that you’re the expert in something, if no one else sees it that way. If you were looking for a new doctor, you wouldn’t just go with the guy who says “I’m the world’s best doctor.” But if a trusted friend told you that they loved their doctor and that they thought you’d like her too, you’d almost certainly make an appointment.

When other websites link to your site, it helps to answer the question: “Do other people see you as a trustworthy resource?” Google wants to provide correct and complete information to people’s queries. The more trusted your content is by others, the more that indicates the value of that information and your authority as an expert.

When Google looks at a site’s backlinks, they are effectively doing the same thing that humans do when they read reviews and testimonials to decide which product to buy, which movie to see, or which restaurant to go to for dinner. If you haven’t worked with a product or business, other people’s reviews point you to what’s good and what’s not. In Google’s case, a link from another site serves as a vote of confidence for your content.

That being said, not all backlinks are treated equally when it comes to boosting your site’s rankings. They are weighted differently according to how Google perceives the quality and authority of the site that’s doing the linking. This can feel a little confusing, but when you think about it in the context of a recommendation, it becomes a lot easier to understand whether the backlinks your site is collecting are useful or not. After all, think about the last time you saw a movie. How did you choose what to see? Maybe you checked well-known critics’ reviews, checked Rotten Tomatoes, asked friends’ opinions, looked at Netflix’s suggestions list, or saw acquaintances posting about the film on social media.

When it comes to making a decision, who do you trust? As humans, we tend to use an (often unconscious) hierarchy of trust:

  1. Personalized recommendation: Close friends who know me well are most likely to recommend something I’ll like;
  2. Expert recommendation: Professional reviewers who are authorities on the art of film are likely to have a useful opinion, although it may not always totally match my personal taste;
  3. Popular recommendation: If a high percentage of random people liked the movie, this might mean it has a wide appeal and will likely be a good experience for me as well;
  4. Negative association: If someone is raving about a movie on social media and I know that they’re a terrible human with terrible taste… well, in the absence of other positive signals, that fact might actually influence me not to see the movie.

To bring this back to SEO, you can think about backlinks as the SEO version of reviews. And the same hierarchy comes into play.

  1. Personalized/contextual recommendation: For local businesses or niche markets, very specific websites like a local city’s tourism site, local business directory or very in-depth, niche fan site might be the equivalent of the “best friend recommendation”. They may not be an expert in what everyone likes, but they definitely know what works for you as an individual and in some cases, that’s more valuable.
  2. Expert recommendation: Well-known sites with a lot of inherent trust, like the BBC or Harvard University, are like the established movie critics. Broadly speaking they are the most trustworthy, but possibly lacking the context for a specific person’s needs. In the absence of a highly targeted type of content or service, these will be your strongest links.
  3. Popular recommendation: All things being equal, a lot of backlinks from a lot of different sites is seen as a signal that the content is relevant and useful.
  4. Negative association: Links that are placed via spam tactics, that you buy in bulk, or that sit on sites that look like garbage, are the website equivalent of that terrible person whose recommendation actually turns you off the movie.

If a site collects too many links from poor-quality sites, it could look like those links were bought, rather than “earned” recommendations (similar to businesses paying people to write positive reviews). Google views the buying of links as a dishonest practice, and a way of gaming their system, and therefore if they believe that you are doing this intentionally it may trigger a penalty. Even if they don’t cause a penalty, you won’t gain any real value from poor quality links, so they’re certainly not something to aim for. Because of this, some people become very risk-averse about backlinks, even the ones that came to them naturally. But as long as you are getting links from other trustworthy sources, and these high quality links make up a substantially higher percentage of your total, having a handful of lower quality sites linking to you shouldn’t prevent you from benefiting from the high quality ones.

For more in-depth guides to backlinks, check out the following posts:

Theory of Links

Getting More Links

Mitigating Risk of Links

Does anything about SEO actually change?

If SEO is really this simple, why do people talk about how it changes all the time? This is where we have to separate the theory of SEO from the tactics we use as SEO professionals to grow traffic and optimize for better rankings.

The fundamentals that we’ve covered here — crawlability, keywords, backlinks, and site structure — are the theory of SEO. But when it comes to actually making it work, you need to use tactics to optimize these areas. And this is where we see a lot of changes happening on a regular basis, because Google and the other search engines are constantly tweaking the way the algorithm understands and utilizes information from those four main areas in determining how a site’s content should rank on a results page.

The important thing to know is that, although the tactics which people use will change all the time, the goal for the search engine is always the same: to provide searchers with the information they need, as quickly and easily as possible. That means that whatever tactics and strategies you choose to pursue, the important thing is that they enable you to optimize for your main keywords, structure your site clearly, keep your site accessible, and get more backlinks from more sites, while still keeping the quality of the site and the backlinks high.

The quality test (EAT)

Because Google’s goal is to provide high-quality results, the changes that they make to the algorithm are designed to improve their ability to identify the highest quality content possible. Therefore, when tactics stop working (or worse, backfire and incur penalties), it is usually related to the fact that these tactics didn’t create high-quality outputs.

Like the fundamentals of SEO theory which we’ve already covered, the criteria that Google uses to determine whether a website or page is good quality haven’t changed all that much since the beginning. They’ve just gotten better at enforcing them. This means that you can use these criteria as a “sniff test” when considering whether a tactic is likely to be a sustainable approach long-term.

Google themselves refer to these criteria in their Search Quality Rating Guidelines with the acronym EAT, which stands for:

  • Expertise
  • Authoritativeness
  • Trustworthiness

In order to be viewed as high-quality content (on your own site) or a high-quality link (from another site to your site), the content needs to tick at least one of these boxes.

Expertise

Does this content answer a question people have? Is it a *good* answer? Do you have a more in-depth degree of knowledge about this topic than most people?

This is why you will see people talk about Google penalizing “thin” content — that just refers to content which isn’t really worth having on its own page, because it doesn’t provide any real value to the reader.

Authority

Are you someone who is respected and cited by others who know something about this topic?

This is where the value of backlinks can come in. One way to demonstrate that you are an authority on a topic is if Google sees a lot of other reputable sources referring to your content as a source or resource.

Trust

Are you a reputable person or business? Can you be trusted to take good care of your users and their information?

Because trustworthiness is a factor in determining a site’s quality, Google has compiled a list of indicators which might mean a site is untrustworthy or spammy. These include things like a high proportion of ads to regular content, behavior that forces or manipulates users into taking actions they didn’t want to take, hiding some content and only showing it to search engines to manipulate rankings, not using a secure platform to take payment information, etc.

It’s always the same end goal

Yes, SEO can be technical, and yes, it can change rapidly. But at the end of the day, what doesn’t change is the end goal. Google and the other search engines make money through advertising, and in order to get more users to see (and click on) their ads, they have to provide a great user experience. Therefore, their goal is always going to be to give the searchers the best information they can, as easily as they can, so that people will keep using their service.

As long as you understand this, the theory of SEO is pretty straightforward. It’s just about making it easy for Google to answer these questions:

  1. What is your site about?
    1. What information does it provide?
    2. What service or function does it provide?
  2. How do we know that you’ll provide the best answer or product or service for our users’ needs?
  3. Does your content demonstrate Expertise, Authoritativeness, and/or Trustworthiness (EAT)?

This is why the fundamentals have changed so little, despite the fact that the industry, technology and tactics have transformed rapidly over time.

A brief caveat

My goal with this post is not to provide step-by-step instruction in how to “do SEO,” but rather to demystify the basic theory for those who find the topic too overwhelming to know where to start, or who believe that it’s too complicated to understand without years of study. With this goal in mind, I am intentionally taking a simplified and high-level perspective. This is not to dismiss the importance of an SEO expert in driving strategy and continuing to develop and maximize value from the search channel. My hope is that those business owners and entrepreneurs who currently feel overwhelmed by this topic can gain a better grasp on the way SEO works, and a greater confidence and ease in approaching their search strategy going forward.

I have provided a few in-depth resources for each of the key areas — but you will likely want to hire a specialist or consultant to assist with analysis and implementation (certainly if you want to develop your search strategy beyond simply the “table stakes” as Rand calls it, you will need a more nuanced understanding of the topic than I can provide in a single blog post).

At the end of the day, the ideas behind SEO are actually pretty simple — it’s the execution that can be more complex or simply time-consuming. That’s why it’s important to understand that theory — so that you can be more informed if and when you do decide to partner with someone who is offering that expertise. As long as you understand the basic concepts and end goal, you’ll be able to go into that process with confidence. Good luck!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Articles

Posted in Latest NewsComments Off

Father’s Day 2017 Google Doodle brings back the cactus family from Mother’s Day

Google’s Father’s Day doodle shares the same cactus-themed artwork that was used for its Mother’s Day doodle in May.

The post Father’s Day 2017 Google Doodle brings back the cactus family from Mother’s Day appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Local SEO Spam Tactics Are Working: How You Can Fight Back

Posted by Casey_Meraz

For years, I’ve been saying that if you have a problem with spammers in local results, you can just wait it out. I mean, if Google cared about removing spam and punishing those who are regular spammers we’d see them removed fast and often, right?

While there are instances where spam has been removed, it seems these are not fast fixes, permanent fixes, or even very common. In fact, they seem few and far between. So today I’m changing my tune a bit to call more attention to the spam issues people employ that violate Google My Business terms and yet continue to win in the SERPs.

The problems are rampant and blatant. I’ve heard and seen many instances of legitimate businesses changing their names just to rank better and faster for their keywords.

Another problem is that Google is shutting down MapMaker at the end of March. Edits will still be allowed, but they’ll need to be made through Google Maps.

If Google is serious about rewarding brands in local search, they need to encourage it through their local search algorithms.

For some people, it’s gotten so bad that they’re actually suing Google. On January 13, 2017, for instance, a group of fourteen locksmiths sued Google, Yahoo, and Bing over fake spam listings, as reported by Joy Hawkins.

While some changes — like the Possum update — seemed to have a positive impact overall, root problems (such as multiple business listings) and many other issues still exist in the local search ecosystem.

And there are other technically non-spammy ways that users are also manipulating Google results. Let’s look at a couple of these examples.

It’s not all spam. Businesses are going to great lengths to stay within the GMB guidelines & manipulate results.

Let’s look at an example of a personal injury attorney in the Denver market. Recently, I came across these results when doing a search for trial attorneys:

2017-02-28_1137.png

Look at the #2 result listing, entitled “Denver Trial Lawyers.” I originally thought this was spam and wanted to report it, but I had to do my due diligence first.

To start, I needed to verify that the listing was actually spam by looking at the official business name. I pulled up their website and, to my surprise, the business name in the logo is actually “Denver Trial Lawyers.”

business name.png

This intrigued me, so I decided to see if they were using a deceptive logo to advertise the business name or if this was the actual business name.

I checked out the Colorado Secretary of State’s website and did a little digging around. After a few minutes I found the legally registered trade name through their online search portal. The formation date of this entity was 7/31/2008, so they appear to have been planning on using the name for some time.

I also reviewed their MapMaker listing history to see when this change was made and whether it reflected the trade name registration. I saw that on October 10, 2016 the business updated their MapMaker listing to reflect the new business name.

mapmaker-history.png

After all of this, I decided to take this one step further and called the business. When I did, the auto-attendant answered with “Thank you for calling Denver Trial Lawyers,” indicating that this is their legitimate business name.

I guess that, according to the Google My Business Guidelines, this can be considered OK. They state:

“Your name should reflect your business’ real-world name, as used consistently on your storefront, website, stationery, and as known to customers. Accurately representing your business name helps customers find your business online.”

But what does that mean for everyone else?

Recently, Gyi Tsakalakis also shared this beautiful screenshot on Twitter of a SERP with three businesses using their keywords in the business name:

It seems they’re becoming more and more prominent because people see they’re working.

To play devil’s advocate, there are also businesses that legitimately sport less-than-creative names, so where do you draw the line? (Note: I’ve been following some of above businesses for years; I can confirm they’ve changed their business names to include keywords).

Here’s another example

If you look closely, you’ll find more keyword- and location-stuffed business names popping up every day.

Here’s an interesting case of a business (also located in Denver) that might have been trying to take advantage of Near Me searches, as pointed out by Matt Lacuesta:

lacquesta.png

Do you think this business wanted to rank for Near Me searches in Denver? Maybe it’s just a coincidence. It’s funny, nonetheless.

How are people actively manipulating local results?

While there are many ways to manipulate a Google My Business result, today we’re going to focus on several tactics and identify the steps you can take to help fight back.

Tactic #1: Spammy business names

Probably the biggest problem in Google’s algorithm is the amount of weight they put into a business name. At a high level, it makes sense that they would treat this with a lot of authority. After all, if I’m looking for a brand name, I want to find that specific brand when I’m doing a search.

The problem is that people quickly figured out that Google gives a massive priority to businesses with keywords or locations in their business names.

In the example below, I did a search for “Fresno Personal Injury Lawyers” and was given an exact match result, as you can see in the #2 position:

fresno-.png

However, when I clicked through to the website, I found it was for a firm with a different name. In this case, they blatantly spammed their listing and have been floating by with nice rankings for quite some time.

I reported their listing a couple of times and nothing was done until I was able to escalate this. It’s important to note that the account I used to edit this listing didn’t have a lot of authority. Once an authoritative account approved my edit, it went live.

The spam listing below has the keyword and location in the business name.

We reported this listing using the process outlined below, but sadly the business owner noticed and changed it back within hours.

How can you fight back against spammy business names?

Figuring out how to fight back against people manipulating results is now your job as an SEO. In the past, some in the industry have given the acronym “SEO” a bad name due to the manipulative practices they performed. Now it’s our job to give us a better name by helping to police these issues.

Since Google MapMaker is now disappearing, you’ll need to make edits in Google Maps directly. This is also a bit of a problem, as there’s no room to leave comments for evidence.

Here are the steps you should take to report a listing with incorrect information:

  1. Make sure you’re signed into Google
  2. Locate the business on maps.google.com
  3. Once the business is located, open it up and look for the “Suggest an edit” option:

    suggest-edit.png

  4. Once you select it, you’ll be able to choose the field you want to change:
    click on what you want to edit.png
  5. Make the necessary change and then hit submit! (Don’t worry — I didn’t make the change above.)

Now, don’t expect anything to happen right away. It can take time for changes to take place. Also, the trust level of your profile seems to play a big role in how Google evaluates these changes. Getting the approval by someone with a high level of trust can make your edits go live quickly.

Make sure you check out all of these great tips from Joy Hawkins on The Ultimate Guide to Fighting Spam on Google Maps, as well.

Tactic #2: Fake business listings

Another issue that we see commonly with maps spam is fake business listings. These listings are completely false businesses that black-hat SEOs build just to rank and get more leads.

Typically we see a lot of these in the locksmith niche — it’s full of people creating fake listings. This is one of the reasons Google started doing advanced verification for locksmiths and plumbers. You can read more about that on Mike Blumenthal’s blog.

Joy Hawkins pointed out a handy tip for identifying these listings on her blog, saying:

“Many spammers who create tons of fake listings answer their phone with something generic like ‘Hello, locksmith’ or ‘Hello, service.’”

I did a quick search in Denver for a plumber and it wasn’t long before I found a listing with an exact match name. Using Joy’s tips, I called the number and it was disconnected. This seemed like an illegitimate listing to me.

Thankfully, in this case, the business wasn’t ranking highly in the search results:

2017-02-28_1254.png

When you run into these types of listings, you’ll want to take a similar approach as we did above and report the issue.

Tactic #3: Review spam

Review spam can come in many different forms. It’s clear that Google’s putting a lot of attention into reviews by adding sorting features and making stars more prominent. I think Google knows they can do a better job with their reviews overall, and I hope we see them take it a little bit more seriously.

Let’s look at a few different ways that review spam appears in search results.

Self-reviews & competitor shaming

Pretty much every business knows they need reviews, but they have trouble getting them. One way people get them is to leave them on their own business.

Recently, we saw a pretty blatant example where someone left a positive five-star review for a law firm and then five other one-star reviews for all of their competitors. You can see this below:

review-spam.png

Although it’s very unethical for these types of reviews to show up, it happens everyday. According to Google’s review and photo policies, they want to:

“Make sure that the reviews and photos on your business listing, or those that you leave at a business you’ve visited, are honest representations of the customer experience. Those that aren’t may be removed.”

While I’d say that this does violate the policies, figuring out which rule applies best is a little tricky. It appears to be a conflict of interest, as defined by Google’s review guidelines below:

"Conflict of interest: Reviews are most valuable when they are honest and unbiased. If you own or work at a place, please don’t review your own business or employer. Don’t offer or accept money, products, or services to write reviews for a business or to write negative reviews about a competitor. If you're a business owner, don't set up review stations or kiosks at your place of business just to ask for reviews written at your place of business."

In this particular case, a member of our staff, Dillon Brickhouse, reached out to Google to see what they would say.

Unfortunately, Google told Dillon that since there was no text in the review, nothing could be done. They refused to edit the review.

And, of course, this is not an isolated case. Tim Capper recently wrote an article — “Are Google My Business Guidelines & Spam Algos Working?” — in which he identified similar situations and nothing had been done.

How can you fight against review stars?

Although there will still be cases where spammy reviews are ignored until Google steps up their game, there is something you can try to remove bad reviews. In fact, Google published the exact steps on their review guidelines page here.

You can view the steps and flag a review for removal using the method below:

1. Navigate to Google Maps. 2. Search for your business using its name or address. 3. Select your business from the search results. 4. In the panel on the left, scroll to the “Review summary” section. 5. Under the average rating, click [number of] reviews. 6. Scroll to the review you’d like to flag and click the flag icon. 7. Complete the form in the window that appears and click Submit.

What can you do if the basics don’t work?

There are a ton of different ways to spam local listings. What can you do if you’ve reported the issue and nothing changes?

While edits may take up to six weeks to go live, the next step involves you getting more public about the issue. The key to the success of this approach is documentation. Take screenshots, record dates, and keep a file for each issue you’re fighting. That way you can address it head-on when you finally get the appropriate exposure.

Depending on whether or not the listing is verified, you’ll want to try posting in different forums:

Verified listings

If the listing you’re having trouble with is a verified listing, you’ll want to make a public post about it in the Google My Business Community forum. When posting, make sure to provide all corresponding evidence, screenshots, etc. to make the case very clear to the moderators. There’s a Spam and Policy section on the forum where you can do this.

Unverified listings

However, some spam listings are not verified listings. In these cases ,Joy Hawkins recommends that you engage with the Local Guides Connect Forum here.

Key takeaways

Sadly, there’s not a lot we can do outside of the basics of reporting results, but hopefully being more proactive about it and making some noise will encourage Google to take steps in the right direction.

  1. Start being more proactive about reporting listings and reviews that are ignoring the guidelines. Be sure to record the screenshots and take evidence.
  2. If the listings still aren’t being fixed after some time, escalate them to the Google My Business Community forum.
  3. Read Joy Hawkins’ post from start to finish on The Ultimate Guide to Fighting Spam in Google Maps
  4. Don’t spam local results. Seriously. It’s annoying. Continually follow and stay up-to-date on the Google My Business guidelines.
  5. Lastly, don’t think the edit you made is the final say or that it’ll stay around forever. The reality is that they could come back. During testing for this post, the listing for “Doug Allen Personal Injury Attorney Colorado Springs” came back within hours based on an owner edit.

In the future, I’m personally looking forward to seeing some major changes from Google with regards to how they rank local results and how they monitor reviews. I would love to see local penalties become as serious as manual penalties.

How do you think Google can fight this better? What are your suggestions? Let me know in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Google brings back emojis in the search results snippets for relevant queries

Google has brought back displaying emojis in the search results when the query thinks it is relevant.

The post Google brings back emojis in the search results snippets for relevant queries appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

More Articles

Posted in Latest NewsComments Off

SearchCap: Google mobile-friendly bug, emojis are back & Google app update

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google mobile-friendly bug, emojis are back & Google app update appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

SearchCap: Google mobile index, back to top button & a Doodle

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google mobile index, back to top button & a Doodle appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Related Articles

Posted in Latest NewsComments Off

Advert