Tag Archive | "from"

Be careful what content you cut from your site

Contributor Janet Driscoll Miller helps you determine how to make your website lean and mean without eliminating big traffic drivers.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

More Articles

Posted in Latest NewsComments Off

Google web spam report: Less than 1% of sites visited from search results are spammy

Google doubled down on removing unnatural links, reduced link spam by almost half.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Moz’s Mid-Year Retrospective: Exciting Upgrades from the First Half of 2018

Posted by NeilCrist

Every year, we publish an overview of all the upgrades we’ve made to our tools and how those changes benefit our customers and Moz Community members. So far, 2018 has been a whirlwind of activity here at Moz — not only did we release a massive, long-awaited update to our link building tool, we’ve also been improving and updating systems and tools across the board to make your Moz experience even better. To that end, we’re sharing a mid-year retrospective to keep up with the incredible amount of progress we’ve made.

We receive a lot of amazing feedback from our customers on pain points they experience and improvements they’d like to see. Folks, we hear you.

We not only massively restructured some of our internal systems to provide you with better data, we also innovated new ways to display and report on that data, making the tools more accurate and more useful than ever before.

If you’ve been tasked with achieving organic success, we know your job isn’t easy. You need tools that get the job done, and done well. We think Moz delivered.

Check out our 2018 improvements so far:

Our new link index: Bigger, fresher, better than ever

Our link index underwent a major overhaul: it’s now 20x larger and 30x fresher than it was previously. This new link index data has been made available via our Mozscape API, as well as integrated into many Moz Pro tools, including Campaigns, Keyword Explorer, the MozBar, and Fresh Web Explorer. But undoubtedly the largest and most-anticipated improvement the new link index allowed us to make was the launch of Link Explorer, which we rolled out at the end of April as a replacement for Open Site Explorer.

Link Explorer addresses and improves upon its predecessor by providing more data, fresher data, and better ways to visualize that data. Answering a long-asked-for feature in OSE, Link Explorer includes historical metrics, and it also surfaces newly discovered and lost links:

Below are just a few of the many ways Link Explorer is providing some of the best link data available:

  • Link Explorer’s link index contains approximately 4.8 trillion URLs — that’s 20x larger than OSE and surpasses Ahrefs’ index (~3 trillion pages) and Majestic’s fresh index (~1 trillion pages).
  • Link Explorer is 30x fresher than OSE. All data updates every 24 hours.
  • We believe Link Explorer is unique in how accurately our link index represents the web, resulting in data quality you can trust.
  • Link Explorer has the closest robots.txt profile to Google among the three major link indexes, which means we get more of the links Google gets.
  • We also improved Domain Authority, Page Authority, and Spam Score. The size and freshness of our index has allowed us to offer a more stable DA and PA score. Though it will still fluctuate as the index fluctuates (which has always been by design), it will not be as dramatic as it was in Open Site Explorer.

Explore your link profile

You can learn more about Link Explorer by reading Sarah Bird’s announcement, watching Rand’s Whiteboard Friday, or visiting our Link Explorer Help Guide. Even though it’s still in beta, Link Explorer already blows away OSE’s data quality, freshness, and capabilities. Look for steady improvements to Link Explorer as we continue to iterate on it and add more key features.

New-and-improved On-Page Grader

Moz’s On-Page Grader got a thorough and much-needed overhaul! Not only did we freshen up the interface with a new look and feel, but we also added new features and improved upon our data.

Inside the new On-Page Grader, you’ll find:

  • An updated metrics bar to show you Page Title, Meta Description, and the number of Keywords Found. No need to dig through source code!
  • An updated Optimization Score to align with the Page Optimization feature that’s inside Campaigns and in the MozBar. Instead of a letter grade (A–F), you now have Page Score (0–100) for a more precise measurement of page optimization performance.
  • On-page factors are now categorized so you can see what is hurting or helping your Page Score.
  • On-page factors are organized by importance so you can prioritize your efforts. Red indicates high importance, yellow indicates moderate importance, and blue indicates low importance.

On-Page Grader is a great way to take a quick look at how well a page is optimized for a specified keyword. Here’s how it works.

Input your page and the keyword you want that page to rank for…

… and On-Page Grader will return a list of suggestions for improving your on-site optimization.

Check it out!

Keyword ranking data now available for Canada, UK, and Australia

We’re very excited to announce that, as of just last week, international data has been added to the Keywords by Site feature of Keyword Explorer! This will now allow Moz Pro customers to see which keywords they rank for and assess their visibility across millions of SERPs, now encompassing the US, Canada, the United Kingdom, and Australia! Keywords by Site is a newer feature within Keyword Explorer, added just last October to show which and how many keywords any domain, subdomain, or page ranks for.

Want to see which keywords your site ranks for in the US, UK, Canada, or Australia?

See what you rank for

It’s easy to use — just select a country from the dropdown menu to the right. This will show you which keywords a domain or page is ranking for from a particular country.

On-Demand Crawl now available

We know it can be important to track your site changes in real time. That’s why, on June 29th, we’re replacing our legacy site audit tool, Crawl Test, with the new and improved On-Demand Crawl:

Whether you need to double-check a change you’ve made or need a one-off report, the new On-Demand Crawl offers an updated experience for Moz Pro customers:

  • Crawl reports are now faster and available sooner, allowing you to quickly assess your site, a new client or prospect’s, or the competition.
  • Your site issues are now categorized by issue type and quantity, making it easier to identify what to work on and how to prioritize:

  • Recommendations are now provided for how to fix each issue, along with resources detailing why it matters:

  • Site audit reports are now easier than ever to package and present with PDF exports.
  • An updated, fresh design and UX!

On-Demand Crawl is already available now in Moz Pro. If you’re curious how it works, check it out:

Try On-Demand Crawl

Improvements to tool notifications & visuals

Moz’s email notification system and tools dashboard didn’t always sync up perfectly with the actual data update times. Sometimes, customers would receive an email or see updated dates on their dashboard before the data had rolled out, resulting in confusion. We’ve streamlined the process, and now customers no longer have to wonder where their data is — you can rest assured that your data is waiting for you in Moz Pro as soon as you’re notified.

Rank Tracker is sticking around

While we had originally planned to retire Rank Tracker at the beginning of June, we’ve decided to hold off in light of the feedback we received from our customers. Our goal in retiring Rank Tracker was to make Moz Pro easier to navigate by eliminating the redundancy of having two options for tracking keyword rankings (Rank Tracker and Campaigns), but after hearing how many people use and value Rank Tracker, and after weighing our options, we decided to postpone its retirement until we had a better solution than simply shutting it down.

Right now, we’re focused on learning more from our community on what makes this tool so valuable, so if you have feedback regarding Rank Tracker, we’d love it if you would take our survey. The information we gather from this survey will help us create a better solution for you!

Updates from Moz Academy

New advanced SEO courses

In response to the growing interest in advanced and niche-specific training, Moz is now offering ongoing classes and seminars on topics such as e-commerce SEO and technical site audits. If there’s an advanced topic you’d like training on, let us know by visiting https://moz.com/training and navigating to the “Custom” tab to tell us exactly what type of training you’re looking for.

On-demand coursework

We love the fact that we have Moz customers from around the globe, so we’re always looking for new ways to accommodate those in different timezones and those with sporadic schedules. One new way we’re doing this is by offering on-demand coursework. Get training from Moz when it works best for you. With this added scheduling flexibility (and with added instructors to boot), we hope to be able to reach more people than ever before.

To view Moz’s on-demand coursework, visit moz.com/training and click on the “On-Demand” tab.

Certificate development

There’s been a growing demand for a meaningful certification program in SEO, and we’re proud to say that Moz is here to deliver. This coursework will include a certificate and a badge for your LinkedIn profile. We’re planning on launching the program later this year, so stay tuned by signing up for Moz Training Alerts!

Tell us what you think!

Have feedback for us on any of our 2018 improvements? Any ideas on new ways we can improve our tools and training resources? Let us know in the comments! We love hearing from marketers like you. Your input helps us develop the best tools possible for ensuring your content gets found online.

If you’re not a Moz Pro subscriber and haven’t gotten a chance to check out these new features yet, sign up for a free trial!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Articles

Posted in Latest NewsComments Off

How Much Data Is Missing from Analytics? And Other Analytics Black Holes

Posted by Tom.Capper

If you’ve ever compared two analytics implementations on the same site, or compared your analytics with what your business is reporting in sales, you’ve probably noticed that things don’t always match up. In this post, I’ll explain why data is missing from your web analytics platforms and how large the impact could be. Some of the issues I cover are actually quite easily addressed, and have a decent impact on traffic — there’s never been an easier way to hit your quarterly targets. ;)

I’m going to focus on GA (Google Analytics), as it’s the most commonly used provider, but most on-page analytics platforms have the same issues. Platforms that rely on server logs do avoid some issues but are fairly rare, so I won’t cover them in any depth.

Side note: Our test setup (multiple trackers & customized GA)

On Distilled.net, we have a standard Google Analytics property running from an HTML tag in GTM (Google Tag Manager). In addition, for the last two years, I’ve been running three extra concurrent Google Analytics implementations, designed to measure discrepancies between different configurations.

(If you’re just interested in my findings, you can skip this section, but if you want to hear more about the methodology, continue reading. Similarly, don’t worry if you don’t understand some of the detail here — the results are easier to follow.)

Two of these extra implementations — one in Google Tag Manager and one on page — run locally hosted, renamed copies of the Google Analytics JavaScript file (e.g. www.distilled.net/static/js/au3.js, instead of www.google-analytics.com/analytics.js) to make them harder to spot for ad blockers. I also used renamed JavaScript functions (“tcap” and “Buffoon,” rather than the standard “ga”) and renamed trackers (“FredTheUnblockable” and “AlbertTheImmutable”) to avoid having duplicate trackers (which can often cause issues).

This was originally inspired by 2016-era best practice on how to get your Google Analytics setup past ad blockers. I can’t find the original article now, but you can see a very similar one from 2017 here.

Lastly, we have (“DianaTheIndefatigable”), which just has a renamed tracker, but uses the standard code otherwise and is implemented on-page. This is to complete the set of all combinations of modified and unmodified GTM and on-page trackers.

Two of Distilled’s modified on-page trackers, as seen on https://www.distilled.net/

Overall, this table summarizes our setups:

Tracker

Renamed function?

GTM or on-page?

Locally hosted JavaScript file?

Default

No

GTM HTML tag

No

FredTheUnblockable

Yes – “tcap”

GTM HTML tag

Yes

AlbertTheImmutable

Yes – “buffoon”

On page

Yes

DianaTheIndefatigable

No

On page

No

I tested their functionality in various browser/ad-block environments by watching for the pageviews appearing in browser developer tools:

Reason 1: Ad Blockers

Ad blockers, primarily as browser extensions, have been growing in popularity for some time now. Primarily this has been to do with users looking for better performance and UX on ad-laden sites, but in recent years an increased emphasis on privacy has also crept in, hence the possibility of analytics blocking.

Effect of ad blockers

Some ad blockers block web analytics platforms by default, others can be configured to do so. I tested Distilled’s site with Adblock Plus and uBlock Origin, two of the most popular ad-blocking desktop browser addons, but it’s worth noting that ad blockers are increasingly prevalent on smartphones, too.

Here’s how Distilled’s setups fared:

(All numbers shown are from April 2018)

Setup

Vs. Adblock

Vs. Adblock with “EasyPrivacy” enabled

Vs. uBlock Origin

GTM

Pass

Fail

Fail

On page

Pass

Fail

Fail

GTM + renamed script & function

Pass

Fail

Fail

On page + renamed script & function

Pass

Fail

Fail

Seems like those tweaked setups didn’t do much!

Lost data due to ad blockers: ~10%

Ad blocker usage can be in the 15–25% range depending on region, but many of these installs will be default setups of AdBlock Plus, which as we’ve seen above, does not block tracking. Estimates of AdBlock Plus’s market share among ad blockers vary from 50–70%, with more recent reports tending more towards the former. So, if we assume that at most 50% of installed ad blockers block analytics, that leaves your exposure at around 10%.

Reason 2: Browser “do not track”

This is another privacy motivated feature, this time of browsers themselves. You can enable it in the settings of most current browsers. It’s not compulsory for sites or platforms to obey the “do not track” request, but Firefox offers a stronger feature under the same set of options, which I decided to test as well.

Effect of “do not track”

Most browsers now offer the option to send a “Do not track” message. I tested the latest releases of Firefox & Chrome for Windows 10.

Setup

Chrome “do not track”

Firefox “do not track”

Firefox “tracking protection”

GTM

Pass

Pass

Fail

On page

Pass

Pass

Fail

GTM + renamed script & function

Pass

Pass

Fail

On page + renamed script & function

Pass

Pass

Fail

Again, it doesn’t seem that the tweaked setups are doing much work for us here.

Lost data due to “do not track”: <1%

Only Firefox Quantum’s “Tracking Protection,” introduced in February, had any effect on our trackers. Firefox has a 5% market share, but Tracking Protection is not enabled by default. The launch of this feature had no effect on the trend for Firefox traffic on Distilled.net.

Reason 3: Filters

It’s a bit of an obvious one, but filters you’ve set up in your analytics might intentionally or unintentionally reduce your reported traffic levels.

For example, a filter excluding certain niche screen resolutions that you believe to be mostly bots, or internal traffic, will obviously cause your setup to underreport slightly.

Lost data due to filters: ???

Impact is hard to estimate, as setup will obviously vary on a site-by site-basis. I do recommend having a duplicate, unfiltered “master” view in case you realize too late you’ve lost something you didn’t intend to.

Reason 4: GTM vs. on-page vs. misplaced on-page

Google Tag Manager has become an increasingly popular way of implementing analytics in recent years, due to its increased flexibility and the ease of making changes. However, I’ve long noticed that it can tend to underreport vs. on-page setups.

I was also curious about what would happen if you didn’t follow Google’s guidelines in setting up on-page code.

By combining my numbers with numbers from my colleague Dom Woodman’s site (you’re welcome for the link, Dom), which happens to use a Drupal analytics add-on as well as GTM, I was able to see the difference between Google Tag Manager and misplaced on-page code (right at the bottom of the <body> tag) I then weighted this against my own Google Tag Manager data to get an overall picture of all 5 setups.

Effect of GTM and misplaced on-page code

Traffic as a percentage of baseline (standard Google Tag Manager implementation):

Google Tag Manager

Modified & Google Tag Manager

On-Page Code In <head>

Modified & On-Page Code In <head>

On-Page Code Misplaced In <Body>

Chrome

100.00%

98.75%

100.77%

99.80%

94.75%

Safari

100.00%

99.42%

100.55%

102.08%

82.69%

Firefox

100.00%

99.71%

101.16%

101.45%

90.68%

Internet Explorer

100.00%

80.06%

112.31%

113.37%

77.18%

There are a few main takeaways here:

  • On-page code generally reports more traffic than GTM
  • Modified code is generally within a margin of error, apart from modified GTM code on Internet Explorer (see note below)
  • Misplaced analytics code will cost you up to a third of your traffic vs. properly implemented on-page code, depending on browser (!)
  • The customized setups, which are designed to get more traffic by evading ad blockers, are doing nothing of the sort.

It’s worth noting also that the customized implementations actually got less traffic than the standard ones. For the on-page code, this is within the margin of error, but for Google Tag Manager, there’s another reason — because I used unfiltered profiles for the comparison, there’s a lot of bot spam in the main profile, which primarily masquerades as Internet Explorer. Our main profile is by far the most spammed, and also acting as the baseline here, so the difference between on-page code and Google Tag Manager is probably somewhat larger than what I’m reporting.

I also split the data by mobile, out of curiosity:

Traffic as a percentage of baseline (standard Google Tag Manager implementation):

Google Tag Manager

Modified & Google Tag Manager

On-Page Code In <head>

Modified & On-Page Code In <head>

On-Page Code Misplaced In <Body>

Desktop

100.00%

98.31%

100.97%

100.89%

93.47%

Mobile

100.00%

97.00%

103.78%

100.42%

89.87%

Tablet

100.00%

97.68%

104.20%

102.43%

88.13%

The further takeaway here seems to be that mobile browsers, like Internet Explorer, can struggle with Google Tag Manager.

Lost data due to GTM: 1–5%

Google Tag Manager seems to cost you a varying amount depending on what make-up of browsers and devices use your site. On Distilled.net, the difference is around 1.7%; however, we have an unusually desktop-heavy and tech-savvy audience (not much Internet Explorer!). Depending on vertical, this could easily swell to the 5% range.

Lost data due to misplaced on-page code: ~10%

On Teflsearch.com, the impact of misplaced on-page code was around 7.5%, vs Google Tag Manager. Keeping in mind that Google Tag Manager itself underreports, the total loss could easily be in the 10% range.

Bonus round: Missing data from channels

I’ve focused above on areas where you might be missing data altogether. However, there are also lots of ways in which data can be misrepresented, or detail can be missing. I’ll cover these more briefly, but the main issues are dark traffic and attribution.

Dark traffic

Dark traffic is direct traffic that didn’t really come via direct — which is generally becoming more and more common. Typical causes are:

  • Untagged campaigns in email
  • Untagged campaigns in apps (especially Facebook, Twitter, etc.)
  • Misrepresented organic
  • Data sent from botched tracking implementations (which can also appear as self-referrals)

It’s also worth noting the trend towards genuinely direct traffic that would historically have been organic. For example, due to increasingly sophisticated browser autocompletes, cross-device history, and so on, people end up “typing” a URL that they’d have searched for historically.

Attribution

I’ve written about this in more detail here, but in general, a session in Google Analytics (and any other platform) is a fairly arbitrary construct — you might think it’s obvious how a group of hits should be grouped into one or more sessions, but in fact, the process relies on a number of fairly questionable assumptions. In particular, it’s worth noting that Google Analytics generally attributes direct traffic (including dark traffic) to the previous non-direct source, if one exists.

Discussion

I was quite surprised by some of my own findings when researching this post, but I’m sure I didn’t get everything. Can you think of any other ways in which data can end up missing from analytics?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Learn from the SMX Advanced family

♫ Here’s the story of a lovely conference…♫
Please visit Search Engine Land for the full article.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

More Articles

Posted in Latest NewsComments Off

Time to Act: Review Responses Just Evolved from "Extra" to "Expected"

Posted by MiriamEllis

I’ve advocated the use of Google’s owner response review feature since it first rolled out in 2010. This vital vehicle defends brand reputation and revenue, offering companies a means of transforming dissatisfied consumers into satisfied ones, supporting retention so that less has to be spent on new customer acquisition. I consider review responses to be a core customer service responsibility. Yet, eight years into the existence of this feature, marketing forums are still filled with entry-level questions like:

  • Should I respond to reviews?
  • Should I respond to positive reviews?
  • How should I respond to negative reviews?

Over the years, I’ve seen different local SEO consultants reply in differing degrees to these common threads, but as of May 11, 2018, both agencies and brands woke to a new day: the day on which Google announced it would be emailing notifications like this to consumers when a business responds to their reviews, prompting them to view the reply.

Surveys indicate that well over 50% of consumers already expect responses within days of reviewing a business. With Google’s rollout, we can assume that this number is about to rise.

Why is this noteworthy news? I’ll explain exactly that in this post, plus demo how Moz Local can be a significant help to owners and marketers in succeeding in this new environment.

When “extra” becomes “expected”

In the past, owner responses may have felt like something extra a business could do to improve management of its reputation. Perhaps a company you’re marketing has been making the effort to respond to negative reviews, at the very least, but you’ve let replying to positive reviews slide. Or maybe you respond to reviews when you can get around to it, with days or weeks transpiring between consumer feedback and brand reaction.

Google’s announcement is important for two key reasons:

1) It signals that Google is turning reviews into a truly interactive feature, in keeping with so much else they’ve rolled out to the Knowledge Panel in recent times. Like booking buttons and Google Questions & Answers, notifications of owner responses are Google’s latest step towards making Knowledge Panels transactional platforms instead of static data entities. Every new feature brings us that much closer to Google positioning itself between providers and patrons for as many transactional moments as possible.

2) It signals a major turning point in consumer expectations. In the past, reviewers have left responses from motives of “having their say,” whether that’s to praise a business, warn fellow consumers, or simply document their experiences.

Now, imagine a patron who writes a negative review of two different restaurants he dined at for Sunday lunch and dinner. On Monday, he opens his email to find a Google notification that Restaurant A has left an owner response sincerely apologizing and reasonably explaining why service was unusually slow that weekend, but that Restaurant B is meeting his complaint about a rude waiter with dead air.

“So, Restaurant A cares about me, and Restaurant B couldn’t care less,” the consumer is left to conclude, creating an emotional memory that could inform whether he’s ever willing to give either business a second chance in the future.

Just one experience of receiving an owner response notification will set the rules of the game from here on out, making all future businesses that fail to respond seem inaccessible, neglectful, and even uncaring. It’s the difference between reviewers narrating their experiences from random motives, and leaving feedback with the expectation of being heard and answered.

I will go so far as to predict that Google’s announcement ups the game for all review platforms, because it will make owner responses to consumer sentiment an expected, rather than extra, effort.

The burden is on brands

Because no intelligent business would believe it can succeed in modern commerce while appearing unreachable or unconcerned, Google’s announcement calls for a priority shift. For brands large and small, it may not be an easy one, but it should look something like this:

  • Negative reviews are now direct cries for help to our business; we will respond with whatever help we can give within X number of hours or days upon receipt
  • Positive reviews are now thank-you notes directly to our company; we will respond with gratitude within X number of hours or days upon receipt

Defining X is going to have to depend on the resources of your organization, but in an environment in which consumers expect your reply, the task of responding must now be moved from the back burner to a hotter spot on the stovetop. Statistics differ in past assessments of consumer expectations of response times:

  • In 2016, GetFiveStars found that 15.6% of consumers expected a reply with 1–3 hours, and 68.3% expected a reply within 1–3 days of leaving a review.
  • In 2017, RevLocal found that 52% of consumers expected responses within 7 days.
  • Overall, 30% of survey respondents told BrightLocal in 2017 that owner responses were a factor they looked at in judging a business.

My own expectation is that all of these numbers will now rise as a result of Google’s new function, meaning that the safest bet will be the fastest possible response. If resources are limited, I recommend prioritizing negative sentiment, aiming for a reply within hours rather than days as the best hope of winning back a customer. “Thank yous” for positive sentiment can likely wait for a couple of days, if absolutely necessary.

It’s inspiring to know that a recent Location3 study found that brands which do a good job of responding to reviews saw an average conversion rate of 13.9%, versus lackluster responders whose conversion rate was 10.4%. Depending on what you sell, those 3.5 points could be financially significant! But it’s not always easy to be optimally responsive.

If your business is small, accelerating response times can feel like a burden because of lack of people resources. If your business is a large, multi-location enterprise, the burden may lie in organizing awareness of hundreds of incoming reviews in a day, as well as keeping track of which reviews have been responded to and which haven’t.

The good news is…

Moz Local can help

The screenshot, above, is taken from the Moz Local dashboard. If you’re a customer, just log into your Moz Local account and go to your review section. From the “sources” section, choose “Google” — you’ll see the option to filter your reviews by “replied” and “not replied.” You’ll instantly be able to see which reviews you haven’t yet responded to. From there, simply use the in-dashboard feature that enables you to respond to your (or your clients’) reviews, without having to head over to your GMB dashboard or log into a variety of different clients’ GMB dashboards. So easy!

I highly recommend that all our awesome customers do this today and be sure you’ve responded to all of your most recent reviews. And, in the future, if you’re working your way through a stack of new, incoming Google reviews, this function should make it a great deal easier to keep organized about which ones you’ve checked off and which ones are still awaiting your response. I sincerely hope this function makes your work more efficient!

Need some help setting the right review response tone?

Please check out Mastering the Owner Response to the Quintet of Google My Business Reviews, which I published in 2016 to advocate responsiveness. It will walk you through these typical types of reviews you’ll be receiving:

  • “I love you!”
  • “I haven’t made up my mind yet.”
  • “There was hair in my taco…”
  • “I’m actually your competitor!”
  • “I’m citing illegal stuff.”

The one update I’d make to the advice in the above piece, given Google’s rollout of the new notification function, would be to increase the number of positive reviews to which you’re responding. In 2016, I suggested that enterprises managing hundreds of locations should aim to express gratitude for at least 10% of favorable reviews. In 2018, I’d say reply with thanks to as many of these as you possibly can. Why? Because reviews are now becoming more transactional than ever, and if a customer says, “I like you,” it’s only polite to say, “Thanks!”. As more customers begin to expect responsiveness, failure to acknowledge praise could feel uncaring.

I would also suggest that responses to negative reviews more strongly feature a plea to the customer to contact the business so that things can be “made right.” GetFiveStars co-founder, Mike Blumenthal, is hoping that Google might one day create a private channel for brands and consumers to resolve complaints, but until that happens, definitely keep in mind that:

  1. The new email alerts will ensure that more customers realize you’ve responded to their negative sentiment.
  2. If, while “making things right” in the public response, you also urge the unhappy customer to let you make things “more right” in person, you will enhance your chances of retaining him.
  3. If you are able to publicly or privately resolve a complaint, the customer may feel inspired to amend his review and raise your star rating; over time, more customers doing this could significantly improve your conversions and, possibly, your local search rankings.
  4. All potential customers who see your active responses to complaints will understand that your policies are consumer-friendly, which should increase the likelihood of them choosing your business for transactions.

Looking ahead

One of the most interesting aspects I’m considering as of the rollout of response notifications is whether it may ultimately impact the tone of reviews themselves. In the past, some reviewers have given way to excesses in their sentiment, writing about companies in the ugliest possible language… language I’ve always wanted to hope they wouldn’t use face-to-face with other human beings at the place of business. I’m wondering now if knowing there’s a very good chance that companies are responding to feedback could lessen the instances of consumers taking wild, often anonymous potshots at brands and create a more real-world, conversational environment.

In other words, instead of: “You overcharged me $ 3 for a soda and I know it’s because you’re [expletive] scammers, liars, crooks!!! Everyone beware of this company!!!”

We might see: “Hey guys, I just noticed a $ 3 overcharge on my receipt. I’m not too happy about this.”

The former scenario is honestly embarrassing. Trying to make someone feel better when they’ve just called you a thief feels a bit ridiculous and depressing. But the latter scenario is, at least, situation-appropriate instead of blown out of all proportion, creating an opening for you and your company to respond well and foster loyalty.

I can’t guarantee that reviewers will tone it down a bit if they feel more certain of being heard, but I’m hoping it will go that way in more and more cases.

What do you think? How will Google’s new function impact the businesses you market and the reviewers you serve? Please share your take and your tips with our community!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Email Testing: 7 tips from your peers for email conversion optimization

We recently asked the MarketingSherpa audience for tips on running effective email tests. Here are a few of the most helpful responses to consider as you start to develop an email testing program.
MarketingSherpa Blog

Posted in Latest NewsComments Off

Selling and Marketing to Senior Citizens When Your Team is Very Different From the Customer

Hear wisdom from Denis Mrkva, General Manager of Aetna’s HealthSpire subsidiary, sharing what he’s learned leading a team selling (and serving) his customers with Medicare Advantage plans.
MarketingSherpa Blog

More Articles

Posted in Latest NewsComments Off

Customer-First Marketing: What every entrepreneur and SMB marketer can learn from successful Etsy sellers

Etsy is a laboratory of capitalism that any marketers — especially small businesses, entrepreneurs, and startups — can learn from. Here are just a few tips from successful shop owners that can help other marketers who are trying to succeed in an already saturated marketplace.
MarketingSherpa Blog

Posted in Latest NewsComments Off

How We Got a 32% Organic Traffic Boost from 4 On-Page SEO Changes [Case Study]

Posted by WallStreetOasis.com

My name is Patrick Curtis, and I’m the founder and CEO of Wall Street Oasis, an online community focused on careers in finance founded in 2006 with over 2 million visits per month.

User-generated content and long-tail organic traffic is what has built our business and community over the last 12+ years. But what happens if you wake up one day and realize that your growth has suddenly stopped? This is what happened to us back in November 2012.

In this case study, I’ll highlight two of our main SEO problems as a large forum with over 200,000 URLs, then describe two solutions that finally helped us regain our growth trajectory — almost five years later.

Two main problems

1. Algorithm change impacts

Ever since November 2012, Google’s algo changes have seemed to hurt many online forums like ours. Even though our traffic didn’t decline, our growth dropped to the single-digit percentages. No matter what we tried, we couldn’t break through our “plateau of pain” (I call it that because it was a painful ~5 years trying).

Plateau of pain: no double-digit growth from late 2012 onward

2. Quality of user-generated content

Related to the first problem, 99% of our content is user-generated (UGC) which means the quality is mixed (to put it kindly). Like most forum-based sites, some of our members create incredible pieces of content, but a meaningful percentage of our content is also admittedly thin and/or low-quality.

How could we deal with over 200,000 pieces of content efficiently and try to optimize them without going bankrupt? How could we “clean the cruft” when there was just so much of it?

Fighting back: Two solutions (and one statistical analysis to show how it worked)

1. “Merge and Purge” project

Our goal was to consolidate weaker “children” URLs into stronger “master” URLs to utilize some of the valuable content Google was ignoring and to make the user experience better.

For example, instead of having ~20 discussions on a specific topic (each with an average of around two to three comments) across twelve years, we would consolidate many of those discussions into the strongest two or three URLs (each with around 20–30 comments), leading to a much better user experience with less need to search and jump around the site.

Changes included taking the original post and comments from a “child” URL and merging them into the “master” URL, unpublishing the child URL, removing the child from sitemap, and adding a 301 redirect to the master.

Below is an example of how it looked when we merged a child into our popular Why Investment Banking discussion. We highlighted the original child post as a Related Topic with a blue border and included the original post date to help avoid confusion:

Highlighting a related topic child post

This was a massive project that involved some complex Excel sorting, but after 18 months and about $ 50,000 invested (27,418 children merged into 8,515 masters to date), the user experience, site architecture, and organization is much better.

Initial analysis suggests that the percentage gain from merging weak children URLs into stronger masters has given us a boost of ~10–15% in organic search traffic.

2. The Content Optimization Team

The goal of this initiative was to take the top landing pages that already existed on Wall Street Oasis and make sure that they were both higher quality and optimized for SEO. What does that mean, exactly, and how did we execute it?

We needed a dedicated team that had some baseline industry knowledge. To that end, we formed a team of five interns from the community, due to the fact that they were familiar with the common topics.

We looked at the top ~200 URLs over the previous 90 days (by organic landing page traffic) and listed them out in a spreadsheet:

Spreadsheet of organic traffic to URLs

We held five main hypotheses of what we believed would boost organic traffic before we started this project:

  1. Longer content with subtitles: Increasing the length of the content and adding relevant H2 and H3 subtitles to give the reader more detailed and useful information in an organized fashion.
  2. Changing the H1 so that it matched more high-volume keywords using Moz’s Keyword Explorer.
  3. Changing the URL so that it also was a better match to high-volume and relevant keywords.
  4. Adding a relevant image or graphic to help break up large “walls of text” and enrich the content.
  5. Adding a relevant video similar to the graphic, but also to help increase time on page and enrich the content around the topic.

We tracked all five of these changes across all 200 URLs (see image above). After a statistical analysis, we learned that four of them helped our organic search traffic and one actually hurt.

Summary of results from our statistical analysis

  • Increasing the length of the articles and adding relevant subtitles (H2s, H3s, and H4s) to help organize the content gives an average boost to organic traffic of 14%
  • Improving the title or H1 of the URLs yields a 9% increase on average
  • Changing the URL decreased traffic on average by 38% (this was a smaller sample size — we stopped doing this early on for obvious reasons)
  • Including a relevant video increases the organic traffic by 4% on average, while putting an image up increases it by 5% on average.

Overall, the boost to organic traffic — should we continue to make these four changes (and avoid changing the URL) — is 32% on average.

Key takeaway:

Over half of that gain (~18%) comes from changes that require a minimal investment of time. For teams trying to optimize on-page SEO across a large number of pages, we recommend focusing on the top landing pages first and easy wins before deciding if further investment is warranted.

We hope this case study of our on-page SEO efforts was interesting, and I’m happy to answer any questions you have in the comments!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Advert