Tag Archive | "Update"

Update Your Focus to Work Hard on the Right Things

Last week, I affirmed that “hard work is luck.” However, what if you work hard and aren’t seeing any progress…

The post Update Your Focus to Work Hard on the Right Things appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

Subdomain leasing and the giant hole in Google’s Medic update

In today’s Soapbox, a warning about the hack in low-authority medical advice with content mirrors and “buying guides” slipping through Google’s algorithm.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Did Google’s Site Diversity Update Live Up to its Promise?

Posted by Dr-Pete

On June 6th, on the heels of a core update, Google announced that a site diversity update was also rolling out. This update offered a unique opportunity, because site diversity is something we can directly measure. Did Google deliver on their promise, or was this announcement mostly a PR play?

There are a lot of ways to measure site diversity, and we’re going to dive pretty deep into the data. If you can’t wait, here’s the short answer — while Google technically improved site diversity, the update was narrowly targeted and we had to dig to find evidence of improvement.

How did average diversity improve?

Using the 10,000-keyword MozCast set, we looked at the average diversity across page-one SERPs. Put simply, we measured how many unique sub-domains were represented within each results page. Since page one of Google can have less than ten organic results, this was expressed as a percentage — specifically, the ratio of unique sub-domains to total organic results on the page. Here’s the 30-day graph (May 19 – June 17):

A site diversity of 90 percent on a 10-result SERP would mean that nine out of ten sub-domains were unique, with one repeat. It’s hard to see, but between June 6th and 7th, average diversity did improve marginally, from 90.23 percent to 90.72 percent (a 0.49 percent improvement). If we zoom in quite a bit (10x) on the Y-axis, we can see the trend over time:

Zooming in to just a 10 percent range (85–95 percent diversity), you can see that most of the change happened in a single day, and the improvement has remained in place for the week since the update. Even zooming in, though, the improvement hardly seems impressive.

Was the improvement more isolated?

Being as fair to Google as possible, we need to consider one of their follow-up statements:

What if Google improved the worst-case scenarios, but it wasn’t immediately clear when we averaged out all SERPs? We can isolate situations with more than two listings from the same site by looking specifically at SERPs with a site diversity score of 80 percent or better (eight out of ten sub-domains are unique). Here’s the 30-day graph for just those cases:

On June 6th, 84.58 percent of sites in our data set had a diversity of 80 percent or better. On June 7th, that increased to 86.68 percent — a 2.1 percent improvement. Let’s dig even deeper to see what’s happening for individual counts.

How did the impact break down?

A single data point doesn’t tell us much about what’s happening within each of the buckets. For this analysis, I’m going to use the exact duplicate count, since percentages can get a bit confusing once we have to put them in bins. Another complication is that, on occasion, two sites have more than one organic result — this brings down the overall diversity of the SERP, but doesn’t necessarily mean that one site is dominating.

So, what if we look just at the count of the dominant site (the site with the most duplicates) across the 10,000 SERPs? We’ll compare June 6th (blue) to June 7th (purple):

For slightly over half of SERPs in our data set, there were no duplicates (every site had one listing), and this number didn’t change much after the update. The number of sites with two listings (i.e. one duplicate) increased pretty noticeably after the update (up by 346 SERPs). This was offset almost entirely by a decrease in SERPs with three to five listings (down by 345 SERPs across the three bins).

The numbers get too small to see at 5K scale after the four-count SERPs, so I’ll restrict the Y-axis:

SERPs with dominant sites owning six to ten organic listings accounted for only 117 of 10,000 SERPs (just over 1 percent) on June 6th. After the update, this actually went up a tiny bit, to 119 SERPs.

We still see SERPs where one site dominates, and that story didn’t change much after the update. That said, these six to ten-count SERPs are fairly rare. Looking at the keywords, we also see that many of them have brand or navigational intent. Here are a few keywords where we still see a ten-count:

  • “kohl’s hours”
  • “macy’s hotel collection”
  • “lowes outlet”
  • “dillard’s sales”
  • “edd unemployment”

Many dominant-intent searches show site-links in the #1 position (which allow up to six additional links from one site). It’s hard to say why Google isn’t using site-links in these extreme cases. These may be situations where the intent isn’t quite as clear, but we can only speculate based on looking at a handful of examples. Keep in mind, too, that Google determines intent algorithmically, so it can shift over time.

This isn’t an easy problem. Site diversity isn’t a lever you can pull in isolation, especially when it’s left to the algorithm. Reducing repetition too much could harm quality, in some cases (especially SERPs with brand intent). Similarly, many algorithm updates unrelated to diversity seem to have unintended consequences for site diversity.

So, what’s the final verdict?

When evaluating site diversity, we have to be careful relying too much on anecdotes. Anecdotally, there are definitely SERPs where a single domain seems to have too much power. For example, here’s the main results column on a search for “pure green coffee extract” (I’ve removed a local pack for the purposes of this post):

The shopping results at the top suggest commercial intent, but the organic results are a mix of informational and commercial results. Amazon has a block of five product results, and this is not a situation where the query suggests brand or navigational intent (I haven’t indicated any specific interest in Amazon). 

It’s easy to cherry-pick, and we can certainly say that Google hasn’t solved the problem, but what are the broader results telling us? It’s fair to say that there was some amount of improvement, and the improvement tracked with Google’s public statements. SERPs with three to five results (two to four duplicates) from the dominant site decreased a bit — in most cases, these SERPs still had two results from the dominant site (one duplicate).

Even isolating the change, though, it was fairly small, and there was no improvement for SERPs where six to ten results came from the dominant domain. This may be because many of those queries had strong brand or navigational intent, and the six to ten count SERPs were rare both before and after the update.

While the improvements were real and Google’s statements were technically accurate, the impact of the site diversity update doesn’t feel on par with a pre-announcement and the PR it received. Regarding the state of site diversity in SERPs, Google has made minor improvements but still has work to do.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Google search update aims to show more diverse results from different domain names

Another Google search update has rolled out this week, this one deals with domain diversity in the search results.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Early data around the Google June 2019 core update shows some winners, losers

This Google update that began rolling out on Monday seems like it was pretty big and the scary part, it isn’t done rolling out yet.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Google Florida 2.0 Algorithm Update: Early Observations

It has been a while since Google has had a major algorithm update.

They recently announced one which began on the 12th of March.

What changed?

It appears multiple things did.

When Google rolled out the original version of Penguin on April 24, 2012 (primarily focused on link spam) they also rolled out an update to an on-page spam classifier for misdirection.

And, over time, it was quite common for Panda & Penguin updates to be sandwiched together.

If you were Google & had the ability to look under the hood to see why things changed, you would probably want to obfuscate any major update by changing multiple things at once to make reverse engineering the change much harder.

Anyone who operates a single website (& lacks the ability to look under the hood) will have almost no clue about what changed or how to adjust with the algorithms.

In the most recent algorithm update some sites which were penalized in prior “quality” updates have recovered.

Though many of those recoveries are only partial.

Many SEO blogs will publish articles about how they cracked the code on the latest update by publishing charts like the first one without publishing that second chart showing the broader context.

The first penalty any website receives might be the first of a series of penalties.

If Google smokes your site & it does not cause a PR incident & nobody really cares that you are gone, then there is a very good chance things will go from bad to worse to worser to worsterest, technically speaking.

“In this age, in this country, public sentiment is everything. With it, nothing can fail; against it, nothing can succeed. Whoever molds public sentiment goes deeper than he who enacts statutes, or pronounces judicial decisions.” – Abraham Lincoln

Absent effort & investment to evolve FASTER than the broader web, sites which are hit with one penalty will often further accumulate other penalties. It is like compound interest working in reverse – a pile of algorithmic debt which must be dug out of before the bleeding stops.

Further, many recoveries may be nothing more than a fleeting invitation to false hope. To pour more resources into a site that is struggling in an apparent death loop.

The above site which had its first positive algorithmic response in a couple years achieved that in part by heavily de-monetizing. After the algorithm updates already demonetized the website over 90%, what harm was there in removing 90% of what remained to see how it would react? So now it will get more traffic (at least for a while) but then what exactly is the traffic worth to a site that has no revenue engine tied to it?

That is ultimately the hard part. Obtaining a stable stream of traffic while monetizing at a decent yield, without the monetizing efforts leading to the traffic disappearing.

A buddy who owns the above site was working on link cleanup & content improvement on & off for about a half year with no results. Each month was a little worse than the prior month. It was only after I told him to remove the aggressive ads a few months back that he likely had any chance of seeing any sort of traffic recovery. Now he at least has a pulse of traffic & can look into lighter touch means of monetization.

If a site is consistently penalized then the problem might not be an algorithmic false positive, but rather the business model of the site.

The more something looks like eHow the more fickle Google’s algorithmic with receive it.

Google does not like websites that sit at the end of the value chain & extract profits without having to bear far greater risk & expense earlier into the cycle.

Thin rewrites, largely speaking, don’t add value to the ecosystem. Doorway pages don’t either. And something that was propped up by a bunch of keyword-rich low-quality links is (in most cases) probably genuinely lacking in some other aspect.

Generally speaking, Google would like themselves to be the entity at the end of the value chain extracting excess profits from markets.

This is the purpose of the knowledge graph & featured snippets. To allow the results to answer the most basic queries without third party publishers getting anything. The knowledge graph serve as a floating vertical that eat an increasing share of the value chain & force publishers to move higher up the funnel & publish more differentiated content.

As Google adds features to the search results (flight price trends, a hotel booking service on the day AirBNB announced they acquired HotelTonight, ecommerce product purchase on Google, shoppable image ads just ahead of the Pinterest IPO, etc.) it forces other players in the value chain to consolidate (Expedia owns Orbitz, Travelocity, Hotwire & a bunch of other sites) or add greater value to remain a differentiated & sought after destination (travel review site TripAdvisor was crushed by the shift to mobile & the inability to monetize mobile traffic, so they eventually had to shift away from being exclusively a reviews site to offer event & hotel booking features to remain relevant).

It is never easy changing a successful & profitable business model, but it is even harder to intentionally reduce revenues further or spend aggressively to improve quality AFTER income has fallen 50% or more.

Some people do the opposite & make up for a revenue shortfall by publishing more lower end content at an ever faster rate and/or increasing ad load. Either of which typically makes their user engagement metrics worse while making their site less differentiated & more likely to receive additional bonus penalties to drive traffic even lower.

In some ways I think the ability for a site to survive & remain though a penalty is itself a quality signal for Google.

Some sites which are overly reliant on search & have no external sources of traffic are ultimately sites which tried to behave too similarly to the monopoly that ultimately displaced them. And over time the tech monopolies are growing more powerful as the ecosystem around them burns down:

If you had to choose a date for when the internet died, it would be in the year 2014. Before then, traffic to websites came from many sources, and the web was a lively ecosystem. But beginning in 2014, more than half of all traffic began coming from just two sources: Facebook and Google. Today, over 70 percent of traffic is dominated by those two platforms.

Businesses which have sustainable profit margins & slack (in terms of management time & resources to deploy) can better cope with algorithmic changes & change with the market.

Over the past half decade or so there have been multiple changes that drastically shifted the online publishing landscape:

  • the shift to mobile, which both offers publishers lower ad yields while making the central ad networks more ad heavy in a way that reduces traffic to third party sites
  • the rise of the knowledge graph & featured snippets which often mean publishers remain uncompensated for their work
  • higher ad loads which also lower organic reach (on both search & social channels)
  • the rise of programmatic advertising, which further gutted display ad CPMs
  • the rise of ad blockers
  • increasing algorithmic uncertainty & a higher barrier to entry

Each one of the above could take a double digit percent out of a site’s revenues, particularly if a site was reliant on display ads. Add them together and a website which was not even algorithmically penalized could still see a 60%+ decline in revenues. Mix in a penalty and that decline can chop a zero or two off the total revenues.

Businesses with lower margins can try to offset declines with increased ad spending, but that only works if you are not in a market with 2 & 20 VC fueled competition:

Startups spend almost 40 cents of every VC dollar on Google, Facebook, and Amazon. We don’t necessarily know which channels they will choose or the particularities of how they will spend money on user acquisition, but we do know more or less what’s going to happen. Advertising spend in tech has become an arms race: fresh tactics go stale in months, and customer acquisition costs keep rising. In a world where only one company thinks this way, or where one business is executing at a level above everyone else – like Facebook in its time – this tactic is extremely effective. However, when everyone is acting this way, the industry collectively becomes an accelerating treadmill. Ad impressions and click-throughs get bid up to outrageous prices by startups flush with venture money, and prospective users demand more and more subsidized products to gain their initial attention. The dynamics we’ve entered is, in many ways, creating a dangerous, high stakes Ponzi scheme.

And sometimes the platform claws back a second or third bite of the apple. Amazon.com charges merchants for fulfillment, warehousing, transaction based fees, etc. And they’ve pushed hard into launching hundreds of private label brands which pollute the interface & force brands to buy ads even on their own branded keyword terms.

They’ve recently jumped the shark by adding a bonus feature where even when a brand paid Amazon to send traffic to their listing, Amazon would insert a spam popover offering a cheaper private label branded product:

Amazon.com tested a pop-up feature on its app that in some instances pitched its private-label goods on rivals’ product pages, an experiment that shows the e-commerce giant’s aggressiveness in hawking lower-priced products including its own house brands. The recent experiment, conducted in Amazon’s mobile app, went a step further than the display ads that commonly appear within search results and product pages. This test pushed pop-up windows that took over much of a product page, forcing customers to either click through to the lower-cost Amazon products or dismiss them before continuing to shop. … When a customer using Amazon’s mobile app searched for “AAA batteries,” for example, the first link was a sponsored listing from Energizer Holdings Inc. After clicking on the listing, a pop-up window appeared, offering less expensive AmazonBasics AAA batteries.”

Buying those Amazon ads was quite literally subsidizing a direct competitor pushing you into irrelevance.

And while Amazon is destroying brand equity, AWS is doing investor relations matchmaking for startups. Anything to keep the current bubble going ahead of the Uber IPO that will likely mark the top in the stock market.

As the market caps of big tech companies climb they need to be more predatious to grow into the valuations & retain employees with stock options at an ever-increasing strike price.

They’ve created bubbles in their own backyards where each raise requires another. Teachers either drive hours to work or live in houses subsidized by loans from the tech monopolies that get a piece of the upside (provided they can keep their own bubbles inflated).

“It is an uncommon arrangement — employer as landlord — that is starting to catch on elsewhere as school employees say they cannot afford to live comfortably in regions awash in tech dollars. … Holly Gonzalez, 34, a kindergarten teacher in East San Jose, and her husband, Daniel, a school district I.T. specialist, were able to buy a three-bedroom apartment for $ 610,000 this summer with help from their parents and from Landed. When they sell the home, they will owe Landed 25 percent of any gain in its value. The company is financed partly by the Chan Zuckerberg Initiative, Mark Zuckerberg’s charitable arm.”

The above sort of dynamics have some claiming peak California:

The cycle further benefits from the Alchian-Allen effect: agglomerating industries have higher productivity, which raises the cost of living and prices out other industries, raising concentration over time. … Since startups raise the variance within whatever industry they’re started in, the natural constituency for them is someone who doesn’t have capital deployed in the industry. If you’re an asset owner, you want low volatility. … Historically, startups have created a constant supply of volatility for tech companies; the next generation is always cannibalizing the previous one. So chip companies in the 1970s created the PC companies of the 80s, but PC companies sourced cheaper and cheaper chips, commoditizing the product until Intel managed to fight back. Meanwhile, the OS turned PCs into a commodity, then search engines and social media turned the OS into a commodity, and presumably this process will continue indefinitely. … As long as higher rents raise the cost of starting a pre-revenue company, fewer people will join them, so more people will join established companies, where they’ll earn market salaries and continue to push up rents. And one of the things they’ll do there is optimize ad loads, which places another tax on startups. More dangerously, this is an incremental tax on growth rather than a fixed tax on headcount, so it puts pressure on out-year valuations, not just upfront cash flow.

If you live hundreds of miles away the tech companies may have no impact on your rental or purchase price, but you can’t really control the algorithms or the ecosystem.

All you can really control is your mindset & ensuring you have optionality baked into your business model.

  • If you are debt-levered you have little to no optionality. Savings give you optionality. Savings allow you to run at a loss for a period of time while also investing in improving your site and perhaps having a few other sites in other markets.
  • If you operate a single website that is heavily reliant on a third party for distribution then you have little to no optionality. If you have multiple projects that enables you to shift your attention toward working on whatever is going up and to the right while letting anything that is failing pass time without becoming overly reliant on something you can’t change. This is why it often makes sense for a brand merchant to operate their own ecommerce website even if 90% of their sales come from Amazon. It gives you optionality should the tech monopoly become abusive or otherwise harm you (even if the intent was benign rather than outright misanthropic).

As the update ensues Google will collect more data with how users interact with the result set & determine how to weight different signals, along with re-scoring sites that recovered based on the new engagement data.

Recently a Bing engineer named Frédéric Dubut described how they score relevancy signals used in updates

As early as 2005, we used neural networks to power our search engine and you can still find rare pictures of Satya Nadella, VP of Search and Advertising at the time, showcasing our web ranking advances. … The “training” process of a machine learning model is generally iterative (and all automated). At each step, the model is tweaking the weight of each feature in the direction where it expects to decrease the error the most. After each step, the algorithm remeasures the rating of all the SERPs (based on the known URL/query pair ratings) to evaluate how it’s doing. Rinse and repeat.

That same process is ongoing with Google now & in the coming weeks there’ll be the next phase of the current update.

So far it looks like some quality-based re-scoring was done & some sites which were overly reliant on anchor text got clipped. On the back end of the update there’ll be another quality-based re-scoring, but the sites that were hit for excessive manipulation of anchor text via link building efforts will likely remain penalized for a good chunk of time.

Categories: 

SEO Book

Posted in Latest NewsComments Off

March 1st Google Update: The Mysterious Case of the 19-Result SERPs

Posted by Dr-Pete

Late last week (Feb 28 – Mar 1), we saw a sizable two-day spike in Google rankings flux, as measured by MozCast. Temperatures on Friday reached 108°F. The original temperature on Thursday was 105°F, but that was corrected down to 99°F (more on that later).

Digging in on Friday (March 1st), we saw a number of metrics shift, but most notably was a spike in page-one Google SERPs with more than 10 organic results. Across the 10,000 keywords in MozCast, here’s what we observed at the high end:

Counting “organic” results in 2019 is challenging — some elements, like expanded site-links (in the #1 position), Top Stories, and image results can occupy an organic position. In-depth Articles are particularly challenging (more on that in a moment), and the resulting math usually leaves us with page-one SERPs with counts from 4 to 12. Friday’s numbers were completely beyond anything we’ve seen historically, though, with organic counts up to 19 results.

Dissecting the 19-result SERP

Across 10K keywords, we saw 9 SERPs with 19 results. Below is one of the most straightforward (in terms of counting). There was a Featured Snippet in the #0 position, followed by 19 results that appear organic. This is a direct screenshot from a result for “pumpkin pie recipe” on Google.com/US:

Pardon the long scroll, but I wanted you to get the full effect. There’s no clear marker here to suggest that part of this SERP is a non-organic feature or in some way different. You’ll notice, though, that we transition from more traditional recipe results (with thumbnails) to what appear to be a mix of magazine and newspaper articles. We’ve seen something like this before …

Diving into the depths of in-depth

You may not think much about In-depth Articles these days. That’s in large part because they’re almost completely hidden within regular, organic results. We know they still exist, though, because of deep source-code markers and a mismatch in page-one counts. Here, for example, are the last 6 results from today (March 4th) on a search for “sneakers”:

Nestled in the more traditional, e-commerce results at the end of page one (like Macy’s), you can see articles from FiveThirtyEight, Wired, and The Verge. It’s hard to tell from the layout, but this is a 3-pack of In-depth Articles, which takes the place of a single organic position. So, this SERP appears to have 12 page-one results. Digging into the results on March 1st, we saw a similar pattern, but those 3-packs had expanded to as many as 10 articles.

We retooled the parser to more flexibly detect In-depth Articles (allowing for packs with more than 3 results), and here’s what we saw for prevalence of In-depth Articles over the past two weeks:

Just under 23% of MozCast SERPs on the morning of March 1st had something similar to In-depth Articles, an almost 4X increase from the day before. This number returned to normal (even slightly lower) the next day. It’s possible that our new definition is too broad, and these aren’t really traditional “In-depth” packs, but then we would expect the number to stay elevated. We also saw a large spike in SERP “real-estate” shares for major publications, like the New York Times, which typically dominate In-depth Articles. Something definitely happened around March 1st.

By the new method (removing these results from organic consideration), the temperature for 2/28 dropped from 105°F to 99°F, as some of the unusual results were treated as In-depth Articles and removed from the weather report.

Note that the MozCast temperatures are back-dated, since they represent the change over a 24-hour period. So, the prevalence of In-depth articles on the morning of March 1st is called “3/1″ in the graph, but the day-over-day temperature recorded that morning is labeled “2/28″ in the graph at the beginning of this post.

Sorting out where to go from here

Is this a sign of things to come? It’s really tough to say. On March 1st, I reached out to Twitter to see if people could replicate the 19-result SERPs and many people were able to, both on desktop and mobile:

This did not appear to be a normal test (which we see roll out to something like 1% or less of searchers, typically). It’s possible this was a glitch on Google’s end, but Google doesn’t typically publicize temporary glitches, so it’s hard to tell.

It appears that the 108°F was, in part, a reversal of these strange results. On the other hand, it’s odd that the reversal was larger than the original rankings flux. At the same time, we saw some other signals in play, such as a drop in image results on page one (about 10.5% day-over-day, which did not recover the next day). It’s possible that an algorithm update rolled out, but there was a glitch in that update.

If you’re a traditional publisher or someone who generally benefits from In-depth Articles, I’d recommend keeping your eyes open. This could be a sign of future intent by Google, or it could simply be a mistake. For the rest of us, we’ll have to wait and see. Fortunately, these results appeared mostly at the end of page one, so top rankings were less impacted, but a 19-result page one would certainly shake-up our assumptions about organic positioning and CTR.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Find More Articles

Posted in Latest NewsComments Off

What are you seeing? Help us analyze Google’s March 2019 core algorithm update

Tell us if you’re seeing any impact from Google’s latest algorithm update.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Find More Articles

Posted in Latest NewsComments Off

SearchCap: EU search results preview, Google URL inspection tool update & Bing Ads page feeds

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

SearchCap: Google News update coming, local features & Search Console stats

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Advert