Tag Archive | "Changed"

How Wrenches Changed the Way I Think about Digital Tools

About a year and a half ago, I made up my mind to rebuild a motorcycle. I had no mechanical…

The post How Wrenches Changed the Way I Think about Digital Tools appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

How 6 Months of a Daily Journaling Habit Changed My Writing

I’ve been a sporadic journaler all my life. About once every three months, when inspiration struck, I’d seize my journal and dash out several pages. But most of the time, it sat on my shelf, collecting dust. About six months ago, I decided it was time to start a daily journaling habit. We’ve all heard
Read More…

The post How 6 Months of a Daily Journaling Habit Changed My Writing appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

The Day Remarketing Changed Forever

Posted by cmurf

Today, we’re talking about June 25th, 2015: the day Google AdWords changed forever to allow Analytics remarketing audiences to be available for advertisers on search.

In our video blog, we’ll begin by telling you why this is important and explain some problems it solves. We’ll give you some examples of how you can use RLSAs, along with four benefits of using them. We’ve also included a simple example of how you can create one audience. And we’ll conclude with the sequel to The Day Google AdWords Changed Forever.

This video is about 25 minutes long — if this hits your TLDW (too long, didn’t watch) limit, you can check out a shortened, related video on elegant remarketing on search, or read through the summary below.



1. AdWords Problems

Those familiar with Google AdWords will be aware that it is your website’s best (or second-best, behind organic) converting source of traffic. However, that’s not to say that Google AdWords isn’t without its problems.

We term the three most common issues we see with AdWords as:

  1. The Leaky Bucket
  2. The Generic KeyWord Conundrum
  3. You Say Tomato

The Leaky Bucket

The Leaky Bucket refers to instances where your AdWords budget is insufficient to cover all searches for your keywords. This means that you could potentially miss out on conversions due to your ad not showing along a user’s path to purchase.

The Generic Keyword Conundrum

We coined the phrase “Generic KeyWord Conundrum” for keywords that are relevant to your business and which will drive a lot of traffic to your website. However, this traffic can be top-of-funnel and less likely to convert. So, when you have a set budget, it may not always make sense to bid on generic keywords.

You Say Tomato

Then we come to the issue of the keyword itself. The best thing about AdWords is that it’s based on keywords. You choose your keywords, set your bid, and off you go driving traffic to your website. Yet, it can often be difficult to understand the intent behind a user’s search term. For example, if a user searches for “pizza,” it’s difficult to know if they’re searching for “pizza delivery,” a nearby restaurant, or a recipe.

Encapsulating these issues, we can say that the biggest limitation for AdWords has been that we can only target by words (excluding the obvious targeting options of location, device, etc).

2. RLSAs & Google Analytics

We’re always striving to innovate for our clients here at Wolfgang Digital; to this end, we have weekly learning sessions where we discuss the latest changes in AdWords and how we can introduce these to our clients’ accounts. So we were extremely excited when it was announced on June 25th that Analytics audiences would now become available within Remarketing Lists for Search Ads. Up to that point, you could use RLSAs only with your AdWords remarketing tag. This would now mean that we could use over 200 Google Analytics dimensions and metrics to create audiences for RLSAs.

Building out these audiences meant that we could effectively target people at all stages across the purchase funnel.

3. RLSAs & Audience building

Audience of website visitors

We could create an audience for past website visitors on the awareness stage. This means when we bid on keywords, we know that the user has been on our website and are familiar with our products, services, and price range. This user is more qualified and we can now tackle the Generic KeyWord Conundrum and also ensure that we do not suffer from the Leaky Bucket.

We can target Facebook traffic within the interest stage. Facebook drives lots of quality traffic, but it still suffers from lower conversion rates. Capturing this audience in Analytics and remarketing to them on search will allow advertisers to extract more value from their Facebook activity.

Case study 1: Brown Thomas

Our first case study comes from a department store in Dublin who wanted visibility on highly competitive keywords for cosmetic keywords. We knew from past AdWords performance and insights from their Google Analytics that we would have a tough time getting good coverage on these keywords, while keeping the campaigns profitable — so we decided to overlay the campaign with a remarketing list for people who had previously visited the website. This enabled us to improve conversion rate by 1,500% and reduce cost-per-sale by 94%.

Case study 2: iClothing

Our second case study comes from an online clothes retailer called iClothing. We manage their social accounts as well as their AdWords account, and could see that while social conversion rates were low, it was often a touchpoint on a user’s path to purchase. We created a list of users who had come to the iClothing website from one of our Facebook ads. Using this list within AdWords, we were able to show these users specific, Facebook-related ads encouraging them to return to the website to complete their purchase. This strategy allowed us to boost conversion rate by 165%, while reducing cost-per-sale by 84%.

Past purchases & repeat customers

Another smart way to capture an audience is to look at past purchasers and repeat customers. You can define these audiences based on users who have made exactly one purchase as a “past customer,” and users who have made more than 1 purchase as “repeat customers.” There are lots of studies that indicate customers will convert at a higher rate and spend more, so it makes sense to target customers differently than non-customers.

Case study 3: McElhinneys

Based on these studies, as well as our own insights from Google Analytics, we implemented a strategy for another one of our clients, McElhinneys, to break our audience down into 2 distinct categories: one for past customers, and one for users who were yet to purchase from us. This allowed us to target the 2 audiences in different ways and with different messages, as well as allowing us to split our budget as efficiently as possible. The implementation of this strategy led to a 324% boost in conversion rate, a 75% drop in cost-per-sale, and a 300% boost in Return On Ad Spend (ROAS).

Other audiences

There are a few other types of audiences to take note of that you might want to target. These could be cart abandoners, those who’ve visited a certain number of pages, or those who have spent a certain amount of time on your site — these people are more likely to be engaged with your site and your products.

4. How to create an audience

  1. Make sure your Google Analytics account is linked to your Google AdWords account.
  2. Make sure Remarketing is on in the Data Collection setting in your Analytics account (this is essential!).
  3. Go to the “Audiences” settings in the Remarketing section of the Property column on the Admin page of your Google Analytics account.
  4. Click on “+New Audience.”
  5. Select the Analytics view that you want to take this audience from and the AdWords account where you want to use the audience.
  6. You can then either create a new audience based on a number of characteristics, or import an existing segment as an audience.
  7. Once the audience is created, you’ll be able to see it in the Shared Library of your AdWords account.

Creating the audiences is a straightforward process. You need to visit the Remarketing section within Admin in Google Analytics. You then link with the appropriate AdWords account. Next, you can decide whether to use some available audiences, or import some segments already active within your Analytics account — or you can create a new audience.

Creating a new audience is easy. You can build an audience based on demographics, technology, behavior, date of first session, or traffic source. For example, we can create an audience for traffic that arrives from source = Facebook & Medium = CPC and this will create an audience of all Paid Facebook traffic (assuming you use GTMs to tag your Facebook Paid traffic with Medium = CPC).

The next time you visit the Audience section of your AdWords Shared Library, you’ll see your new Analytics audience available.

5. Benefits of RLSAs

Now we’ll look at four benefits of using RLSAs.

Control budget

First up, it allows us to control budget. We can ensure that we place sufficient budget on our best-performing campaigns. We have mentioned using RLSAs for past purchasers — in this case, we can ensure that any campaign targeting past purchasers has full impression share, as this campaign will be most likely to convert.

Control ad message

Our second benefit is that we can control our ad message. For example, after capturing a Facebook audience who has viewed our Spring/Summer dresses, we can tailor our creative by referencing Facebook and a Summer Sale.

Control KPIs

The third benefit is that we can control KPIs. We can now differentiate our campaigns between customer retention and customer acquisition, and designate our KPIs accordingly.

Control targeting

Finally, the fourth benefit allows us to control targeting. We can decide exactly what audiences we are showing our search ads to. We can move beyond mere keyword targeting and now add the layer of an audience, making Google AdWords even more powerful.

6. The Day Adwords Changed Forever: Further developments with RLSAs

But, like all great stories, there is a sequel. The Day AdWords Changed Forever, Part II was when Google moved beyond audience targeting based on Analytics and introduced first-party data with Customer Match. Having witnessed the success of Facebook’s and Twitter’s use of email lists within their platform, it became inevitable that AdWords would allow advertisers to use email lists on search.

Building audiences via Analytics has its limitations. It is cookie-based, so a user could delete their cookies and we would lose them. They also do not cross devices so again we might be missing our audience when they switch devices. This makes email lists much more powerful, as we can now target users across devices when they’re logged into Google.

There are requirements to enable Customer Match, but if you go through the video blog above, we include a way to hack these requirements.

Customer Match

  • Go to the Audiences section of the Shared Library of your Google AdWords account, click on “+Remarketing List” and click on “Customer Emails.”
  • You can then name your list and upload a .csv file with the email addresses on it.
  • You also need to provide a URL where users can unsubscribe from the list.
  • Define the duration of this audience, i.e. how long a user will remain part of this audience.
  • Click “Upload and save list.”
  • Wait until your list has been processed and if there are enough valid email addresses, you can then start to advertise to that list.

Thanks for watching and reading along today. We’d love to hear your thoughts on RLSAs and AdWords — sound off in the comments section below!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

How Much Has Link Building Changed in Recent Years?

Posted by Paddy_Moogan

I get asked this question a lot. It’s mainly asked by people who are considering buying my link building book and want to know whether it’s still up to date. This is understandable given that the first edition was published in February 2013 and our industry has a deserved reputation for always changing.

I find myself giving the same answer, even though I’ve been asked it probably dozens of times in the last two years—”not that much”. I don’t think this is solely due to the book itself standing the test of time, although I’ll happily take a bit of credit for that :) I think it’s more a sign of our industry as a whole not changing as much as we’d like to think.

I started to question myself and if I was right and honestly, it’s one of the reasons it has taken me over two years to release the second edition of the book.

So I posed this question to a group of friends not so long ago, some via email and some via a Facebook group. I was expecting to be called out by many of them because my position was that in reality, it hasn’t actually changed that much. The thing is, many of them agreed and the conversations ended with a pretty long thread with lots of insights. In this post, I’d like to share some of them, share what my position is and talk about what actually has changed.

My personal view

Link building hasn’t changed as much we think it has.

The core principles of link building haven’t changed. The signals around link building have changed, but mainly around new machine learning developments that have indirectly affected what we do. One thing that has definitely changed is the mindset of SEOs (and now clients) towards link building.

I think the last big change to link building came in April 2012 when Penguin rolled out. This genuinely did change our industry and put to bed a few techniques that should never have worked so well in the first place.

Since then, we’ve seen some things change, but the core principles haven’t changed if you want to build a business that will be around for years to come and not run the risk of being hit by a link related Google update. For me, these principles are quite simple:

  • You need to deserve links – either an asset you create or your product
  • You need to put this asset in front of a relevant audience who have the ability to share it
  • You need consistency – one new asset every year is unlikely to cut it
  • Anything that scales is at risk

For me, the move towards user data driving search results + machine learning has been the biggest change we’ve seen in recent years and it’s still going.

Let’s dive a bit deeper into all of this and I’ll talk about how this relates to link building.

The typical mindset for building links has changed

I think that most SEOs are coming round to the idea that you can’t get away with building low quality links any more, not if you want to build a sustainable, long-term business. Spammy link building still works in the short-term and I think it always will, but it’s much harder than it used to be to sustain websites that are built on spam. The approach is more “churn and burn” and spammers are happy to churn through lots of domains and just make a small profit on each one before moving onto another.

For everyone else, it’s all about the long-term and not putting client websites at risk.

This has led to many SEOs embracing different forms of link building and generally starting to use content as an asset when it comes to attracting links. A big part of me feels that it was actually Penguin in 2012 that drove the rise of content marketing amongst SEOs, but that’s a post for another day…! For today though, this goes some way towards explain the trend we see below.

Slowly but surely, I’m seeing clients come to my company already knowing that low quality link building isn’t what they want. It’s taken a few years after Penguin for it to filter down to client / business owner level, but it’s definitely happening. This is a good thing but unfortunately, the main reason for this is that most of them have been burnt in the past by SEO companies who have built low quality links without giving thought to building good quality ones too.

I have no doubt that it’s this change in mindset which has led to trends like this:

The thing is, I don’t think this was by choice.

Let’s be honest. A lot of us used the kind of link building tactics that Google no longer like because they worked. I don’t think many SEOs were under the illusion that it was genuinely high quality stuff, but it worked and it was far less risky to do than it is today. Unless you were super-spammy, the low-quality links just worked.

Fast forward to a post-Penguin world, things are far more risky. For me, it’s because of this that we see the trends like the above. As an industry, we had the easiest link building methods taken away from us and we’re left with fewer options. One of the main options is content marketing which, if you do it right, can lead to good quality links and importantly, the types of links you won’t be removing in the future. Get it wrong and you’ll lose budget and lose the trust if your boss or client in the power of content when it comes to link building.

There are still plenty of other methods to build links and sometimes we can forget this. Just look at this epic list from Jon Cooper. Even with this many tactics still available to us, it’s hard work. Way harder than it used to be.

My summary here is that as an industry, our mindset has shifted but it certainly wasn’t a voluntary shift. If the tactics that Penguin targeted still worked today, we’d still be using them.

A few other opinions…

I definitely think too many people want the next easy win. As someone surfing the edge of what Google is bringing our way, here’s my general take—SEO, in broad strokes, is changing a lot, *but* any given change is more and more niche and impacts fewer people. What we’re seeing isn’t radical, sweeping changes that impact everyone, but a sort of modularization of SEO, where we each have to be aware of what impacts our given industries, verticals, etc.”

- Dr. Pete

 

I don’t feel that techniques for acquiring links have changed that much. You can either earn them through content and outreach or you can just buy them. What has changed is the awareness of “link building” outside of the SEO community. This makes link building / content marketing much harder when pitching to journalists and even more difficult when pitching to bloggers.

“Link building has to be more integrated with other channels and struggles to work in its own environment unless supported by brand, PR and social. Having other channels supporting your link development efforts also creates greater search signals and more opportunity to reach a bigger audience which will drive a greater ROI.

- Carl Hendy

 

SEO has grown up in terms of more mature staff and SEOs becoming more ingrained into businesses so there is a smarter (less pressure) approach. At the same time, SEO has become more integrated into marketing and has made marketing teams and decision makers more intelligent in strategies and not pushing for the quick win. I’m also seeing that companies who used to rely on SEO and building links have gone through IPOs and the need to build 1000s of links per quarter has rightly reduced.

- Danny Denhard

Signals that surround link building have changed

There is no question about this one in my mind. I actually wrote about this last year in my previous blog post where I talked about signals such as anchor text and deep links changing over time.

Many of the people I asked felt the same, here are some quotes from them, split out by the types of signal.

Domain level link metrics

I think domain level links have become increasingly important compared with page level factors, i.e. you can get a whole site ranking well off the back of one insanely strong page, even with sub-optimal PageRank flow from that page to the rest of the site.

- Phil Nottingham

I’d agree with Phil here and this is what I was getting at in my previous post on how I feel “deep links” will matter less over time. It’s not just about domain level links here, it’s just as much about the additional signals available for Google to use (more on that later).

Anchor text

I’ve never liked anchor text as a link signal. I mean, who actually uses exact match commercial keywords as anchor text on the web?

SEOs. :)

Sure there will be natural links like this, but honestly, I struggle with the idea that it took Google so long to start turning down the dial on commercial anchor text as a ranking signal. They are starting to turn it down though, slowly but surely. Don’t get me wrong, it still matters and it still works. But like pure link spam, the barrier is a lot more lower now in terms what of constitutes too much.

Rand feels that they matter more than we’d expect and I’d mostly agree with this statement:

Exact match anchor text links still have more power than you’d expect—I think Google still hasn’t perfectly sorted what is “brand” or “branded query” from generics (i.e. they want to start ranking a new startup like meldhome.com for “Meld” if the site/brand gets popular, but they can’t quite tell the difference between that and https://moz.com/learn/seo/redirection getting a few manipulative links that say “redirect”)

- Rand Fishkin

What I do struggle with though, is that Google still haven’t figured this out and that short-term, commercial anchor text spam is still so effective. Even for a short burst of time.

I don’t think link building as a concept has changed loads—but I think links as a signal have, mainly because of filters and penalties but I don’t see anywhere near the same level of impact from coverage anymore, even against 18 months ago.

- Paul Rogers

New signals have been introduced

It isn’t just about established signals changing though, there are new signals too and I personally feel that this is where we’ve seen the most change in Google algorithms in recent years—going all the way back to Panda in 2011.

With Panda, we saw a new level of machine learning where it almost felt like Google had found a way of incorporating human reaction / feelings into their algorithms. They could then run this against a website and answer questions like the ones included in this post. Things such as:

  • “Would you be comfortable giving your credit card information to this site?”
  • “Does this article contain insightful analysis or interesting information that is beyond obvious?”
  • “Are the pages produced with great care and attention to detail vs. less attention to detail?”

It is a touch scary that Google was able to run machine learning against answers to questions like this and write an algorithm to predict the answers for any given page on the web. They have though and this was four years ago now.

Since then, they’ve made various moves to utilize machine learning and AI to build out new products and improve their search results. For me, this was one of the biggest and went pretty unnoticed by our industry. Well, until Hummingbird came along I feel pretty sure that we have Ray Kurzweil to thank for at least some of that.

There seems to be more weight on theme/topic related to sites, though it’s hard to tell if this is mostly link based or more user/usage data based. Google is doing a good job of ranking sites and pages that don’t earn the most links but do provide the most relevant/best answer. I have a feeling they use some combination of signals to say “people who perform searches like this seem to eventually wind up on this website—let’s rank it.” One of my favorite examples is the Audubon Society ranking for all sorts of birding-related searches with very poor keyword targeting, not great links, etc. I think user behavior patterns are stronger in the algo than they’ve ever been.

- Rand Fishkin

Leading on from what Rand has said, it’s becoming more and more common to see search results that just don’t make sense if you look at the link metrics—but are a good result.

For me, the move towards user data driving search results + machine learning advanced has been the biggest change we’ve seen in recent years and it’s still going.

Edit: since drafting this post, Tom Anthony released this excellent blog post on his views on the future of search and the shift to data-driven results. I’d recommend reading that as it approaches this whole area from a different perspective and I feel that an off-shoot of what Tom is talking about is the impact on link building.

You may be asking at this point, what does machine learning have to do with link building?

Everything. Because as strong as links are as a ranking signal, Google want more signals and user signals are far, far harder to manipulate than established link signals. Yes it can be done—I’ve seen it happen. There have even been a few public tests done. But it’s very hard to scale and I’d venture a guess that only the top 1% of spammers are capable of doing it, let alone maintaining it for a long period of time. When I think about the process for manipulation here, I actually think we go a step beyond spammers towards hackers and more cut and dry illegal activity.

For link building, this means that traditional methods of manipulating signals are going to become less and less effective as these user signals become stronger. For us as link builders, it means we can’t keep searching for that silver bullet or the next method of scaling link building just for an easy win. The fact is that scalable link building is always going to be at risk from penalization from Google—I don’t really want to live a life where I’m always worried about my clients being hit by the next update. Even if Google doesn’t catch up with a certain method, machine learning and user data mean that these methods may naturally become less effective and cost efficient over time.

There are of course other things such as social signals that have come into play. I certainly don’t feel like these are a strong ranking factor yet, but with deals like this one between Google and Twitter being signed, I wouldn’t be surprised if that ever-growing dataset is used at some point in organic results. The one advantage that Twitter has over Google is it’s breaking news freshness. Twitter is still way quicker at breaking news than Google is—140 characters in a tweet is far quicker than Google News! Google know this which is why I feel they’ve pulled this partnership back into existence after a couple of years apart.

There is another important point to remember here and it’s nicely summarised by Dr. Pete:

At the same time, as new signals are introduced, these are layers not replacements. People hear social signals or user signals or authorship and want it to be the link-killer, because they already fucked up link-building, but these are just layers on top of on-page and links and all of the other layers. As each layer is added, it can verify the layers that came before it and what you need isn’t the magic signal but a combination of signals that generally matches what Google expects to see from real, strong entities. So, links still matter, but they matter in concert with other things, which basically means it’s getting more complicated and, frankly, a bit harder. Of course, on one wants to hear that.”

- Dr. Pete

The core principles have not changed

This is the crux of everything for me. With all the changes listed above, the key is that the core principles around link building haven’t changed. I could even argue that Penguin didn’t change the core principles because the techniques that Penguin targeted should never have worked in the first place. I won’t argue this too much though because even Google advised website owners to build directory links at one time.

You need an asset

You need to give someone a reason to link to you. Many won’t do it out of the goodness of their heart! One of the most effective ways to do this is to develop a content asset and use this as your reason to make people care. Once you’ve made someone care, they’re more likely to share the content or link to it from somewhere.

You need to promote that asset to the right audience

I really dislike the stance that some marketers take when it comes to content promotion—build great content and links will come.

No. Sorry but for the vast majority of us, that’s simply not true. The exceptions are people that sky dive from space or have huge existing audiences to leverage.

You simply have to spend time promoting your content or your asset for it to get shares and links. It is hard work and sometimes you can spend a long time on it and get little return, but it’s important to keep working at until you’re at a point where you have two things:

  • A big enough audience where you can almost guarantee at least some traffic to your new content along with some shares
  • Enough strong relationships with relevant websites who you can speak to when new content is published and stand a good chance of them linking to it

Getting to this point is hard—but that’s kind of the point. There are various hacks you can use along the way but it will take time to get right.

You need consistency

Leading on from the previous point. It takes time and hard work to get links to your content—the types of links that stand the test of time and you’re not going to be removing in 12 months time anyway! This means that you need to keep pushing content out and getting better each and every time. This isn’t to say you should just churn content out for the sake of it, far from it. I am saying that with each piece of content you create, you will learn to do at least one thing better the next time. Try to give yourself the leverage to do this.

Anything scalable is at risk

Scalable link building is exactly what Google has been trying to crack down on for the last few years. Penguin was the biggest move and hit some of the most scalable tactics we had at our disposal. When you scale something, you often lose some level of quality, which is exactly what Google doesn’t want when it comes to links. If you’re still relying on tactics that could fall into the scalable category, I think you need to be very careful and just look at the trend in the types of links Google has been penalizing to understand why.

The part Google plays in this

To finish up, I want to briefly talk about the part that Google plays in all of this and shaping the future they want for the web.

I’ve always tried to steer clear of arguments involving the idea that Google is actively pushing FUD into the community. I’ve preferred to concentrate more on things I can actually influence and change with my clients rather than what Google is telling us all to do.

However, for the purposes of this post, I want to talk about it.

General paranoia has increased. My bet is there are some companies out there carrying out zero specific linkbuilding activity through worry.

- Dan Barker

Dan’s point is a very fair one and just a day or two after reading this in an email, I came across a page related to a client’s target audience that said:

“We are not publishing guest posts on SITE NAME any more. All previous guest posts are now deleted. For more information, see www.mattcutts.com/blog/guest-blogging/“.

I’ve reworded this as to not reveal the name of the site, but you get the point.

This is silly. Honestly, so silly. They are a good site, publish good content, and had good editorial standards. Yet they have ignored all of their own policies, hard work, and objectives to follow a blog post from Matt. I’m 100% confident that it wasn’t sites like this one that Matt was talking about in this blog post.

This is, of course, from the publishers’ angle rather than the link builders’ angle, but it does go to show the effect that statements from Google can have. Google know this so it does make sense for them to push out messages that make their jobs easier and suit their own objectives—why wouldn’t they? In a similar way, what did they do when they were struggling to classify at scale which links are bad vs. good and they didn’t have a big enough web spam team? They got us to do it for them :)

I’m mostly joking here, but you see the point.

The most recent infamous mobilegeddon update, discussed here by Dr. Pete is another example of Google pushing out messages that ultimately scared a lot of people into action. Although to be fair, I think that despite the apparent small impact so far, the broad message from Google is a very serious one.

Because of this, I think we need to remember that Google does have their own agenda and many shareholders to keep happy. I’m not in the camp of believing everything that Google puts out is FUD, but I’m much more sensitive and questioning of the messages now than I’ve ever been.

What do you think? I’d love to hear your feedback and thoughts in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

10 Genius Ideas That Changed Marketing Forever

history of marketing ideasintroductory3

This is an excerpt from our new ebook, 100 Ideas That Changed Marketing. Download your free copy if you want to see the other 90 ideas that have changed our industry forever!

At the beginning of this year, we set out to create an infographic that gave a rundown of the history of marketing. And as we looked back, we found that one idea from all the way back in the 1400s — the invention of the printing press that made mass media possible — totally and completely changed the entire trajectory of our industry. Heck, you could argue it made our industry possible!

Read the full story

Posted in Latest NewsComments Off

The Sales Game Has Changed: Here’s How to Adapt

sales gameintroductory3

This is a guest post written by Dave Kurlan, a top rated speaker, best selling author, and sales development thought leader. His top-rated business blog, Understanding the Sales Force, is read by thousands of sales and marketing leaders.

Read the full story

Posted in Latest NewsComments Off

How Google’s Panda Update Changed SEO Best Practices Forever – Whiteboard Friday

Posted by Aaron Wheeler

 It’s here! Google has released Panda update 2.2, just as Matt Cutts said they would at SMX Advanced here in Seattle a couple of weeks ago. This time around, Google has – among other things – improved their ability to detect scraper sites and banish them from the SERPs. Of course, the Panda updates are changes to Google’s algorithm and are not merely manual reviews of sites in the index, so there is room for error (causing devestation for many legitimate webmasters and SEOs).

A lot of people ask what parts of their existing SEO practice they can modify and emphasize to recover from the blow, but alas, it’s not that simple. In this week’s Whiteboard Friday, Rand discusses how the Panda updates work and, more importantly, how Panda has fundamentally changed the best practices for SEO. Have you been Panda-abused? Do you have any tips for recouperating? Let us know in the comments!

 

Video Transcription

Howdy, SEOmoz fans. Welcome to another edition of Whiteboard Friday. This week, we’re talking about the very exciting, very interesting, very controversial Google Panda update.

Panda, also known as Farmer, was this update that Google came out with in March of this year, of 2011, that rejiggered a bunch of search results and pushed a lot of websites down in the rankings, pushed some websites up in the rankings, and people have been concerned about it ever since. It has actually had several updates and new versions of that implementation and algorithm come out. A lot of people have all these questions like, "Ah, what’s going on around Panda?" There have been some great blog posts on SEOmoz talking about some of the technical aspects. But I want to discuss in this Whiteboard Friday some of the philosophical and theoretical aspects and how Google Panda really changes the way a lot of us need to approach SEO.

So let’s start with a little bit of Panda history. Google employs an engineer named Navneet Panda. The guy has done some awesome work. In fact, he was part of a patent application that Bill Slawski looked into where he found a great way to scale some machine learning algorithms. Now, machine learning algorithms, as you might be aware, are very computationally expensive and they take a long time to run, particularly if you have extremely large data sets, both of inputs and of outputs. If you want, you can research machine learning. It is an interesting fun tactic that computer scientists use and programmers use to find solutions to problems. But basically before Panda, machine learning scalability at Google was at level X, and after it was at the much higher level Y. So that was quite nice. Thanks to Navneet, right now they can scale up this machine learning.

What Google can do based on that is take a bunch of sites that people like more and a bunch of sites that people like less, and when I say like, what I mean is essentially what the quality raters, Google’s quality raters, tell them this site is very enjoyable. This is a good site. I’d like to see this high in the search results. Versus things where the quality raters say, "I don’t like to see this." Google can say, "Hey, you know what? We can take the intelligence of this quality rating panel and scale it using this machine learning process."

Here’s how it works. Basically, the idea is that the quality raters tell Googlers what they like. They answer all these questions, and you can see Amit Singhal and Matt Cutts were interviewed by Wired Magazine. They talked about some of the things that were asked of these quality raters, like, "Would you trust this site with your credit card? Would you trust the medical information that this site gives you with your children? Do you think the design of this site is good?" All sorts of questions around the site’s trustworthiness, credibility, quality, how much they would like to see it in the search results. Then they compare the difference.

The sites that people like more, they put in one group. The sites that people like less, they put in another group. Then they look at tons of metrics. All these different metrics, numbers, signals, all sorts of search signals that many SEOs suspect come from user and usage data metrics, which Google has not historically used as heavily. But they think that they use those in a machine learning process to essentially separate the wheat from the chaff. Find the ones that people like more and the ones that people like less. Downgrade the ones they like less. Upgrade the ones they like more. Bingo, you have the Panda update.

So, Panda kind of means something new and different for SEO. As SEOs, for a long time you’ve been doing the same kind of classic things. You’ve been building good content, making it accessible to search engines, doing good keyword research, putting those keywords in there, and then trying to get some links to it. But you have not, as SEOs, we never really had to think as much or as broadly about, "What is the experience of this website? Is it creating a brand that people are going to love and share and reward and trust?" Now we kind of have to think about that.

It is almost like the job of SEO has been upgraded from SEO to web strategist. Virtually everything you do on the Internet with your website can impact SEO today. That is especially true following Panda. The things that they are measuring is not, oh, these sites have better links than these sites. Some of these sites, in fact, have much better links than these sites. Some of these sites have what you and I might regard, as SEOs, as better content, more unique, robust, quality content, and yet, people, quality raters in particular, like them less or the things, the signals that predict that quality raters like those sites less are present in those types of sites.

Let’s talk about a few of the specific things that we can be doing as SEOs to help with this new sort of SEO, this broader web content/web strategy portion of SEO.

First off, design and user experience. I know, good SEOs have been preaching design user experience for years because it tends to generate more links, people contribute more content to it, it gets more social signal shares and tweets and all this other sort of good second order effect. Now, it has a first order effect impact, a primary impact. If you can make your design absolutely beautiful, versus something like this where content is buffeted by advertising and you have to click next, next, next a lot. The content isn’t all in one page. You cannot view it in that single page format. Boy, the content blocks themselves aren’t that fun to read, even if it is not advertising that’s surrounding them, even if it is just internal messaging or the graphics don’t look very good. The site design feels like it was way back in the 1990s. All that stuff will impact the ability of this page, this site to perform. And don’t forget, Google has actually said publically that even if you have a great site, if you have a bunch of pages that are low quality on that site, they can drag down the rankings of the rest of the site. So you should try and block those for us or take them down. Wow. Crazy, right? That’s what a machine learning algorithm, like Panda, will do. It will predicatively say, "Hey, you know what? We’re seeing these features here, these elements, push this guy down."

Content quality matters a lot. So a lot of time, in the SEO world, people will say, "Well, you have to have good, unique, useful content." Not enough. Sorry. It’s just not enough. There are too many people making too much amazing stuff on the Internet for good and unique and grammatically correct and spelled properly and describes the topic adequately to be enough when it comes to content. If you say, "Oh, I have 50,000 pages about 50,000 different motorcycle parts and I am just going to go to Mechanical Turk or I am going to go outsource, and I want a 100 word, two paragraphs about each one of them, just describe what this part is." You think to yourself, "Hey, I have good unique content." No, you have content that is going to be penalized by Panda. That is exactly what Panda is designed to do. It is designed to say this is content that someone wrote for SEO purposes just to have good unique content on the page, not content that makes everyone who sees it want to share it and say wow. Right?

If I get to a page about a motorcycle part and I am like, "God, not only is this well written, it’s kind of funny. It’s humorous. It includes some anecdotes. It’s got some history of this part. It has great photos. Man, I don’t care at all about motorcycle parts, and yet, this is just a darn good page. What a great page. If I were interested, I’d be tweeting about this, I’d share it. I’d send it to my uncle who buys motorcycles. I would love this page." That’s what you have to optimize for. It is a totally different thing than optimizing for did I use the keyword at least three times? Did I put it in the title tag? Is it included in there? Is the rest of the content relevant to the keywords? Panda changes this. Changes it quite a bit.

Finally, you are going to be optimizing around user and usage metrics. Things like, when people come to your site, generally speaking compared to other sites in your niche or ranking for your keywords, do they spend a good amount of time on your site, or do they go away immediately? Do they spend a good amount of time? Are they bouncing or are they browsing? If you have a good browse rate, people are browsing 2, 3, 4 pages on average on a content site, that’s decent. That’s pretty good. If they’re browsing 1.5 pages on some sites, like maybe specific kinds of news sites, that might actually be pretty good. That might be better than average. But if they are browsing like 1.001 pages, like virtually no one clicks on a second page, that might be weird. That might hurt you. Your click-through rate from the search results. When people see your title and your snippet and your domain name, and they go, "Ew, I don’t know if I want to get myself involved in that. They’ve got like three hyphens in their domain name, and it looks totally spammy. I’m not going to get involved." Then that click-through rate is probably going to suffer and so are your rankings.

They are going to be looking at things like the diversity and quantity of traffic that comes to your site. Do lots of people from all around the world or all around your local region, your country, visit your website directly? They can measure this through Chrome. They can measure it through Android. They can measure it through the Google toolbar. They have all this user and usage metrics. They know where people are going on the Internet, where they spend time, how much time they spend, and what they do on those pages. They know about what happens from the search results too. Do people click from a result and then go right back to the search results and perform another search? Clearly, they were unhappy with that. They can take all these metrics and put them into the machine learning algorithm and then have Panda essentially recalculate. This why you see essentially Google doesn’t issue updates every day or every week. It is about every 30 or 40 days that a new Panda update will come out because they are rejiggering all this stuff.

One of the things that people who get hit by Panda come up to me and say, "God, how are we ever going to get out of Panda? We’ve made all these changes. We haven’t gotten out yet." I’m like, "Well, first off, you’re not going to get out of it until they rejigger the results, and then there is no way that you are going to get out of it unless you change the metrics around your site." So if you go into your Analytics and you see that people are not spending longer on your pages, they are not enjoying them more, they are not sharing them more, they are not naturally linking to them more, your branded search traffic is not up, your direct type in traffic is not up, you see that none of these metrics are going up and yet you think you have somehow fixed the problems that Panda tries to solve for, you probably haven’t.

I know this is frustrating. I know it’s a tough issue. In fact, I think that there are sites that have been really unfairly hit. That sucks and they shouldn’t be and Google needs to work on this. But I also know that I don’t think Google is going to be making many changes. I think they are very happy with the way that Panda has gone from a search quality perspective and from a user happiness perspective. Their searchers are happier, and they are not seeing as much junk in the results. Google likes the way this is going. I think we are going to see more and more of this over time. It could even get more aggressive. I would urge you to work on this stuff, to optimize around these things, and to be ready for this new form of SEO.

Thanks everyone for watching. Look forward to some great comments, questions, feedback in the post. I will see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Posted in Latest NewsComments Off


Advert