Archive | Latest News

4 underutilized schema markup opportunities that impact SEO

Contributor Tony Edwards recommends taking advantage of little-used brand, image, app and person schema that indirectly help position a website for better rankings.

The post 4 underutilized schema markup opportunities that impact SEO appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

SearchCap: Google more results button, new filters for Search Console & Chrome search suggestions

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google more results button, new filters for Search Console & Chrome search suggestions appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

The Bot Plan: Your Guide to Making Conversations Convert

Posted by purna_v

Let’s start off with a quick “True or False?” game:

“By 2020, the average person will have more conversations with their bot than with their spouse.”

True, or false? You may be surprised to learn that speaking more with bots than our spouse is precisely what Gartner is predicting.

And when Facebook’s Mark Zuckerberg says “messaging is one of the few things that people do more than social networking,” it requires no leap of faith to see that chatbots are an integral part of marketing’s future.

But you don’t need to stock up on canned peaches and head for the hills because “the robots are coming.” The truth is, the robots aren’t coming because they’re already here, and they love us from the bottom of their little AI-powered hearts.

Bots aren’t a new thing for many parts of the world such as China or India. As reported by Business Insider, sixty-seven percent of consumers worldwide have used a chatbot for customer support in the last year.

Within the United States, an impressive 60% of millennials have used chatbots with 70% of those reporting positive experiences, according to Forbes.

There’s no putting bots back in the box.

And it’s not just that brands have to jump on board to keep up with those pesky new generations, either. Bots are great for them, too.

Bots offer companies:

  1. A revolutionary way to reach consumers. For the first time in history, brands of any size can reach consumers on a personal level. Note my emphasis on “of any size.” You can be a company of one and your bot army can give your customers a highly personal experience. Bots are democratizing business!
  2. Snackable data. This “one-to-one” communication gives you personal insights and specificity, plus a whole feast of snackable data that is actionable.
  3. Non-robot-like interaction. An intelligent bot can keep up with back-and-forth customer messages in a natural, contextual, human way.
  4. Savings. According to Juniper Research, the average time saving per chatbot inquiry compared to traditional call centers is over four minutes, which has the potential to make a truly extraordinary impact on a company’s bottom line (not to mention the immeasurable impact it has on customers’ feelings about the company).
  5. Always on. It doesn’t matter what time zone your customer is in. Bots don’t need to sleep, or take breaks. Your company can always be accessible via your friendly bot.

Here in the West, we are still in the equivalent of the Jurassic Period for bots. What they can be used for is truly limited only by our imagination.

One of my most recent favorites is an innovation from the BBC News Labs and Visual Journalism teams, who have launched a bot-builder app designed to, per Nieman Lab, “make it as easy as possible for reporters to build chatbots and insert them in their stories.”

So, in a story about President Trump from earlier this year, you see this:

Source: BBC.com

It’s one of my favorites not just because it’s innovative and impressive, but because it neatly illustrates how bots can add to and improve our lives… not steal our jobs.

Don’t be a dinosaur

A staggering eighty percent of brands will use chatbots for customer interactions by 2020, according to research. That means that if you don’t want to get left behind, you need to join the bot arms race right now.

“But where do I start?” you wonder.

I’m happy you asked that. Building a bot may seem like an endeavor that requires lots of tech savvy, but it’s surprisingly low-risk to get started.

Many websites allow you to build bots for free, and then there’s QNAMaker.ai (created by Microsoft, my employer), which does a lot of the work for you.

You simply input your company’s FAQ section, and it builds the foundation for an easy chatbot that can be taken live via almost any platform, using natural language processing to parse your FAQ and develop a list of questions your customers are likely to ask.

This is just the beginning — the potential for bots is wow-tastic.

That’s what I’m going to show you today — how you can harness bot-power to build strong, lasting relationships with your customers.

Your 3-step plan to make conversations convert

Step 1: Find the right place to start

The first step isn’t to build a bot straightaway. After all, you can build the world’s most elaborate bot and it is worth exactly nothing to you or your customer if it does not address their needs.

That’s why the first step is figuring out the ways bots can be most helpful to your customers. You need to find their pain points.

You can do this by pretending you’re one of your customers, and navigating through your purchase funnel. Or better again, find data within your CRM system and analytics tools that can help you answer key questions about how your audience interacts with your business.

Here’s a handy checklist of questions you should get answers to during this research phase:

  • How do customers get information or seek help from your company? ☑
  • How do they make a purchase? ☑
  • Do pain points differ across channels and devices? ☑
  • How can we reduce the number of steps in each interaction? ☑

Next, you’ll want to build your hypothesis. And here’s a template to help you do just that:

I believe [type of person] needs to solve [problem] which happens while [situation], which will allow them to [get value].

For example, you’re the manager of a small spa, whose biggest time-suck is people calling to ask simple questions, meaning other customers are on hold for a long time. If those customers can ask a bot these simple questions, you get three important results:

  1. The hold time for customers overall will diminish
  2. The customer-facing staff in your spa will be able to pay more attention to clients who are physically in front of them
  3. Customers with lengthier questions will be helped sooner

Everybody wins.

Finally, now that you’ve identified and prioritized the situations where conversation can help, you’ll be ready to build a bot as well as a skill.

Wait a minute — what’s a skill in this context, and how do they relate to bots? Here’s a great explanation from Chris Messina:

  • A bot is an autonomous program on a network
  • A chatbot is a bot that uses human language to communicate
  • An AI assistant is a chatbot that performs tasks or services for an individual
  • A skill is a capability that an AI assistant can learn

Each of them can help look things up, place orders, solve problems, and make things happen easier, better, and faster.

A few handy resources to build a bot are:

Step 2: Add conversation across the entire customer journey

There are three distinct areas of the customer decision journey where bots and skills can make a big difference.

Bot as introducer

Bots can help your company by being present at the very first event in a purchase path.

Adidas did this wonderfully when they designed a chatbot for their female-focused community Studio LDN, to help create an interactive booking process for the free fitness sessions offered. To drive engagement further, as soon as a booking was made the user would receive reminders and messages from influencer fitness instructors.

The chatbot was the only way for people to book these sessions and it worked spectacularly well.

In the first two weeks, 2,000 people signed up to participate, with repeat use at 80%. Retention after week one was 60%, which the brand claims is far better compared to an app.

Adidas did something really clever. They advertised the bot across many of their other channels to help promote the bot and help with its discoverability.

You can do the same.

There are countless examples where bots can put their best suit on and act as the first introduction to your company:

  • Email marketing: According to MailChimp research, the average email open rates are between 15% to 26% with click rates being just a fraction of that at approximately 2%–5%. That’s pretty low when you compare that to Messenger messages, which can have an open rate of well over 90%. Why not make your call-to-action within your email be an incentive for people to engage with your chatbot? For example, something like “message us for 10% off” could be a compelling reason for people to engage with your chatbot.
  • Social media: How about instead of running Facebook ads which direct people to websites, you run an ad connecting people to bots instead? For example, in the ad, advise people to “chat to see the latest styles” or “chat now to get 20% off” and then have your bot start a conversation. Instant engagement! Plus, it’s a more gentle call-to-action as opposed to a hard sell such as “buy now.”
  • Video: How about creating instructional YouTube videos on how to use your bot? Especially helpful since one of the barriers to using this new technology is a lack of awareness about how to use it. A short, quick video that demonstrates what your skill can do could be very impactful. Check out this great example from FitBit and Cortana:

  • Search: As you’ve likely seen by now, Bing has been integrating chatbots within the SERPs itself. You can do a search for bots across different platforms and you’ll be able to add relevant bots directly to your preferred platform right from the search results themselves:

Travel Bots

  • You can engage with local businesses such as restaurants via the Bing Business bot that shows up as part of the local listings:

Monsoon Seattle search with chatbot

The key lesson here is that when your bot is acting as an introducer, give your audience plenty of ways and reasons to chat. Use conversation to tell people about new stuff, and get them to kick off that conversation.

Bot as influencer

To see a bot acting as an effective influencer, let’s turn to Chinese giant Alibaba. They developed a customizable chatbot store concierge that they offer free to brands and markets.

Cutely named dian xiao mi, or “little shop bee,” the concierge is designed to be the most helpful store assistant you could wish for.

For example, if a customer interacting with a clothing brand uploads a photograph of a t-shirt, the bot buzzes in with suggestions of pants to match. Or, if a customer provides his height and weight, the bot can offer suggested sizing. Anyone who has ever shopped online for clothing knows exactly how much pain the latter offering could eliminate.

This helpful style is essentially changing the conversation from “BUY NOW!” to “What do you need right now?”

We should no longer ask: “How should we sell to customers?” The gazillion-dollar question instead is: How can we connect with them?

An interesting thing about this change is that, when you think about it for a second, it seems like common sense. How much more trust would you have for a brand that was only trying to help you? If you bought a red dress, how much more helpful would it be if the brand showed you a pic of complementary heels and asked if you want to “complete the look”?

For the chatbot to be truly helpful as an influencer, it needs to learn from each conversation. It needs to remember what you shared from the last conversation, and use it to shape future conversations.

So, say a chatbot from my favorite shoe store knew all about my shoe addiction (is there a cure? Would I event want to be cured of it?), then it could be more helpful via its remarketing efforts.

Imagine how much more effective it would be if we could have an interaction like this:

Shoestore Chatbot: Hi Purna! We’re launching a new collection of boots. Would you like a sneak peek?

Me: YES please!!!

Shoestore Chatbot: Great! I’ll email pics to you. You can also save 15% off your next order with code “MozBlog”. Hurry, code expires in 24 hours.

Me: *buys all the shoes, obvs*

This is Bot-topia. Your brand is being helpful, not pushy. Your bot is cultivating relationships with your customers, not throwing ads at them.

The key lesson here? For your bot to be a successful influencer, you must always consider how they can be helpful and how they can add value.

Bot as closer

Bot: “A, B, C. Always be closing.”

Imagine you want to buy flowers for Mother’s Day, but you have very little interest in flowers, and when you scroll through the endless options on the website, and then a long checkout form, you just feel overwhelmed.

1-800-Flowers found your pain point, and acted on it by creating a bot for Facebook Messenger.

It asks you whether you want to select a bunch from one of their curated collections, instantly eliminating the choice paralysis that could see consumers leave the website without purchasing anything.

And once you’ve chosen, you can easily complete the checkout process using your phone’s payment system (e.g. Apple Pay) to make checkout a cinch. So easy, and so friction-free.

The result? According to Digiday, within two months of launch the company saw 70% of the orders through the bot came from brand-new customers. By building a bot, 1-800 Flowers slam-dunked their way into the hearts of a whole new, young demographic.

Can you think of a better, more inexpensive way to unlock a big demographic? I can’t.

To quote Mr. Zuckerberg again: “It’s pretty ironic. To order from 1-800-Flowers, you never have to call 1-800-Flowers again.”

Think back to that handy checklist of questions from Step 1, especially this one: “How can we reduce the number of steps in each interaction?”

Your goal is to make every step easy and empathetic.

Think of what people would want/need to know to as they complete their tasks. For example, if you’re looking to transfer money from your bank account, the banking chatbot could save you from overdraft fees if it warns you that your account could be overdrawn before you make the transfer.

The key lesson here: Leverage your bots to remove any friction and make the experience super relevant and empathetic.

Step 3: Measure the conversation with the right metrics

One of my favorite quotes around how we view metrics versus how we should view metrics comes from Automat CEO Andy Mauro, who says:

“Rather than tracking users with pixels and cookies, why not actually engage them, learn about them, and provide value that actually meets their needs?”

Again, this is common sense once you’ve read it. Of course it makes sense to engage our users and provide value that meets their needs!

We can do this because the bots and skills give us information in our customers’ own words.

Here’s a short list of KPIs that you should look at (let’s call it “bot-alytics”):

  • Delivery and open rates: If the bot starts a conversation, did your customer open it?
  • Click rates: If your bot delivered a link in a chat, did your customer click on it?
  • Retention: How often do they come back and chat with you?
  • Top messages: What messages are resonating with your customers more than others?
  • Conversion rates: Do they buy?
  • Sentiment analysis: Do your customers express happiness and enthusiasm in their conversation with the bot, or frustration and anger?

Using bot-alytics, you can easily build up a clear picture of what is working for you, and more importantly, what is working for your customer.

And don’t forget to ask: What can you learn from bot-alytics that can help other channels?

The future’s bright, the future’s bots

What were once dumb machines are now smart enough that we can engage with them in a very human way. It presents the opportunity of a generation for businesses of all shapes and sizes.

Our customers are beginning to trust bots and digital personal assistants for recommendations, needs, and more. They are the friendly neighborhood machines that the utopian vision of a robotic future presents. They should be available to people anywhere: from any device, in any way.

And if that hasn’t made you pencil in a “we need to talk about bots” meeting with your company, here’s a startling prediction from Accenture. They believe that in five years, more than half of your customers will select your services based on your AI instead of your traditional brand.

In three steps, you can start your journey toward bot-topia and having your conversations convert. What are you waiting for?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Why Great Content Alone Isn’t Enough to Build an Audience

A couple of weeks ago, I wrote a blog post about creating content that earns your audience’s attention. Mark Schaefer swung by and left a comment — and he made a point that is dear to our hearts at Copyblogger. “Outstanding content is not the finish line, it’s the starting line.”– Mark Schaefer I told
Read More…

The post Why Great Content Alone Isn’t Enough to Build an Audience appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

Want to Speak at MozCon 2018? Here’s Your Chance – Pitch to Be a Community Speaker!

Posted by Danielle_Launders

MozCon 2018 is nearing and it’s almost time to brush off that microphone. If speaking at MozCon is your dream, then we have the opportunity of a lifetime for you! Pitch us your topic and you may be selected to join us as one of our six community speakers.

What is a community speaker, you ask? MozCon sessions are by invite only, meaning we reach out to select speakers for the majority of our talks. But every year we reserve six 15-minute community speaking slots, where we invite anyone in the SEO community to pitch to present at MozCon. These sessions are both an attendee favorite and a fabulous opportunity to break into the speaking circuit.

Katie Cunningham, one of last year’s community speakers, on stage at MozCon 2017

Interested in pitching your own idea? Read on for everything you need to know:

The details

  • Fill out the community speaker submission form
  • Only one submission per person — make sure to choose the one you’re most passionate about!
  • Pitches must be related to online marketing and for a topic that can be covered in 15 minutes
  • Submissions close on Sunday, April 22nd at 5pm PDT
  • All decisions are final
  • All speakers must adhere to the MozCon Code of Conduct
  • You’ll be required to present in Seattle at MozCon

Ready to pitch your idea?

If you submit a pitch, you’ll hear back from us regardless of your acceptance status.

What you’ll get as a community speaker:

  • 15 minutes on the MozCon stage for a keynote-style presentation, followed by 5 minutes of Q&A
  • A free ticket to MozCon (we can issue a refund or transfer if you have already purchased yours)
  • Four nights of lodging covered by Moz at our partner hotel
  • Reimbursement for your travel — up to $ 500 for domestic and $ 750 for international travel
  • An additional free MozCon ticket for you to give away, plus a code for $ 300 off of one ticket
  • An invitation for you and your significant other to join us for the pre-event speakers dinner

The selection process:

We have an internal committee of Mozzers that review every pitch. In the first phase we review only the topics to ensure that they’re a good fit for our audience. After this first phase, we look at the entirety of the pitch to help us get a comprehensive idea of what to expect from your talk on the MozCon stage.

Want some advice for perfecting your pitch?

  • Keep your pitch focused to online marketing. The more actionable the pitch, the better.
  • Be detailed! We want to know the actual tactics our audience will be learning about. Remember, we receive a ton of pitches, so the more you can explain, the better!
  • Review the topics already being presented — we’re looking for something new to add to the stage.
  • Keep the pitch to under 1200 characters. We’re strict with the word limits — even the best pitches will be disqualified if they don’t abide by the rules.
  • No pitches will be evaluated in advance, so please don’t ask :)
  • Using social media to lobby your pitch won’t help. Instead, put your time and energy into the actual pitch itself!
  • Linking to a previous example of a slide deck or presentation isn’t required, but it does help the committee a ton.

You’ve got this!

This could be you.

If your pitch is selected, the MozCon team will help you along the way. Whether this is your first time on stage or your twentieth, we want this to be your best talk to date. We’re here to answer questions that may come up and to work with you to deliver something you’re truly proud of. Here are just a handful of ways that we’re here to help:

  • Topic refinement
  • Helping with your session title and description
  • Reviewing any session outlines and drafts
  • Providing plenty of tips around best practices — specifically with the MozCon stage in mind
  • Comprehensive show guide
  • Being available to listen to you practice your talk
  • Reviewing your final deck
  • A full stage tour on Sunday to meet our A/V crew, see your presentation on the big screens, and get a feel for the show
  • An amazing 15-person A/V team

Make your pitch to speak at MozCon!

We can’t wait to see what y’all come up with. Best of luck!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

How to Stop Wishing You Had More Time to Write

“If only I had all the time in the world, my blog would be perfect.” That thought has probably crossed your mind more than once. I know it’s crossed mine. I find myself lost in daydreams about how amazing my motorcycle blog could be — if only I had more time. When writerly productivity is
Read More…

The post How to Stop Wishing You Had More Time to Write appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

How Mobile-First Indexing Disrupts the Link Graph

Posted by rjonesx.

It’s happened to all of us. You bring up a webpage on your mobile device, only to find out that a feature you were accustomed to using on desktop simply isn’t available on mobile. While frustrating, it has always been a struggle for web developers and designers alike to simplify and condense their site on mobile screens without needing to strip features or content that would otherwise clutter a smaller viewport. The worst-case scenario for these trade-offs is that some features would be reserved for desktop environments, or perhaps a user might be able to opt out of the mobile view. Below is an example of how my personal blog displays the mobile version using a popular plugin by ElegantThemes called HandHeld. As you can see, the vast page is heavily stripped down and is far easier to read… but at what cost? And at what cost to the link graph?

My personal blog drops 75 of the 87 links, and all of the external links, when the mobile version is accessed. So what happens when the mobile versions of sites become the primary way the web is accessed, at scale, by the bots which power major search engines?

Google’s announcement to proceed with a mobile-first index raises new questions about how the link structure of the web as a whole might be influenced once these truncated web experiences become the first (and sometimes only) version of the web Googlebot encounters.

So, what’s the big deal?

The concern, which no doubt Google engineers have studied internally, is that mobile websites often remove content and links in order to improve user experience on a smaller screen. This abbreviated content fundamentally alters the link structure which underlies one of the most important factors in Google’s rankings. Our goal is to try and understand the impact this might have.

Before we get started, one giant unknown variable which I want to be quick to point out is we don’t know what percentage of the web Google will crawl with both its desktop and mobile bots. Perhaps Google will choose to be “mobile-first” only on sites that have historically displayed an identical codebase to both the mobile and desktop versions of Googlebot. However, for the purposes of this study, I want to show the worst-case scenario, as if Google chose not only to go “mobile-first,” but in fact to go “mobile-only.”

Methodology: Comparing mobile to desktop at scale

For this brief research, I decided to grab 20,000 random websites from the Quantcast Top Million. I would then crawl two levels deep, spoofing both the Google mobile and Google desktop versions of Googlebot. With this data, we can begin to compare how different the link structure of the web might look.

Homepage metrics

Let’s start with some descriptive statistics of the home pages of these 20,000 randomly selected sites. Of the sites analyzed, 87.42% had the same number of links on their homepage regardless of whether the bot was mobile- or desktop-oriented. Of the remaining 12.58%, 9% had fewer links and 3.58% had more. This doesn’t seem too disparate at first glance.

Perhaps more importantly, only 79.87% had identical links on the homepage when visited by desktop and mobile bots. Just because the same number of links were found didn’t mean they were actually the same links. This is important to take into consideration because links are the pathways which bots use to find content on the web. Different paths mean a different index.

Among the homepage links, we found a 7.4% drop in external links. This could mean a radical shift in some of the most important links on the web, given that homepage links often carry a great deal of link equity. Interestingly, the biggest “losers” as a percentage tended to be social sites. In retrospect, it seems reasonable that one of the common types of links a website might remove from their mobile version would be social share buttons because they’re often incorporated into the “chrome” of a page rather than the content, and the “chrome” often changes to accommodate a mobile version.

The biggest losers as a percentage in order were:

  1. linkedin.com
  2. instagram.com
  3. twitter.com
  4. facebook.com

So what’s the big deal about 5–15% differences in links when crawling the web? Well, it turns out that these numbers tend to be biased towards sites with lots of links that don’t have a mobile version. However, most of those links are main navigation links. When you crawl deeper, you just find the same links. But those that do deviate end up having radically different second-level crawl links.

Second-level metrics

Now this is where the data gets interesting. As we continue to crawl out on the web using crawl sets that are influenced by the links discovered by a mobile bot versus a desktop bot, we’ll continue to get more and more divergent results. But how far will they diverge? Let’s start with size. While we crawled an identical number of home pages, the second-tier results diverged based on the number of links found on those original home pages. Thus, the mobile crawlset was 977,840 unique URLs, while the desktop crawlset was 1,053,785. Already we can see a different index taking shape — the desktop index would be much larger. Let’s dig deeper.

I want you to take a moment and really focus on this graph. Notice there are three categories:

  • Mobile Unique: Blue bars represent unique items found by the mobile bot
  • Desktop Unique: Orange bars represent unique items found by the desktop bot
  • Shared: Gray bars represent items found by both

Notice also that there are there are four tests:

  • Number of URLs discovered
  • Number of Domains discovered
  • Number of Links discovered
  • Number of Root Linking Domains discovered

Now here is the key point, and it’s really big. There are more URLs, Domains, Links, and Root Linking Domains unique to the desktop crawl result than there are shared between the desktop and mobile crawler. The orange bar is always taller than the gray. This means that by just the second level of the crawl, the majority of link relationships, pages, and domains are different in the indexes. This is huge. This is a fundamental shift in the link graph as we have come to know it.

And now for the big question, what we all care about the most — external links.

A whopping 63% of external links are unique to the desktop crawler. In a mobile-only crawling world, the total number of external links was halved.

What is happening at the micro level?

So, what’s really causing this huge disparity in the crawl? Well, we know it has something to do with a few common shortcuts to making a site “mobile-friendly,” which include:

  1. Subdomain versions of the content that have fewer links or features
  2. The removal of links and features by user-agent detecting plugins

Of course, these changes might make the experience better for your users, but it does create a different experience for bots. Let’s take a closer look at one site to see how this plays out.

This site has ~10,000 pages according to Google and has a Domain Authority of 72 and 22,670 referring domains according to the new Moz Link Explorer. However, the site uses a popular WordPress plugin that abbreviates the content down to just the articles and pages on the site, removing links from descriptions in the articles on the category pages and removing most if not all extraneous links from the sidebar and footer. This particular plugin is used on over 200,000 websites. So, what happens when we fire up a six-level-deep crawl with Screaming Frog? (It’s great for this kind of analysis because we can easily change the user-agent and restrict settings to just crawl HTML content.)

The difference is shocking. First, notice that in the mobile crawl on the left, there is clearly a low number of links per page and that number of links is very steady as you crawl deeper through the site. This is what produces such a steady, exponential growth curve. Second, notice that the crawl abruptly ended at level four. The site just didn’t have any more pages to offer the mobile crawler! Only ~3,000 of the ~10,000 pages Google reports were found.

Now, compare this to the desktop crawler. It explodes in pages at level two, collecting nearly double the total pages of the mobile crawl at this level alone. Now, recall the graph before showing that there were more unique desktop pages than there were shared pages when we crawled 20,000 sites. Here is confirmation of exactly how it happens. Ultimately, 6x the content was made available to the desktop crawler in the same level of crawl depth.

But what impact did this have on external links?

Wow. 75% of the external, outbound links were culled in the mobile version. 4,905 external links were found in the desktop version while only 1,162 were found in the mobile. Remember, this is a DA 72 site with over twenty thousand referring domains. Imagine losing that link because the mobile index no longer finds the backlink. What should we do? Is the sky falling?

Take a deep breath

Mobile-first isn’t mobile-only

The first important caveat to all this research is that Google isn’t giving up on the desktop — they’re simply prioritizing the mobile crawl. This makes sense, as the majority of search traffic is now mobile. If Google wants to make sure quality mobile content is served, they need to shift their crawl priorities. But they also have a competing desire to find content, and doing so requires using a desktop crawler so long as webmasters continue to abbreviate the mobile versions of their sites.

This reality isn’t lost on Google. In the Original Official Google Mobile First Announcement, they write…

If you are building a mobile version of your site, keep in mind that a functional desktop-oriented site can be better than a broken or incomplete mobile version of the site.

Google took the time to state that a desktop version can be better than an “incomplete mobile version.” I don’t intend to read too much into this statement other than to say that Google wants a full mobile version, not just a postcard.

Good link placements will prevail

One anecdotal outcome of my research was that the external links which tended to survive the cull of a mobile version were often placed directly in the content. External links in sidebars like blog-rolls were essentially annihilated from the index, but in-content links survived. This may be a signal Google picks up on. External links that are both in mobile and desktop tend to be the kinds of links people might click on.

So, while there may be fewer links powering the link graph (or at least there might be a subset that is specially identified), if your links are good, content-based links, then you have a chance to see improved performance.

I was able to confirm this by looking at a subset of known good links. Using Fresh Web Explorer, I looked up fresh links to toysrus.com which is currently gaining a great deal of attention due to stores closing. We can feel confident that most of these links will be in-content because the articles themselves are about the relevant, breaking news regarding Toys R Us. Sure enough, after testing 300+ mentions, we found the links to be identical in the mobile and desktop crawls. These were good, in-content links and, subsequently, they showed up in both versions of the crawl.

Selection bias and convergence

It is probably the case that popular sites are more likely to have a mobile version than non-popular sites. Now, they might be responsive — at which point they would yield no real differences in the crawl — but at least some percentage would likely be m.* domains or utilize plugins like those mentioned above which truncate the content. At the lower rungs of the web, older, less professional content is likely to have only one version which is shown to mobile and desktop devices alike. If this is the case, we can expect that over time the differences in the index might begin to converge rather than diverge, as my study looked only at sites that were in the top million and only crawled two levels deep.

Moreover (this one is a bit speculative), but I think over time that there will be convergence between a mobile and desktop index. I don’t think the link graphs will grow exponentially different as the linked web is only so big. Rather, the paths to which certain pages are reached, and the frequency with which they are reached, will change quite a bit. So, while the link graph will differ, the set of URLs making up the link graph will largely be the same. Of course, some percentage of the mobile web will remain wholly disparate. The large number of sites that use dedicated mobile subdomains or plugins that remove substantial sections of content will remain like mobile islands in the linked web.

Impact on SERPs

It’s difficult at this point to say what the impact on search results will be. It will certainly not leave the SERPs unchanged. What would be the point of Google making and announcing a change to its indexing methods if it didn’t improve the SERPs?

That being said, this study wouldn’t be complete without some form of impact assessment. Hat tip to JR Oakes for giving me this critique, otherwise I would have forgotten to take a look.

First, there are a couple of things which could mitigate dramatic shifts in the SERPs already, regardless of the veracity of this study:

  • A slow rollout means that shifts in SERPs will be lost to the natural ranking fluctuations we already see.
  • Google can seed URLs found by mobile or by desktop into their respective crawlers, thereby limiting index divergence. (This is a big one!)
  • Google could choose to consider, for link purposes, the aggregate of both mobile and desktop crawls, not counting one to the exclusion of the other.

Second, the relationships between domains may be less affected than other index metrics. What is the likelihood that the relationship between Domain X and Domain Y (more or less links) is the same for both the mobile- and desktop-based indexes? If the relationships tend to remain the same, then the impact on SERPs will be limited. We will call this relationship being “directionally consistent.”

To accomplish this part of the study, I took a sample of domain pairs from the mobile index and compared their relationship (more or less links) to their performance in the desktop index. Did the first have more links than the second in both the mobile and desktop? Or did they perform differently?

It turns out that the indexes were fairly close in terms of directional consistency. That is to say that while the link graphs as a whole were quite different, when you compared one domain to another at random, they tended in both data sets to be directionally consistent. Approximately 88% of the domains compared maintained directional consistency via the indexes. This test was only run comparing the mobile index domains to the desktop index domains. Future research might explore the reverse relationship.

So what’s next?: Moz and the mobile-first index

Our goal for the Moz link index has always been to be as much like Google as possible. It is with that in mind that our team is experimenting with a mobile-first index as well. Our new link index and Link Explorer in Beta seeks to be more than simply one of the largest link indexes on the web, but the most relevant and useful, and we believe part of that means shaping our index with methods similar to Google. We will keep you updated!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Related Website Articles

Posted in Latest NewsComments Off

SearchCap: Bing Shopping Ads, Google Shopping carousels & Facebook data check

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Bing Shopping Ads, Google Shopping carousels & Facebook data check appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

One Factor that Caused the Current Content Marketing Climate (and How to Fix It)

Pardon our dust. Content marketing is under reconstruction right now, and frankly, it has been for years. Even when content marketing was a newer tactic online, there were naysayers. Now the naysayers point to the loads of crappy content and say, “You think that works?” But there has always been junk. Currently, it’s just easier
Read More…

The post One Factor that Caused the Current Content Marketing Climate (and How to Fix It) appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

Google Confirms Chrome Usage Data Used to Measure Site Speed

Posted by Tom-Anthony

During a discussion with Google’s John Mueller at SMX Munich in March, he told me an interesting bit of data about how Google evaluates site speed nowadays. It has gotten a bit of interest from people when I mentioned it at SearchLove San Diego the week after, so I followed up with John to clarify my understanding.

The short version is that Google is now using performance data aggregated from Chrome users who have opted in as a datapoint in the evaluation of site speed (and as a signal with regards to rankings). This is a positive move (IMHO) as it means we don’t need to treat optimizing site speed for Google as a separate task from optimizing for users.

Previously, it has not been clear how Google evaluates site speed, and it was generally believed to be measured by Googlebot during its visits — a belief enhanced by the presence of speed charts in Search Console. However, the onset of JavaScript-enabled crawling made it less clear what Google is doing — they obviously want the most realistic data possible, but it’s a hard problem to solve. Googlebot is not built to replicate how actual visitors experience a site, and so as the task of crawling became more complex, it makes sense that Googlebot may not be the best mechanism for this (if it ever was the mechanism).

In this post, I want to recap the pertinent data around this news quickly and try to understand what this may mean for users.

Google Search Console

Firstly, we should clarify our understand of what the “time spent downloading a page” metric in Google Search Console is telling us. Most of us will recognize graphs like this one:

Until recently, I was unclear about exactly what this graph was telling me. But handily, John Mueller comes to the rescue again with a detailed answer [login required] (hat tip to James Baddiley from Chillisauce.com for bringing this to my attention):

John clarified what this graph is showing:

It’s technically not “downloading the page” but rather “receiving data in response to requesting a URL” – it’s not based on rendering the page, it includes all requests made.

And that it is:

this is the average over all requests for that day

Because Google may be fetching a very different set of resources every day when it’s crawling your site, and because this graph does not account for anything to do with page rendering, it is not useful as a measure of the real performance of your site.

For that reason, John points out that:

Focusing blindly on that number doesn’t make sense.

With which I quite agree. The graph can be useful for identifying certain classes of backend issues, but there are also probably better ways for you to do that (e.g. WebPageTest.org, of which I’m a big fan).

Okay, so now we understand that graph and what it represents, let’s look at the next option: the Google WRS.

Googlebot & the Web Rendering Service

Google’s WRS is their headless browser mechanism based on Chrome 41, which is used for things like “Fetch as Googlebot” in Search Console, and is increasingly what Googlebot is using when it crawls pages.

However, we know that this isn’t how Google evaluates pages because of a Twitter conversation between Aymen Loukil and Google’s Gary Illyes. Aymen wrote up a blog post detailing it at the time, but the important takeaway was that Gary confirmed that WRS is not responsible for evaluating site speed:

Twitter conversation with Gary Ilyes

At the time, Gary was unable to clarify what was being used to evaluate site performance (perhaps because the Chrome User Experience Report hadn’t been announced yet). It seems as though things have progressed since then, however. Google is now able to tell us a little more, which takes us on to the Chrome User Experience Report.

Chrome User Experience Report

Introduced in October last year, the Chrome User Experience Report “is a public dataset of key user experience metrics for top origins on the web,” whereby “performance data included in the report is from real-world conditions, aggregated from Chrome users who have opted-in to syncing their browsing history and have usage statistic reporting enabled.”

Essentially, certain Chrome users allow their browser to report back load time metrics to Google. The report currently has a public dataset for the top 1 million+ origins, though I imagine they have data for many more domains than are included in the public data set.

In March I was at SMX Munich (amazing conference!), where along with a small group of SEOs I had a chat with John Mueller. I asked John about how Google evaluates site speed, given that Gary had clarified it was not the WRS. John was kind enough to shed some light on the situation, but at that point, nothing was published anywhere.

However, since then, John has confirmed this information in a Google Webmaster Central Hangout [15m30s, in German], where he explains they’re using this data along with some other data sources (he doesn’t say which, though notes that it is in part because the data set does not cover all domains).

At SMX John also pointed out how Google’s PageSpeed Insights tool now includes data from the Chrome User Experience Report:

The public dataset of performance data for the top million domains is also available in a public BigQuery project, if you’re into that sort of thing!

We can’t be sure what all the other factors Google is using are, but we now know they are certainly using this data. As I mentioned above, I also imagine they are using data on more sites than are perhaps provided in the public dataset, but this is not confirmed.

Pay attention to users

Importantly, this means that there are changes you can make to your site that Googlebot is not capable of detecting, which are still detected by Google and used as a ranking signal. For example, we know that Googlebot does not support HTTP/2 crawling, but now we know that Google will be able to detect the speed improvements you would get from deploying HTTP/2 for your users.

The same is true if you were to use service workers for advanced caching behaviors — Googlebot wouldn’t be aware, but users would. There are certainly other such examples.

Essentially, this means that there’s no longer a reason to worry about pagespeed for Googlebot, and you should instead just focus on improving things for your users. You still need to pay attention to Googlebot for crawling purposes, which is a separate task.

If you are unsure where to look for site speed advice, then you should look at:

That’s all for now! If you have questions, please comment here and I’ll do my best! Thanks!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Advert