Tag Archive | "Future"

SEO & Progressive Web Apps: Looking to the Future

Posted by tombennet

Practitioners of SEO have always been mistrustful of JavaScript.

This is partly based on experience; the ability of search engines to discover, crawl, and accurately index content which is heavily reliant on JavaScript has historically been poor. But it’s also habitual, born of a general wariness towards JavaScript in all its forms that isn’t based on understanding or experience. This manifests itself as dependence on traditional SEO techniques that have not been relevant for years, and a conviction that to be good at technical SEO does not require an understanding of modern web development.

As Mike King wrote in his post The Technical SEO Renaissance, these attitudes are contributing to “an ever-growing technical knowledge gap within SEO as a marketing field, making it difficult for many SEOs to solve our new problems”. They also put SEO practitioners at risk of being left behind, since too many of us refuse to explore – let alone embrace – technologies such as Progressive Web Apps (PWAs), modern JavaScript frameworks, and other such advancements which are increasingly being seen as the future of the web.

In this article, I’ll be taking a fresh look at PWAs. As well as exploring implications for both SEO and usability, I’ll be showcasing some modern frameworks and build tools which you may not have heard of, and suggesting ways in which we need to adapt if we’re to put ourselves at the technological forefront of the web.

1. Recap: PWAs, SPAs, and service workers

Progressive Web Apps are essentially websites which provide a user experience akin to that of a native app. Features like push notifications enable easy re-engagement with your audience, while users can add their favorite sites to their home screen without the complication of app stores. PWAs can continue to function offline or on low-quality networks, and they allow a top-level, full-screen experience on mobile devices which is closer to that offered by native iOS and Android apps.

Best of all, PWAs do this while retaining – and even enhancing – the fundamentally open and accessible nature of the web. As suggested by the name they are progressive and responsive, designed to function for every user regardless of their choice of browser or device. They can also be kept up-to-date automatically and — as we shall see — are discoverable and linkable like traditional websites. Finally, it’s not all or nothing: existing websites can deploy a limited subset of these technologies (using a simple service worker) and start reaping the benefits immediately.

The spec is still fairly young, and naturally, there are areas which need work, but that doesn’t stop them from being one of the biggest advancements in the capabilities of the web in a decade. Adoption of PWAs is growing rapidly, and organizations are discovering the myriad of real-world business goals they can impact.

You can read more about the features and requirements of PWAs over on Google Developers, but two of the key technologies which make PWAs possible are:

  • App Shell Architecture: Commonly achieved using a JavaScript framework like React or Angular, this refers to a way of building single page apps (SPAs) which separates logic from the actual content. Think of the app shell as the minimal HTML, CSS, and JS your app needs to function; a skeleton of your UI which can be cached.
  • Service Workers: A special script that your browser runs in the background, separate from your page. It essentially acts as a proxy, intercepting and handling network requests from your page programmatically.

Note that these technologies are not mutually exclusive; the single page app model (brought to maturity with AngularJS in 2010) obviously predates service workers and PWAs by some time. As we shall see, it’s also entirely possible to create a PWA which isn’t built as a single page app. For the purposes of this article, however, we’re going to be focusing on the ‘typical’ approach to developing modern PWAs, exploring the SEO implications — and opportunities — faced by teams that choose to join the rapidly-growing number of organizations that make use of the two technologies described above.

We’ll start with the app shell architecture and the rendering implications of the single page app model.

2. The app shell architecture

URLs

In a nutshell, the app shell architecture involves aggressively caching static assets (the bare minimum of UI and functionality) and then loading the actual content dynamically, using JavaScript. Most modern JavaScript SPA frameworks encourage something resembling this approach, and the separation of logic and content in this way benefits both speed and usability. Interactions feel instantaneous, much like those on a native app, and data usage can be highly economical.

Credit to https://developers.google.com/web/fundamentals/architecture/app-shell

As I alluded to in the introduction, a heavy reliance on client-side JavaScript is a problem for SEO. Historically, many of these issues centered around the fact that while search crawlers require unique URLs to discover and index content, single page apps don’t need to change the URL for each state of the application or website (hence the phrase ‘single page’). The reliance on fragment identifiers — which aren’t sent as part of an HTTP request — to dynamically manipulate content without reloading the page was a major headache for SEO. Legacy solutions involved replacing the hash with a so-called hashbang (#!) and the _escaped_fragment_ parameter, a hack which has long-since been deprecated and which we won’t be exploring today.

Thanks to the HTML5 history API and pushState method, we now have a better solution. The browser’s URL bar can be changed using JavaScript without reloading the page, thereby keeping it in sync with the state of your application or site and allowing the user to make effective use of the browser’s ‘back’ button. While this solution isn’t a magic bullet — your server must be configured to respond to requests for these deep URLs by loading the app in its correct initial state — it does provide us with the tools to solve the problem of URLs in SPAs.

// Run this in your console to modify the URL in your
// browser - note that the page doesn't actually reload.
history.pushState(null, "Page 2", "/page2.html");

The bigger problem facing SEO today is actually much easier to understand: rendering content, namely when and how it gets done.

Rendering content

Note that when I refer to rendering here, I’m referring to the process of constructing the HTML. We’re focusing on how the actual content gets to the browser, not the process of drawing pixels to the screen.

In the early days of the web, things were simpler on this front. The server would typically return all the HTML that was necessary to render a page. Nowadays, however, many sites which utilize a single page app framework deliver only minimal HTML from the server and delegate the heavy lifting to the client (be that a user or a bot). Given the scale of the web this requires a lot of time and computational resource, and as Google made clear at its I/O conference in 2018, this poses a major problem for search engines:

“The rendering of JavaScript-powered websites in Google Search is deferred until Googlebot has resources available to process that content.”

On larger sites, this second wave of indexation can sometimes be delayed for several days. On top of this, you are likely to encounter a myriad of problems with crucial information like canonical tags and metadata being missed completely. I would highly recommend watching the video of Google’s excellent talk on this subject for a rundown of some of the challenges faced by modern search crawlers.

Google is one of the very few search engines that renders JavaScript at all. What’s more, it does so using a web rendering service that until very recently was based on Chrome 41 (released in 2015). Obviously, this has implications outside of just single page apps, and the wider subject of JavaScript SEO is a fascinating area right now. Rachel Costello’s recent white paper on JavaScript SEO is the best resource I’ve read on the subject, and it includes contributions from other experts like Bartosz Góralewicz, Alexis Sanders, Addy Osmani, and a great many more.

For the purposes of this article, the key takeaway here is that in 2019 you cannot rely on search engines to accurately crawl and render your JavaScript-dependent web app. If your content is rendered client-side, it will be resource-intensive for Google to crawl, and your site will underperform in search. No matter what you’ve heard to the contrary, if organic search is a valuable channel for your website, you need to make provisions for server-side rendering.

But server-side rendering is a concept which is frequently misunderstood…

“Implement server-side rendering”

This is a common SEO audit recommendation which I often hear thrown around as if it were a self-contained, easily-actioned solution. At best it’s an oversimplification of an enormous technical undertaking, and at worst it’s a misunderstanding of what’s possible/necessary/beneficial for the website in question. Server-side rendering is an outcome of many possible setups and can be achieved in many different ways; ultimately, though, we’re concerned with getting our server to return static HTML.

So, what are our options? Let’s break down the concept of server-side rendered content a little and explore our options. These are the high-level approaches which Google outlined at the aforementioned I/O conference:

  • Dynamic Rendering — Here, normal browsers get the ‘standard’ web app which requires client-side rendering while bots (such as Googlebot and social media services) are served with static snapshots. This involves adding an additional step onto your server infrastructure, namely a service which fetches your web app, renders the content, then returns that static HTML to bots based on their user agent (i.e. UA sniffing). Historically this was done with a service like PhantomJS (now deprecated and no longer developed), while today Puppeteer (headless Chrome) can perform a similar function. The main advantage is that it can often be bolted into your existing infrastructure.
  • Hybrid Rendering — This is Google’s long-term recommendation, and it’s absolutely the way to go for newer site builds. In short, everyone — bots and humans — get the initial view served as fully-rendered static HTML. Crawlers can continue to request URLs in this way and will get static content each time, while on normal browsers, JavaScript takes over after the initial page load. This is a great solution in theory, and comes with many other advantages for speed and usability too; more on that soon.

The latter is cleaner, doesn’t involve UA sniffing, and is Google’s long-term recommendation. It’s also worth clarifying that ‘hybrid rendering’ is not a single solution — it’s an outcome of many possible approaches to making static prerendered content available server-side. Let’s break down how a couple of ways such an outcome can be achieved.

Isomorphic/universal apps

This is one way in which you might achieve a ‘hybrid rendering’ setup. Isomorphic applications use JavaScript which runs on both the server and the client. This is made possible thanks to the advent of Node.js, which – among many other things – allows developers to write code which can run on the backend as well as in the browser.

Typically you’ll configure your framework (React, Angular Universal, whatever) to run on a Node server, prerendering some or all of the HTML before it’s sent to the client. Your server must, therefore, be configured to respond to deep URLs by rendering HTML for the appropriate page. In normal browsers, this is the point at which the client-side application will seamlessly take over. The server-rendered static HTML for the initial view is ‘rehydrated’ (brilliant term) by the browser, turning it back into a single page app and executing subsequent navigation events with JavaScript.

Done well, this setup can be fantastic since it offers the usability benefits of client-side rendering, the SEO advantages of server-side rendering, and a rapid first paint (even if Time to Interactive is often negatively impacted by the rehydration as JS kicks in). For fear of oversimplifying the task, I won’t go into too much more detail here, but the key point is that while isomorphic JavaScript / true server-side rendering can be a powerful solution, it is often enormously complex to set up.

So, what other options are there? If you can’t justify the time or expense of a full isomorphic setup, or if it’s simply overkill for what you’re trying to achieve, are there any other ways you can reap the benefits of the single page app model — and hybrid rendering setup — without sabotaging your SEO?

Prerendering/JAMstack

Having rendered content available server-side doesn’t necessarily mean that the rendering process itself needs to happen on the server. All we need is for rendered HTML to be there, ready to serve to the client; the rendering process itself can happen anywhere you like. With a JAMstack approach, rendering of your content into HTML happens as part of your build process.

I’ve written about the JAMstack approach before. By way of a quick primer, the term stands for JavaScript, APIs, and markup, and it describes a way of building complex websites without server-side software. The process of assembling a site from front-end component parts — a task a traditional site might achieve with WordPress and PHP — is executed as part of the build process, while interactivity is handled client-side using JavaScript and APIs.

Think of it this way: everything lives in your Git repository. Your content is stored as plain text markdown files (editable via a headless CMS or other API-based solution) and your page templates and assembly logic are written in Go, JavaScript, Ruby, or whatever language your preferred site generator happens to use. Your site can be built into static HTML on any computer with the appropriate set of command line tools before it’s hosted anywhere. The resulting set of easily-cached static files can often be securely hosted on a CDN for next to nothing.

I honestly think static site generators – or rather the principles and technologies which underpin them — are the future. There’s every chance I’m wrong about this, but the power and flexibility of the approach should be clear to anyone who’s used modern npm-based automation software like Gulp or Webpack to author their CSS or JavaScript. I’d challenge anyone to test the deep Git integration offered by specialist webhost Netlify in a real-world project and still think that the JAMstack approach is a fad.


The popularity of static site generators on GitHub, generated using https://stars.przemeknowak.com

The significance of a JAMstack setup to our discussion of single page apps and prerendering should be fairly obvious. If our static site generator can assemble HTML based on templates written in Liquid or Handlebars, why can’t it do the same with JavaScript?

There is a new breed of static site generator which does just this. Frequently powered by React or Vue.js, these programs allow developers to build websites using cutting-edge JavaScript frameworks and can easily be configured to output SEO-friendly, static HTML for each page (or ‘route’). Each of these HTML files is fully rendered content, ready for consumption by humans and bots, and serves as an entry point into a complete client-side application (i.e. a single page app). This is a perfect execution of what Google termed “hybrid rendering”, though the precise nature of the pre-rendering process sets it quite apart from an isomorphic setup.

A great example is GatsbyJS, which is built in React and GraphQL. I won’t go into too much detail, but I would encourage everyone who’s read this far to check out their homepage and excellent documentation. It’s a well-supported tool with a reasonable learning curve, an active community (a feature-packed v2.0 was released in September), an extensible plugin-based architecture, rich integrations with many CMSs, and it allows developers to utilize modern frameworks like React without sabotaging their SEO. There’s also Gridsome, based on VueJS, and React Static which — you guessed it — uses React.


Nike’s recent Just Do It campaign, which utilized the React-powered static site generator GatsbyJS and is hosted on Netlify.

Enterprise-level adoption of these platforms looks set to grow; GatsbyJS was used by Nike for their Just Do It campaign, Airbnb for their engineering site airbnb.io, and Braun have even used it to power a major e-commerce site. Finally, our friends at SEOmonitor used it to power their new website.

But that’s enough about single page apps and JavaScript rendering for now. It’s time we explored the second of our two key technologies underpinning PWAs. Promise you’ll stay with me to the end (haha, nerd joke), because it’s time to explore Service Workers.

3. Service Workers

First of all, I should clarify that the two technologies we’re exploring — SPAs and service workers — are not mutually exclusive. Together they underpin what we commonly refer to as a Progressive Web App, yes, but it’s also possible to have a PWA which isn’t an SPA. You could also integrate a service worker into a traditional static website (i.e. one without any client-side rendered content), which is something I believe we’ll see happening a lot more in the near future. Finally, service workers operate in tandem with other technologies like the Web App Manifest, something that my colleague Maria recently explored in more detail in her excellent guide to PWAs and SEO.

Ultimately, though, it is service workers which make the most exciting features of PWAs possible. They’re one of the most significant changes to the web platform in its history, and everyone whose job involves building, maintaining, or auditing a website needs to be aware of this powerful new set of technologies. If, like me, you’ve been eagerly checking Jake Archibald’s Is Service Worker Ready page for the last couple of years and watching as adoption by browser vendors has grown, you’ll know that the time to start building with service workers is now.

We’re going to explore what they are, what they can do, how to implement them, and what the implications are for SEO.

What can service workers do?

A service worker is a special kind of JavaScript file which runs outside of the main browser thread. It sits in-between the browser and the network, and its powers include:

  • Intercepting network requests and deciding what to do with them programmatically. The worker might go to network as normal, or it might rely solely on the cache. It could even fabricate an entirely new response from a variety of sources. That includes constructing HTML.
  • Preloading files during service worker installation. For SPAs this commonly includes the ‘app shell’ we discussed earlier, while simple static websites might opt to preload all HTML, CSS, and JavaScript, ensuring basic functionality is maintained while offline.
  • Handling push notifications, similar to a native app. This means websites can get permission from users to deliver notifications, then rely on the service worker to receive messages and execute them even when the browser is closed.
  • Executing background sync, deferring network operations until connectivity has improved. This might be an ‘outbox’ for a webmail service or a photo upload facility. No more “request failed, please try again later” – the service worker will handle it for you at an appropriate time.

The benefits of these kinds of features go beyond the obvious usability perks. As well as driving adoption of HTTPS across the web (all the major browsers will only register service workers on the secure protocol), service workers are transformative when it comes to speed and performance. They underpin new approaches and ideas like Google’s PRPL Pattern, since we can maximize caching efficiency and minimize reliance on the network. In this way, service workers will play a key role in making the web fast and accessible for the next billion web users.

So yeah, they’re an absolute powerhouse.

Implementing a service worker

Rather than doing a bad job of writing a basic tutorial here, I’m instead going to link to some key resources. After all, you are in the best position to know how deep your understanding of service workers needs to be.

The MDN Docs are a good place to learn more about service workers and their capabilities. If you’re already confident with the essentials of web development and enjoy a learn-by-doing approach, I’d highly recommend completing Google’s PWA training course. It includes a whole practical exercise on service workers, which is a great way to familiarize yourself with the basics. If ES6 and promises aren’t yet a part of your JavaScript repertoire, prepare for a baptism of fire.

They key thing to understand — and which you’ll realize very quickly once you start experimenting — is that service workers hand over an incredible level of control to developers. Unlike previous attempts to solve the connectivity conundrum (such as the ill-fated AppCache), service workers don’t enforce any specific patterns on your work; they’re a set of tools for you to write your own solutions to the problems you’re facing.

One consequence of this is that they can be very complex. Registering and installing a service worker is not a simple exercise, and any attempts to cobble one together by copy-pasting from StackExchange are doomed to failure (seriously, don’t do this). There’s no such thing as a ready-made service worker for your site — if you’re to author a suitable worker, you need to understand the infrastructure, architecture, and usage patterns of your website. Uncle Ben, ever the web development guru, said it best: with great power comes great responsibility.

One last thing: you’ll probably be surprised how many sites you visit are already using a service worker. Head to chrome://serviceworker-internals/ in Chrome or about:debugging#workers in Firefox to see a list.

Service workers and SEO

In terms of SEO implications, the most relevant thing about service workers is probably their ability to hijack requests and modify or fabricate responses using the Fetch API. What you see in ‘View Source’ and even on the Network tab is not necessarily a representation of what was returned from the server. It might be a cached response or something constructed by the service worker from a variety of different sources.

Credit: https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API

Here’s a practical example:

  • Head to the GatsbyJS homepage
  • Hit the link to the ‘Docs’ page.
  • Right-click – View Source

No content, right? Just some inline scripts and styles and empty HTML elements — a classic client-side JavaScript app built in React. Even if you open the Network tab and refresh the page, the Preview and Response tabs will tell the same story. The actual content only appears in the Element inspector, because the DOM is being assembled with JavaScript.

Now run a curl request for the same URL (https://www.gatsbyjs.org/docs/), or fetch the page using Screaming Frog. All the content is there, along with proper title tags, canonicals, and everything else you might expect from a page rendered server-side. This is what a crawler like Googlebot will see too.

This is because the website uses hybrid rendering and a service worker — installed in your browser — is handling subsequent navigation events. There is no need for it to fetch the raw HTML for the Docs page from the server because the client-side application is already up-and-running – thus, View Source shows you what the service worker returned to the application, not what the network returned. Additionally, these pages can be reloaded while you’re offline thanks to the service worker’s effective use of the cache.

You can easily spot which responses came from the service worker using the Network tab — note the ‘from ServiceWorker’ line below.

On the Application tab, you can see the service worker which is running on the current page along with the various caches it has created. You can disable or bypass the worker and test any of the more advanced functionality it might be using. Learning how to use these tools is an extremely valuable exercise; I won’t go into details here, but I’d recommend studying Google’s Web Fundamentals tutorial on debugging service workers.

I’ve made a conscious effort to keep code snippets to a bare minimum in this article, but grant me this one. I’ve put together an example which illustrates how a simple service worker might use the Fetch API to handle requests and the degree of control which we’re afforded:

The result:

I hope that this (hugely simplified and non-production ready) example illustrates a key point, namely that we have extremely granular control over how resource requests are handled. In the example above we’ve opted for a simple try-cache-first, fall-back-to-network, fall-back-to-custom-page pattern, but the possibilities are endless. Developers are free to dictate how requests should be handled based on hostnames, directories, file types, request methods, cache freshness, and loads more. Responses – including entire pages – can be fabricated by the service worker. Jake Archibald explores some common methods and approaches in his Offline Cookbook.

The time to learn about the capabilities of service workers is now. The skillset required for modern technical SEO has a fair degree of overlap with that of a web developer, and today, a deep understanding of the dev tools in all major browsers – including service worker debugging – should be regarded as a prerequisite.

4. Wrapping Up

SEOs need to adapt

Until recently, it’s been too easy to get away with not understanding the consequences and opportunities posed by PWAs and service workers.

These were cutting-edge features which sat on the periphery of what was relevant to search marketing, and the aforementioned wariness of many SEOs towards JavaScript did nothing to encourage experimentation. But PWAs are rapidly on their way to becoming a norm, and it will soon be impossible to do an effective job without understanding the mechanics of how they function. To stay relevant as a technical SEO (or SEO Engineer, to borrow another term from Mike King), you should put yourself at the forefront of these kinds of paradigm-shifting developments. The technical SEO who is illiterate in web development is already an anachronism, and I believe that further divergence between the technical and content-driven aspects of search marketing is no bad thing. Specialize!

Upon learning that a development team is adopting a new JavaScript framework for a new site build, it’s not uncommon for SEOs to react with a degree of cynicism. I’m certainly guilty of joking about developers being attracted to the latest shiny technology or framework, and at how rapidly the world of JavaScript development seems to evolve, layer upon layer of abstraction and automation being added to what — from the outside — can often seem to be a leaning tower of a development stack. But it’s worth taking the time to understand why frameworks are chosen, when technologies are likely to start being used in production, and how these decisions will impact SEO.

Instead of criticizing 404 handling or internal linking of a single page app framework, for example, it would be far better to be able to offer meaningful recommendations which are grounded in an understanding of how they actually work. As Jono Alderson observed in his talk on the Democratization of SEO, contributions to open source projects are more valuable in spreading appreciation and awareness of SEO than repeatedly fixing the same problems on an ad-hoc basis.

Beyond SEO

One last thing I’d like to mention: PWAs are such a transformative set of technologies that they obviously have consequences which reach far beyond just SEO. Other areas of digital marketing are directly impacted too, and from my standpoint, one of the most interesting is analytics.

If your website is partially or fully functional while offline, have you adapted your analytics setup to account for this? If push notification subscriptions are a KPI for your website, are you tracking this as a goal? Remembering that service workers do not have access to the Window object, tracking these events is not possible with ‘normal’ tracking code. Instead, it’s necessary to configure your service worker to build hits using the Measurement Protocol, queue them if necessary, and send them directly to the Google Analytics servers.

This is a fascinating area that I’ve been exploring a lot lately, and you can read the first post in my series of articles on PWA analytics over on the Builtvisible blog.

That’s all from me for now! Thanks for reading. If you have any questions or comments, please leave a message below or drop me a line on Twitter @tomcbennet.

Many thanks to Oliver Mason and Will Nye for their feedback on an early draft of this article.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Building the Target of the Future

“We dropped back several years ago and started thinking about building the target of the future,” says Target CEO Brian Cornell. “It really started with an investment in understanding the consumer and really understanding what they were looking for and how to build the capabilities starting with data science to really guide us through that journey. Whether that’s technology or supply chain capabilities, product design, or our focus on execution at the store level, data and analytics have been important guideposts for us as we’ve gone through this journey.”

Brian Cornell, CEO of Target, discusses the details of how the company is building the Target of the future in an interview at the Stanford Graduate School of Business:

Reimagining Stores and Investment in Technology is Paying Off

Target’s (current success) is really a combination of a number of things that we’ve been working on for several years now. If I go back to February of 2017 we laid out a three-year vision for the company. We said we’re going to invest billions of dollars. At that point, I said $ 7 billion dollars over a three year period to invest in reimagining our stores, in building new smaller stores and urban centers and on college campuses, reinvest in our brands, invest in technology and fulfillment capabilities, and make a big investment in our people.

The success we’re seeing right now is really a combination of all those elements starting to mature. We’re executing at scale and they’re all starting to work together. That’s driving for us great top-line growth, market share gains, and importantly more traffic in our stores and visits to our site.

In Most Cases Shopping Starts With the Mobile Phone

I actually think blend (of digital and physical) is the right term. I think from a consumer standpoint they’ve really lost sight of whether they’re shopping in a physical environment or a digital environment. In most cases, their shopping starts with that mobile phone in their hands, that digital device. It’s how they decide where they’re going to shop and what they’re looking for. If you went to one of our Target stores this afternoon I guarantee you we’d find consumers with a phone in their hand, they’d be looking at their latest Pinterest, they’d be checking things on their favorite digital site, and they’d have their shopping list there.

That device really guides them through the shopping experience. I think more and more there’s a blurring and a blending that’s taking place and it’s a combination of both. The consumer today is enjoying the fact that shopping has become really easy. They get the best of both. They get a physical experience when they want it and if they don’t have time they can shop from their desk or from their classroom. They’re constantly in touch and we’ve made it really easy now for them to interface with our brand on their own terms.

Building the Target of the Future

We dropped back several years ago and started thinking about building the target of the future. It really started with an investment in understanding the consumer and really understanding what they were looking for and how to build the capabilities starting with data science to really guide us through that journey. I can talk a lot about strategy, but the other thing that we’ve recognized is how important it is to have the right capabilities in place. Whether that’s technology or supply chain capabilities, product design, or our focus on execution at the store level, data and analytics have been important guideposts for us as we’ve gone through this journey.

We’ve been fortunate in that we’ve recruited quite a few Stanford grads. I think what’s attracting them to our business is the richness of our data. The fact that on an average week we get 30 million consumers shopping our stores and a similar number going to Target.com. We have all this rich data and we understand where consumers are shopping, what they’re looking for, and I think they’ve been really intrigued by the ability to take that data and help us build a future.

The Consumer is Looking For a Unique Personalized Experience

I’ve certainly seen this trend towards personalization and localization. If I think about the changes in consumer packaged goods, in some cases those big brands that you and I grew up with, well they’ve been replaced by smaller local niche brands that we didn’t see when we grew up and they’re being regionalized across the country. I think the consumer today is looking for that unique personalized experience, whether they’re shopping a Target store or they’re walking through a local store right here on the Stanford campus.

I think I walked in recognizing the importance of a clear strategy for an organization. But I’ve come to realize just how important culture is, a clear purpose, and importantly ensuring that our strategy is supported by great capabilities and the importance of team. I think (as we look toward the future) we’ll still be true to the purpose we have today. It’s really focused on bringing a little bit of joy to all the families we serve each and every week and really enhancing their everyday life. I think that focus on families, that connection we have today with moms with kids with families across the country, will be as true in the future as it is today.

Building the Target of the Future – Target CEO

The post Building the Target of the Future appeared first on WebProNews.

WebProNews

Posted in Latest NewsComments Off

Seth Godin on the Future of Marketing

Well, I’m back in the United States from my family trip around the world. Although not officially home in Boulder,…

The post Seth Godin on the Future of Marketing appeared first on Copyblogger.

Copyblogger

Posted in Latest NewsComments Off

Local Search Ranking Factors 2018: Local Today, Key Takeaways, and the Future

Posted by Whitespark

In the past year, local SEO has run at a startling and near-constant pace of change. From an explosion of new Google My Business features to an ever-increasing emphasis on the importance of reviews, it’s almost too much to keep up with. In today’s Whiteboard Friday, we welcome our friend Darren Shaw to explain what local is like today, dive into the key takeaways from his 2018 Local Search Ranking Factors survey, and offer us a glimpse into the future according to the local SEO experts.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans. I’m Darren Shaw from Whitespark, and today I want to talk to you about the local search ranking factors. So this is a survey that David Mihm has run for the past like 10 years. Last year, I took it over, and it’s a survey of the top local search practitioners, about 40 of them. They all contribute their answers, and I aggregate the data and say what’s driving local search. So this is what the opinion of the local search practitioners is, and I’ll kind of break it down for you.

Local search today

So these are the results of this year’s survey. We had Google My Business factors at about 25%. That was the biggest piece of the pie. We have review factors at 15%, links at 16%, on-page factors at 14%, behavioral at 10%, citations at 11%, personalization and social at 6% and 3%. So that’s basically the makeup of the local search algorithm today, based on the opinions of the people that participated in the survey.

The big story this year is Google My Business. Google My Business factors are way up, compared to last year, a 32% increase in Google My Business signals. I’ll talk about that a little bit more over in the takeaways. Review signals are also up, so more emphasis on reviews this year from the practitioners. Citation signals are down again, and that makes sense. They continue to decline I think for a number of reasons. They used to be the go-to factor for local search. You just built out as many citations as you could. Now the local search algorithm is so much more complicated and there’s so much more to it that it’s being diluted by all of the other factors. Plus it used to be a real competitive difference-maker. Now it’s not, because everyone is pretty much getting citations. They’re considered table stakes now. By seeing a drop here, it doesn’t mean you should stop doing them. They’re just not the competitive difference-maker they used to be. You still need to get listed on all of the important sites.

Key takeaways

All right, so let’s talk about the key takeaways.

1. Google My Business

The real story this year was Google My Business, Google My Business, Google My Business. Everyone in the comments was talking about the benefits they’re seeing from investing in a lot of these new features that Google has been adding.

Google has been adding a ton of new features lately — services, descriptions, Google Posts, Google Q&A. There’s a ton of stuff going on in Google My Business now that allows you to populate Google My Business with a ton of extra data. So this was a big one.

✓ Take advantage of Google Posts

Everyone talked about Google Posts, how they’re seeing Google Posts driving rankings. There are a couple of things there. One is the semantic content that you’re providing Google in a Google post is definitely helping Google associate those keywords with your business. Engagement with Google Posts as well could be driving rankings up, and maybe just being an active business user continuing to post stuff and logging in to your account is also helping to lift your business entity and improve your rankings. So definitely, if you’re not on Google Posts, get on it now.

If you search for your category, you’ll see a ton of businesses are not doing it. So it’s also a great competitive difference-maker right now.

✓ Seed your Google Q&A

Google Q&A, a lot of businesses are not even aware this exists. There’s a Q&A section now. Your customers are often asking questions, and they’re being answered by not you. So it’s valuable for you to get in there and make sure you’re answering your questions and also seed the Q&A with your own questions. So add all of your own content. If you have a frequently asked questions section on your website, take that content and put it into Google Q&A. So now you’re giving lots more content to Google.

✓ Post photos and videos

Photos and videos, continually post photos and videos, maybe even encourage your customers to do that. All of that activity is helpful. A lot of people don’t know that you can now post videos to Google My Business. So get on that if you have any videos for your business.

✓ Fill out every field

There are so many new fields in Google My Business. If you haven’t edited your listing in a couple of years, there’s a lot more stuff in there that you can now populate and give Google more data about your business. All of that really leads to engagement. All of these extra engagement signals that you’re now feeding Google, from being a business owner that’s engaged with your listing and adding stuff and from users, you’re giving them more stuff to look at, click on, and dwell on your listing for a longer time, all that helps with your rankings.

2. Reviews

✓ Get more Google reviews

Reviews continue to increase in importance in local search, so, obviously, getting more Google reviews. It used to be a bit more of a competitive difference-maker. It’s becoming more and more table stakes, because everybody seems to be having lots of reviews. So you definitely want to make sure that you are competing with your competition on review count and lots of high-quality reviews.

✓ Keywords in reviews

Getting keywords in reviews, so rather than just asking for a review, it’s useful to ask your customers to mention what service they had provided or whatever so you can get those keywords in your reviews.

✓ Respond to reviews (users get notified now!)

Responding to reviews. Google recently started notifying users that if the owner has responded to you, you’ll get an email. So all of that is really great, and those responses, it’s another signal to Google that you’re an engaged business.

✓ Diversify beyond Google My Business for reviews

Diversify. Don’t just focus on Google My Business. Look at other sites in your industry that are prominent review sites. You can find them if you just look for your own business name plus reviews, if you search that in Google, you’re going to see the sites that Google is saying are important for your particular business.

You can also find out like what are the sites that your competitors are getting reviews on. Then if you just do a search like keyword plus city, like “lawyers + Denver,” you might find sites that are important for your industry as well that you should be listed on. So check out a couple of your keywords and make sure you’re getting reviews on more sites than just Google.

3. Links

Then links, of course, links continue to drive local search. A lot of people in the comments talked about how a handful of local links have been really valuable. This is a great competitive difference-maker, because a lot of businesses don’t have any links other than citations. So when you get a few of these, it can really have an impact.

✓ From local industry sites and sponsorships

They really talk about focusing on local-specific sites and industry-specific sites. So you can get a lot of those from sponsorships. They’re kind of the go-to tactic. If you do a search for in title sponsors plus city name, you’re going to find a lot of sites that are listing their sponsors, and those are opportunities for you, in your city, that you could sponsor that event as well or that organization and get a link.

The future!

All right. So I also asked in the survey: Where do you see Google going in the future? We got a lot of great responses, and I tried to summarize that into three main themes here for you.

1. Keeping users on Google

This is a really big one. Google does not want to send its users to your website to get the answer. Google wants to have the answer right on Google so that they don’t have to click. It’s this zero-click search result. So you see Rand Fishkin talking about this. This has been happening in local for a long time, and it’s really amplified with all of these new features Google has been adding. They want to have all of your data so that they don’t have to send users to find it somewhere else. Then that means in the future less traffic to your website.

So Mike Blumenthal and David Mihm also talk about Google as your new homepage, and this concept is like branded search.

  • What does your branded search look like?
  • So what sites are you getting reviews on?
  • What does your knowledge panel look like?

Make that all look really good, because Google doesn’t want to send people to your new website.

2. More emphasis on behavioral signals

David Mihm is a strong voice in this. He talks about how Google is trying to diversify how they rank businesses based on what’s happening in the real world. They’re looking for real-world signals that actual humans care about this business and they’re engaging with this business.

So there’s a number of things that they can do to track that — so branded search, how many people are searching for your brand name, how many people are clicking to call your business, driving directions. This stuff is all kind of hard to manipulate, whereas you can add more links, you can get more reviews. But this stuff, this is a great signal for Google to rely on.

Engagement with your listing, engagement with your website, and actual humans in your business. If you’ve seen on the knowledge panel sometimes for brick-and-mortar business, it will be like busy times. They know when people are actually at your business. They have counts of how many people are going into your business. So that’s a great signal for them to use to understand the prominence of your business. Is this a busy business compared to all the other ones in the city?

3. Google will monetize everything

Then, of course, a trend to monetize as much as they can. Google is a publicly traded company. They want to make as much money as possible. They’re on a constant growth path. So there are a few things that we see coming down the pipeline.

Local service ads are expanding across the country and globally and in different industries. So this is like a paid program. You have to apply to get into it, and then Google takes a cut of leads. So if you are a member of this, then Google will send leads to you. But you have to be verified to be in there, and you have to pay to be in there.

Then taking a cut from bookings, you can now book directly on Google for a lot of different businesses. If you think about Google Flights and Google Hotels, Google is looking for a way to monetize all of this local search opportunity. That’s why they’re investing heavily in local search so they can make money from it. So seeing more of these kinds of features rolling out in the future is definitely coming. Transactions from other things. So if I did book something, then Google will take a cut for it.

So that’s the future. That’s sort of the news of the local search ranking factors this year. I hope it’s been helpful. If you have any questions, just leave some comments and I’ll make sure to respond to them all. Thanks, everybody.

Video transcription by Speechpad.com


If you missed our recent webinar on the Local Search Ranking Factors survey with Darren Shaw and Dr. Pete, don’t worry! You can still catch the recording here:

Check out the webinar

You’ll be in for a jam-packed hour of deeper insights and takeaways from the survey, as well as some great audience-contributed Q&A.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Using a New Correlation Model to Predict Future Rankings with Page Authority

Posted by rjonesx.

Correlation studies have been a staple of the search engine optimization community for many years. Each time a new study is released, a chorus of naysayers seem to come magically out of the woodwork to remind us of the one thing they remember from high school statistics — that “correlation doesn’t mean causation.” They are, of course, right in their protestations and, to their credit, and unfortunate number of times it seems that those conducting the correlation studies have forgotten this simple aphorism.

We collect a search result. We then order the results based on different metrics like the number of links. Finally, we compare the orders of the original search results with those produced by the different metrics. The closer they are, the higher the correlation between the two.

That being said, correlation studies are not altogether fruitless simply because they don’t necessarily uncover causal relationships (ie: actual ranking factors). What correlation studies discover or confirm are correlates.

Correlates are simply measurements that share some relationship with the independent variable (in this case, the order of search results on a page). For example, we know that backlink counts are correlates of rank order. We also know that social shares are correlates of rank order.

Correlation studies also provide us with direction of the relationship. For example, ice cream sales are positive correlates with temperature and winter jackets are negative correlates with temperature — that is to say, when the temperature goes up, ice cream sales go up but winter jacket sales go down.

Finally, correlation studies can help us rule out proposed ranking factors. This is often overlooked, but it is an incredibly important part of correlation studies. Research that provides a negative result is often just as valuable as research that yields a positive result. We’ve been able to rule out many types of potential factors — like keyword density and the meta keywords tag — using correlation studies.

Unfortunately, the value of correlation studies tends to end there. In particular, we still want to know whether a correlate causes the rankings or is spurious. Spurious is just a fancy sounding word for “false” or “fake.” A good example of a spurious relationship would be that ice cream sales cause an increase in drownings. In reality, the heat of the summer increases both ice cream sales and people who go for a swim. That swimming can cause drownings. So while ice cream sales is a correlate of drowning, it is *spurious.* It does not cause the drowning.

How might we go about teasing out the difference between causal and spurious relationships? One thing we know is that a cause happens before its effect, which means that a causal variable should predict a future change.

An alternative model for correlation studies

I propose an alternate methodology for conducting correlation studies. Rather than measure the correlation between a factor (like links or shares) and a SERP, we can measure the correlation between a factor and changes in the SERP over time.

The process works like this:

  1. Collect a SERP on day 1
  2. Collect the link counts for each of the URLs in that SERP
  3. Look for any URLs are out of order with respect to links; for example, if position 2 has fewer links than position 3
  4. Record that anomaly
  5. Collect the same SERP in 14 days
  6. Record if the anomaly has been corrected (ie: position 3 now out-ranks position 2)
  7. Repeat across ten thousand keywords and test a variety of factors (backlinks, social shares, etc.)

So what are the benefits of this methodology? By looking at change over time, we can see whether the ranking factor (correlate) is a leading or lagging feature. A lagging feature can automatically be ruled out as causal. A leading factor has the potential to be a causal factor.

We collect a search result. We record where the search result differs from the expected predictions of a particular variable (like links or social shares). We then collect the same search result 2 weeks later to see if the search engine has corrected the out-of-order results.

Following this methodology, we tested 3 different common correlates produced by ranking factors studies: Facebook shares, number of root linking domains, and Page Authority. The first step involved collecting 10,000 SERPs from randomly selected keywords in our Keyword Explorer corpus. We then recorded Facebook Shares, Root Linking Domains, and Page Authority for every URL. We noted every example where 2 adjacent URLs (like positions 2 and 3 or 7 and 8) were flipped with respect to the expected order predicted by the correlating factor. For example, if the #2 position had 30 shares while the #3 position had 50 shares, we noted that pair. Finally, 2 weeks later, we captured the same SERPs and identified the percent of times that Google rearranged the pair of URLs to match the expected correlation. We also randomly selected pairs of URLs to get a baseline percent likelihood that any 2 adjacent URLs would switch positions. Here were the results…

The outcome

It’s important to note that it is incredibly rare to expect a leading factor to show up strongly in an analysis like this. While the experimental method is sound, it’s not as simple as a factor predicting future — it assumes that in some cases we will know about a factor before Google does. The underlying assumption is that in some cases we have seen a ranking factor (like an increase in links or social shares) before Googlebot has and that in the 2 week period, Google will catch up and correct the incorrectly ordered results. As you can expect, this is a rare occasion. However, with a sufficient number of observations, we should be able to see a statistically significant difference between lagging and leading results. However, the methodology only detects when a factor is both leading and Moz Link Explorer discovered the relevant factor before Google.

Factor Percent Corrected P-Value 95% Min 95% Max
Control 18.93% 0
Facebook Shares Controlled for PA 18.31% 0.00001 -0.6849 -0.5551
Root Linking Domains 20.58% 0.00001 0.016268 0.016732
Page Authority 20.98% 0.00001 0.026202 0.026398

Control:

In order to create a control, we randomly selected adjacent URL pairs in the first SERP collection and determined the likelihood that the second will outrank the first in the final SERP collection. Approximately 18.93% of the time the worse ranking URL would overtake the better ranking URL. By setting this control, we can determine if any of the potential correlates are leading factors – that is to say that they are potential causes of improved rankings.

Facebook Shares:

Facebook Shares performed the worst of the three tested variables. Facebook Shares actually performed worse than random (18.31% vs 18.93%), meaning that randomly selected pairs would be more likely to switch than those where shares of the second were higher than the first. This is not altogether surprising as it is the general industry consensus that social signals are lagging factors — that is to say the traffic from higher rankings drives higher social shares, not social shares drive higher rankings. Subsequently, we would expect to see the ranking change first before we would see the increase in social shares.

RLDs

Raw root linking domain counts performed substantially better than shares at ~20.5%. As I indicated before, this type of analysis is incredibly subtle because it only detects when a factor is both leading and Moz Link Explorer discovered the relevant factor before Google. Nevertheless, this result was statistically significant with a P value <0.0001 and a 95% confidence interval that RLDs will predict future ranking changes around 1.5% greater than random.

Page Authority

By far, the highest performing factor was Page Authority. At 21.5%, PA correctly predicted changes in SERPs 2.6% better than random. This is a strong indication of a leading factor, greatly outperforming social shares and outperforming the best predictive raw metric, root linking domains.This is not unsurprising. Page Authority is built to predict rankings, so we should expect that it would outperform raw metrics in identifying when a shift in rankings might occur. Now, this is not to say that Google uses Moz Page Authority to rank sites, but rather that Moz Page Authority is a relatively good approximation of whatever link metrics Google is using to determine ranking sites.

Concluding thoughts

There are so many different experimental designs we can use to help improve our research industry-wide, and this is just one of the methods that can help us tease out the differences between causal ranking factors and lagging correlates. Experimental design does not need to be elaborate and the statistics to determine reliability do not need to be cutting edge. While machine learning offers much promise for improving our predictive models, simple statistics can do the trick when we’re establishing the fundamentals.

Now, get out there and do some great research!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

More Articles

Posted in Latest NewsComments Off

3 Empowering Small Business Tips for Today, 2019, and a Better Future

Posted by MiriamEllis

“American business is overwhelmingly small business.” – SBE Council

Small businesses have created 61.8% of net new jobs in the US since the early 1990s. Local business is big business. Let’s celebrate this in honor of Small Business Saturday with 3 strategies that will support independent business owners this week, and in the better future that can be attained with the right efforts.

What’s Small Business Saturday?

It’s an annual shopping event sponsored by American Express on the Saturday following Thanksgiving with the primary goal of encouraging residents to patronize local merchants. The program was launched in 2010 in response to the Great Recession. By 2017, Small Business Saturday jumped to 7,200 Neighborhood Champions (individuals and groups that organize towns for the event), with 108 million reported participating consumers spending $ 12 billion across the country.

Those numbers are impressive, and more than that, they hold the acorn of strategy for the spreading oak of a nation in which independently grown communities set standards of living, set policy, and set us on course for a sustainable future.

Tips for small businesses today

If your community is already participating in Small Business Saturday, try these techniques to enhance your success on the big day:

1. Give an extra reason to shop with you

This can be as simple as giving customers a small discount or a small free gift with their purchase, or as far-reaching as donating part of the proceeds of the day’s sales to a worthy local cause. Give customers a reason to feel extra good that they shopped with you, especially if you can demonstrate how their purchase supports their own community. Check out our Local Business Holiday Checklist for further tips.

2. Give local media something to report

Creativity is your best asset in deciding how to make your place of business a top destination on Small Business Saturday, worthy of mentions in the local news. Live music? A treasure hunt? The best store window in town? Reach out to reporters if you’re doing something extra special to build up publicity.

3. Give a reason to come back year-round

Turn a shopping moment into a teaching moment. Print up some flyers from the American Independent Business Alliance and pass them out to customers to teach them how local purchasing increases local wealth, health, and security. Take a minute or two to talk with customers who express interest. Sometimes, all it takes is a little education and kindness to shift habits. First, take a few minutes to boost your own education by reading How to Win Some Customer Back from Amazon this Holiday Season.

AMIBA has a great list of tips for Small Business Saturday success and American Express has the best examples of how whole communities have created memorable events surrounding these campaigns. I’ve seen everything from community breakfast kickoffs in Michigan, to jazz bands in Louisiana, to Santa Claus coming to town on a riverboat in California. Working closely with participating neighboring businesses can transform your town or city into a holiday wonderland on this special day, and if your community isn’t involved yet, research this year can prepare you to rally support for an application to next year’s program.

Tips for small businesses for the new year

Unless your town is truly so small that all residents are already aware of every business located there, make 2019 the year you put the Internet to work for you and your community. Even small town businesses have news and promotions they’d like to share on the web, and don’t forget the arrival of new neighbors and travelers who need to be guided to find you. In larger cities, every resident and visitor needs help navigating the local commercial scene.

Try these tips for growth in the new year:

  1. Dig deeply into the Buy Local movement by reading The Local SEO’s Guide to the Buy Local Phenomenon. Even if you see yourself as a merchant today, you can re-envision your role as a community advocate, improving the quality of life for your entire town.
  2. Expand your vision of excellent customer service to include the reality that your neighbors are almost all on the Internet part of every day looking for solutions to their problems. A combination of on-and-offline customer service is your key to becoming the problem-solver that wins lucrative, loyal patrons. Read What the Local Customer Service Ecosystem Looks Like in 2019.
  3. Not sure where to begin learning about local search marketing on the web? First, check out Moz’s free Local SEO Learning Center with articles written for the beginner to familiarize yourself with the basic concepts. Then, start following the recognized leaders in this form of marketing to keep pace with new developments and opportunities as they arise. Make a new year’s resolution to devote just 15 minutes a day, 5 days a week, to learning more about marketing your small local business. By the end of a single year, you will have become a serious force for promotion of your company and the community it serves.

Tips for an independent business future: The time is right

I’ve been working in local business marketing for about 15 years, watching not just the development of technologies, but the ebb and flow of brand and consumer habits and attitudes. What I’m observing with most interest as we close out the present year is a rising tide of localistic leanings.

On the one hand, we have some of the largest brands (Google, Amazon, Facebook, etc.) losing the trust of the public in serious scandals surrounding privacy, human rights violations, and even war. On the other hand, we have small business owners uniting to revitalize their communities in wounded cities like Detroit and tiny towns like Bozeman, in the wake of the Great Recession, itself cited as a big brand product.

Where your company does business may influence your customers’ take on economics, but overall, the engrossing trend I’m seeing is towards more trust in smaller, independently owned companies. In fact, communities across the US are starting to map out futures for themselves that are as self-sustaining as possible. Earlier, I referenced small business owners undergoing a mental shift from lone merchant to community advocate, and here, I’ve mapped out a basic model for towns and cities to shift toward independence.

What most communities can’t access locally are branded products: imported big label clothing, packaged foods, electronics, cars, branded cosmetics, books. Similarly, most communities don’t have direct local access to the manufacture or mining of plastics, metals, and gases. And, very often, towns and cities lack access to agroforestry for raw lumber, fuel, natural fibers and free food. So, let’s say for now that the typical community leaves these things up to big brands so that they can still buy computers, books and stainless steel cookware from major manufacturers.

But beyond this, with the right planning, the majority of the components for a high standard of living can be created and owned locally. For example:

There are certainly some things we may rely on big brands and federal resources for, but it isn’t Amazon or the IRS who give us a friendly wave as we take our morning hike through town, making us feel acknowledged as people and improving our sense of community. For that, we have to rely on our neighbor. And it’s becoming increasingly clear that it’s up to towns and cities to determine whether neighbors are experiencing a decent standard of living.

Reading the mood of the economy, I am seeing more and more Americans becoming open to the messages that the percentage of small businesses in a community correlates with residents’ health, that quality social interactions lessen the chances of premature death by 50%, that independent businesses recirculate almost 4x as much community wealth, and that Main Street-style city planning massively reduces pollution vs. big box stores on the outskirts of town.

Small Business Saturday doesn’t have to be a once-a-year phenomenon. Small business owners, by joining together as community advocates, have the power to make it a way of life where they live. And they have one significant advantage over most corporations, the value of which shouldn’t be underestimated: They can begin the most important conversations face-to-face with their neighbors, asking, “Who do we want to be? Where do want to live? What’s our best vision for how life could be here?”

Don’t be afraid to talk beyond transactions with your favorite customers. Listening closely, I believe you’ll discover that there’s a longing for change and that the time is right.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Yum CEO: Driverless Cars, Robots Making Pizzas, This is All In Our Future

Yum Brands which owns Taco Bell, KFC, Pizza Hut and other restaurant brands are at the forefront of technological innovation. Yum also isn’t afraid to experiment with seemingly outlandish ideas either such as their announcement of the Toyota Tundra Pie Pro which makes pizza on the go.

Yum Brands CEO Greg Creed recently discussed Yum’s use of technology:

We Love Our Relationship with Grubhub

“We love our relationship with Grubhub, it’s a great partnership,” says Yum Brands CEO Greg Creed. “By the end of the year in the U.S., we’ll have about 2,000 KFC’s and probably close to 4,000 Taco Bell’s delivering. In the stores that are already delivering we’re getting check increases and incremental sales that are coming from it, so we’re very excited about this partnership. We think it obviously bodes well for the future sales growth for both KFC and for Taco Bell in the U.S.”

Driverless Cars and Robots Making Pizzas is Our Future

Pizza Hut has partnered with Toyota to develop a zero-emission Tundra PIE Pro, a mobile pizza factory with the ability to deliver oven-hot pizza wherever it goes. The full-size pizza-making truck was introduced at Toyota’s 2018 Specialty Equipment Market Association (SEMA) Show presentation.

“I love our partnership with Toyota,” added Creed. “This is really about technology, this is about robotics, this is about what the future is envisioning. Driverless cars, robots making pizzas, this is all in our future. Is it in our future next week? No, but is it in our foreseeable future, absolutely. Everything that we can do to make the brands more relevant, make them easier to access and more distinctive, that’s what will lead to continued success, not just for Pizza Hut but also at KFC and at Taco Bell.”

Pizza Hut Partners with Toyota on the Tundra PIE Pro

Pizza Hut has partnered with Toyota to develop the one-of-a-kind, zero-emission Tundra PIE Pro, a mobile pizza factory with the ability to deliver oven-hot pizza wherever it goes. The full-size pizza-making truck was introduced at Toyota’s 2018 Specialty Equipment Market Association (SEMA) Show presentation.

“Nothing tastes better than a fresh Pizza Hut pizza straight out of the oven,” said Marianne Radley, Chief Brand Officer, Pizza Hut. “The Tundra PIE Pro brings to life our passion for innovation not just on our menu but in digital and delivery in order to provide the best possible customer experience.”

The post Yum CEO: Driverless Cars, Robots Making Pizzas, This is All In Our Future appeared first on WebProNews.

WebProNews

Posted in Latest NewsComments Off

Why We’re Doubling Down on the Future of SEO – Moz + STAT

Posted by Dr-Pete

Search is changing. As a 200-person search marketing software company, this isn’t just a pithy intro – it’s a daily threat to our survival. Being an organic search marketer can be frustrating when even a search like “What is SEO?” returns something like this…

…or this…

…or even this…

So, why don’t we just give up on search marketing altogether? If I had to pick just one answer, it’s this – because search still drives the lion’s share of targeted, relevant traffic to business websites (and Google drives the vast majority of that traffic, at least in the US, Canada, Australia, and Western Europe).

We have to do everything better

The answer isn’t to give up – it’s to recognize all of this new complexity, study it, and do our jobs better. Earlier this year, for example, we embarked on a study to understand how SERP features impact click-through rates (CTR). It turns out to be a difficult problem, but even the initial insights of the data were useful (and a bit startling). For example, here’s the average organic (SERPs with no features) curve from that study…

Various studies show the starting point at various places, but the shape itself is consistent and familiar. We know, though, that reducing everything to one average ignores a lot. Here’s a dramatic example. Let’s compare the organic curve to the curve for SERPs with expanded sitelinks (which are highly correlated with dominant and/or branded intent)…

Results with sitelinks in the #1 position have a massive 80% average CTR, with a steep drop to #2. These two curves represent two wildly different animals. Now, let’s look at SERPs with Knowledge Cards (AKA “answer boxes” – Knowledge Graph entities with no organic link)…

The CTR in the #1 organic position drops to almost 1/3 of the organic-only curve, with corresponding drops throughout all positions. Organic opportunity on these SERPs is severely limited.

Opportunity isn’t disappearing, but it is evolving. We have to do better. This is why Moz has teamed up with STAT, and why we’re doubling down on search. We recognize the complexity of SERP analytics in 2018, but we also truly believe that there’s real opportunity for those willing to do the hard work and build better tools.

Doubling down on RANKINGS

It hurts a bit to admit, but there’s been more than once in the past couple of years where a client outgrew Moz for rank tracking. When they did, we had one thing to say to those clients: “We’ll miss you, and you should talk to STAT Search Analytics.” STAT has been a market leader in daily rank tracking, and they take that job very seriously, with true enterprise-scale capabilities and reporting.

For the past couple of years, STAT’s team has also been a generous source of knowledge, and even as competitors our engineering teams have shared intel on Google’s latest changes. As of now, all brakes are off, and we’re going to dive deep into each other’s brains (figuratively, of course – I only take mad science so far) to find out what each team does best. We’re going to work to combine the best of STAT’s daily tracking technology with Moz’s proprietary metrics (such as Keyword Difficulty) to chart the future of rank tracking.

We’ll also be working together to redefine what “ranking” means, in an organic sense. There are multiple SERP features, from Featured Snippets to Video Carousels to People Also Ask boxes that represent significant organic opportunity. STAT and Moz both have a long history of researching these opportunities and recognize the importance of reflecting them in our products.

Doubling down on RESEARCH

One area Moz has excelled at, showcased in the launch and evolution of Keyword Explorer, is keyword research. We’ll be working hard to put that knowledge to work for STAT customers even as we evolve Moz’s own toolsets. We’re already doing work to better understand keyword intent and how it impacts keyword research – beyond semantically related keywords, how do you find the best keywords with local intent or targeted at the appropriate part of the sales funnel? In an age of answer engines, how do you find the best questions to target? Together, we hope to answer these questions in our products.

In August, we literally doubled our keyword corpus in Keyword Explorer to supercharge your keyword research. You can now tap into suggestions from 160 million keywords across the US, Canada, UK, and Australia.

Beyond keywords, Moz and STAT have both been market leaders in original industry research, and we’ll be stronger together. We’re going to have access to more data and more in-house experts, and we’ll be putting that data to work for the search industry.

Doubling down on RESULTS

Finally, we recognize that SERP analytics are much more than just a number from 1–50. You need to understand how results drive clicks, traffic, and revenue. You need to understand your competitive landscape. You need to understand the entire ecosystem of keywords, links, and on-page SEO, and how those work together. By combining STAT’s enterprise-level analytics with Moz’s keyword research, link graph, and technical SEO tools (including both Site Crawl and On-demand Crawl), we’re going to bring you the tools you need to demonstrate and drive bottom-line results.

In the short-term, we’re going to be listening and learning from each other, and hopefully from you (both our community and our customers). What’s missing in your search marketing workflow today? What data do you love in Moz or STAT that’s missing from the other side? How can we help you do your job better? Let us know in the comments.

If you’d like to be notified of future developments, join our Moz+STAT Search Analytics mailing list (sign-up at bottom of page) to find out about news and offers as we roll them out.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

The Future of Connection on Facebook: How Stories May Change the Marketing Game

How Facebook Stories Will Change Social Media Marketing

How Facebook Stories Will Change Social Media Marketing

“You have part of my attention – you have the minimum amount.”

This scathing remark, delivered by actor Jesse Eisenberg while portraying Mark Zuckerberg amidst a heated deposition in the 2010 film The Social Network, has a certain pertinence today with regards to the company Zuckerberg founded back in 2004.

As Facebook’s news feed algorithm becomes increasingly restricting for brands and publishers, many of us are finding it difficult to capture even the minimum amount of our audience’s attention on the platform.

The search for elusive reach on the world’s largest social media channel has led some marketers to explore Facebook Groups as a way to stay visible with users. But it appears the more critical frontier may be Facebook Stories, a feature that is rapidly on the rise and — according to the company’s own top execs — represents the future of connection on Facebook.

[bctt tweet="#FacebookStories — according to the company’s own top execs — represents the future of connection on #Facebook. #SocialMediaMarketing" username="toprank"]

A Primer on Facebook Stories

The Social Network, referenced earlier, is a biographical drama depicting the inception of Facebook and the power struggles that took place. The film was extremely well received, earning eight Oscar nominations and winning three: Best Adapted Screenplay, Best Original Score, and Best Film Editing.

Certain people portrayed in the movie have criticized its inaccuracies (it wasn’t exactly kind to Mr. Zuckerberg, as the opening quote in this post illustrates), and writer Aaron Sorkin doesn’t deny playing loose with the facts.

“I don’t want my fidelity to be to the truth,” he told New York Magazine. “I want it to be to storytelling.”

A reputed screenwriter, Sorkin understands the power of stories, which have an ability to hook and captivate audiences in a way few other styles of communication can hope to match. This dynamic is undoubtedly driving the growth of “Stories” — series of images and videos played in succession, perfectly suited for mobile screens — across all social media platforms.

This chart via Block Party’s report, Beyond the News Feed: Why Stories Are Becoming the New Face of Social Media, visualizes the unmistakable trend well:

Facebook Stories Usage Trend

Interestingly, Snapchat — which largely sparked the popularity of this format when its “My Story” feature launched in 2014 — has remained stagnant while other players have gained fast traction. You can definitely count Facebook among them.

Originally rolled out on mobile in 2017, Facebook Stories made their way to desktop earlier this year and the feature now boasts 150 million daily active users. Like the versions on Instagram and Snapchat, this content is ephemeral — Facebook Stories and all of their comments disappear after 24 hours. But the convention itself is here to stay.

“We expect Stories are on track to overtake posts in feeds as the most common way that people share across all social apps,” said Zuckerberg (the real one, not the Eisenberg character) during a fourth-quarter earnings conference call.

This sentiment is shared by Facebook’s Chief Product Officer, Chris Cox, who laid out a more specific and imminent timeline at the company’s annual conference in early May:

The increase in the Stories format is on a path to surpass feeds as the primary way people share things with their friends sometime next year.

Needless to say, this is a story marketers need to be tracking.

The Other Side of the Story

Okay, so we know that Stories are quickly becoming a mainstream method for sharing content on social media, and we know that Facebook is making a firm commitment to the format. What does all this mean to us as marketers?

Add to Your Facebook Story

This is definitely a tool that companies can use, if they are so inclined. You have the ability to post them from your brand page, and (at least for now) it may increase your content’s odds of getting noticed. Relatively speaking, this feature isn’t being used all that much, and Facebook’s clear emphasis on growing it means that Stories are carving prime real estate above the news feed.

Some view this as the next great social media marketing opportunity on the platform. Earlier this year, Bud Torcom wrote in a piece at Forbes that Facebook Stories are “like California’s mines and creeks before the 1849 gold rush.” He sees this format transforming campaigns through experimentation, experiential marketing, influencer integration, and visual pizzazz.

Michelle Cyca sees similar potential, as she wrote on the HootSuite blog, calling Stories “a way to reconnect with users who aren’t seeing your content in their Newsfeed the same way” and calling out examples of campaigns that drove lifts in awareness by incorporating the tactic.

The idea of added organic reach is enticing (if fleeting, knowing that the onset of ads will turn this — like all Facebook marketing initiatives — into a pay-to-play space), but what really intrigues me about Stories is the almost infinite grounds for creativity.

Facebook Stories Examples Facebook Stories Examples from ModCloth and Mashable.

It’s a very cool method for visual storytelling. It’s a low-barrier entry point for social video (no one is expecting premium production quality on these). And it presents an accessible avenue for toying with emerging technologies — most notably, augmented reality, which is being strongly integrated into Facebook Stories in another step down the road Snapchat has paved.

[bctt tweet="The idea of added organic reach is enticing, but what really intrigues me about #FacebookStories is the almost infinite grounds for creativity. - @NickNelsonMN #SocialMediaMarketing" username="toprank"]

Where Does the Story Go Next?

“You don’t even know what the thing is yet. How big it can get, how far it can go. This is no time to take your chips down.”

This advice — delivered to Eisenberg’s Zuckerberg by Justin Timberlake’s Sean Parker in The Social Network — referred to Zuck’s budding Facebook venture, but could just as easily apply to any social media marketer eyeing Stories as a way to connect with their audience.

The downside is minimal. What have you got to lose? A little time and effort, perhaps. The possible benefits are extensive however. These include:

  • Prioritized placement on user feeds
  • Engaging bite-sized video content
  • Powerful visual storytelling for brands
  • Ability to experiment with new content styles and emerging tech like AR
  • Gaining familiarity with a format that could well represent the future of social marketing

More than anything, though, Facebook Stories are intriguing because they offer a real chance to capture part of a user’s attention — maybe even more than the minimum amount.

[bctt tweet="#FacebookStories are intriguing because they offer a real chance to capture part of a user’s attention — maybe even more than the minimum amount. - @NickNelsonMN #SocialMediaMarketing" username="toprank"]

And since brands generally aren’t tapping into this functionality as of yet, early adopters can jump ahead of the curve and beat their competition to the punch. If there’s one primary takeaway from Facebook’s story (as reflected in The Social Network), it’s the tremendous business value in being first. Just ask the Winklevoss twins.

At TopRank Marketing, we’re all about helping companies tell their stories through a wide variety of digital channels and tactics. Give us a shout if you’d like to hear more.

What are you thoughts on the future of Facebook stories? Tell us in the comments section below.

The post The Future of Connection on Facebook: How Stories May Change the Marketing Game appeared first on Online Marketing Blog – TopRank®.

Online Marketing Blog – TopRank®

Posted in Latest NewsComments Off

Web Personalization: The Future of Digital Marketing and Sales Is Now

In the beginning, the business website was a mere brochure. Low value, low shareability, low findability. Around 2005, a big shift happened thanks to content. Cutting-edge business websites became educational resources with valuable content that ranked well in search engines and benefited from the sharing functionality of emerging social media. Soon, “cutting edge” became the
Read More…

The post Web Personalization: The Future of Digital Marketing and Sales Is Now appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

Advert