Tag Archive | "Real"

The Real Impact of Mobile-First Indexing & The Importance of Fraggles

Posted by Suzzicks

While SEOs have been doubling-down on content and quality signals for their websites, Google was building the foundation of a new reality for crawling — indexing and ranking. Though many believe deep in their hearts that “Content is King,” the reality is that Mobile-First Indexing enables a new kind of search result. This search result focuses on surfacing and re-publishing content in ways that feed Google’s cross-device monetization opportunities better than simple websites ever could.

For two years, Google honed and changed their messaging about Mobile-First Indexing, mostly de-emphasizing the risk that good, well-optimized, Responsive-Design sites would face. Instead, the search engine giant focused more on the use of the Smartphone bot for indexing, which led to an emphasis on the importance of matching SEO-relevant site assets between desktop and mobile versions (or renderings) of a page. Things got a bit tricky when Google had to explain that the Mobile-First Indexing process would not necessarily be bad for desktop-oriented content, but all of Google’s shifting and positioning eventually validated my long-stated belief: That Mobile-First Indexing is not really about mobile phones, per se, but mobile content.

I would like to propose an alternative to the predominant view, a speculative theory, about what has been going on with Google in the past two years, and it is the thesis of my 2019 MozCon talk — something we are calling Fraggles and Fraggle-based Indexing

 I’ll go through Fraggles and Fraggle-based indexing, and how this new method of indexing has made web content more ‘liftable’ for Google. I’ll also outline how Fraggles impact the Search Results Pages (SERPs), and why it fits with Google’s promotion of Progressive Web Apps. Next, I will provide information about how astute SEO’s can adapt their understanding of SEO and leverage Fraggles and Fraggle-Based Indexing to meet the needs of their clients and companies. Finally, I’ll go over the implications that this new method of indexing will have on Google’s monetization and technology strategy as a whole.

Ready? Let’s dive in.

Fraggles & Fraggle-based indexing

The SERP has changed in many ways. These changes can be thought of and discussed separately, but I believe that they are all part of a larger shift at Google. This shift includes “Entity-First Indexing” of crawled information around the existing structure of Google’s Knowledge Graph, and the concept of “Portable-prioritized Organization of Information,” which favors information that is easy to lift and re-present in Google’s properties — Google describes these two things together as “Mobile-First Indexing.”

As SEOs, we need to remember that the web is getting bigger and bigger, which means that it’s getting harder to crawl. Users now expect Google to index and surface content instantly. But while webmasters and SEOs were building out more and more content in flat, crawlable HTML pages, the best parts of the web were moving towards more dynamic websites and web-apps. These new assets were driven by databases of information on a server, populating their information into websites with JavaScript, XML or C++, rather than flat, easily crawlable HTML. 

For many years, this was a major problem for Google, and thus, it was a problem for SEOs and webmasters. Ultimately though, it was the more complex code that forced Google to shift to this more advanced, entity-based system of indexing — something we at MobileMoxie call Fraggles and Fraggle-Based Indexing, and the credit goes to JavaScript’s “Fragments.”

Fraggles represent individual parts (fragments) of a page for which Google overlayed a “handle” or “jump-link” (aka named-anchor, bookmark, etc.) so that a click on the result takes the users directly to the part of the page where the relevant fragment of text is located. These Fraggles are then organized around the relevant nodes on the Knowledge Graph, so that the mapping of the relationships between different topics can be vetted, built-out, and maintained over time, but also so that the structure can be used and reused, internationally — even if different content is ranking. 

More than one Fraggle can rank for a page, and the format can vary from a text-link with a “Jump to” label, an unlabeled text link, a site-link carousel, a site-link carousel with pictures, or occasionally horizontal or vertical expansion boxes for the different items on a page.

The most notable thing about Fraggles is the automatic scrolling behavior from the SERP. While Fraggles are often linked to content that has an HTML or JavaScript jump-links, sometimes, the jump-links appear to be added by Google without being present in the code at all. This behavior is also prominently featured in AMP Featured Snippets, for which Google has the same scrolling behavior, but also includes Google’s colored highlighting — which is superimposed on the page — to show the part of the page that was displayed in the Featured Snippet, which allows the searcher to see it in context. I write about this more in the article: What the Heck are Fraggles.

How Fraggles & Fraggle-based indexing works with JavaScript

Google’s desire to index Native Apps and Web Apps, including single-page apps, has necessitated Google’s switch to indexing based on Fragments and Fraggles, rather than pages. In JavaScript, as well as in Native Apps, a “Fragment” is a piece of content or information that is not necessarily a full page. 

The easiest way for an SEO to think about a Fragment is within the example of an AJAX expansion box: The piece of text or information that is fetched from the server to populate the AJAX expander when clicked could be described as a Fragment. Alternatively, if it is indexed for Mobile-First Indexing, it is a Fraggle. 

It is no coincidence that Google announced the launch of Deferred JavaScript Rendering at roughly the same time as the public roll-out of Mobile-First Indexing without drawing-out the connection, but here it is: When Google can index fragments of information from web pages, web apps and native apps, all organized around the Knowledge Graph, the data itself becomes “portable” or “mobile-first.”

We have also recently discovered that Google has begun to index URLs with a # jump-link, after years of not doing so, and is reporting on them separately from the primary URL in Search Console. As you can see below from our data, they aren’t getting a lot of clicks, but they are getting impressions. This is likely because of the low average position. 

Before Fraggles and Fraggle-Based Indexing, indexing # URLs would have just resulted in a massive duplicate content problem and extra work indexing for Google. Now that Fraggle-based Indexing is in-place, it makes sense to index and report on # URLs in Search Console — especially for breaking up long, drawn-out JavaScript experiences like PWA’s and Single-Page-Apps that don’t have separate URLs, databases, or in the long-run, possibly even for indexing native apps without Deep Links. 

Why index fragments & Fraggles?

If you’re used to thinking of rankings with the smallest increment being a URL, this idea can be hard to wrap your brain around. To help, consider this thought experiment: How useful would it be for Google to rank a page that gave detailed information about all different kinds of fruits and vegetables? It would be easy for a query like “fruits and vegetables,” that’s for sure. But if the query is changed to “lettuce” or “types of lettuce,” then the page would struggle to rank, even if it had the best, most authoritative information. 

This is because the “lettuce” keywords would be diluted by all the other fruit and vegetable content. It would be more useful for Google to rank the part of the page that is about lettuce for queries related to lettuce, and the part of the page about radishes well for queries about radishes. But since users don’t want to scroll through the entire page of fruits and vegetables to find the information about the particular vegetable they searched for, Google prioritizes pages with keyword focus and density, as they relate to the query. Google will rarely rank long pages that covered multiple topics, even if they were more authoritative.

With featured snippets, AMP featured snippets, and Fraggles, it’s clear that Google can already find the important parts of a page that answers a specific question — they’ve actually been able to do this for a while. So, if Google can organize and index content like that, what would the benefit be in maintaining an index that was based only on per-pages statistics and ranking? Why would Google want to rank entire pages when they could rank just the best parts of pages that are most related to the query?

To address these concerns, historically, SEO’s have worked to break individual topics out into separate pages, with one page focused on each topic or keyword cluster. So, with our vegetable example, this would ensure that the lettuce page could rank for lettuce queries and the radish page could rank for radish queries. With each website creating a new page for every possible topic that they would like to rank for, there’s lot of redundant and repetitive work for webmasters. It also likely adds a lot of low-quality, unnecessary pages to the index. Realistically, how many individual pages on lettuce does the internet really need, and how would Google determine which one is the best? The fact is, Google wanted to shift to an algorithm that focused less on links and more on topical authority to surface only the best content — and Google circumvents this with the scrolling feature in Fraggles.

Even though the effort to switch to Fraggle-based indexing, and organize the information around the Knowledge Graph, was massive, the long-term benefits of the switch far out-pace the costs to Google because they make Google’s system for flexible, monetizable and sustainable, especially as the amount of information and the number of connected devices expands exponentially. It also helps Google identify, serve and monetize new cross-device search opportunities, as they continue to expand. This includes search results on TV’s, connected screens, and spoken results from connected speakers. A few relevant costs and benefits are outlined below for you to contemplate, keeping Google’s long-term perspective in mind:

Why Fraggles and Fraggle-based indexing are important for PWAs

What also makes the shift to Fraggle-based Indexing relevant to SEOs is how it fits in with Google’s championing of Progressive Web Apps or AMP Progressive Web Apps, (aka PWAs and PWA-AMP websites/web apps). These types of sites have become the core focus of Google’s Chrome Developer summits and other smaller Google conferences.

From the perspective of traditional crawling and indexing, Google’s focus on PWAs is confusing. PWAs often feature heavy JavaScript and are still frequently built as Single-Page Apps (SPA’s), with only one or only a few URLs. Both of these ideas would make PWAs especially difficult and resource-intensive for Google to index in a traditional way — so, why would Google be so enthusiastic about PWAs? 

The answer is because PWA’s require ServiceWorkers, which uses Fraggles and Fraggle-based indexing to take the burden off crawling and indexing of complex web content.

In case you need a quick refresher: ServiceWorker is a JavaScript file — it instructs a device (mobile or computer) to create a local cache of content to be used just for the operation of the PWA. It is meant to make the loading of content much faster (because the content is stored locally) instead of just left on a server or CDN somewhere on the internet and it does so by saving copies of text and images associated with certain screens in the PWA. Once a user accesses content in a PWA, the content doesn’t need to be fetched again from the server. It’s a bit like browser caching, but faster — the ServiceWorker stores the information about when content expires, rather than storing it on the web. This is what makes PWAs seem to work offline, but it is also why content that has not been visited yet is not stored in the ServiceWorker.

ServiceWorkers and SEO

Most SEOs who understand PWAs understand that a ServiceWorker is for caching and load time, but they may not understand that it is likely also for indexing. If you think about it, ServiceWorkers mostly store the text and images of a site, which is exactly what the crawler wants. A crawler that uses Deferred JavaScript Rendering could go through a PWA and simulate clicking on all the links and store static content using the framework set forth in the ServiceWorker. And it could do this without always having to crawl all the JavaScript on the site, as long as it understood how the site was organized, and that organization stayed consistent. 

Google would also know exactly how often to re-crawl, and therefore could only crawl certain items when they were set to expire in the ServiceWorker cache. This saves Google a lot of time and effort, allowing them to get through or possibly skip complex code and JavaScript.

For a PWA to be indexed, Google requires webmasters to ‘register their app in Firebase,’ but they used to require webmasters to “register their ServiceWorker.” Firebase is the Google platform that allows webmasters to set up and manage indexing and deep linking for their native apps, chat-bots and, now, PWA’s

Direct communication with a PWA specialist at Google a few years ago revealed that Google didn’t crawl the ServiceWorker itself, but crawled the API to the ServiceWorker. It’s likely that when webmasters register their ServiceWorker with Google, Google is actually creating an API to the ServiceWorker, so that the content can be quickly and easily indexed and cached on Google’s servers. Since Google has already launched an Indexing API and appears to now favor API’s over traditional crawling, we believe Google will begin pushing the use of ServiceWorkers to improve page speed, since they can be used on non-PWA sites, but this will actually be to help ease the burden on Google to crawl and index the content manually.

Flat HTML may still be the fastest way to get web information crawled and indexed with Google. For now, JavaScript still has to be deferred for rendering, but it is important to recognize that this could change and crawling and indexing is not the only way to get your information to Google. Google’s Indexing API, which was launched for indexing time-sensitive information like job postings and live-streaming video, will likely be expanded to include different types of content. 

It’s important to remember that this is how AMP, Schema, and many other types of powerful SEO functionalities have started with a limited launch; beyond that, some great SEO’s have already tested submitting other types of content in the API and seen success. Submitting to APIs skips Google’s process of blindly crawling the web for new content and allows webmasters to feed the information to them directly.

It is possible that the new Indexing API follows a similar structure or process to PWA indexing. Submitted URLs can already get some kinds of content indexed or removed from Google’s index, usually in about an hour, and while it is only currently officially available for the two kinds of content, we expect it to be expanded broadly.

How will this impact SEO strategy?

Of course, every SEO wants to know how to leverage this speculative theory — how can we make the changes in Google to our benefit? 

The first thing to do is take a good, long, honest look at a mobile search result. Position #1 in the organic rankings is just not what it used to be. There’s a ton of engaging content that is often pushing it down, but not counting as an organic ranking position in Search Console. This means that you may be maintaining all your organic rankings while also losing a massive amount of traffic to SERP features like Knowledge Graph results, Featured Snippets, Google My Business, maps, apps, Found on the Web, and other similar items that rank outside of the normal organic results. 

These results, as well as Pay-per-Click results (PPC), are more impactful on mobile because they are stacked above organic rankings. Rather than being off to the side, as they might be in a desktop view of the search, they push organic rankings further down the results page. There has been some great reporting recently about the statistical and large-scale impact of changes to the SERP and how these changes have resulted in changes to user-behavior in search, especially from Dr. Pete Meyers, Rand Fishkin, and JumpTap.

Dr. Pete has focused on the increasing number of changes to the Google Algorithm recorded in his MozCast, which heated up at the end of 2016 when Google started working on Mobile-First Indexing, and again after it launched the Medic update in 2018. 

Rand, on the other hand, focused on how the new types of rankings are pushing traditional organic results down, resulting in less traffic to websites, especially on mobile. All this great data from these two really set the stage for a fundamental shift in SEO strategy as it relates to Mobile-First Indexing.

The research shows that Google re-organized its index to suit a different presentation of information — especially if they are able to index that information around an entity-concept in the Knowledge Graph. Fraggle-based Indexing makes all of the information that Google crawls even more portable because it is intelligently nested among related Knowledge Graph nodes, which can be surfaced in a variety of different ways. Since Fraggle-based Indexing focuses more on the meaningful organization of data than it does on pages and URLs, the results are a more “windowed” presentation of the information in the SERP. SEOs need to understand that search results are now based on entities and use-cases (think micro-moments), instead of pages and domains.

Google’s Knowledge Graph

To really grasp how this new method of indexing will impact your SEO strategy, you first have to understand how Google’s Knowledge Graph works. 

Since it is an actual “graph,” all Knowledge Graph entries (nodes) include both vertical and lateral relationships. For instance, an entry for “bread” can include lateral relationships to related topics like cheese, butter, and cake, but may also include vertical relationships like “standard ingredients in bread” or “types of bread.” 

Lateral relationships can be thought of as related nodes on the Knowledge Graph, and hint at “Related Topics” whereas vertical relationships point to a broadening or narrowing of the topic; which hints at the most likely filters within a topic. In the case of bread, a vertical relationship-up would be topics like “baking,” and down would include topics like “flour” and other ingredients used to make bread, or “sourdough” and other specific types of bread.

SEOs should note that Knowledge Graph entries can now include an increasingly wide variety of filters and tabs that narrow the topic information to benefit different types of searcher intent. This includes things like helping searchers find videos, books, images, quotes, locations, but in the case of filters, it can be topic-specific and unpredictable (informed by active machine learning). This is the crux of Google’s goal with Fraggle-based Indexing: To be able to organize the information of the web-based on Knowledge Graph entries or nodes, otherwise discussed in SEO circles as “entities.” 

Since the relationships of one entity to another remain the same, regardless of the language a person is speaking or searching in, the Knowledge Graph information is language-agnostic, and thus easily used for aggregation and machine learning in all languages at the same time. Using the Knowledge Graph as a cornerstone for indexing is, therefore, a much more useful and efficient means for Google to access and serve information in multiple languages for consumption and ranking around the world. In the long-term, it’s far superior to the previous method of indexing.

Examples of Fraggle-based indexing in the SERPs 

Knowledge Graph

Google has dramatically increased the number of Knowledge Graph entries and the categories and relationships within them. The build-out is especially prominent for topics for which Google has a high amount of structured data and information already. This includes topics like:

  • TV and Movies — from Google Play
  • Food and Recipe — from Recipe Schema, recipe AMP pages, and external food and nutrition databases 
  • Science and medicine — from trusted sources (like WebMD) 
  • Businesses — from Google My Business. 

Google is adding more and more nodes and relationships to their graph and existing entries are also being built-out with more tabs and carousels to break a single topic into smaller, more granular topics or type of information.

As you can see below, the build-out of the Knowledge Graph has also added to the number of filters and drill-down options within many queries, even outside of the Knowledge Graph. This increase can be seen throughout all of the Google properties, including Google My Business and Shopping, both of which we believe are now sections of the Knowledge Graph:


Google Search for ‘Blazers’ with Visual Filters at the Top for Shopping Oriented Queries

Google My Business (Business Knowledge Graph) with Filters for Information about Googleplex

Other similar examples include the additional filters and “Related Topics” results in Google Images, which we also believe to represent nodes on the Knowledge Graph:

0

 Advanced issues found

 


Google Images Increase in Filters & Inclusion of Related Topics Means that These Are Also Nodes on the Knowledge Graph

The Knowedge Graph is also being presented in a variety of different ways. Sometimes there’s a sticky navigation that persists at the top of the SERP, as seen in many media-oriented queries, and sometimes it’s broken up to show different information throughout the SERP, as you may have noticed in many of the local business-oriented search results, both shown below.


Media Knowledge Graph with Sticky Top Nav (Query for ‘Ferris Bueller’s Day Off’)

Local Business Knowledge Graph (GMB) With Information Split-up Throughout the SERP

Since the launch of Fraggle-based indexing is essentially a major Knowledge Graph build-out, Knowledge Graph results have also begun including more engaging content which makes it even less likely that users will click through to a website. Assets like playable video and audio, live sports scores, and location-specific information such as transportation information and TV time-tables can all be accessed directly in the search results. There’s more to the story, though. 

Increasingly, Google is also building out their own proprietary content by re-mixing existing information that they have indexed to create unique, engaging content like animated ‘AMP Stories’ which webmasters are also encouraged to build-out on their own. They have also started building a zoo of AR animals that can show as part of a Knowledge Graph result, all while encouraging developers to use their AR kit to build their own AR assets that will, no doubt, eventually be selectively incorporated into the Knowledge Graph too.


Google AR Animals in Knowledge Graph

Google AMP Stories Now Called ‘Life in Images’

SEO Strategy for Knowledge Graphs

Companies who want to leverage the Knowledge Graph should take every opportunity to create your own assets, like AR models and AMP Stories, so that Google will have no reason to do it. Beyond that, companies should submit accurate information directly to Google whenever they can. The easiest way to do this is through Google My Business (GMB). Whatever types of information are requested in GMB should be added or uploaded. If Google Posts are available in your business category, you should be doing Posts regularly, and making sure that they link back to your site with a call to action. If you have videos or photos that are relevant for your company, upload them to GMB. Start to think of GMB as a social network or newsletter — any assets that are shared on Facebook or Twitter can also be shared on Google Posts, or at least uploaded to the GMB account.

You should also investigate the current Knowledge Graph entries that are related to your industry, and work to become associated with recognized companies or entities in that industry. This could be from links or citations on the entity websites, but it can also include being linked by third-party lists that give industry-specific advice and recommendations, such as being listed among the top competitors in your industry (“Best Plumbers in Denver,” “Best Shoe Deals on the Web,” or “Top 15 Best Reality TV Shows”). Links from these posts also help but are not required — especially if you can get your company name on enough lists with the other top players. Verify that any links or citations from authoritative third-party sites like Wikipedia, Better Business Bureau, industry directories, and lists are all pointing to live, active, relevant pages on the site, and not going through a 301 redirect.

While this is just speculation and not a proven SEO strategy, you might also want to make sure that your domain is correctly classified in Google’s records by checking the industries that it is associated with. You can do so in Google’s MarketFinder tool. Make updates or recommend new categories as necessary. Then, look into the filters and relationships that are given as part of Knowledge Graph entries and make sure you are using the topic and filter words as keywords on your site.

Featured snippets 

Featured Snippets or “Answers” first surfaced in 2014 and have also expanded quite a bit, as shown in the graph below. It is useful to think of Featured Snippets as rogue facts, ideas or concepts that don’t have a full Knowledge Graph result, though they might actually be associated with certain existing nodes on the Knowledge Graph (or they could be in the vetting process for eventual Knowledge Graph build-out). 

Featured Snippets seem to surface when the information comes from a source that Google does not have an incredibly high level of trust for, like it does for Wikipedia, and often they come from third party sites that may or may not have a monetary interest in the topic — something that makes Google want to vet the information more thoroughly and may prevent Google from using it, if a less bias option is available.

Like the Knowledge Graph, Featured Snippets results have grown very rapidly in the past year or so, and have also begun to include carousels — something that Rob Bucci writes about extensively here. We believe that these carousels represent potentially related topics that Google knows about from the Knowledge Graph. Featured Snippets now look even more like mini-Knowledge Graph entries: Carousels appear to include both lateral and vertically related topics, and their appearance and maintenance seem to be driven by click volume and subsequent searches. However, this may also be influenced by aggregated engagement data for People Also Ask and Related Search data.

The build-out of Featured Snippets has been so aggressive that sometimes the answers that Google lifts are obviously wrong, as you can see in the example image below. It is also important to understand that Featured Snippet results can change from location to location and are not language-agnostic, and thus, are not translated to match the Search Language or the Phone Language settings. Google also does not hold themselves to any standard of consistency, so one Featured Snippet for one query might present an answer one way, and a similar query for the same fact could present a Featured Snippet with slightly different information. For instance, a query for “how long to boil an egg” could result in an answer that says “5 minutes” and a different query for “how to make a hard-boiled egg” could result in an answer that says “boil for 1 minute, and leave the egg in the water until it is back to room temperature.”


Featured Snippet with Carousel Featured

Snippet that is Wrong

The data below was collected by Moz and represents an average of roughly 10,000 that skews slightly towards ‘head’ terms.


This Data Was Collected by Moz & represents an average of roughly 10,000 that skews slightly towards ‘head’ terms

SEO strategy for featured snippets

All of the standard recommendations for driving Featured Snippets apply here. This includes making sure that you keep the information that you are trying to get ranked in a Featured Snippet clear, direct, and within the recommended character count. It also includes using simple tables, ordered lists, and bullets to make the data easier to consume, as well as modeling your content after existing Featured Snippet results in your industry.

This is still speculative, but it seems likely that the inclusion of Speakable Schema markup for things like “How To,” “FAQ,” and “Q&A” may also drive Featured Snippets. These kinds of results are specially designated as content that works well in a voice-search. Since Google has been adamant that there is not more than one index, and Google is heavily focused on improving voice-results from Google Assistant devices, anything that could be a good result in the Google Assistant, and ranks well, might also have a stronger chance at ranking in a Featured Snippet.

People Also Ask & Related Searches

Finally, the increased occurrence of “Related Searches” as well as the inclusion of People Also Ask (PAA) questions, just below most Knowledge Graph and Featured Snippet results, is undeniable. The Earl Tea screenshot shows that PAA’s along with Interesting Finds are both part of the Knowledge Graph too.

The graph below shows the steady increase in PAA’s. PAA results appear to be an expansion of Featured Snippets because once expanded, the answer to the question is displayed, with the citation below it. Similarly, some Related Search results also now include a result that looks like a Featured Snippet, instead of simply linking over to a different search result. You can now find ‘Related Searches’ throughout the SERP, often as part of a Knowledge Graph results, but sometimes also in a carousel in the middle of the SERP, and always at the bottom of the SERP — sometimes with images and expansion buttons to surface Featured Snippets within the Related Search results directly in the existing SERP.

Boxes with Related Searches are now also included with Image Search results. It’s interesting to note that Related Search results in Google Images started surfacing at the same time that Google began translating image Title Tags and Alt Tags. It coincides well with the concept that Entity-First Indexing, that Entities and Knowledge Graph are language-agnostic, and that Related Searches are somehow related to the Knowledge Graph.


This data was collected by Moz and represents an average of roughly 10,000 that skews slightly towards ‘head’ terms.


People Also Ask

Related Searches

SEO STRATEGY for PAA and related searches

Since PAAs and some Related Searches now appear to simply include Featured Snippets, driving Featured Snippet results for your site is also a strong strategy here. It often appears that PAA results include at least two versions of the same question, re-stated with a different language, before including questions that are more related to lateral and vertical nodes on the Knowledge Graph. If you include information on your site that Google thinks is related to the topic, based on Related Searches and PAA questions, it could help make your site appear relevant and authoritative.

Finally, it is crucial to remember that you don’t have a website to rank in Google now and SEO’s should consider non-website rankings as part of their job too. 

If a business doesn’t have a website, or if you just want to cover all the bases, you can let Google host your content directly — in as many places as possible. We have seen that Google-hosted content generally seems to get preferential treatment in Google search results and Google Discover, especially when compared to the decreasing traffic from traditional organic results. Google is now heavily focused on surfacing multimedia content, so anything that you might have previously created a new page on your website for should now be considered for a video.

Google My Business (GMB) is great for companies that don’t have websites, or that want to host their websites directly with Google. YouTube is great for videos, TV, video-podcasts, clips, animations, and tutorials. If you have an app, a book, an audio-book, a podcast, a movie, TV show, class or music, or PWA, you can submit that directly to GooglePlay (much of the video content in GooglePlay is now cross-populated in YouTube and YouTube TV, but this is not necessarily true of the other assets). This strategy could also include books in Google Books, flights in Google Flights, Hotels in Google Hotel listings, and attractions in Google Explore. It also includes having valid AMP code, since Google hosts AMP content, and includes Google News if your site is an approved provider of news.

Changes to SEO tracking for Fraggle-based indexing

The biggest problem for SEOs is the missing organic traffic, but it is also the fact that current methods of tracking organic results generally don’t show whether things like Knowledge Graph, Featured Snippets, PAA, Found on the Web, or other types of results are appearing at the top of the query or somewhere above your organic result. Position one in organic results is not what it used to be, nor is anything below it, so you can’t expect those rankings to drive the same traffic. If Google is going to be lifting and representing everyone’s content, the traffic will never arrive at the site and SEOs won’t know if their efforts are still returning the same monetary value. This problem is especially poignant for publishers, who have only been able to sell advertising on their websites based on the expected traffic that the website could drive.

The other thing to remember is that results differ — especially on mobile, which varies from device to device (generally based on screen size) but also can vary based on the phone IOS. They can also change significantly based on the location or the language settings of the phone, and they definitely do not always match with desktop results for the same query. Most SEO’s don’t know much about the reality of their mobile search results because most SEO reporting tools still focus heavily on desktop results, even though Google has switched to Mobile-First. 

As well, SEO tools generally only report on rankings from one location — the location of their servers — rather than being able to test from different locations. 

The only thing that good SEO’s can do to address this problem is to use tools like the MobileMoxie SERP Test to check what rankings look like on top keywords from all the locations where their users may be searching. While the free tool only provides results with one location at a time, subscribers can test search results in multiple locations, based on a service-area radius or based on an uploaded CSV of addresses. The tool has integrations with Google Sheets, and a connector with Data Studio, to help with SEO reporting, but APIs are also available, for deeper integrations in content editing tools, dashboards and for use within other SEO tools.

Conclusion

At MozCon 2017, I expressed my belief that the impact of Mobile-First Indexing requires a re-interpretation of the words “Mobile,” “First,” and “Indexing.” Re-defined in the context of Mobile-First Indexing, the words should be understood to mean “portable,” “preferred,” and “organization of information.” The potential of a shift to Fraggle-based indexing and the recent changes to the SERPs, especially in the past year, certainly seems to prove the accuracy of this theory. And though they have been in the works for more than two years, the changes to the SERP now seem to be rolling-out faster and are making the SERP unrecognizable from what it was only three or four years ago.

In this post, we described Fraggles and Fraggle-based indexing for SEO as a theory that speculates the true nature of the change to Mobile-First Indexing, how the index itself — and the units of indexing — may have changed to accommodate faster and more nuanced organization of information based on the Knowledge Graph, rather than simply links and URLs. We covered how Fraggles and Fraggle-based Indexing works, how it is related to JavaScript and PWA’s and what strategies SEOs can take to leverage it for additional exposure in the search results as well as how they can update their success tracking to account for all the variabilities that impact mobile search results.

SEOs need to consider the opportunities and change the way we view our overall indexing strategy, and our jobs as a whole. If Google is organizing the index around the Knowledge Graph, that makes it much easier for Google to constantly mention near-by nodes of the Knowledge Graph in “Related Searches” carousels, links from the Knowledge Graph, and topics in PAAs. It might also make it easier to believe that featured snippets are simply pieces of information being vetted (via Google’s click-crowdsourcing) for inclusion or reference in the Knowledge Graph.

Fraggles and Fraggled indexing re-frames the switch to Mobile-First Indexing, which means that SEOs and SEO tool companies need to start thinking mobile-first — i.e. the portability of their information. While it is likely that pages and domains still carry strong ranking signals, the changes in the SERP all seem to focus less on entire pages, and more on pieces of pages, similar to the ones surfaced in Featured Snippets, PAAs, and some Related Searches. If Google focuses more on windowing content and being an “answer engine” instead of a “search engine,” then this fits well with their stated identity, and their desire to build a more efficient, sustainable, international engine.

SEOs also need to find ways to serve their users better, by focusing more on the reality of the mobile SERP, and how much it can vary for real users. While Google may not call the smallest rankable units Fraggles, it is what we call them, and we think they are critical to the future of SEO.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Millions of fake Google Maps listings hurt real business and consumers

Google says it is working on it, but is still positioned to profit as local businesses claw back visibility with paid ads.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

More Articles

Posted in Latest NewsComments Off

Huge Volume of IoT Data Managed via AI Creates Real Value, Says Oracle VP

“What’s interesting is that IoT has been around for a long time but as companies start to enable it and start to leverage it more and more there’s just huge volumes of data that have to be managed and be able to analyze and be able to execute from,” says John Barcus, Vice President Manufacturing Industries at Oracle. “One of the technologies that is really exciting is this whole concept of AI. It really allows you to use that information and correlate it with a lot of different pieces of information.”

John Barcus, Vice President Manufacturing Industries at Oracle, discusses how technologies such as AI and blockchain are now helping companies manage huge volumes of IoT data in an interview with technology influencer Ronald van Loon:

Companies Are Moving Toward Selling Products as a Service

I think that (manufacturers connecting all the processes digitally) is the way that will differentiate them. It’s really the only way the companies will be able to survive into the future. There are all these business models and it has become significantly more competitive than it has been in the past. Companies have to work faster and they have to be more responsive to what their customer needs are. The only way really of doing that is to connect the various aspects of the business. They can’t work in silos anymore. That really will give you the whole value of the business.

One area that companies are moving away from is selling products. They’re going into selling more services which we’ve actually seen for some time. But what they’re now getting into is these new models where they might be selling products as a service. If you think about how do you sell a product as a service and the ability to support that it is a lot different than it was before. Connecting to that product and being able to anticipate activities, anticipate needs, anticipate failures, and to be able to monitor how it’s performing, how the customers use it and are able to expand on that to be able to provide a better outcome for the customer are important components.

Huge Volume of IoT Data Managed via AI Creates Real Value

What’s interesting is that IoT has been around for a long time but as companies start to enable it and start to leverage it more and more there’s just huge volumes of data that have to be managed and be able to analyze and be able to execute from. One of the technologies that is really exciting is this whole concept of AI. It really allows you to use that information and correlate it with a lot of different pieces of information. You can correlate with the data that might be in your ERP and your MES and other sources of information and actually provide some real value and provide the real outcomes. It can now do some predictions where it would be actually physically impossible for people to do the same type of calculations that they’ve been doing in the past with this huge volume today.

The second area where there seems to be a little hesitation at the moment is around blockchain. But the technology is there and people have been trying to identify how best to use it. Some of the use cases that are coming out now are going to be quite impressive. I think the little bit of a lull was deserved. People who looked at it anticipate a little bit more than what was possible and now they’re really starting to develop some good use cases. I think there’s a lot of opportunities in that area.

Huge Volume of IoT Data Managed via AI Creates Real Value, Says Oracle VP John Barcus

The post Huge Volume of IoT Data Managed via AI Creates Real Value, Says Oracle VP appeared first on WebProNews.

WebProNews

Posted in Latest NewsComments Off

5 Real Examples of Advanced Content Promotion Strategies

Posted by bsmarketer

Content promotion isn’t tweeting or upvoting. Those tiny, one-off tactics are fine for beginners. They might make a dent, but they definitely won’t move the needle. Companies that want to grow big and grow fast need to grow differently.

Here’s how Kissmetrics, Sourcify, Sales Hacker, Kinsta, and BuildFire have used advanced content promotion tips like newsjacking and paid social to elevate their brands above the competition.

1. Use content to fuel social media distribution (and not the other way around)

Prior to selling the brand and blog to Neil Patel, Kissmetrics had no dedicated social media manager at the height of their success. The Kissmetrics blog received nearly 85% of its traffic from organic search. The second biggest traffic-driver was the newsletter.

Social media did drive traffic to their posts. However, former blog editor Zach Buylgo’s research showed that these traffic segments often had the lowest engagement (like time on site) and the least conversions (like trial or demo opt-ins) — so they didn’t prioritize it. The bulk of Zach’s day was instead focused on editing posts, making changes himself, adding comments and suggestions for the author to fix, and checking for regurgitated content. Stellar, long-form content was priority number one. And two. And three.

So Zach wasn’t just looking for technically-correct content. He was optimizing for uniqueness: the exact same area where most cheap content falls short. That’s an issue because many times, a simple SERP analysis would reveal that one submission:

benefits of content marketing (crowd content)

(image source)

…Looked exactly like the number-one result from Content Marketing Institute:

benefits of content marketing CMI

(image source)

Today’s plagiarism tools can catch the obvious stuff, but these derivatives often slip through the cracks. Recurring paid writers contributed the bulk of the TOFU content, which would free Zach up to focus more on MOFU use cases and case studies to help visitors understand how to get the most out of their product set (from the in-house person who knows it best).

They produced marketing guides and weekly webinars to transform initial attention into new leads:

image.png

They also created free marketing tools to give prospects an interactive way to continue engaging with their brand:

image.png

In other words, they focused on doing the things that matter most — the 20% that would generate the biggest bang for their buck. They won’t ignore social networks completely, though. They still had hundreds of thousands of followers across each network. Instead, their intern would take the frontlines. That person would watch out for anything critical, like a customer question, which will then be passed off to the Customer Success Manager that will get back to them within a few hours.

image.png

New blog posts would get the obligatory push to Twitter and LinkedIn. (Facebook is used primarily for their weekly webinar updates.) Zach used Pablo from Buffer to design and create featured images for the blog posts.

image.png

Then he’d use an Open Graph Protocol WordPress plugin to automatically add all appropriate tags for each network. That way, all he had to do was add the file and basic post meta data. The plugin would then customize how it shows up on each network afterward. Instead of using Buffer to promote new posts, though, Zach likes MeetEdgar.

Why? Doesn’t that seem like an extra step at first glance? Like Buffer, MeetEdgar allows you to select when you’d like to schedule content. You can just load up the queue with content, and the tool will manage the rest. The difference is that Buffer constantly requires new content — you need to keep topping it off, whereas MeetEdgar will automatically recycle the old stuff you’ve previously added. This saved a blog like Kissmetrics, with thousands of content pieces, TONS of time.

Header-screenshot-animated_02.gif

(image source)

He would then use Sleeknote to build forms tailored to each blog category to transform blog readers into top-of-the-funnel leads:

Screen Shot 2017-10-30 at 1.36.05 PM.png

But that’s about it. Zach didn’t do a ton of custom tweets. There weren’t a lot of personal replies. It’s not that they didn’t care. They just preferred to focus on what drives the most results for their particular business. They focused on building a brand that people recognize and trust. That means others would do the social sharing for them.

Respected industry vets like Avinash Kaushik, for example, would often share their blog posts. And Avinash was the perfect fit, because he already has a loyal, data-driven audience following him.

image.png

So that single tweet brings in a ton of highly-qualified traffic — traffic that turns into leads and customers, not just fans.

2. Combine original research and newsjacking to go viral

Sourcify has grown almost exclusively through content marketing. Founder Nathan Resnick speaks, attends, and hosts everything from webinars to live events and meetups. Most of their events are brand-building efforts to connect face-to-face with other entrepreneurs. But what’s put them on the map has been leveraging their own experience and platform to fuel viral stories.

Last summer, the record-breaking Mayweather vs. McGregor fight was gaining steam. McGregor was already infamous for his legendary trash-talking and shade-throwing abilities. He also liked to indulge in attention-grabbing sartorial splendor. But the suit he wore to the very first press conference somehow managed to combine the best of both personality quirks:

(image source)

This was no off-the-shelf suit. He had it custom made. Nathan recalls seeing this press conference suit fondly: “Literally, the team came in after the press conference, thinking, ‘Man, this is an epic suit.’” So they did what any other rational human being did after seeing it on TV: they tried to buy it online.

“Except, the dude was charging like $ 10,000 to cover it and taking six weeks to produce.” That gave Nathan an idea. “I think we can produce this way faster.”

They “used their own platform, had samples done in less than a week, and had a site up the same day.”

(image source)

“We took photos, sent them to different factories, and took guesstimates on letter sizing, colors, fonts, etc. You can often manufacture products based on images if it’s within certain product categories.” The goal all along was to use the suit as a case study. They partnered with a local marketing firm to help split the promotion, work, and costs.

“The next day we signed a contract with a few marketers based in San Francisco to split the profits 50–50 after we both covered our costs. They cover the ad spend and setup; we cover the inventory and logistics cost,” Nathan wrote in an article for The Hustle. When they were ready to go, the marketing company began running ad campaigns and pushing out stories. They went viral on BroBible quickly after launch and pulled in over $ 23,000 in sales within the first week.

The only problem is that they used some images of Conor in the process. And apparently, his attorney’s didn’t love the IP infringement. A cease and desist letter wasn’t far behind:

(image source)

This result wasn’t completely unexpected. Both Nathan and the marketing partner knew they were skirting a thin line. But either way, Nathan got what he wanted out of it.

3. Drive targeted, bottom-of-the-funnel leads with Quora

Quora packs another punch that often elevates it over the other social channels: higher-quality traffic. Site visitors are asking detailed questions, expecting to comb through in-depth answers to each query. In other words, they’re invested. They’re smart. And if they’re expressing interest in managed WordPress hosting, it means they’ve got dough, too.

Both Sales Hacker and Kinsta take full advantage. Today, Gaetano DiNardi is the Director of Demand Generation at Nextiva. But before that, he lead marketing at Sales Hacker before they were acquired. There, content was central to their stratospheric growth. With Quora, Gaetano would take his latest content pieces and use them to solve customer problems and address pain points in the general sales and marketing space:

By using Quora as a research tool, he would find new topics that he can create content around to drive new traffic and connect with their current audience:

He found questions that they already had content for and used it as a chance to engage users and provide value. He can drive tons of relevant traffic for free by linking back to the Sales Hacker blog:

Kinsta, a managed WordPress hosting company out of Europe, also uses uses relevant threads and Quora ads. CMO Brian Jackson jumps into conversations directly, lending his experience and expertise where appropriate. His technical background makes it easy to talk shop with others looking for a sophisticated conversation about performance (beyond the standard, PR-speak most marketers offer up):

Brian targets different WordPress-related categories, questions, or interests. Technically, the units are “display ads, but they look like text.” The ad copy is short and to the point. Usually something like, “Premium hosting plans starting at $ XX/month” to fit within their length requirements.

4. Rank faster with paid (not organic) social promotion

Kinsta co-founder Tom Zsomborgi wrote about their journey in a bootstrapping blog post that went live last November. It instantly hit the top of Hacker News, resulting in their website getting a consistent 400+ concurrent visitors all day:

Within hours their post was also ranking on the first page for the term “bootstrapping,” which receives around 256,000 monthly searches.

How did that happen?

“There’s a direct correlation between social proof and increased search traffic. It’s more than people think,” said Brian. Essentially, you’re paying Facebook to increase organic rankings. You take good content, add paid syndication, and watch keyword rankings go up.

Kinsta’s big goal with content promotion is to build traffic and get as many eyeballs as possible. Then they’ll use AdRoll for display retargeting messages, targeting the people who just visited with lead gen offers to start a free trial. (“But I don’t use AdRoll for Facebook because it tags on their middleman fee.”)

Brian uses the “Click Campaigns” objective on Facebook Ads for both lead gen and content promotion. “It’s the best for getting traffic.”

Facebook’s organic reach fell by 52% in 2016 alone. That means your ability to promote content to your own page fans is quickly approaching zero.

Screen Shot 2017 06 29 at 12.52.27 PM

(image source)

“It’s almost not even worth posting if you’re not paying,” confirms Brian. Kinsta will promote new posts to make sure it comes across their fans’ News Feed. Anecdotally, that reach number with a paid assist might jump up around 30%.

If they don’t see it, Brian will “turn it into an ad and run it separately.” It’s “re-written a second time to target a broader audience.”

In addition to new post promotion, Brian has an evergreen campaign that’s constantly delivering the “best posts ever written” on their site. It’s “never-ending” because it gives Brian a steady-stream of new site visitors — or new potential prospects to target with lead gen ads further down the funnel. That’s why Brian asserts that today’s social managers need to understand PPC and lead gen. “A lot of people hire social media managers and just do organic promotion. But Facebook organic just sucks anyway. It’s becoming “pay to play.’”

“Organic reach is just going to get worse and worse and worse. It’s never going to get better.” Also, advertising gets you “more data for targeting,” which then enables you to create more in-depth A/B tests.

We confirmed this through a series of promoted content tests, where different ad types (custom images vs. videos) would perform better based on the campaign objectives and placements.

(image source)

That’s why “best practices” are past practices — or BS practices. You don’t know what’s going to perform best until you actually do it for yourself. And advertising accelerates that feedback loop.

5. Constantly refresh your retargeting ad creative to keep engagement high

Almost every single stat shows that remarketing is one of the most efficient ways to close more customers. The more ad remarketing impressions someone sees, the higher the conversion rate. Remarketing ads are also incredibly cheap compared to your standard AdWords search ad when trying to reach new cold traffic.

(image source)

There’s only one problem to watch out for: ad fatigue. The image creative plays a massive role in Facebook ad success. But over time (a few days to a few weeks), the performance of that ad will decline. The image becomes stale. The audience has seen it too many times. The trick is to continually cycle through similar, but different, ad examples.

Here’s how David Zheng does it for BuildFire:

His team will either (a) create the ad creative image directly inside Canva, or (b) have their designers create a background ‘template’ that they can use to manipulate quickly. That way, they can make fast adjustments on the fly, A/B testing small elements like background color to keep ads fresh and conversions as high as possible.

(image source)

All retargeting or remarketing campaigns will be sent to a tightly controlled audience. For example, let’s say you have leads who’ve downloaded an eBook and ones who’ve participated in a consultation call. You can just lump those two types into the same campaign, right? I mean, they’re both technically ‘leads.’

But that’s a mistake. Sure, they’re both leads. However, they’re at different levels of interest. Your goal with the first group is to get them on a free consultation call, while your goal with the second is to get them to sign up for a free trial. That means two campaigns, which means two audiences.

Facebook’s custom audiences makes this easy, as does LinkedIn’s new-ish Matched Audiences feature. Like with Facebook, you can pick people who’ve visited certain pages on your site, belong to specific lists in your CRM, or whose email address is on a custom .CSV file:

If both of these leads fall off after a few weeks and fail to follow up, you can go back to the beginning to re-engage them. You can use content-based ads all over again to hit back at the primary pain points behind the product or service that you sell.

This seems like a lot of detailed work — largely because it is. But it’s worth it because of scale. You can set these campaigns up, once, and then simply monitor or tweak performance as you go. That means technology is largely running each individual campaign. You don’t need as many people internally to manage each hands-on.

And best of all, it forces you to create a logical system. You’re taking people through a step-by-step process, one tiny commitment at a time, until they seamlessly move from stranger into customer.

Conclusion

Sending out a few tweets won’t make an impact at the end of the day. There’s more competition (read: noise) than ever before, while organic reach has never been lower. The trick isn’t to follow some faux influencer who talks the loudest, but rather the practitioners who are doing it day-in, day-out, with the KPIs to prove it.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

3 Reasons Why Good Ideas Are a Real Threat to Good Writing

Ahh, the elusive “good idea.” Writers spend a large amount of time thinking about them and looking for them. It’s an undeniable part of the creative process. So why would I consider them such a pervasive threat to good writing? The answer is simple. Good ideas are just part of what it takes to produce
Read More…

The post 3 Reasons Why Good Ideas Are a Real Threat to Good Writing appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

Getting Real with Retail: An Agency’s Guide to Inspiring In-Store Excellence

Posted by MiriamEllis

A screenshot of a negative 1-star review citing poor customer service

No marketing agency staffer feels good when they see a retail client getting reviews like this on the web.

But we can find out why they’re happening, and if we’re going above-and-beyond in our work, we just might be able to catalyze turning things around if we’re committed to being honest with clients and have an actionable strategy for their in-store improvements.

In this post, I’ll highlight some advice from an internal letter at Tesla that I feel is highly applicable to the retail sector. I’d also like to help your agency combat the retail blues headlining the news these days with big brands downsizing, liquidating and closing up shop — I’m going to share a printable infographic with some statistics with you that are almost guaranteed to generate the client positivity so essential to making real change. And, for some further inspiration, I’d like to offer a couple of anecdotes involving an Igloo cooler, a monk, reindeer moss, and reviews.

The genuine pain of retail gone wrong: The elusive cooler, “Corporate,” and the man who could hardly stand

“Hi there,” I greeted the staffer at the customer service counter of the big department store. “Where would I find a small cooler?”

“We don’t have any,” he mumbled.

“You don’t have any coolers? Like, an Igloo cooler to take on a picnic to keep things cold?”

“Maybe over there,” he waved his hand in unconcern.

And I stood there for a minute, expecting him to actually figure this out for me, maybe even guide me to the appropriate aisle, or ask a manager to assist my transaction, if necessary. But in his silence, I walked away.

“Hi there,” I tried with more specificity at the locally owned general store the next day. “Where would I find something like a small Igloo cooler to keep things cold on a picnic?”

“I don’t know,” the staffer replied.

“Oh…” I said, uncomfortably.

“It could be upstairs somewhere,” he hazarded, and left me to quest for the second floor, which appeared to be a possibly-non-code-compliant catch-all attic for random merchandise, where I applied to a second dimly illuminated employee who told me I should probably go downstairs and escalate my question to someone else.

And apparently escalation was necessary, for on the third try, a very tall man was able to lift his gaze to some coolers on a top shelf… within clear view of the checkout counter where the whole thing began.

Why do we all have experiences like this?

“Corporate tells us what to carry” is the almost defensive-sounding refrain I have now received from three employees at two different Whole Foods Markets when asking if they could special order items for me since the Amazon buyout.

Because, you know, before they were Amazon-Whole Foods, staffers would gladly offer to procure anything they didn’t have in stock. Now, if they stop carrying that Scandinavian vitamin D-3 made from the moss eaten by reindeer and I’ve got to have it because I don’t want the kind made by irradiating sheep wool, I’d have to special order an entire case of it to get my hands on a bottle. Because, you know, “Corporate.”

Why does the distance between corporate and customer make me feel like the store I’m standing in, and all of its employees, are powerless? Why am I, the customer, left feeling powerless?

So maybe my search for a cooler, my worries about access to reindeer moss, and the laughable customer service I’ve experienced don’t signal “genuine pain.” But this does:

Screenshot of a one-star review: "The pharmacy shows absolutely no concern for the sick, aged and disabled from what I see and experienced. There's 2 lines for drops and pick up, which is fine, but keep in mind those using the pharmacy are sick either acute or chronic. No one wants to be there. The lines are often long with the phone ringing off the hook, so very understaffed. There are no chairs near the line to sit even if someone is in so much pain they can hardly stand, waiting area not nearby. If you have to drop and pick you have to wait in 2 separate lines. They won't inform the other window even though they are just feet away from each other. I saw one poor man wait 4 people deep, leg bandaged, leaning on a cart to be able to stand, but he was in the wrong line and was told to go to the other line. They could have easily taken the script, asked him to wait in the waiting area, walk the script 5 feet, and call him when it was his turn, but this fella who could barely stand had to wait in another line, 4 people deep. I was in the correct line, pick up. I am a disabled senior with cancer and chronic pain. However, I had a new Rx insurance card, beginning of the year. I was told that to wait in the other line, too! I was in the correct line, but the staff was so poorly trained she couldn't enter a few new numbers. This stuff happens repeatedly there. I've written and called the home office who sound so concerned but nothing changes. I tried to talk to manager, who naturally was "unavailable" but his underling made it clear their process was more important than the customers. All they have to do to fix the problem is provide nearby sitting or ask the customer to wait in the waiting area where there are chairs and take care of the problem behind the counter, but they would rather treat the sick, injured and old like garbage than make a small change that would make a big difference to their customers. Although they are very close I am looking for a pharmacy who actually cares to transfer my scripts, because I feel they are so uncaring and disinterested although it's their job to help the sick."

This is genuine pain. When customer service is failing to the point that badly treated patrons are being further distressed by the sight of fellow shoppers meeting the same fate, the cause is likely built into company structure. And your marketing agency is looking at a bonafide reputation crisis that could presage things like lawsuits, impactful reputation damage, and even closure for your valuable clients.

When you encounter customer service disasters, it begs questions like:

  1. Could no one in my situation access a list of current store inventory, or, barring that, seek out merchandise with me instead of risking the loss of a sale?
  2. Could no one offer to let “corporate” know that I’m dissatisfied with a “customer service policy” that would require me to spend $ 225 to buy a whole case of vitamins? Why am I being treated like a warehouse instead of a person?
  3. Could no one at the pharmacy see a man with a leg wound about to fall over, grab a folding chair for him, and keep him safe, instead of risking a lawsuit?

I think a “no” answer to all three questions proceeds from definite causes. And I think Tesla CEO, Elon Musk, had such causes in mind when he recently penned a letter to his own employees.

“It must be okay for people to talk directly and just make the right thing happen.”

“Communication should travel via the shortest path necessary to get the job done, not through the ‘chain of command.’ Any manager who attempts to enforce chain of command communication will soon find themselves working elsewhere.

A major source of issues is poor communication between depts. The way to solve this is allow free flow of information between all levels. If, in order to get something done between depts, an individual contributor has to talk to their manager, who talks to a director, who talks to a VP, who talks to another VP, who talks to a director, who talks to a manager, who talks to someone doing the actual work, then super dumb things will happen. It must be ok for people to talk directly and just make the right thing happen.

In general, always pick common sense as your guide. If following a ‘company rule’ is obviously ridiculous in a particular situation, such that it would make for a great Dilbert cartoon, then the rule should change.”
- Elon Musk, CEO, Tesla

Let’s parlay this uncommon advice into retail. If it’s everyone’s job to access a free flow of information, use common sense, make the right thing happen, and change rules that don’t make sense, then:

  1. Inventory is known by all store staff, and my cooler can be promptly located by any employee, rather than workers appearing helpless.
  2. Employees have the power to push back and insist that, because customers still expect to be able to special order merchandise, a specific store location will maintain this service rather than disappoint consumers.
  3. Pharmacists can recognize that patrons are often quite ill and can immediately place some chairs near the pharmacy counter, rather than close their eyes to suffering.

“But wait,” retailers may say. “How can I trust that an employee’s idea of ‘common sense’ is reliable?”

Let’s ask a monk for the answer.

“He took the time…”

I recently had the pleasure of listening to a talk given by a monk who was defining what it meant to be a good leader. He hearkened back to his young days, and to the man who was then the leader of his community.

“He was a busy man, but he took the time to get to know each of us one-on-one, and to be sure that we knew him. He set an example for me, and I watched him,” the monk explained.

Most monasteries function within a set of established rules, many of which are centuries old. You can think of these guidelines as a sort of policy. In certain communities, it’s perfectly acceptable that some of the members live apart as hermits most of the year, only breaking their meditative existence by checking in with the larger group on important holidays to share what they’ve been working on solo. In others, every hour has its appointed task, from prayer, to farming, to feeding people, to engaging in social activism.

The point is that everyone within a given community knows the basic guidelines, because at some point, they’ve been well-communicated. Beyond that, it is up to the individual to see whether they can happily live out their personal expression within the policy.

It’s a lot like retail can be, when done right. And it hinges on the question:

“Has culture been well-enough communicated to every employee so that he or she can act like the CEO of the company would in wide variety of circumstances?”

Or to put it another way, would Amazon owner Jeff Bezos be powerless to get me my vitamins?

The most accessible modern benchmark of good customer service — the online review — is what tells the public whether the CEO has “set the example.” Reviews tell whether time has been taken to acquaint every staffer with the business that employs them, preparing them to fit their own personal expression within the company’s vision of serving the public.

An employee who is able to recognize that an injured patron needs a seat while awaiting his prescription should be empowered to act immediately, knowing that the larger company supports treating people well. If poor training, burdensome chains of command, or failure to share brand culture are obstacles to common-sense personal initiative, the problem must be traced back to the CEO and corrected, starting from there.

And, of course, should a random staffer’s personal expression genuinely include an insurmountable disregard for other people, they can always be told it’s time to leave the monastery…

For marketing agencies, opportunity knocks

So your agency is auditing a valuable incoming client, and their negative reviews citing dirty premises, broken fixtures, food poisoning, slowness, rudeness, cluelessness, and lack of apparent concern make you say to yourself,

“Well, I was hoping we could clean up the bad data on the local business listings for this enterprise, but unless they clean up their customer service at 150 of their worst-rated locations, how much ROI are we really going to be able to deliver? What’s going on at these places?”

Let’s make no bones about this: Your honesty at this critical juncture could mean the difference between survival and closure for the brand.

You need to bring it home to the most senior level person you can reach in the organization that no amount of honest marketing can cover up poor customer service in the era of online reviews. If the brand has fallen to the level of the pharmacy I’ve cited, structural change is an absolute necessity. You can ask the tough questions, ask for an explanation of the bad reviews.

“But I’m just a digital marketer,” you may think. “I’m not in charge of whatever happens offline.”

Think again.

Headlines in retail land are horrid right now:

If you were a retail brand C-suite and were swallowing these predictions of doom with your daily breakfast, wouldn’t you be looking for inspiration from anyone with genuine insight? And if a marketing agency should make it their business to confront the truth while also being the bearer of some better news, wouldn’t you be ready to listen?

What is the truth? That poor reviews are symptoms smart doctors can use for diagnosis of structural problems.
What is the better news? The retail scenario is not nearly as dire as it may seem.

Why let hierarchy and traditional roles hold your agency back? Tesla wouldn’t. Why not roll up your sleeves and step into in-store? Organize and then translate the narrative negative reviews are telling about structural problems for the brand which have resulted in dangerously bad customer service. And then, be prepared to counter corporate inertia born of fear with some eye-opening statistics.

Print and share some good retail tidings

Local SEO infographic

Print your own copy of this infographic to share with clients.

At Moz, we’re working with enterprises to get their basic location data into shape so that they are ready to win their share of the predicted $ 1.4 trillion in mobile-influenced local sales by 2021, and your agency can use these same numbers to combat indecision and apathy for your retail clients. Look at that second statistic again: 90% of purchases are still happening in physical stores. At Moz, we ask our customers if their data is ready for this. Your agency can ask its clients if their reputations are ready for this, if their employees have what they need to earn the brand’s piece of that 90% action. Great online data + great in-store service = table stakes for retail success.

While I won’t play down the unease that major brand retail closures is understandably causing, I hope I’ve given you the tools to fight the “retail disaster” narrative. 85% more mobile users are searching for things like “Where do I buy that reindeer moss vitamin D3?” than they were just 3 years ago. So long as retail staff is ready to deliver, I see no “apocalypse” here.

Investing time

So, your agency has put in the time to identify a reputation problem severe enough that it appears to be founded in structural deficiencies or policies. Perhaps you’ve used some ORM software to do review sentiment analysis to discover which of your client’s locations are hurting worst, or perhaps you’ve done an initial audit manually. You’ve communicated the bad news to the most senior-level person you can reach at the company, and you’ve also shared the statistics that make change seem very worthwhile, begging for a new commitment to in-store excellence. What happens next?

While there are going to be nuances specific to every brand, my bet is that the steps will look like this for most businesses:

  1. C-suites need to invest time in creating a policy which a) abundantly communicates company culture, b) expresses trust in employee initiative, and c) dispenses with needless “chain of command” steps, while d) ensuring that every public facing staffer receives full and ongoing training. A recent study says 62% of new retail hires receive less than 10 hours of training. I’d call even these worrisome numbers optimistic. I worked at 5 retail jobs in my early youth. I’d estimate that I received no more than 1 hour of training at any of them.
  2. Because a chain of command can’t realistically be completely dispensed with in a large organization, store managers must then be allowed the time to communicate the culture, encourage employees to use common sense, define what “common sense” does and doesn’t look like to the company, and, finally, offer essential training.
  3. Employees at every level must be given the time to observe how happy or unhappy customers appear to be at their location, and they must be taught that their observations are of inestimable value to the brand. If an employee suggests a solution to a common consumer complaint, this should be recognized and rewarded.
  4. Finally, customers must be given the time to air their grievances at the time of service, in-person, with accessible, responsive staff. The word “corporate” need never come into most of these conversations unless a major claim is involved. Given that it may cost as much as 7x more to replace an unhappy customer than to keep an existing one happy, employees should be empowered to do business graciously and resolve complaints, in most cases, without escalation.

Benjamin Franklin may or may not have said that “time is money.” While the adage rings true in business, reviews have taught me the flip side — that a lack of time equals less money. Every negative review that cites helpless employees and poor service sounds to my marketing ears like a pocketful of silver dollars rolling down a drain.

The monk says good leaders make the time to communicate culture one-on-one.

Tesla says rules should change if they’re ridiculous.

Chairs should be offered to sick people… where common sense is applied.

Reviews can read like this:

Screenshot of a positive 5-star review: "Had personal attention of three Tesla employees at the same time. They let me sit in both the model cars they had for quite time time and let me freely fiddle and figure out all the gizmos of the car. Super friendly and answered all my questions. The sales staff did not pressure me to buy or anything, but only casually mentioned the price and test drive opportunities, which is the perfect touch for a car company like Tesla. "

And digital marketers have never known a time quite like this to have the ear of retail, maybe stepping beyond traditional boundaries into the fray of the real world. Maybe making a fundamental difference.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Real Talk about Moving Forward with Your Big Idea

Great to see you again! This week on Copyblogger, we looked at how to make progress on projects and opportunities that might seem intimidating at first. Stefanie Flaxman showed us how to take that Big Idea (exciting, challenging, scary) and break it down until you discover your first (or next) move. She shared a process
Read More…

The post Real Talk about Moving Forward with Your Big Idea appeared first on Copyblogger.


Copyblogger

Find More Articles

Posted in Latest NewsComments Off

Virtual Real Estate

Wrastlin With The News

The current presidential cabinet includes a WWE co-founder & this passes for modern political discourse:

CNN promised vengeance.

Something To Believe In

The pretense of objectivity has been dropped:

These reporters aren’t ideologues. They’re just right-thinking people who lean left. Somewhere along the line, they stopped pretending to be objective about Trump. … People don’t just disagree with each other. They can’t imagine how a decent caring human being could disagree with their own view of race or the minimum wage or immigration or Trump. Being a member of the virtuous tribe means not only carrying the correct card in your wallet to reassure yourself. You have to also believe that the people carrying any other card are irrational, or worse, evil. They’re not people to engage in conversation with. They are barriers to be ignored or pushed aside on the virtuous path to paradise. This intolerance and inability to imagine the virtue of the other side is the road to tyranny and chaos. It dehumanizes a good chunk of humanity and that in turn justifies the worst atrocities human beings are capable of.

The WSJ, typically a right-leaning publication, is differentiating their coverage of the president from most other outlets by attempting to be somewhat neutral.

The news is fake. Even historically left-leaning people are disgusted with outlets like CNN.

  • “I think the president is probably right to say, like, look you are witch-hunting me. You have no smoking gun, you have no real proof.” – CNN supervising producer John Bonifield
  • “When you do shitty reporting like CNN, the Washington Post, the New York Times & Rachel Maddow especially. When you do that & it is revealed to be bullshit, what you’re doing is building up Trump. There’s no greater way to build up Trump than to falsely report on him. There’s no better way to build up Trump than to make him the victim.” – Jimmy Dore
  • “Rachel Maddow was given the facts, she ignored them, & she kept right on going. That’s MSNBC, that’s CNN, that’s the New York Times, the Washington Post – they’re all horrible. That’s why we had the Iraq war. That’s why we have the Syria war. That’s why we are still in Afghanistan. That’s why we had Libya. That’s why we have the biggest income disparity since the gilded age. Meanwhile we are spending more on the military than the rest of the world.” – Jimmy Dore

And, since people need something to believe in, there are new American Gods:

“A half hour of cable news delivers enough psychic trauma for a whole year. The newspapers are talking of nothing but treason, espionage, investigations, protests.” … “Stocks are rallying because of how little faith we have in the government. The Mega Blue Chip Corporation is the new Sovereign.”

Current Headwinds for Online Publishing

I struggle to keep up with the accelerating rate of change. A number of common themes in the current ecosystem are:

  • We are moving toward a world where more things are becoming fake (only accelerated by the demonetization of neutrality & the algorithmic boost associated with reliably delivering confirmation bias in an algorithmic or manual fashion)
  • risk keeps being radiated outward to the individual while monopoly platforms capture the spoils (forced-place health insurance purchases where the insurance company arbitrarily drops the sick member on the policy even though that is supposed to be illegal, more temp jobs where people don’t get enough hours to get health insurance through their employer, under-funded pensions, outsourcing of core business functions to sweatshops where part-time workers don’t get paid for dozens to hundreds of hours of required training & get to watch beheading videos all day)
  • the barrier to entry keeps increasing (increased marketing cost due to brand bias, heavy ad loads on dominant platforms, & central platforms making partners do “bet the farm” moves in how they adjust distribution to drive adoption of proprietary formats & temporarily over-promote select content formats)
  • the increasing chunk size of competition is making it much harder for individuals to build sustainable businesses. (Yes the tools of the trade are improving quickly, BUT the central platforms are demonetizing the adjacent fields faster than publishing tools & business options improve.)
  • In Europe publishers are aggressively leaning on regulators to try to rebalance power.

Some of this stuff is cyclical. About a decade ago the European Commission went after Microsoft for bundling Internet Explorer. Google complained about the monopolistic practices to ensure Microsoft was fined. And we’ve went from that to a web where Google syndicates native ads that blend into page content while directly funding robot journalism. And then Google is ranking the robot-generated crap too.

But to keep the ecosystem clean & spam free, Google is also planning to leverage their web browser to further dictate the terms of online publishing. Chrome will block autoplay audio & will automatically reroute .dev domains to https. Cutting edge developers suggest using a web browser other than Google Chrome to prevent proprietary lock in across the web.

While Google distributes their Chrome browser as unwanted bundleware, other web browsers must display uninstall links front & center when trying to gain awareness of their product using Google AdWords. Microsoft Edge is coming to Android, but without a BrowserChoice.eu screen it is unlikely most users will change their web browser as most are unaware of what a web browser even is, let alone the limitations of any of them or the tracking embedded in them.

If you go back several years, there was celebration of the fact that the cost of doing a startup was so low. You didn’t have to pay Oracle a million dollars for a server license any more. You didn’t even have to rack your own hardware. Now you can just dial it up on Amazon. But there are now these gatekeepers and toll-takers. Back in 2004, you had the wide-open internet. – Jeremy Stoppelman

The Mobile Revolution

If you are an anti-social work at home webmaster who has dual monitors it is easy to dismiss cell phones as inefficient and chalk most mobile usage up to the following.

The reality is cell phones are more powerful than they seem if you are strictly consuming rather than working.

And that is how the unstoppable quickly becomes the extinct!

Many people the world over are addicted to their cell phones to where viral game makers are self-regulating before regulators step in: “From Tuesday, users below 12 years of age will be limited to one hour of play time each day, while those aged between 12 years and 18 years will be limited to two hours a day, Tencent said.”

While China is using their various tools to clamp down on Honour of Kings, Tencent is bringing the game to the west, which makes blocking VPN services (with Apple’s helpthey must play along or have the phones reduced to bricks) & requiring local data storage & technology transfer more important. Anything stored locally can be easily disappeared: “China’s already formidable internet censors have demonstrated a new strength-the ability to delete images in one-on-one chats as they are being transmitted, making them disappear before receivers see them.”

China has banned live streaming, threatened their largest domestic social networks, shut down chat bots, require extensive multimedia review: “an industry association circulated new regulations that at least two “auditors” will, with immediate effect, be required to check all audiovisual content posted online” AND they force users to install spyware on their devices.

In spite of all those restrictions, last year “Chinese consumers spent $ 5.5 trillion through mobile payment platforms, about 50 times more than their American counterparts.” In the last quarter Baidu had Â¥20.87 billion in revenues, with 72% of their revenues driven by mobile.

People can not miss that which they’ve never seen, thus platform socialism works. Those who doubt it will be tracked & scored accordingly.

History, as well, can be scrubbed. And insurance companies watch everything in real-time – careful what you post. The watchful eye of the Chinese pre-crime team is also looking over every move.

Last quarter Facebook had revenues of $ 9.164 billion, with 87% coming from mobile devices.

Simulacrum has ALMOST been perfected:

“We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.” … “Rates of teen depression and suicide have skyrocketed since 2011. It’s not an exaggeration to describe iGen as being on the brink of the worst mental-health crisis in decades. Much of this deterioration can be traced to their phones.” … “Teens who spend more time than average on screen activities are more likely to be unhappy, and those who spend more time than average on nonscreen activities are more likely to be happy.”

The web is becoming easier to get addicted to due to personalization algorithms that reinforce our worldviews even as they make us feel more isolated and left out. And the barrier to entry for consumers into one of the few central gatekeeper ecosystems is dropping like a rock due to the falling cost of mobile devices, coupled with with images & video displacing text making literacy optional. As we become more “connected” we feel more isolated:

“Social isolation, loneliness or living alone was each a significant factor contributing to premature death. And each one of these factors was a more significant risk factor for dying than obesity. … No one knows precisely why loneliness is surging, threatening the lives of many millions of people, but it does seem that the burgeoning use of technology may have something to do with it. Personally, I would contend that technology may be the chief factor fueling it.”

The primary role of the big data mining companies is leveraging surveillance for social engineering

App Annie expects the global app economy to be worth $ 6.3 trillion by 2021.

The reason those numbers can easily sound fake & mobile can seem overblown is how highly concentrated usage has become: “over 80 percent of consumer time on mobile devices is now spent on the apps, websites and properties” of just five companies: Facebook, Google, Apple, Yelp and Bing.

eMarketer stated Google will have more mobile ad revenue than desktop ad revenue in the US this year. They also predicted Google & Facebook will consume over 2/3 of US online ad spend within 2 years.

The central network operators not only maintain an outsized share of revenues, but also maintain an outsized share of profits. When the home team gets a 30% rake of any sale it is hard for anyone else to compete. Even after buying and gutting Motorola Google bought part of HTC for $ 1.1 billion. The game plan has never changed: commoditize the compliment to ensure user data & most of the profits flow to Google. Put up arbitrary roadblocks for competing services while launching free parallel offerings to drive lock-in.

Central data aggregators can keep collecting more user data & offer more granular ad distribution features. They can tell you that this micro moment RIGHT NOW is make or break:

it’s intended to create a bizarre sense of panic among marketers – “OMG, we have to be present at every possible instant someone might be looking at their phone!” – which doesn’t help them think strategically or make the best use of their marketing or ad spend.

The reality is that if you don’t have a relationship with a person on their desktop computer they probably don’t want your mobile app either.

If you have the relationship then mobile only increases profits.

Many people attempting to build “the next mobile” will go bust, but wherever the attention flows the ads will follow.

Those with a broad & dominant tech platform can copy features from single-category devices and keep integrating them into their core products to increase user lock-in. And they can build accessories for those core devices while prohibiting the flow of data to third party devices to keep users locked into their ecosystem.

Smaller Screens, Shallower Attention

People often multi-task while using mobile devices.

When multi-tasking it is easier to accidentally click an ad. This happens 10s of billions of times a year:

This year, in-app mobile ad spend will reach $ 45.3 billion, up $ 11 billion from last year, according to eMarketer. And apps are where the money is at for mobile advertising, comprising 80 percent of all U.S. media dollars spent on mobile.

But multi-tasking means doing almost everything else worse. The “always on” mode not only increases isolation, but also lowers our ability to focus:

“while our phones offer convenience and diversion, they also breed anxiety. Their extraordinary usefulness gives them an unprecedented hold on our attention and vast influence over our thinking and behavior. … Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens. … when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline. … As the phone’s proximity increased, brainpower decreased. … Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it. … people are all too quick to credit lies and half-truths spread through social media by Russian agents and other bad actors. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.”

Further, the shallow attention stream makes it easy to displace content with ads:

4 Ads
3 map carrousel results
5 organic results
4 Ads

Then “see more results”

4 more Ads
5 organic results
4 more Ads

On desktop devices people don’t accidentally misclick on ads at anywhere near the rate they fat thumb ads on mobile devices.

Desktop ad clicks convert to purchases. Mobile ad clicks convert to ad budget burned: “marketers are still seeing few shoppers purchasing on mobile. The 52% of share in traffic only has 26% share of revenue.”

For traditional publishers mobile users drastically under-monetize desktop users due to

  • drastically lower conversion rates (true for almost everyone in ecommerce outside of Amazon perhaps)
  • limited cross-device tracking (how do you track people who don’t even hit your site but hit a cached page hosted via Google AMP or Facebook Instant Articles?)
  • lower ad load allowed on publisher sites due to limited screen size
  • aggressive filtering of fat thumb ad clicks on partner sites from central ad networks

For the central network operators almost all the above are precisely the exact opposite.

  • higher ad CTR by making entire interface ads (& perhaps even disappearing the concept of non-ads in the result set)
  • great cross-device user tracking
  • higher ad load allowed by the small screen size pushing content below the fold
  • more lenient filtering of fat thumb accidental ad clicks

If you look at raw stats without understanding the underlying impact, it is easy to believe the ecosystem is healthy.

However the huge number of “no click” results are demonetizing easy publisher revenues, which have traditionally helped to fund more in-depth investigative reporting. Further, much of the direct navigation which happened in the past is now being passed through brand-related search result pages. You can argue that is an increase in search traffic, or you can argue it is shifting the roll of the address bar from navigation to search.

The first page is nothing but ads

On mobile so is the second, and most of the third

If a search query has lots of easy to structure crap around it, a user might need 6 or 7 scrolls to get to an organic result

Then if third parties go “well Google does this, so I should too” they are considered a low quality user experience and get a penalty.

31% ad coverage on mobile website is excessive / spam / low quality user experience for a publisher, while 301% coverage is reasonable for the central network operators.

Google not only displaces the result set, but also leverages their search suggestion features & algorithmic influence to alter how people search & what they search for.

Ads are getting integrated into mobile keyboards.

And when a user finally reaches the publisher’s website (provided they scroll past the ads, the AMP listings, and all the other scrape-n-displace trash) then when they finally land on a publication Google will overlay other recommended articles from other sites.

That feature will eventually end up including ads in it, where publishers will get 0.00% of the revenue generated.

Remember how Google suggested publishers should make their websites faster, remove ads, remove clutter, etc. What was the point of all that? To create free real estate for Google to insert a spam unit into your website?

This wouldn’t be bad if mobile were a small, fringe slice of traffic, but it is becoming the majority of traffic. And as mobile increases desktop traffic is shrinking.

Even politically biased outlets that appear to be nearly perfectly optimized for a filter bubble that promotes identity politics struggle to make the numbers work: “As a result of continued decline in direct advertising, [Salon's] total revenue in the fiscal year 2017 decreased by 34% to $ 4.6 million. Following the market trend, 84% of our advertising revenue in fiscal year 2017 was generated by programmatic selling. … [Monthly unique visitors to our website saw] a decrease of 23%. We attribute the decline primarily to the changes in the algorithms used by Facebook.”

The above sorts of numbers are the logical outcome to this:

we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content”above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.

Especially when combined with this:

As you scroll through it, you are then given travel ads for flight options through Google Flight search, hotels through Google Hotel search and restaurants through Google Local results. Then towards the bottom of the knowledge graph card, all the way at the end in a small grayish font, you have a link to “see web results.”

Bad news for TripAdvisor.

And amongst the good news for Expedia, there’s also a bit of bad news for Expedia. The hotels are fighting Airbnb & OTAs. In travel Google is twice as big as the biggest OTA players. They keep eating more SERP real estate and adding more content behind tabs. On mobile they’ll even disappear the concept of organic results.

Room previews in the search results not only means that second tier players are worth a song, but even the new growth players propped up by aggressive ad buying eventually hit a wall and see their stock crash.

As the entire ecosystem gets squeezed by middlemen and the market gets obfuscated with an incomplete selection it is ultimately consumers who lose: “Reservations made through Internet discount sites are almost always slated for our worst rooms.”

The New York Times pitched Yelp as a pesky player holding a grudge:

“For six years, his company has been locked in a campaign on three continents to get antitrust regulators to punish Google, Yelp’s larger, richer and more politically connected competitor. … Yelp concluded that there was no better way to get Google’s attention than to raise the specter of regulation. … something [Mark Mahaney] calls the Death of Free Google. As the internet has migrated to mobile phones, Google has compensated for the smaller screen space by filling it with so many ads that users can have a hard time finding a result that hasn’t been paid for.”

In spite of how quick The New York Times was to dismiss Yelp, the monopoly platforms are stiffing competition & creativity while bundling fake reviews & junk features into their core platforms.

People can literally switch their name to “Loop dee Loop”

and leave you terrible, fake reviews.

Google’s lack of effort & investment to clean up trash in their local services department highlights that they don’t feel they need to compete on quality. Pay for core search distribution, throw an inferior service front & center, and win by default placement.

As AI advancements make fake reviews look more legit Google’s lack of investment in creating a quality ecosystem will increasingly harm both consumers and businesses. Many low margin businesses will go under simply because their Google reviews are full of inaccurate trash or a competitor decided to hijack their business listing or list their business as closed.

To this day Google is still egregiously stealing content from Yelp:

Yelp said it investigated and found that over one hour, Google pulled images from Yelp’s servers nearly 386,000 times for business listings in Google Maps, which Google exempted from its promise to not scrape content. Yelp then searched Google for 150 of the businesses from those map listings and found that for 110 of them, Google used a Yelp photo as the lead image in the businesses’ listings in search results.

Stealing content & wrapping it in fake reviews is NOT putting the user first.

Facebook has their own matching parallel shifts.

The aggregate quality of mobile ad clicks is quite low. So as mobile becomes a much higher percent of total ad clicks, those who don’t have scale and narrative control are reduced to taking whatever they can get. And mainstream media outlets are reduced to writing puff pieces so the brands they cover will pay to promote the stories on the main channels.

As programmatic advertising, ad blockers, unpatched Android-powered botnets & malware spread each day gets a little uglier for everyone but the central market operators. It is so bad that some of the central market operators offer surveillance apps which claim to protect user privacy! Other app makers not connected to monopoly profit streams monetize any way they can.

The narrative of growth can be sold (we are launching a new food channel, we are investing in our internal video team, we have exclusive real estate listings, and, um, we acquired a food channel) but the competition is a zero sum game with Google & Facebook eating off the publisher’s plates.

That’s why Time is trying to shave $ 400 million off their expenses & wants to sell their magazine division. Newspaper companies are selling for $ 1. It is also why Business Insider is no longer chasing growth & the New York Times is setting up a charitable trust.

The rise of ad blocking only accelerates the underlying desperation.

As long as news websites make their own customer experience worse than what can be found as a cached copy on the monopoly platforms there is no reason to visit the end publisher website. That is why the proprietary formats promoted by the monopoly platforms are so dangerous. They force lighter monetization & offset the lack of revenue by given preferential placement:

click through rate from Google search went from 5.9% (Regular) to 10.3% (AMP), and average search position went from 5.9 (Regular) to 1.7 (AMP). Since then, we have deployed AMP across fifteen of our brands and we have been very pleased with the results. Today, AMP accounts for 79% of our mobile search traffic and 36% of our total mobile visits.

As long as almost nobody is using the new proprietary, ghetto lock-in format the math may work out there, but once many people adopt it then it becomes another recurring sunk cost with no actual benefit:

the only voices promoting AMP’s performance benefits are coming from inside Google. … given how AMP pages are privileged in Google’s search results, the net effect of the team’s hard, earnest work comes across as a corporate-backed attempt to rewrite HTML in Google’s image.

Even if you get a slight uptick in traffic from AMP, it will lead to lower quality user engagement as users are browsing across websites rather than within websites. Getting a bit more traffic but 59% fewer leads is a fail.

No amount of collaborative publisher partnerships, begging for anti-trust exemptions, or whining about Google is going to fix the problem.

“The only way publishers can address this inexorable threat is by banding together. If they open a unified front to negotiate with Google and Facebook-pushing for stronger intellectual-property protections, better support for subscription models and a fair share of revenue and data-they could build a more sustainable future for the news business. But antitrust laws make such coordination perilous. These laws, intended to prevent monopolies, are having the unintended effect of preserving and protecting Google and Facebook’s dominant position.”

Wait a minute.

Wasn’t it the New York Times which claimed Yelp was holding an arbitrary grudge against Google?

The following sounds a lot more desperate:

newspapers that once delivered their journalism with their own trucks increasingly have to rely on these big online platforms to get their articles in front of people, fighting for attention alongside fake news, websites that lift their content, and cat videos.

Well maybe that is just smaller publications & not the gray lady herself

“the temperature is rising in terms of concern, and in some cases anger, about what seems like a very asymmetric, disadvantageous relationship between the publishers and the very big digital platforms.” – NYT CEO Mark Thompson

In unrelated news, there’s another round of layoffs coming at the New York Times.

And the New York Times is also setting up a nonprofit division to expand journalism while their core company focuses on something else.

Apparently Yelp does not qualify as a publisher in this instance.

Or does it?

The Times is backing the move for what is called an anticompetitive safe haven, in part, Mr. Thompson said, “because we care about the whole of journalism as well as about The New York Times.”

Ah, whole of journalism, which, apparently, no longer includes local business coverage.

You know the slogan: “news isn’t news, unless it isn’t local.”

The struggles are all across the media landscape. The new Boston Globe CEO lasted a half-year. The San Diego Union-Tribune resorted to using GoFundMe. The Chicago Sun-Times sold for $ 1. Moody’s issued a negative outlook for the US newspaper sector. As the industry declines the biggest players view consolidation as the only solution.

These struggles existed even before the largest brand advertisers like P&G cut back on low & no value ad venues like YouTube:

In the fourth quarter, the reduction in marketing that occurred was almost all in the digital space. And what it reflected was a choice to cut spending from a digital standpoint where it was ineffective: where either we were serving bots as opposed to human beings, or where the placement of ads was not facilitating the equity of our brands.

Google & Facebook are extending their grip on the industry with Google launching topical feeds & Facebook wanting to control subscription management.

Best of luck to journalists on the employment front:

The initiative, dubbed Reporters and Data and Robots (RADAR), will see a team of five journalists work with Natural Language Generation software to produce over 30,000 pieces of content for local media each month.

Hopefully editors catch the subtle errors the bots make, because most of them will not be this obvious & stupid.

The cost of parasitic content recycling is coming down quickly:

In a show of strength last year, Microsoft used thousands of these chips at once to translate all of English Wikipedia into Spanish-3 billion words across five million articles-in less than a tenth of a second. Next Microsoft will let its cloud customers use these chips to speed up their own AI tasks.

Voice search makes it even easier to extract the rewards without paying publishers. Throwing pennies at journalists does nothing to change this.

And that voice shift is happening fast: “By 2020 half of search will be via voice”

If Google is subsidizing robotic journalism they are thus legitimizing robotic journalism. As big publishers employ the tactic, Google ranks it.

It is almost impossible to compete economically with an entity that rewrites your work & has zero marginal cost of production.

YouTube has perhaps the worst comments on the web. Some mainstream news sites got rid of comments because they couldn’t justify the cost of policing them. That in turn shifts the audience & attention stream to sites like Facebook & Twitter. Some news sites which are still leaving comments enabled rely on a Google filter, a technology Google can use on YouTube as they see fit.

Any plugins publishers use to lower their costs can later disappear. It looked like FindTheBest was doing well financially, but when it was acquired many news sites quickly found out the cost of free as they now have thousands of broken articles in their archives: “Last month, Graphiq announced that features for news publishers would no longer be available after Friday.”

Driving costs toward zero by piling on external dependencies is no way to build a sustainable business. Especially when the central network operators are eating the playing field:

“Between fast-loading AMP articles from major news brands hosted in its domain, full pages of information scraped from outside sites that don’t require you to visit them, basic shopping functions built into ads, YouTube, and a host of other features, the Google-verse is more of a digital walled garden than ever. … If Google continues to choke these sites out, what incentive will there be for new ones to come along?”

Unprofitable partners which were buying growth with artificially cheap pricing eventually find out investors want profits more than growth & either reprice or go away. The longer you use something & the more locked in you are to it the more aggressively it can afford to reprice. Symbiotic relationships devolve into abusive ones:

  • “for every pound an advertiser spends programmatically on the Guardian only 30 pence actually goes to the publisher.” – Mediatel
  • “Google wants to cut out the middlemen, which it turns out, are URLs.” – MobileMoxie
  • “[AMP is] a way for Google to obfuscate your website, usurp your content & remove any personal credibility from the web” – TheRegister
  • “Though the stated initiative of ads.txt is to stop inventory resale, it achieves this by establishing ‘preferred’ channels, which naturally favors the industry’s most influential companies” – Ad Exchanger

That Apple does extra work to undo AMP says a lot.

Those who think the central network operators are naive to the power structure being promoted by the faux solutions are either chasing short-term goals or are incredibly masochistic.

Arbitraging brand is the core business model of the attention merchant monopoly.

we’ve found out that 98% of our business was coming from 22 words. So, wait, we’re buying 3,200 words and 98% of the business is coming from 22 words. What are the 22 words? And they said, well, it’s the word Restoration Hardware and the 21 ways to spell it wrong, okay?

Publishers buying the “speed” narrative are failing themselves. The Guardian has 11 people working on AMP integration. And what is Google doing about speed? Google shut down Google Instant search results, often displays a screen or two full of ads which mobile users have to scroll past to find the organic search results AND is testing auto-playing videos in the search results.

Facebook is also promoting fast loading & mobile-friendly pages.

To keep bleeding clicks out of the “organic” ecosystem they don’t even need to have explicit malicious intent. They can run a thousand different tests every month (new vertical sitelink formats, swipable sitelinks, showing 8 sitelinks on tiny mobile devices, MOAR sitelinks, message extensions, extensions on call-only ads, price discount labels, frame 3rd party content inline, dramatically ramp up featured snippets +QnA listings, more related searches, more features in ad units, larger ad units, ad units that replace websites & charge advertisers for sending clicks from Google to Google, launch a meta-search service where they over-promote select listings, test dropping URLs from listings, put ads in the local pack, change color of source links or other elements, pop ups of search results inside search results, etc.) & keep moving toward whatever layout drives more ad clicks, keeps users on Google longer & forces businesses to buy ads for exposure, claiming they are optimizing the user experience the whole time.

They can hard-code any data type or feature, price it at free to de-fund adjacent businesses, consolidate market power, then increase rents after they have a monopoly position in the adjacent market.

And they can fund research on how to remove watermarks from images.

Why not make hosting free, get people to publish into a proprietary format & try to shift news reading onto the Google app. With enough attention & market coverage they can further extort publishers into accepting perpetually worse deals. And free analytics & business plugins which are widely adopted can have key features get pushed into the paid version. Just look at Google Analytics – its free or $ 150,000+/yr.

The above sorts of moves can be done in isolation, or in a combinatorial approach. Publishers aloof of the ecosystem shifts may use microformats to structure their content. They’ll then find it is integrated in Google’s new image search layout, where Google copies the content wholesale &
shows it near other third party images framed by Google.

How about some visually striking, yet irrelevant listings for competing brands on branded searches to force the brand ad buy. And, of course rounded card corners to eat a few more pixels, along with faint ad labeling on ads coupled with vibrant colored dots on the organic results to confuse end users into thinking the organic results are the ads.

While Google turns their search results into an adspam farm, they invite you to test showing fewer ads on your site to improve user experience. Google knows best – let them automate your ad load & ad placement.

What is the real risk of AI? Bias.

“It’s important that we be transparent about the training data that we are using, and are looking for hidden biases in it, otherwise we are building biased systems,” Giannandrea added. “If someone is trying to sell you a black box system for medical decision support, and you don’t know how it works or what data was used to train it, then I wouldn’t trust it.”

And how does Google justify their AI investments? Through driving incremental ad clicks: “The DeepMind founders understand that their power within [Alphabet], and their ability to get their way with [Alphabet CEO] Larry Page, depends on how many eyeballs and clicks and ad dollars they can claim to be driving”

No bias at all there!

And if publishing was killed in 2015, things have only got worse since then:

Looking at 2015 vs 2017 data for all keywords ranking organically on the first page, we’ve seen a dramatic change in CTR. Below we’ve normalized our actual CTR on a 1–10 scale, representing a total drop of 25% of click share on desktop and 55% on mobile.

SEOs who were overly reliant on the search channel were the first to notice all the above sorts of change, as it is their job to be hyper-aware of ecosystem shifts. But publishers far removed from SEO who never focused on SEO are now writing about the trends SEOs were writing about nearly a decade ago. Josh Marshall recently covered Google’s awesome monopoly powers.

few publishers really want to talk about the depths or mechanics of Google’s role in news publishing. Some of this is secrecy about proprietary information; most of it is that Google could destroy or profoundly damage most publications if it wanted to. So why rock the boat? … Google’s monopoly control is almost comically great. It’s a monopoly at every conceivable turn and consistently uses that market power to deepen its hold and increase its profits. Just the interplay between DoubleClick and Adexchange is textbook anti-competitive practices. … Is your favorite website laying off staff or ‘pivoting to video’. In most cases, the root cause is not entirely but to a significant degree driven by the platform monopolies

His article details how Google owns many points of the supply chain

So let’s go down the list: 1) The system for running ads, 2) the top purchaser of ads, 3) the most pervasive audience data service, 4) all search, 5) our email. … But wait, there’s more! Google also owns Chrome, the most used browser for visiting TPM.

He also covers the price dumping technique that is used to maintain control

In many cases, alternatives don’t exist because no business can get a footing with a product Google lets people use for free.

And he shared an example of Google algorithms gone astray crippling his business, even though it was not related to search & unintentional:

Because we were forwarding to ourselves spam that other people sent to us, Google decided that the owner of the TPM url was a major spammer and blocked emails from TPM from being sent to anyone.

If the above comes across as depressing, don’t worry. The search results now contain a depression diagnostic testing tool.

Categories: 

SEO Book

More Articles

Posted in Latest NewsComments Off

Ranking Multiple Domains to Own More SERP Real Estate – Whiteboard Friday

Posted by randfish

Is it better to rank higher in a single position frequently, or to own more of the SERP real estate consistently? The answer may vary. In today’s Whiteboard Friday, Rand presents four questions you should ask to determine whether this strategy could work for you, shares some high-profile success cases, and explores the best ways to go about ranking more than one site at a time.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about ranking multiple domains so you can own a bunch of the SERP real estate and whether you should do that, how you should do that, and some ways to do that.

I’ll show you an example, because I think that will help kick us off. So you are almost certainly familiar, if you’ve played around in the world of real estate SERPs, with Zillow and Trulia. Zillow started up here in Seattle. They bought Trulia a couple of years ago and have been doing pretty amazingly well. In fact, I was speaking at a real estate conference in New York recently, and my God, I did an example where I was searching for tons of cities plus homes for sale or plus real estate or houses, and Zillow and Trulia, along with a couple others, are in the top five for every single city I checked no matter how big or small. So very, very impressive SEO.

One of the things that a lot of SEOs have seen, not just with Zillow and Trulia, but with a few others like them is that, man, they own multiple listings in the SERPs, and so they kind of dominate the real estate here and get even more clicks as an entity, a combined entity than they would if Zillow had, for example, when they bought Trulia, redirected Trulia.com to Zillow. On Whiteboard Friday and at Moz and a lot of people in the SEO world often recommend that when you buy another domain or when you’re combining entities, that you do actually 301 redirect, because it can help bring up the rankings here.

The reason Zillow did not do that, and I think wisely so, is that they already dominated these SERPs so well that they figured pushing Trulia’s rankings into their own and combining the two entities would, yes, probably move them from number two and three to number one in some places, but they already own number one in a ton of these. Trulia was almost always one or two or three. Why not own all of that? Why not own 66% of the top three consistently, rather than number one a little more frequently? I think that was probably the right move for them.

Questions to ask

As a result, many SEOs asked themselves, “Should I do something similar? Should I buy other domains, or should I start other domains? Should I run multiple sites and try and rank for many different keyword phrases or a few keywords that I care very, very deeply about?” The answer is, well, before you do that, before you make any call, ask yourself these four questions. The answers to them will help you determine whether you should follow in these footsteps.

1. Do I need to dominate multiple results for a keyword or set of keywords MORE than I need better global rankings or a larger set of keywords sending visits?

So first off, do you need to dominate multiple results for a keyword or a small set of keywords more than you need to improve global rankings? Global rankings, I mean like all the keywords that your site could rank for potentially or that you do rank for now or could help you to rank a larger set of keywords that send visits and traffic.

You kind of have to weigh these two things. It’s either: Do I want two out of the top three results to be mine for this one keyword, or do I want these 10 keywords that I’m ranking for to broadly move up in rankings generally?
A lot of the time, this will bias you to go, “Wait a minute, no, the opportunity is not in these few keywords where I could dominate multiple positions. It’s in moving up the global rankings and making my ability to rank for any set of keywords greater.”

Even at Moz today, Moz does very well in the rankings for a lot of terms around SEO. But if, for example, let’s say we were purchased by Search Engine Land or we bought Search Engine Land. If those two entities were combined, and granted, we do rank for many, many similar keywords, but we would probably not keep them separate. We would probably combine them, because the opportunity is still greater in combination than it is in dominating multiple results the way Zillow and Trulia are. This is a pretty rare circumstance.

2. Will I cannibalize link equity opportunities with multiple sites? Can I get enough link equity & authority signals to rank both?

Second, are you going to cannibalize link equity opportunities with multiple sites, and do you have the ability to get enough equity and authority signals to rank both domains or all three or all four or whatever it is?

A challenge that many SEOs encounter is that building links and building up the authority to rank is actually the toughest part of the SEO equation. The keyword targeting and ranking multiple domains, that’s nice to have, but first you’ve got to build up a site that’s got enough link equity. If it is challenging to earn links, maybe the answer is, hey, we should combine all our efforts or we should on work on all our efforts. Remember, even though Zillow owns Trulia, Trulia and Zillow are one entity, the links between them don’t help the other one rank very much. It was already a case, before Zillow bought them, that Trulia and Zillow independently ranked. The two sites offer different experiences and some different listings and all that kind of stuff.

There are reasons why Google keeps them separately and why Zillow and Trulia keep them separately. But that’s going to be really tough. If you’re a smaller business or a smaller website starting out, you’re trying to decide where should you put your link equity efforts, it might lean a little more this way.

3. Should I use my own domain(s), should I buy an existing site that ranks, or should I do barnacle SEO?

Number three. Should you use your own domain if you decide that you need to have multiple domains ranking for a single keyword? A good example of this case scenario is reputation management for your own brand name or for maybe someone who works at your company, some particular product that you make, whatever it is, or you’re very, very focused and you know, “Hey, this one keyword matters more than everything else that we do.”

Okay. Now the question would be: Should you use your own domain or a new domain that you buy and register and start building up? Should you buy an existing domain, something that already ranks, or should you do barnacle SEO? So mysite2.com, that would be basically you’re registering a new domain, you’re building it up from scratch, you’re growing that brand, and you’re trying to build all the signals that you’ll need.

You could buy a competitor that’s already ranking in the search results, that already has equity and ranking ability. Or you could say, “Hey, we see that this Quora question is doing really well. Can we answer that question tremendously well?” Or, “We see that Medium can perform tremendously well here. You know what? We can write great posts on Medium.” “We see that LinkedIn does really well in this sector. Great. We can do some publishing on LinkedIn.” Or, “There’s a list of companies on this page. We can make sure that we’re the number-one listed company on that page.” Okay. That kind of barnacle SEO, we did a Whiteboard Friday about that a few months ago, and you can check that out too.

4. Will my multi-domain strategy cost time/money that would be better spent on boosting my primary site’s marketing? Will those efforts cause brand dilution or sacrifice potential brand equity?

And number four, last but not least, will your multi-site domain strategy cost you time and money that would be better spent on boosting your primary site’s marketing efforts? It is the case that you’re going to sacrifice something if you’re putting effort into a different website versus putting all your marketing efforts into one domain.

Now, one reason that people certainly do this is because they’re trying riskier tactics with the second site. Another reason is because they’ve already dominated the rankings as much as they want, or because they’re trying to build up multiple properties so that they can sell one off. They’re very, very good at link building this space already and growing equity and those sorts of things.

But the other question you have to ask is: Will this cause brand dilution? Or is it going to sacrifice potential brand equity? One of the things that we’ve observed in the SEO world is that rankings alone do not make for revenue. It is absolutely the case that people are choosing which domains to click on and which domains to buy from and convert on based on the brand and their brand familiarity. When you’re building up a second site, you’ve got to be building up a second brand. So that’s an additional cost and effort.

Now, I don’t want to rain on the entire parade here. Like we’ve said in a few of these, there are reasons why you might want to consider multiple domains and reasons why a multi-domain strategy can be effective for some folks. It’s just that I think it might be a little less often and should be undertaken with more care and attention to detail and to all these questions than what some folks might be doing when they buy a bunch of domains and hope that they can just dominate the top 10 right out of the gate.

All right, everyone, look forward to your thoughts on multi-domain strategies, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

How to Make Your Writing Real

simple ways to make your content irresistible

In this day and age, substance matters.

What you say must be meaningful to the people you’re trying to attract. Your content must solve real problems and satisfy real desires.

So why should it matter how you say it?

The reality is, how you say it has always mattered, and it matters even more today. For content marketing, it’s basically the difference between success and failure.

You’re in a battle for attention. A headline that doesn’t specifically convey a compelling promise results in content that is too often simply ignored.

Beyond that, your copy has to hold that precious attention, sentence by sentence, until the conclusion. Even the appearance of your content on the page matters when trying to make a substantive point.

Finally, the way you convey information, no matter how independently valuable, affects everything from clarity to engagement to retention at a psychological level. Your ideas and advice must stick in people’s heads for you to succeed.

In short, how you say it is what you say.

Here’s an example

If someone asks you what’s for dinner, you can stick with the substance:

Tonight we’re having pasta for dinner.

pasta-plain

Or you can add a bit of craft and style to make it more tangible:

Tonight we’ll enjoy a dinner of tender linguini, topped with a homemade marinara sauce featuring vine-ripened tomatoes, fragrant basil, and fresh oregano straight from our garden, accented with just a hint of garlic and red wine flavoring.

pasta-sauce

Same basic information — we indeed will be having pasta for dinner.

Is one more enticing and memorable than the other?

Let’s look at another example.

Popcorn is bad for you

The book Made to Stick gives us the case of Art Silverman, a guy with a vendetta against popcorn. Silverman wanted to educate the public about the fact that a typical bag of movie popcorn has 37 grams of saturated fat, while the USDA recommends you have no more than 20 grams in an entire day.

Instead of simply citing that surprising — if dry — statistic, Silverman made the message meaningful by making it real. He said:

“A medium-sized ‘butter’ popcorn contains more artery-clogging fat than a bacon-and-eggs breakfast, a Big Mac and fries for lunch, and a steak dinner with all the trimmings — combined!”

food-equation

Ummm … I’ll go ahead and skip the popcorn, thanks.

Make the benefits tangible

Yes, substance matters. Your content must be more than just relevant — it’s got to be meaningful to the people you’re trying to attract.

But never forget that it’s the relevant and tangible expression of that substance that creates meaning.

People have to get connected with your content in the first place before they comment, share, buy, or recommend your products or services.

Make your messages as real to people as possible, and you’ll find that content marketing has a payoff way bigger than the investment. Really.

Editor’s note: The original version of this post was published on September 21, 2011.

The post How to Make Your Writing Real appeared first on Copyblogger.


Copyblogger

More Articles

Posted in Latest NewsComments Off

Advert