Tag Archive | "used"

Sites vulnerable to XSS can be used to phish Googlebot

How to protect your site from being used in XSS exploits.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Machine Learning Should Be Used to Deliver Great Brand Experiences, Says PagerDuty CEO

PagerDuty began trading on the New York Stock Exchange for the first time this morning and is now trading at more than 60% above their IPO price of $ 24. That gives the company a market capitalization of more than $ 2.7 billion. PagerDuty offers a SAAS platform that monitors IT performance. The company had sales of $ 118 million for its last fiscal year, up close to 50% over the previous year.

The company uses machine learning to inform companies in real-time about technical issues. “Our belief is that machine learning and data should be used in the service of making people better, helping people do their jobs more effectively, and delivering those great brand experiences every time,” says PagerDuty CEO Jennifer Tejada. “PagerDuty is really about making sure that our users understand that this could be a good thing, being woken up in the middle of the night if it’s for the right problem. It’s a way that can help you deliver a much better experience for your customers.”

Jennifer Tejada, CEO of PagerDuty, discusses their IPO and how machine learning should be used to deliver great brand experiences in an interview on CNBC:

It’s Gotten Harder for Human’s to Manage the Entire IT Ecosystem

If you think about the world today, it’s an always-on world. We as consumers expect every experience to be perfect. Every time you wake up in the morning, you order your coffee online, you check Slack to communicate with your team, and maybe you take a Lyft into work. Sitting behind all of that is a lot of complexity, many digital and infrastructure based platforms, that don’t always work together the way you’d expect them to. As that complexity has proliferated over the years and because developers can deploy what they like and can use the tools that they want it’s gotten harder for human beings to really manage the entire ecosystem even as your demands increase.

You want it perfect, you want it right now and you want it the way you’d like it to be. PagerDuty is the platform that brings the right problem to the right person at the right time. We use machine learning, sitting on ten years of data, data on humans behavior and data on all these signals there that are happening through the system, and it really helps the developers that sit behind these great experiences to deliver the right experience all the time.

Machine Learning Should Be Used to Deliver Great Brand Experiences

Going public is the right time for us right now because there’s an opportunity for us to deliver the power of our platform to users all over the world. We are a small company and we weren’t as well-known as we could be and this is a great opportunity to extend our brand and help developers and employees across teams and IT security and customer support to deliver better experiences for their end customers all the time.

At PagerDuty we take customer trust and user trust very seriously. We publish our data policy and we will not use data in a way other than what we describe online. We care deeply about the relationship between our users in our platform. Our belief is that machine learning and data should be used in the service of making people better, helping people do their jobs more effectively, and delivering those great brand experiences every time. PagerDuty is really about making sure that our users understand that this could be a good thing, being woken up in the middle of the night if it’s for the right problem. It’s a way that can help you deliver a much better experience for your customers.

The post Machine Learning Should Be Used to Deliver Great Brand Experiences, Says PagerDuty CEO appeared first on WebProNews.

WebProNews

Posted in Latest NewsComments Off

Moz’s Link Data Used to Suck… But Not Anymore! The New Link Explorer is Here – Whiteboard Friday

Posted by randfish

Earlier this week we launched our brand-new link building tool, and we’re happy to say that Link Explorer addresses and improves upon a lot of the big problems that have plagued our legacy link tool, Open Site Explorer. In today’s Whiteboard Friday, Rand transparently lists out many of the biggest complaints we’ve heard about OSE over the years and explains the vast improvements Link Explorer provides, from DA scores updated daily to historic link data to a huge index of almost five trillion URLs.

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

Click on the whiteboard image above to open a high-resolution version in a new tab!


Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m very excited to say that Moz’s Open Site Explorer product, which had a lot of challenges with it, is finally being retired, and we have a new product, Link Explorer, that’s taking its place. So let me walk you through why and how Moz’s link data for the last few years has really kind of sucked. There’s no two ways about it.

If you heard me here on Whiteboard Friday, if you watched me at conferences, if you saw me blogging, you’d probably see me saying, “Hey, I personally use Ahrefs, or I use Majestic for my link research.” Moz has a lot of other good tools. The crawler is excellent. Moz Pro is good. But Open Site Explorer was really lagging, and today, that’s not the case. Let me walk you through this.

The big complaints about OSE/Mozscape

1. The index was just too small

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

Mozscape was probably about a fifth to a tenth the size of its competitors. While it got a lot of the quality good links of the web, it just didn’t get enough. As SEOs, we need to know all of the links, the good ones and the bad ones.

2. The data was just too old

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So, in Mozscape, a link that you built on November 1st, you got a link added to a website, you’re very proud of yourself. That’s excellent. You should expect that a link tool should pick that up within maybe a couple weeks, maybe three weeks at the outside. Google is probably picking it up within just a few days, sometimes hours.

Yet, when Mozscape would crawl that, it would often be a month or more later, and by the time Mozscape processed its index, it could be another 40 days after that, meaning that you could see a 60- to 80-day delay, sometimes even longer, between when your link was built and when Mozscape actually found it. That sucks.

3. PA/DA scores took forever to update

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

PA/DA scores, likewise, took forever to update because of this link problem. So the index would say, oh, your DA is over here. You’re at 25, and now maybe you’re at 30. But in reality, you’re probably far ahead of that, because you’ve been building a lot of links that Mozscape just hasn’t picked up yet. So this is this lagging indicator. Sometimes there would be links that it just didn’t even know about. So PA and DA just wouldn’t be as accurate or precise as you’d want them to be.

4. Some scores were really confusing and out of date

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

MozRank and MozTrust relied on essentially the original Google PageRank paper from 1997, which there’s no way that’s what’s being used today. Google certainly uses some view of link equity that’s passed between links that is similar to PageRank, and I think they probably internally call that PageRank, but it looks nothing like what MozRank was called.

Likewise, MozTrust, way out of date, from a paper in I think 2002 or 2003. Much more advancements in search have happened since then.

Spam score was also out of date. It used a system that was correlated with what spam looked like three, four years ago, so much more up to date than these two, but really not nearly as sophisticated as what Google is doing today. So we needed to toss those out and find their replacements as well.

5. There was no way to see links gained and lost over time

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

Mozscape had no way to see gained and lost links over time, and folks thought, “Gosh, these other tools in the SEO space give me this ability to show me links that their index has discovered or links they’ve seen that we’ve lost. I really want that.”

6. DA didn’t correlate as well as it should have

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So over time, DA became a less and less indicative measure of how well you were performing in Google’s rankings. That needed to change as well. The new DA, by the way, much, much better on this front.

7. Bulk metrics checking and link reporting was too hard and manual

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So folks would say, “Hey, I have this giant spreadsheet with all my link data. I want to upload that. I want you guys to crawl it. I want to go fetch all your metrics. I want to get DA scores for these hundreds or thousands of websites that I’ve got. How do I do that?” We didn’t provide a good way for you to do that either unless you were willing to write code and loop in our API.

8. People wanted distribution of their links by DA

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

They wanted distributions of their links by domain authority. Show me where my links come from, yes, but also what sorts of buckets of DA do I have versus my competition? That was also missing.

So, let me show you what the new Link Explorer has.

Moz's new Link Explorer

Click on the whiteboard image above to open a high-resolution version in a new tab!

Wow, look at that magical board change, and it only took a fraction of a second. Amazing.

What Link Explorer has done, as compared to the old Open Site Explorer, is pretty exciting. I’m actually very proud of the team. If you know me, you know I am a picky SOB. I usually don’t even like most of the stuff that we put out here, but oh my god, this is quite an incredible product.

1. Link Explorer has a GIANT index

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So I mentioned index size was a big problem. Link Explorer has got a giant index. Frankly, it’s about 20 times larger than what Open Site Explorer had and, as you can see, very, very competitive with the other services out there. Majestic Fresh says they have about a trillion URLs from their I think it’s the last 60 days. Ahrefs, about 3 trillion. Majestic’s historic, which goes all time, has about 7 trillion, and Moz, just in the last 90 days, which I think is our index — maybe it’s a little shorter than that, 60 days — 4.7 trillion, so almost 5 trillion URLs. Just really, really big. It covers a huge swath of the web, which is great.

2. All data updates every 24 hours

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So, unlike the old index, it is very fresh. Every time it finds a new link, it updates PA scores and DA scores. The whole interface can show you all the links that it found just yesterday every morning.

3. DA and PA are tracked daily for every site

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

You don’t have to track them yourself. You don’t have to put them into your campaigns. Every time you go and visit a domain, you will see this graph showing you domain authority over time, which has been awesome.

For my new company, I’ve been tracking all the links that come in to SparkToro, and I can see my DA rising. It’s really exciting. I put out a good blog post, I get a bunch of links, and my DA goes up the next day. How cool is that?

4. Old scores are gone, and new scores are polished and high quality

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So we got rid of MozRank and MozTrust, which were very old metrics and, frankly, very few people were using them, and most folks who were using them didn’t really know how to use them. PA basically takes care of both of them. It includes the weight of links that come to you and the trustworthiness. So that makes more sense as a metric.

Spam score is now on a 0 to 100% risk model instead of the old 0 to 17 flags and the flags correlate to some percentage. So 0 to 100 risk model. Spam score is basically just a machine learning built model against sites that Google penalized or banned.

So we took a huge amount of domains. We ran their names through Google. If they couldn’t rank for their own name, we said they were penalized. If we did a site: the domain.com and Google had de-indexed them, we said they were banned. Then we built this risk model. So in the 90% that means 90% of sites that had these qualities were penalized or banned. 2% means only 2% did. If you have a 30% spam score, that’s not too bad. If you have a 75% spam score, it’s getting a little sketchy.

5. Discovered and lost links are available for every site, every day

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So again, for this new startup that I’m doing, I’ve been watching as I get new links and I see where they come from, and then sometimes I’ll reach out on Twitter and say thank you to those folks who are linking to my blog posts and stuff. But it’s very, very cool to see links that I gain and links that I lose every single day. This is a feature that Ahrefs and Majestic have had for a long time, and frankly Moz was behind on this. So I’m very glad that we have it now.

6. DA is back as a high-quality leading indicator of ranking ability

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So, a note that is important: everyone’s DA has changed. Your DA has changed. My DA has changed. Moz’s DA changed. Google’s DA changed. I think it went from a 98 to a 97. My advice is take a look at yourself versus all your competitors that you’re trying to rank against and use that to benchmark yourself. The old DA was an old model on old data on an old, tiny index. The new one is based on this 4.7 trillion size index. It is much bigger. It is much fresher. It is much more accurate. You can see that in the correlations.

7. Building link lists, tracking links that you want to acquire, and bulk metrics checking is now easy

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

Building link lists, tracking links that you want to acquire, and bulk metrics checking, which we never had before and, in fact, not a lot of the other tools have this link tracking ability, is now available through possibly my favorite feature in the tool called Link Tracking Lists. If you’ve used Keyword Explorer and you’ve set up your keywords to watch those over time and to build a keyword research set, very, very similar. If you have links you want to acquire, you add them to this list. If you have links that you want to check on, you add them to this list. It will give you all the metrics, and it will tell you: Does this link to your website that you can associate with a list, or does it not? Or does it link to some page on the domain, but maybe not exactly the page that you want? It will tell that too. Pretty cool.

8. Link distribution by DA

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

Finally, we do now have link distribution by DA. You can find that right on the Overview page at the bottom.

Look, I’m not saying Link Explorer is the absolute perfect, best product out there, but it’s really, really damn good. I’m incredibly proud of the team. I’m very proud to have this product out there.

If you’d like, I’ll be writing some more about how we went about building this product and a bunch of agency folks that we spent time with to develop this, and I would like to thank all of them of course. A huge thank you to the Moz team.

I hope you’ll do me a favor. Check out Link Explorer. I think, very frankly, this team has earned 30 seconds of your time to go check it out.

Try out Link Explorer!

All right. Thanks, everyone. We’ll see you again for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Google Confirms Chrome Usage Data Used to Measure Site Speed

Posted by Tom-Anthony

During a discussion with Google’s John Mueller at SMX Munich in March, he told me an interesting bit of data about how Google evaluates site speed nowadays. It has gotten a bit of interest from people when I mentioned it at SearchLove San Diego the week after, so I followed up with John to clarify my understanding.

The short version is that Google is now using performance data aggregated from Chrome users who have opted in as a datapoint in the evaluation of site speed (and as a signal with regards to rankings). This is a positive move (IMHO) as it means we don’t need to treat optimizing site speed for Google as a separate task from optimizing for users.

Previously, it has not been clear how Google evaluates site speed, and it was generally believed to be measured by Googlebot during its visits — a belief enhanced by the presence of speed charts in Search Console. However, the onset of JavaScript-enabled crawling made it less clear what Google is doing — they obviously want the most realistic data possible, but it’s a hard problem to solve. Googlebot is not built to replicate how actual visitors experience a site, and so as the task of crawling became more complex, it makes sense that Googlebot may not be the best mechanism for this (if it ever was the mechanism).

In this post, I want to recap the pertinent data around this news quickly and try to understand what this may mean for users.

Google Search Console

Firstly, we should clarify our understand of what the “time spent downloading a page” metric in Google Search Console is telling us. Most of us will recognize graphs like this one:

Until recently, I was unclear about exactly what this graph was telling me. But handily, John Mueller comes to the rescue again with a detailed answer [login required] (hat tip to James Baddiley from Chillisauce.com for bringing this to my attention):

John clarified what this graph is showing:

It’s technically not “downloading the page” but rather “receiving data in response to requesting a URL” – it’s not based on rendering the page, it includes all requests made.

And that it is:

this is the average over all requests for that day

Because Google may be fetching a very different set of resources every day when it’s crawling your site, and because this graph does not account for anything to do with page rendering, it is not useful as a measure of the real performance of your site.

For that reason, John points out that:

Focusing blindly on that number doesn’t make sense.

With which I quite agree. The graph can be useful for identifying certain classes of backend issues, but there are also probably better ways for you to do that (e.g. WebPageTest.org, of which I’m a big fan).

Okay, so now we understand that graph and what it represents, let’s look at the next option: the Google WRS.

Googlebot & the Web Rendering Service

Google’s WRS is their headless browser mechanism based on Chrome 41, which is used for things like “Fetch as Googlebot” in Search Console, and is increasingly what Googlebot is using when it crawls pages.

However, we know that this isn’t how Google evaluates pages because of a Twitter conversation between Aymen Loukil and Google’s Gary Illyes. Aymen wrote up a blog post detailing it at the time, but the important takeaway was that Gary confirmed that WRS is not responsible for evaluating site speed:

Twitter conversation with Gary Ilyes

At the time, Gary was unable to clarify what was being used to evaluate site performance (perhaps because the Chrome User Experience Report hadn’t been announced yet). It seems as though things have progressed since then, however. Google is now able to tell us a little more, which takes us on to the Chrome User Experience Report.

Chrome User Experience Report

Introduced in October last year, the Chrome User Experience Report “is a public dataset of key user experience metrics for top origins on the web,” whereby “performance data included in the report is from real-world conditions, aggregated from Chrome users who have opted-in to syncing their browsing history and have usage statistic reporting enabled.”

Essentially, certain Chrome users allow their browser to report back load time metrics to Google. The report currently has a public dataset for the top 1 million+ origins, though I imagine they have data for many more domains than are included in the public data set.

In March I was at SMX Munich (amazing conference!), where along with a small group of SEOs I had a chat with John Mueller. I asked John about how Google evaluates site speed, given that Gary had clarified it was not the WRS. John was kind enough to shed some light on the situation, but at that point, nothing was published anywhere.

However, since then, John has confirmed this information in a Google Webmaster Central Hangout [15m30s, in German], where he explains they’re using this data along with some other data sources (he doesn’t say which, though notes that it is in part because the data set does not cover all domains).

At SMX John also pointed out how Google’s PageSpeed Insights tool now includes data from the Chrome User Experience Report:

The public dataset of performance data for the top million domains is also available in a public BigQuery project, if you’re into that sort of thing!

We can’t be sure what all the other factors Google is using are, but we now know they are certainly using this data. As I mentioned above, I also imagine they are using data on more sites than are perhaps provided in the public dataset, but this is not confirmed.

Pay attention to users

Importantly, this means that there are changes you can make to your site that Googlebot is not capable of detecting, which are still detected by Google and used as a ranking signal. For example, we know that Googlebot does not support HTTP/2 crawling, but now we know that Google will be able to detect the speed improvements you would get from deploying HTTP/2 for your users.

The same is true if you were to use service workers for advanced caching behaviors — Googlebot wouldn’t be aware, but users would. There are certainly other such examples.

Essentially, this means that there’s no longer a reason to worry about pagespeed for Googlebot, and you should instead just focus on improving things for your users. You still need to pay attention to Googlebot for crawling purposes, which is a separate task.

If you are unsure where to look for site speed advice, then you should look at:

That’s all for now! If you have questions, please comment here and I’ll do my best! Thanks!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

How Is Google’s New "Questions and Answers" Feature Being Used? [Case Study]

Posted by MiriamEllis

Ever since Google rolled out Questions and Answers in mid-2017, I’ve been trying to get a sense of its reception by consumers and brands. Initially restricted to Android Google Maps, this fascinating feature which enables local business owners and the public to answer consumer questions made it to desktop displays this past December, adding yet another data layer to knowledge panels and local finders.

As someone who has worked in Q&A forums for the majority of my digital marketing life, I took an immediate shine to the idea of Google Questions and Answers. Here’s a chance, I thought, for consumers and brands to take meaningful communication to a whole new level, exchanging requests, advice, and help so effortlessly. Here’s an opportunity for businesses to place answers to FAQs right upfront in the SERPs, while also capturing new data about consumer needs and desires. So cool!

But, so far, we seem to be getting off to a slow start. According to a recent, wide-scale GetFiveStars study, 25% of businesses now have questions waiting for them. I decided to hone in on San Francisco and look at 20 busy industries in that city to find out not just how many questions were being asked, but also how many answers were being given, and who was doing the answering. I broke down responders into three groups: Local Guides (LGs), random users (RUs), and owners (Os). I looked at the top 10 businesses ranking in the local finder for each industry:

Industry Number of Questions Number of Answers LGs RUs Os
Dentists 1 0 0 0 0
Plumbers 2 0 - - -
Chiropractors 0 - - - -
Mexican Restaurants 10 23 22 1 -
Italian Restaurants 15 20 19 1 -
Chinese Restaurants 16 53 49 4 -
Car Dealers 4 5 3 2 -
Supermarkets 7 27 24 3 -
Clothing Stores 4 1 1 - -
Florists 1 0 - - -
Hotels 44 142 114 28 -
Real Estate Agencies 0 - - - -
General Contractors 1 0 - - -
Cell Phone Stores 14 3 3 - -
Yoga Studios 1 0 - - -
Banks 1 0 - - -
Carpet Cleaning 0 - - - -
Hair Salons 1 0 - - -
Locksmiths 1 0 - - -
Jewelry Stores 0 - - - -


Takeaways from the case study

Here are some patterns and oddities I noticed from looking at 123 questions and 274 answers:

  1. There are more than twice as many answers as questions. While many questions received no answers, others received five, ten, or more.
  2. The Owners column is completely blank. The local businesses I looked at in San Francisco are investing zero effort in answering Google Questions and Answers.
  3. Local Guides are doing the majority of the answering. Of the 274 answers provided, 232 came from users who have been qualified as Local Guides by Google. Why so lopsided? I suspect the answer lies in the fact that Google sends alerts to this group of users when questions get asked, and that they can earn 3 points per answer they give. Acquiring enough points gets you perks like 3 free months of Google Play Music and a 75% discount off Google Play Movies.

    Unfortunately, what I’m seeing in Google Questions and Answers is that incentivizing replies is leading to a knowledge base of questionable quality. How helpful is it when a consumer asks a hotel if they have in-room hair dryers and 10 local guides jump on the bandwagon with “yep”? Worse yet, I saw quite a few local guides replying “I don’t know,” “maybe,” and even “you should call the business and ask.” Here and there, I saw genuinely helpful answers from the Local Guides, but my overall impression didn’t leave me feeling like I’d stumbled upon a new Google resource of matchless expertise.

  4. Some members of the public seem to be confused about the use of this feature. I noticed people using the answer portion to thank people who replied to their query, rather than simply using the thumbs up widget.

    Additionally, I saw people leaving reviews/statements, instead of questions:
    And with a touch of exasperated irony:
    And to rant:

  5. Some industries are clearly generating far more questions than others. Given how people love to talk about hotels and restaurants, I wasn’t surprised to see them topping the charts in sheer volume of questions and answers. What did surprise me was not seeing more questions being asked of businesses like yoga studios, florists, and hair salons; before I actually did the searches, I might have guessed that pleasant, “chatty” places like these would be receiving lots of queries.

Big brands everywhere are leaving Google Questions and Answers unanswered

I chose San Francisco for my case study because of its general reputation for being hip to new tech, but just in case my limited focus was presenting a false picture of how local businesses are managing this feature, I did some random searches for big brands around the state and around the country.

I found questions lacking owner answers for Whole Foods, Sephora, Taco Bell, Macy’s, Denny’s, Cracker Barrel, Target, and T-Mobile. As I looked around the nation, I noted that Walmart has cumulatively garnered thousands of questions with no brand responses.

But the hands-down winner for a single location lacking official answers is Google in Mountain View. 103 questions as of my lookup and nary an owner answer in sight. Alphabet might want to consider setting a more inspiring example with their own product… unless I’m misunderstanding their vision of how Google Questions and Answers is destined to be used.


Just what is the vision for Google Questions and Answers, I wonder?

As I said at the beginning of this post, it’s early days yet to predict ultimate outcomes. Yet, the current lay of the land for this feature has left me with more questions than answers:

  • Does Google actually intend questions to be answered by brands, or by the public? From what I’ve seen, owners are largely unaware of or choosing to ignore this feature many months post-launch. As of writing this, businesses are only alerted about incoming questions if they open the Google Maps app on an Android phone or tablet. There is no desktop GMB dashboard section for the feature. It’s not a recipe for wide adoption. Google has always been a fan of a crowdsourcing approach to their data, so they may not be concerned, but that doesn’t mean your business shouldn’t be.
  • What are the real-time expectations for this feature? I see many users asking questions that needed fast answers, like “are you open now?” while others might support lengthier response times, as in, “I’m planning a trip and want to know what I can walk to from your hotel.” For time-sensitive queries, how does Questions and Answers fit in with Google’s actual chat feature, Google Messaging, also rolled out last summer? Does Google envision different use cases for both features? I wonder if one of the two products will win out over time, while the other gets sunsetted.
  • What are the real, current risks to brands of non-management? I applauded Mike Blumenthal’s smart suggestion of companies proactively populating the feature with known FAQs and providing expert answers, and I can also see the obvious potential for reputation damage if rants or spam are ignored. That being said, my limited exploration of San Francisco has left me wondering just how many people (companies or consumers) are actually paying attention in most industries. Google Knowledge Panels and the Local Finder pop-ups are nearing an information bloat point. Do you want to book something, look at reviews, live chat, see menus, find deals, get driving directions, make a call? Websites are built with multiple pages to cover all of these possible actions. Sticking them all in a 1” box may not equal the best UX I’ve ever seen, if discovery of features is our goal.
  • What is the motivation for consumers to use the product? Personally, I’d be more inclined to just pick up the phone to ask any question to which I need a fast answer. I don’t have the confidence that if I queried Whole Foods in the AM as to whether they’ve gotten in organic avocados from California, there’d be a knowledge panel answer in time for my lunch. Further, some of the questions I’ve asked have received useless answers from the public, which seems like a waste of time for all parties. Maybe if the feature picks up momentum, this will change.
  • Will increasing rates of questions = increasing rates of business responses? According to the GetFiveStars study linked to above, total numbers of questions for the 1700 locations they investigated nearly doubled between November–December of 2017. From my microscopic view of San Francisco, it doesn’t appear to me that the doubling effect also happened for owner answers. Time will tell, but for now, what I’m looking for is question volume reaching such a boiling point that owners feel obligated to jump into management, as they have with reviews. We’re not there yet, but if this feature is a Google keeper, we could get there.

So what should you be doing about Google Questions and Answers?

I’m a fan of early adoption where it makes sense. Speculatively, having an active Questions and Answers presence could end up as a ranking signal. We’ve already seen it theorized that use of another Google asset, Google Posts, may impact local pack rankings. Unquestionably, leaving it up to the public to answer questions about your business with varying degrees of accuracy carries the risk of losing leads and muddying your online presence to the detriment of reputation. If a customer asks if your location has wheelchair access and an unmotivated third party says “I don’t know,” when, in fact, your business is fully ADA-compliant, your lack of an answer becomes negative customer service. Because of this, ignoring the feature isn’t really an option. And, while I wouldn’t prioritize management of Questions and Answers over traditional Google-based reviews at this point, I would suggest:

  1. Do a branded search today and look at your knowledge panel to see if you’ve received any questions. If so, answer them in your best style, as helpfully as possible
  2. Spend half an hour this week translating your company’s 5 most common FAQs into Google Questions and Answers queries and then answering them. Be sure you’re logged into your company’s Google account when you reply, so that your message will be officially stamped with the word “owner.” Whether you proactively post your FAQs while logged into your business’ account is up to you. I think it’s more transparent to do so.
  3. If you’re finding this part of your Knowledge Panel isn’t getting any questions, checking it once a week is likely going to be enough for the present.
  4. If you happen to be marketing a business that is seeing some good Questions and Answers activity, and you have the bandwidth, I’d add checking this to the daily social media rounds you make for the purpose of reputation management. I would predict that if Google determines this feature is a keeper, they’ll eventually start sending email alerts when new queries come in, as they’re now doing with reviews, which should make things easier and minimize the risk of losing a customer with an immediate need. Need to go pro on management right now due to question volume? GetFiveStars just launched an incredibly useful Google Q&A monitoring feature, included in some of their ORM software packages. Looks like a winner!
  5. Do be on the lookout for spam inquiries and responses, and report them if they arise.

If you’re totally new to Google Questions and Answers, this simple infographic will get you going in a flash:

For further tips on using Google Questions and Answers like a pro, I recommend following GetFiveStars’ 3-part series on this topic.


My questions, your answers

My case study is small. Can you help expand our industry’s knowledge base by answering a few questions in the comments to add to the picture of the current rate of adoption/usefulness of Google’s Questions and Answers? Please, let me know:

  1. Have you asked a question using this feature?
  2. Did you receive an answer and was it helpful?
  3. Who answered? The business, a random user, a Local Guide?
  4. Have you come across any examples of business owners doing a good job answering questions?
  5. What are your thoughts on Google Questions and Answers? Is it a winner? Worth your time? Any tips?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

New gTLDs are Like Used Cars

There may be a couple exceptions which prove the rule, but new TLDs are generally an awful investment for everyone except the registry operator.

Here is the short version…

And the long version…

Diminishing Returns

About a half-decade ago I wrote about how Google devalued domain names from an SEO perspective & there have been a number of leading “category killer” domains which have repeatedly been recycled from startup to acquisition to shut down to PPC park page to buy now for this once in a lifetime opportunity in an endless water cycle.

The central web platforms are becoming ad heavy, which in turn decreases the reach of anything which is not an advertisement. For the most valuable concepts / markets / keywords ads eat up the entire interface for the first screen full of results. Key markets like hotels might get a second round of vertical ads to further displace the concept of organic results.

Proprietary, Closed-Ecosystem Roach Motels

The tech monopolies can only make so much money by stuffing ads onto their own platform. To keep increasing their take they need to increase the types, varieties & formats of media they host and control & keep the attention on their platform.

Both Google & Facebook are promoting scams where they feed on desperate publishers & suck a copy of the publisher’s content into being hosted by the tech monopoly platform de jour & sprinkle a share of the revenues back to the content sources.

They may even pay a bit upfront for new content formats, but then after the market is primed the deal shifts to where (once again) almost nobody other than the tech monopoly platform wins.

The attempt to “own” the web & never let users go is so extreme both companies will make up bogus statistics to promote their proprietary / fake open / actually closed standards.

If you ignore how Google’s AMP double, triple, or quadruple counts visitors in Google Analytics the visit numbers look appealing.

But the flip side of those fake metrics is actual revenues do not flow.

Facebook has the same sort of issues, with frequently needing to restate various metrics while partners fly blind.

These companies are restructuring society & the race to the bottom to try to make the numbers work in an increasingly unstable & parasitic set of platform choices is destroying adjacent markets:

Have you tried Angry Birds lately? It’s a swamp of dark patterns. All extractive logic meant to trick you into another in-app payment. It’s the perfect example of what happens when product managers have to squeeze ever-more-growth out of ever-less-fertile lands to hit their targets year after year. … back to the incentives. It’s not just those infused by venture capital timelines and return requirements, but also the likes of tax incentives favoring capital gains over income. … that’s the truly insidious part of the tech lords solution to everything. This fantasy that they will be greeted as liberators. When the new boss is really a lot like the old boss, except the big stick is replaced with the big algorithm. Depersonalizing all punishment but doling it out just the same. … this new world order is being driven by a tiny cabal of monopolies. So commercial dissent is near impossible. … competition is for the little people. Pitting one individual contractor against another in a race to the bottom. Hoarding all the bargaining power at the top. Disparaging any attempts against those at the bottom to organize with unions or otherwise.

To be a success on the attention platforms you have to push toward the edges. But as you become successful you become a target.

And the dehumanized “algorithm” is not above politics & public relations.

Pewdiepie is the biggest success story on the YouTube platform. When he made a video showing some of the absurd aspects of Fiverr it led to a WSJ investigation which “uncovered” a pattern of anti-semitism. And yet one of the reporters who worked on that story wrote far more offensive and anti-semetic tweets. The hypocrisy of the hit job didn’t matter. They still were able to go after Pewdiepie’s ad relationships to cut him off from Disney’s Maker Studios & the premium tier of YouTube ads.

The fact that he is an individual with broad reach means he’ll still be fine economically, but many other publishers would quickly end up in a death spiral from the above sequence.

If it can happen to a leading player in a closed ecosystem then the risk to smaller players is even greater.

In some emerging markets Facebook effectively *is* the Internet.

The Decline of Exact Match Domains

Domains have been so devalued (from an SEO perspective) that some names like PaydayLoans.net sell for about $ 3,000 at auction.

$ 3,000 can sound like a lot to someone with no money, but names like that were going for 6 figures at their peak.

Professional domain sellers participate in the domain auctions on sites like NameJet & SnapNames. Big keywords like [payday loans] in core trusted extensions are not missed. So if the 98% decline in price were an anomaly, at least one of them would have bid more in that auction.

Why did exact match domains fall so hard? In part because Google shifted from scoring the web based on links to considering things like brand awareness in rankings. And it is very hard to run a large brand-oriented ad campaign promoting a generically descriptive domain name. Sure there are a few exceptions like Cars.com & Hotels.com, but if you watch much TV you’ll see a lot more ads associated with businesses that are not built on generically descriptive domain names.

Not all domains have fallen quite that hard in price, but the more into the tail you go the less the domain acts as a memorable differentiator. If the barrier to entry increases, then the justification for spending a lot on a domain name as part of a go to market strategy makes less sense.

Brandable Names Also Lost Value

Arguably EMDs have lost more value than brandable domain names, but even brandable names have sharply slid.

If you go back a decade or two tech startups would secure their name (say Snap.com or Monster.com or such) & then try to build a business on it.

But in the current marketplace with there being many paths to market, some startups don’t even have a domain name at launch, but begin as iPhone or Android apps.

Now people try to create success on a good enough, but cheap domain name & then as success comes they buy a better domain name.

Jelly was recently acquired by Pinterest. Rather than buying jelly.com they were still using AskJelly.com for their core site & Jelly.co for their blog.

As long as domain redirects work, there’s no reason to spend heavily on a domain name for a highly speculative new project.

Rather then spending 6 figures on a domain name & then seeing if there is market fit, it is far more common to launch a site on something like getapp.com, joinapp.com, app.io, app.co, businessnameapp.com, etc.

This in turn means that rather than 10,000s of startups all chasing their core .com domain name off the start, people test whatever is good enough & priced close to $ 10. Then only after they are successful do they try to upgrade to better, more memorable & far more expensive domain names.

Money isn’t spent on the domain names until the project has already shown market fit.

One in a thousand startups spending $ 1 million is less than one in three startups spending $ 100,000.

New TLDs Undifferentiated, Risky & Overpriced

No Actual Marketing Being Done

Some of the companies which are registries for new TLDs talk up investing in marketing & differentiation for the new TLDs, but very few of them are doing much on the marketing front.

You may see their banner ads on domainer blogs & they may even pay for placement with some of the registries, but there isn’t much going on in terms of cultivating a stable ecosystem.

When Google or Facebook try to enter & dominate a new vertical, the end destination may be extractive rent seeking by a monopoly BUT off the start they are at least willing to shoulder some of the risk & cost upfront to try to build awareness.

Where are the domain registries who have built successful new businesses on some of their new TLDs? Where are the subsidies offered to key talent to help drive awareness & promote the new strings?

As far as I know, none of that stuff exists.

In fact, what is prevalent is the exact opposite.

Greed-Based Anti-Marketing

So many of them are short sighted greed-based plays that they do the exact opposite of building an ecosystem … they hold back any domain which potentially might not be complete garbage so they can juice it for a premium ask price in the 10s of thousands of dollars.

While searching on GoDaddy Auctions for a client project I have seen new TLDs like .link listed for sale for MORE THAN the asking price of similar .org names.

If those prices had any sort of legitimate foundation then the person asking $ 30,000 for a .link would have bulk bought all the equivalent .net and .org names which are listed for cheaper prices.

But the prices are based on fantasy & almost nobody is dumb enough to pay those sorts of prices.

Anyone dumb enough to pay that would be better off buying their own registry rather than a single name.

The holding back of names is the exact opposite of savvy marketing investment. It means there’s no reason to use the new TLD if you either have to pay through the nose or use a really crappy name nobody will remember.

I didn’t buy more than 15 of Uniregistry’s domains because all names were reserved in the first place and I didn’t feel like buying 2nd tier domains … Domainers were angry when the first 2 Uniregistry’s New gTLDs (.sexy and .tattoo) came out and all remotely good names were reserved despite Frank saying that Uniregistry would not reserve any domains.

Who defeats the race to the bottom aspects of the web by starting off from a “we only sell shit” standpoint?

Nobody.

And that’s why these new TLDs are a zero.

Defaults Have Value

Many online verticals are driven by winner take most monopoly economics. There’s a clear dominant leader in each of these core markets: social, search, short-form video, long-form video, retail, auctions, real estate, job search, classifieds, etc. Some other core markets have consolidated down to 3 or 4 core players who among them own about 50 different brands that attack different parts of the market.

Almost all the category leading businesses which dominate aggregate usage are on .com domains.

Contrast the lack of marketing for new TLDs with all the marketing one sees for the .com domain name.

Local country code domain names & .com are not going anywhere. And both .org and .net are widely used & unlikely to face extreme price increases.

Hosing The Masses…

A decade ago domainers were frustrated Verisign increased the price of .com domains in ~ 5% increments:

Every mom, every pop, every company that holds a domain name had no say in the matter. ICANN basically said to Verisign: “We agree to let you hose the masses if you stop suing us”.

I don’t necessarily mind paying more for domains so much as I mind the money going to a monopolistic regulator which has historically had little regard for the registrants/registrars it should be serving

Those 5% or 10% shifts were considered “hosing the masses.”

Imagine what sort of blowback PIR would get from influential charities if they tried to increase the price of .org domains 30-fold overnight. It would be such a public relations disaster it would never be considered.

Domain registries are not particularly expensive to run. A person who has a number of them can run each of them for less than the cost of a full time employee – say $ 25,000 to $ 50,00 per year.

And yet, the very people who complained about Verisign’s benign price increases, monopolistic abuses & rent extraction are now pushing massive price hikes:

.Hosting and .juegos are going up from about $ 10-$ 20 retail to about $ 300. Other domains will also see price increases.

Here’s the thing with new TLD pricing: registry operators can increase prices as much as they want with just six months’ notice.

in its applications, Uniregistry said it planned to enter into a contractual agreement to not increase its prices for five years.

Why would anyone want to build a commercial enterprise (or anything they care about) on such a shoddy foundation?

If a person promises…

  • no hold backs of premium domains, then reserves 10s of thousands of domains
  • no price hikes for 5 years, then hikes prices
  • the eventual price hikes being inline with inflation, then hikes prices 3,000%

That’s 3 strikes and the batter is out.

Doing the Math

The claim the new TLDs need more revenues to exist are untrue. Running an extension costs maybe $ 50,000 per year. If a registry operator wanted to build a vibrant & stable ecosystem the first step would be dumping the concept of premium domains to encourage wide usage & adoption.

There are hundreds of these new TLD extensions and almost none of them can be trusted to be a wise investment when compared against similar names in established extensions like .com, .net, .org & CCTLDs like .co.uk or .fr.

There’s no renewal price protection & there’s no need, especially as prices on the core TLDs have sharply come down.

Domain Pricing Trends

Aggregate stats are somewhat hard to come by as many deals are not reported publicly & many sites which aggregate sales data also list minimum prices.

However domains have lost value for many reasons

  • declining SEO-related value due to the search results becoming over-run with ads (Google keeps increasing their ad clicks 20% to 30% year over year)
  • broad market consolidation in key markets like travel, ecommerce, search & social
    • Google & Facebook are eating OVER 100% of online advertising growth – the rest of industry is shrinking in aggregate
    • are there any major news sites which haven’t struggled to monetize mobile?
    • there is a reason there are few great indy blogs compared to a decade ago
  • rising technical costs in implementing independent websites (responsive design, HTTPS, AMP, etc.) “Closed platforms increase the chunk size of competition & increase the cost of market entry, so people who have good ideas, it is a lot more expensive for their productivity to be monetized. They also don’t like standardization … it looks like rent seeking behaviors on top of friction” – Gabe Newell
  • harder to break into markets with brand-biased relevancy algorithms (increased chunk size of competition)
  • less value in trying to build a brand on a generic name, which struggles to rank in a landscape of brand-biased algorithms (inability to differentiate while being generically descriptive)
  • decline in PPC park page ad revenues
    • for many years Yahoo! hid the deterioration in their core business by relying heavily on partners for ad click volumes, but after they switched to leveraging Bing search, Microsoft was far more interested with click quality vs click quantity
    • absent the competitive bid from Yahoo!, Google drastically reduced partner payouts
    • most web browsers have replaced web address bars with dual function search boxes, drastically reducing direct navigation traffic

All the above are the mechanics of “why” prices have been dropping, but it is also worth noting many of the leading portfolios have been sold.

If the domain aftermarket is as vibrant as some people claim, there’s no way the Marchex portfolio of 200,000+ domains would have sold for only $ 28.1 million a couple years ago.

RegistrarStats shows .com registrations have stopped growing & other extensions like .net, .org, .biz & .info are now shrinking.

Both aftermarket domain prices & the pool of registered domains on established gTLDs are dropping.

I know I’ve dropped hundreds & hundreds of domains over the past year. That might be due to my cynical views of the market, but I did hold many names for a decade or more.

As barrier to entry increases, many of the legacy domains which could have one day been worth developing have lost much of their value.

And the picked over new TLDs are an even worse investment due to the near infinite downside potential of price hikes, registries outright folding, etc.

Into this face of declining value there is a rush of oversupply WITH irrational above-market pricing. And then the registries which spend next to nothing on marketing can’t understand why their great new namespaces went nowhere.

As much as I cringe at .biz & .info, I’d prefer either of them over just about any new TLD.

Any baggage they may carry is less than the risk of going with an unproven new extension without any protections whatsoever.

Losing Faith in the Zimbabwe Dollar

Who really loses is anyone who read what these domain registry operators wrote & trusted them.

Uniregistry does not believe that registry fees should rise when the costs of other technology services have uniformly trended downward, simply because a registry operator believes it can extract higher profit from its base of registrants.

How does one justify a 3000% price hike after stating “Our prices are fixed and only indexed to inflation after 5 years.”

Are they pricing these names in Zimbabwe Dollars? Or did they just change their minds in a way that hurt anyone who trusted them & invested in their ecosystem?

Frank Schilling warned about the dangers of lifting price controls

The combination of “presumptive renewal” and the “lifting of price controls on registry services” is incredibly dangerous.
Imagine buying a home, taking on a large mortgage, remodeling, moving in, only to be informed 6 months later that your property taxes will go up 10,000% with no better services offered by local government. The government doesn’t care if you can’t pay your tax/mortgage because they don’t really want you to pay your tax… they want you to abandon your home so they can take your property and resell it to a higher payer for more money, pocketing the difference themselves, leaving you with nothing.

This agreement as written leaves the door open to exactly that type of scenario

He didn’t believe the practice to be poor.

Rather he felt he would have been made poorer, unless he was the person doing it:

It would be the mother of all Internet tragedies and a crippling blow to ICANN’s relevance if millions of pioneering registrants were taxed out of their internet homes as a result of the greed of one registry and the benign neglect, apathy or tacit support of its master.

It is a highly nuanced position.

Categories: 

SEO Book

More Articles

Posted in Latest NewsComments Off

New take on Showcase Shopping ads? Categories of used items showing for retailer outlet queries

Similar to the Showcase ad format introduced this summer, the ads link to Google Shopping pages.

The post New take on Showcase Shopping ads? Categories of used items showing for retailer outlet queries appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Related Articles

Posted in Latest NewsComments Off

Case Study: How Two Artists Used Online Content to Build their Face to Face Business

Content Marketing Case Studies | copyblogger.com

Colorado-based artists, Lori Wostl and Lorri Flint, noticed that when they attended huge art retreats, the experience was more stressful than relaxing.

So, they founded their business — Art Camp for Women — in order to provide a fun, supportive, relaxing camp adventure for their participants.

They started a blog in order to help them market their camps, and got a sweet surprise a few years later when a huge national magazine called to offer them some amazing exposure.

Let’s talk to Lorri and Lori to find out more about their business, and how content marketing helps them reach their professional goals.

What’s your business?

We are Lori Wostl and Lorri Flint of Art Camp for Women. We run mixed-media art retreats for women, and we write online about creativity, art techniques, and mixed-media artists.

Who are your customers and readers, and how do you serve them?

We have a small but growing niche of women who are interested in mixed-media art (and in attending an art retreat). 

Our “campers” are women generally over 40 who have time and money to spend on themselves.

We’ve got women in our community from all walks of life — we have career women, retired women, empty-nesters, women who have recently recovered from cancer, women who are widowed or divorced … and everything in between!

Was there a pressing problem you were trying to solve?

Our business, Art Camp for Women, started because we wanted to provide an intimate retreat environment and a relaxing experience for our campers. ACFW is all-inclusive, and we provide chef-prepared meals, comfortable and cozy lodging, leading-edge art instruction, art supplies, and daily wine and chocolate.

We provide art retreats on a scale that allows our campers to meet and relate to everyone at the camp — including well-known artist-teachers.

We personally have attended “Big Box” retreats in the past with hundreds of attendees. Although the art instruction we received was amazing, our overall experience wasn’t great.

Unless you went with a friend, you were completely on your own in the evenings — often in a hotel in a strange city, without a car.  We also had bring all our own art supplies, which meant lugging incredibly heavy bags through security and on the airplane.

We also had to arrange and pay for our own meals, which meant eating unhealthy and expensive food in our hotel.

Overall, the experience was exhausting. We are fit and healthy women, and we had to come home and rest up after going on our “retreats.”

 

So we started ACWF because wanted to provide retreats where women could expand their art, be inspired by their surroundings, meet women from all over North America, and be rejuvenated — not become masters of the logistics of travel, lodging, food, and art supplies.

 

What kinds of online content are most important to your business?

On the Art Camp for Women blog, we publish art journaling prompts, free tutorials, interviews, organizing tips, and mixed-media art projects.

We use Pinterest to pin artwork, organizing tips for artists, architecture, travel, art exhibits, and photos from our Art Camps.

We use our Facebook page to direct people to our blog posts, and share links and resources from other writers and artists.  

We also run an email newsletter that features our blog content, regular contests for our community, and special offers.

In the last few years, we have joined a lot of different art groups online, trying to get the name of Art Camp for Women out into various communities. We also read and comment on lots of different blogs in the art community, which has helped us build relationships and market our camps.

What resources or tools did you find most helpful when you were getting started?

We took some classes on blogging and WordPress right at the beginning, and we knew that content marketing would be an important part of our marketing strategy.

We also took some business-building classes with our local Chamber of Commerce.  

This past fall, we signed up for Danny Iny’s Guest Posting course. By putting in a lot of work into doing guest posts, we have seen a spike in our web traffic, and increased our mailing list by more than 200%.

Were you always a business owner, or did you have a more traditional career before you started this business?

We had both worked in the corporate world as executives and trainers, and we have each had our own (different) businesses before we started ACFW.  The traditional careers were fine at the time, but there’s no going back for us at this point.  We like to make our own decisions and change direction quickly if we need to — flexibility is a top priority for us.

What were some of your tipping points or “a-ha!” moments? How did they come about?

 

In the fall of 2011, we were were working at our computers when we received a life-changing phone call. It was the photography editor from Oprah Magazine. They were doing a feature story about self-expression, and wanted to include our Art Camps in the story. Oprah has 3.8 million readers a month, so we were thrilled.

Three months after that call, Art Camp for Women was the first item listed in the February 2012 cover story, “Express Yourself! You from A to Z.” 

Our web traffic went from fewer than 100 hits a month (and sometimes far less) to more than 500 hits a day. In the first two weeks after the magazine was released, we had 4,000 hits on our site, and the average length of a visit on our site was over three minutes.

Oprah magazine’s editors called us because we had a great website, and because we were findable in the search engines. And when they called, we were ready.

Since then, we’ve seen a lot more diversity in the women interested in coming to Art Camp. We’ve also had better teachers and artists interested in working with us. The experience also raised our confidence quite a bit — we really felt like we were playing on a whole different level.

Our only regret is that it would have been great to have some professional photographs ready — we sent the O editors the photos we had, but they didn’t use them.

What does your business look like today, and what’s next for you?

We have always been a business that operates in the black, and we have no company debt.  

In the fall of 2012, we organized an online campaign for 2013 to both acquaint us with possible Art Camp teachers, and to grow our mailing list.

Our biggest business goal is to keep increasing the number of fully-attended Art Camps we run each year. We’re also expanding our camp locations, and and we’ll be doing camps in the tropics and in Europe.

We’re focusing on building our blog audience and our email mailing list.

Personally, we want occupations that contribute to the demographic of our choice, (women artists and art lovers) with a comfortable income and flexible working hours. We also have a huge commitment to having fun while always learning something new.

 

What advice would you give to bloggers and content creators who are trying to build an online audience?

Build a viable mailing list and use it.

Follow-up — stay in regular and timely contact with your list.

Always say yes to an opportunity and then figure out how to do it.

Don’t be afraid to give away tutorial information and actual (physical) gifts. It is a low-cost way to build your mailing list and grow your following.

Be willing to drop something that doesn’t work — even if it’s your favorite part.

After every event, or at regularly scheduled times, evaluate what worked and what didn’t. Make sure to do your evaluation in terms of dollars — not just emotions.

Make your photographs as professional as possible. You never know when Oprah may come calling!

About the Author: Beth Hayden is a Senior Staff Writer for Copyblogger Media. Get more from Beth on Twitter and Pinterest.

Related Stories

Copyblogger

Posted in Latest NewsComments Off

Mozscape in the Wild: How The API is (and Could be) Used

Posted by Ryan_Watson

Did you know that there are over 90 billion URLs are packed into our Mozscape API? That’s a lot of links. So many links, in fact, that it can be daunting to dream up all of the many ways that you could put those links to good use. When we originally built Linkscape (the predecessor to Mozscape), we mainly had one thing in mind… SEO and backlinks.

But there’s a whole lot more than that.

Links are only the beginning, it’s what those links can tell us that’s so darn interesting. Which is why I wanted to call out all of the amazing ways that developers (and marketers) are using Mozscape data to better their work, as well as encourage new uses of Mozscape data that have yet to be explored. (Feel free to jump in and create your own API key any time.)

How Mozscape is Being Used Today

Mozscape's wealth of links can be used in a variety of ways: from SEO audits, to domain valuations, to excel integration. Here at Moz, we have only begun to scratch the surface of how we can utilize the API. We currently use it to run some of our own tools such as Open Site Explorer and the Mozbar.

But I don't want to focus on the way we use it. Let's take a look at the way other developers have demonstrated some exciting uses for Mozscape. Hopefully these will get your mind going, thinking up other ways to use the data as well.

SEO Audits

We’ll start with the most obvious of use cases, SEO audits. There quite a few examples of SEO audit tools that use Mozscape data, but a few of our favorites (that are in front of a paywall) are the HubSpot Website Grader and The Found SEO Audit Tool, both of which bring the heat.

Mozscape data is what powers things like the total pages indexed by search, MozRank, a list of the most authoritative pages, along with their corresponding anchor texts. The beauty of this use case is that it can provide a great lead-gen funnel for all of the SEO agencies out there, proving value up front with an email address required prior to running the report. As a digital marketing agency, using Mozscape data to develop a site audit is a great way to get users into your sales funnel. You know, that inbound marketing stuff — cold calls are old news.

Domain Valuation

How valuable is a website, purely from an online authority perspective? Traditionally, that was a very tough question. You could look at things like site traffic (which typically isn’t very accurate) or rankings for certain terms, but that’s a far-sighted approach to the question. Think about using the metrics behind Mozscape, like MozRank, Domain Authority, and MozTrust instead. Flippa, for example, uses Mozscape data as a datapoint for due diligence.

You could imagine this kind of domain valuation anywhere else domains are bought or sold, most of which have yet to use Mozscape data. The value, of course, is providing as much confidence to the buyers of web properties based on the “web footprint” of the site.

Spreadsheet Kung-Fu

The spreadsheet kung-fu of this industry is unmatched anywhere else. With the integration of Mozscape data to Excel, some have been able to make Excel sing. The beauty of using Excel for analyzing Mozscape data is that you can slice and dice as you please, without setting up complex API calls. Perhaps our favorite example of Excel comes from the illustrious Richard Baxter, with the Links API Extension from SEO Gadget.

However, if Google Docs are more up your alley, the amazing Aleyda Solis created just the thing for you (so did Chris Lee). Tools like these allow the average marketer to dig into the firehose of data available through the API in a simple and recognizable interface.

Client Reporting

Yes, that's right. iAcquire uses the data when creating client reports as it not only helps them to inform the client about how their pages are doing but to also show the importance of certain pages on their site. The data is both a research tool and an education tool.

"Below is a screenshot from a ranking research report showing data we gathered for the keyword 'inbound marketing tips.' Moz stats are represented throughout the stats columns. As we work with these reports we are able to see if any of our content distribution efforts resulted in links on page or domain as can be seen in the far left columns."

iAcquire ranking research report

How Mozscape Could Be Used

That’s how Mozscape is being used today, but it’s only the tip of the iceberg. A few folks have realized the potential outside of the traditional use cases that I’ve mentioned above. The power of the data comes when we take Mozscape data outside of its traditional context of pure link evaluation. Let me show you what I mean.

Link Building

Its relatively easy to imagine Mozscape's data being used for link building. With Mozscape's massive amount of link data, SEOs are able to prioritize their link building efforts, and focus on value added efforts.

CRM

You could imagine that some of the examples noted above have been used for link building, but what about a deeper integration into a contact manager? Something that would allow the user to prioritize outreach by the value of a domain.

Just as one can do with the Klout score (or Social Authority) on Twitter, the same can be done for customer relationship efforts in filtering Domain Authority to determine importance.

Top Lists

We’ve seen hints of blogs using Mozscape data determine a top startup list, like the GeekWire 200, but the same could be applied for any rankings list of web properties.

Traditionally, lists have used Alexa or Compete traffic data to determine web prominence, but they’re so inaccurate. Other lists have used social specific metrics like social followings, but those too fall short. Geekwire’s list of the top 200 startups in Seattle uses a blend of both social and web data (External links, MozTrust) to determine just how influential a site is, providing the full picture.

How Could You Use the API?

I’m sure we've missed a ton of ideas, so we’re calling on you to help us find those new opportunities for Mozscape. Things like a tightening relationship between links and social networks, and categorizing link sources. How would you use this data, and how would you build it? Better yet, why not create your key and get going? 

We want to make it easy for you.

We've been working quite hard to make our indexes faster and have recently updated our Mozscape API documentation. We want to make it as simple for you to use the data to get your idea up an running as possible.

Plus, if you create something, it's likely we'll get you added to our app gallery. We have everything from large corporations to individuals who have used the API and we show off their work in the gallery.

We'd love to hear from you. Obviously we always encourage folks to jump in and check out the free API (as well as the paid), and use the data for something useful for you. We're also quite open to hearing about ways we can improve our own tools with the data or help educate people better. I look forward to reading through your feedback and seeing if there are ways we can help get people started using Mozscape.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


SEOmoz Daily SEO Blog

Posted in Latest NewsComments Off

[Video] How The Boston Globe used customer insight to create new strategy

In this MarketingSherpa blog post, watch and video excerpt from last year’s Optimization Summit. Peter Doucette, Executive Director of Circulation Sales & Marketing, The Boston Globe, presented how his company transitioned from a free to paid product. This year, Peter will return to Optimization Summit 2013 with an part two presentation of the transition.
MarketingSherpa Blog

Posted in Latest NewsComments Off

Advert