Tag Archive | "used"

Case Study: How a Media Company Grew 400% and Used SEO to Get Acquired

Posted by Gaetano-DiNardi-NYC

Disclaimer: I’m currently the Director of Demand Generation at Nextiva, and writing this case study post-mortem as the former VP of Marketing at Sales Hacker (Jan. 2017 – Sept. 2018).



Every B2B company is investing in content marketing right now. Why? Because they all want the same thing: Search traffic that leads to website conversions, which leads to money.

But here’s the challenge: Companies are struggling to get traction because competition has reached an all-time high. Keyword difficulty (and CPC) has skyrocketed in most verticals. In my current space, Unified Communication as a Service (UCaaS), some of the CPCs have nearly doubled since 2017, with many keywords hovering close to $ 300 per click.

Not to mention, organic CTRs are declining, and zero-click queries are rising.

Bottom line: If you’re not creating 10x quality content based on strategic keyword research that satisfies searcher intent and aligns back to business goals, you’re completely wasting your time.

So, that’s exactly what we did. The outcome? We grew from 19k monthly organic sessions to over 100k monthly organic sessions in approximately 14 months, leading to an acquisition by Outreach.io

We validated our hard work by measuring organic growth (traffic and keywords) against our email list growth and revenue, which correlated positively, as we expected. 

Organic Growth Highlights

January 2017–June 2018

As soon as I was hired at Sales Hacker as Director of Marketing, I began making SEO improvements from day one. While I didn’t waste any time, you’ll also notice that there was no silver bullet.

This was the result of daily blocking and tackling. Pure execution and no growth hacks or gimmicks. However, I firmly believe that the homepage redesign (in July 2017) was a tremendous enabler of growth.

Organic Growth to Present Day

I officially left Sales Hacker in August of 2018, when the company was acquired by Outreach.io. However, I thought it would be interesting to see the lasting impact of my work by sharing a present-day screenshot of the organic traffic trend, via Google Analytics. There appears to be a dip immediately following my departure, however, it looks like my predecessor, Colin Campbell, has picked up the slack and got the train back on the rails. Well done!

Unique considerations — Some context behind Sales Hacker’s growth

Before I dive into our findings, here’s a little context behind Sales Hacker’s growth:

  • Sales Hacker’s blog is 100 percent community-generated — This means we didn’t pay “content marketers” to write for us. Sales Hacker is a publishing hub led by B2B sales, marketing, and customer success contributors. This can be a blessing and a curse at the same time — on one hand, the site gets loads of amazing free content. On the other hand, the posts are not even close to being optimized upon receiving the first draft. That means, the editorial process is intense and laborious.
  • Aggressive publishing cadence (4–5x per week) — Sales Hacker built an incredible reputation in the B2B Sales Tech niche — we became known as the go-to destination for unbiased thought leadership for practitioners in the space (think of Sales Hacker as the sales equivalent to Growth Hackers). Due to high demand and popularity, we had more content available than we could handle. While it’s a good problem to have, we realized we needed to keep shipping content in order to avoid a content pipeline blockage and a backlog of unhappy contributors.
  • We had to “reverse engineer” SEO — In short, we got free community-generated and sponsored content from top sales and marketing leaders at SaaS companies like Intercom, HubSpot, Pipedrive, LinkedIn, Adobe and many others, but none of it was strategically built for SEO out of the box. We also had contributors like John Barrows, Richard Harris, Lauren Bailey, Tito Bohrt, and Trish Bertuzzi giving us a treasure trove of amazing content to work with. However, we had to collaborate with each contributor from beginning to end and guide them through the entire process. Topical ideation (based on what they were qualified to write about), keyword research, content structure, content type, etc. So, the real secret sauce was in our editorial process. Shout out to my teammate Alina Benny for learning and inheriting my SEO process after we hired her to run content marketing. She crushed it for us!
  • Almost all content was evergreen and highly tactical — I made it a rule that we’d never agree to publish fluffy pieces, whether it was sponsored or not. Plain and simple. Because we didn’t allow “content marketers” to publish with us, our content had a positive reputation, since it was coming from highly respected practitioners. We focused on evergreen content strategies in order to fuel our organic growth. Salespeople don’t want fluff. They want actionable and tactical advice they can implement immediately. I firmly believe that achieving audience satisfaction with our content was a major factor in our SEO success.
    • Outranking the “big guys” — If you look at the highest-ranking sales content, it’s the usual suspects. HubSpot, Salesforce, Forbes, Inc, and many other sites that were far more powerful than Sales Hacker. But it didn’t matter as much as traditional SEO wisdom tells us, largely due to the fact that we had authenticity and rawness to our content. We realized most sales practitioners would rather read insights from their peers in their community, above the traditional “Ultimate Guides,” which tended to be a tad dry.
    • We did VERY little manual link building — Our link building was literally an email from me, or our CEO, to a site we had a great relationship with. “Yo, can we get a link?” It was that simple. We never did large-scale outreach to build links. We were a very lean, remote digital marketing team, and therefore lacked the bandwidth to allocate resources to link building. However, we knew that we would acquire links naturally due to the popularity of our brand and the highly tactical nature of our content.
    • Our social media and brand firepower helped us to naturally acquire links — It helps A LOT when you have a popular brand on social media and a well-known CEO who authored an essential book called “Hacking Sales”. Most of Sales Hacker’s articles would get widely circulated by over 50+ SaaS partners which would help drive natural links.
    • Updating stale content was the lowest hanging fruit — The biggest chunk of our new-found organic traffic came from updating / refreshing old posts. We have specific examples of this coming up later in the post.
    • Email list growth was the “north star” metric — Because Sales Hacker is not a SaaS company, and the “product” is the audience, there was no need for aggressive website CTAs like “book a demo.” Instead, we built a very relationship heavy, referral-based sales cadence that was supported by marketing automation, so list growth was the metric to pay attention to. This was also a key component to positioning Sales Hacker for acquisition. Here’s how the email growth progression was trending.

    So, now that I’ve set the stage, let’s dive into exactly how I built this SEO strategy.

    Bonus: You can also watch the interview I had with Dan Shure on the Evolving SEO Podcast, where I breakdown this strategy in great detail.

    1) Audience research

    Imagine you are the new head of marketing for a well-known startup brand. You are tasked with tackling growth and need to show fast results — where do you start?

    That’s the exact position I was in. There were a million things I could have done, but I decided to start by surveying and interviewing our audience and customers.

    Because Sales Hacker is a business built on content, I knew this was the right choice.

    I also knew that I would be able to stand out in an unglamorous industry by talking to customers about their content interests.

    Think about it: B2B tech sales is all about numbers and selling stuff. Very few brands are really taking the time to learn about the types of content their audiences would like to consume.

    When I was asking people if I could talk to them about their media and content interests, their response was: “So, wait, you’re actually not trying to sell me something? Sure! Let’s talk!”

    Here’s what I set out to learn:

    • Goal 1 — Find one major brand messaging insight.
    • Goal 2 — Find one major audience development insight.
    • Goal 3 — Find one major content strategy insight.
    • Goal 4 — Find one major UX / website navigation insight.
    • Goal 5 — Find one major email marketing insight.

    In short, I accomplished all of these learning goals and implemented changes based on what the audience told me.

    If you’re curious, you can check out my entire UX research process for yourself, but here are some of the key learnings:

    Based on these outcomes, I was able to determine the following:

    • Topical “buckets” to focus on — Based on the most common daily tasks, the data told us to build content on sales prospecting, building partnerships and referral programs, outbound sales, sales management, sales leadership, sales training, and sales ops.
    • Thought leadership — 62 percent of site visitors said they kept coming back purely due to thought leadership content, so we had to double down on that.
    • Content Types — Step by step guides, checklists, and templates were highly desired. This told me that fluffy BS content had to be ruthlessly eliminated at all costs.
    • Sales Hacker Podcast — 76 percent of respondents said they would listen to the Sales Hacker Podcast (if it existed), so we had to launch it!

    2) SEO site audit — Key findings

    I can’t fully break down how to do an SEO site audit step by step in this post (because it would be way too much information), but I will share the key findings and takeaways from our own Site Audit that led to some major improvements in our website performance.

    Lack of referring domain growth

    Sales Hacker was not able to acquire referring domains at the same rate as competitors. I knew this wasn’t because of a link building acquisition problem, but due to a content quality problem.

    Lack of organic keyword growth

    Sales Hacker had been publishing blog content for years (before I joined) and there wasn’t much to show for it from an organic traffic standpoint. However, I do feel the brand experienced a remarkable social media uplift by building content that was helpful and engaging. 

    Sales Hacker did happen to get lucky and rank for some non-branded keywords by accident, but the amount of content published versus the amount of traffic they were getting wasn’t making sense. 

    To me, this immediately screamed that there was an issue with on-page optimization and keyword targeting. It wasn’t anyone’s fault – this was largely due to a startup founder thinking about building a community first, and then bringing SEO into the picture later. 

    At the end of the day, Sales Hacker was only ranking for 6k keywords at an estimated organic traffic cost of $ 8.9k — which is nothing. By the time Sales Hacker got acquired, the site had an organic traffic cost of $ 122k.

    Non-optimized URLs

    This is common among startups that are just looking to get content out. This is just one example, but truth be told, there was a whole mess of non-descriptive URLs that had to get cleaned up.

    Poor internal linking structure

    The internal linking concentration was poorly distributed. Most of the equity was pointing to some of the lowest value pages on the site.

    Poor taxonomy, site structure, and navigation

    I created a mind-map of how I envisioned the new site structure and internal linking scheme. I wanted all the content pages to be organized into categories and subcategories.

    My goals with the new proposed taxonomy would accomplish the following:

    • Increase engagement from natural site visitor exploration
    • Allow users to navigate to the most important content on the site
    • Improve landing page visibility from an increase in relevant internal links pointing to them.

    Topical directories and category pages eliminated with redirects

    Topical landing pages used to exist on SalesHacker.com, but they were eliminated with 301 redirects and disallowed in robots.txt. I didn’t agree with this configuration. Example: /social-selling/

    Trailing slash vs. non-trailing slash duplicate content with canonical errors

    Multiple pages for the same exact intent. Failing to specify the canonical version.

    Branded search problems — “Sales Hacker Webinar”

    Some of the site’s most important content is not discoverable from search due to technical problems. For example, a search for “Sales Hacker Webinar” returns irrelevant results in Google because there isn’t an optimized indexable hub page for webinar content. It doesn’t get that much search volume (0–10 monthly volume according to Keyword Explorer), but still, that’s 10 potential customers you are pissing off every month by not fixing this.

    3) Homepage — Before and after

    Sooooo, this beauty right here (screenshot below) was the homepage I inherited in early 2017 when I took over the site.

    Fast forward six months later, and this was the new homepage we built after doing audience and customer research…

    New homepage goals

    • Tell people EXACTLY what Sales Hacker is and what we do.
    • Make it stupidly simple to sign up for the email list.
    • Allow visitors to easily and quickly find the content they want.
    • Add social proof.
    • Improve internal linking.

    I’m proud to say, that it all went according to plan. I’m also proud to say that as a result, organic traffic skyrocketed shortly after.

    Special Note: Major shout out to Joshua Giardino, the lead developer who worked with me on the homepage redesign. Josh is one of my closest friends and my marketing mentor. I would not be writing this case study today without him!

    There wasn’t one super measurable thing we isolated in order to prove this. We just knew intuitively that there was a positive correlation with organic traffic growth, and figured it was due to the internal linking improvements and increased average session duration from improving the UX.

    4) Updating and optimizing existing content

    Special note: We enforced “Ditch the Pitch”

    Before I get into the nitty-gritty SEO stuff, I’ll tell you right now that one of the most important things we did was blockade contributors and sponsors from linking to product pages and injecting screenshots of product features into blog articles, webinars, etc.

    Side note: One thing we also had to do was add a nofollow attribute to all outbound links within sponsored content that sent referral traffic back to partner websites (which is no longer applicable due to the acquisition).

    The #1 complaint we discovered in our audience research was that people were getting irritated with content that was “too salesy” or “too pitchy” — and rightfully so, because who wants to get pitched at all day?

    So we made it all about value. Pure education. School of hard knocks style insights. Actionable and tactical. No fluff. No nonsense. To the point.

    And that’s where things really started to take off.

    Before and after: “Best sales books”

    What you are about to see is classic SEO on-page optimization at its finest.

    This is what the post originally looked like (and it didn’t rank well for “best sales books).

    And then after…

    And the result…

    Before and after: “Sales operations”

    What we noticed here was a crappy article attempting to explain the role of sales operations.

    Here are the steps we took to rank #1 for “Sales Operations:”

    • Built a super optimized mega guide on the topic.
    • Since the old crappy article had some decent links, we figured let’s 301 redirect it to the new mega guide.
    • Promote it on social, email and normal channels.

    Here’s what the new guide on Sales Ops looks like…

    And the result…

    5) New content opportunities

    One thing I quickly realized Sales Hacker had to its advantage was topical authority. Exploiting this was going to be our secret weapon, and boy, did we do it well: 

    “Cold calling”

    We knew we could win this SERP by creating content that was super actionable and tactical with examples.

    Most of the competing articles in the SERP were definition style and theory-based, or low-value roundups from domains with high authority.

    In this case, DA doesn’t really matter. The better man wins.

    “Best sales tools”

    Because Sales Hacker is an aggregator website, we had the advantage of easily out-ranking vendor websites for best and top queries.

    Of course, it also helps when you build a super helpful mega list of tools. We included over 150+ options to choose from in the list. Whereas SERP competitors did not even come close.

    “Channel sales”

    Notice how Sales Hacker’s article is from 2017 still beats HubSpot’s 2019 version. Why? Because we probably satisfied user intent better than them.

    For this query, we figured out that users really want to know about Direct Sales vs Channel Sales, and how they intersect.

    HubSpot went for the generic, “factory style” Ultimate Guide tactic.

    Don’t get me wrong, it works very well for them (especially with their 91 DA), but here is another example where nailing the user intent wins.

    “Sales excel templates”

    This was pure lead gen gold for us. Everyone loves templates, especially sales excel templates.

    The SERP was easily winnable because the competition was so BORING in their copy. Not only did we build a better content experience, but we used numbers, lists, and power words that salespeople like to see, such as FAST and Pipeline Growth.

    Special note: We never used long intros

    The one trend you’ll notice is that all of our content gets RIGHT TO THE POINT. This is inherently obvious, but we also uncovered it during audience surveying. Salespeople don’t have time for fluff. They need to cut to the chase ASAP, get what they came for, and get back to selling. It’s really that straightforward.

    When you figure out something THAT important to your audience, (like keeping intros short and sweet), and then you continuously leverage it to your advantage, it’s really powerful.

    6) Featured Snippets

    Featured snippets became a huge part of our quest for SERP dominance. Even for SERPs where organic clicks have reduced, we didn’t mind as much because we knew we were getting the snippet and free brand exposure.

    Here are some of the best-featured snippets we got!

    Featured snippet: “Channel sales”

    Featured snippet: “Sales pipeline management”

    Featured snippet: “BANT”

    Featured snippet: “Customer success manager”

    Featured snippet: “How to manage a sales team”

    Featured snippet: “How to get past the gatekeeper”

    Featured snippet: “Sales forecast modeling”

    Featured snippet: “How to build a sales pipeline”

    7) So, why did Sales Hacker get acquired?

    At first, it seems weird. Why would a SaaS company buy a blog? It really comes down to one thing — community (and the leverage you get with it).

    Two learnings from this acquisition are:

    1. It may be worth acquiring a niche media brand in your space

    2. It may be worth starting your own niche media brand in your space

    I feel like most B2B companies (not all, but most) come across as only trying to sell a product — because most of them are. You don’t see the majority of B2B brands doing a good job on social. They don’t know how to market to emotion. They completely ignore top-funnel in many cases and, as a result, get minimal engagement with their content.

    There’s really so many areas of opportunity to exploit in B2B marketing if you know how to leverage that human emotion — it’s easy to stand out if you have a soul. Sales Hacker became that “soul” for Outreach — that voice and community.

    But one final reason why a SaaS company would buy a media brand is to get the edge over a rival competitor. Especially in a niche where two giants are battling over the top spot.

    In this case, it’s Outreach’s good old arch-nemesis, Salesloft. You see, both Outreach and Salesloft are fighting tooth and nail to win a new category called “Sales Engagement”.

    As part of the acquisition process, I prepared a deck that highlighted how beneficial it would be for Outreach to acquire Sales Hacker, purely based on the traffic advantage it would give them over Salesloft.

    Sales Hacker vs. Salesloft vs Outreach — Total organic keywords

    This chart from 2018 (data exported via SEMrush), displays that Sales Hacker is ranking for more total organic keywords than Salesloft and Outreach combined.

    Sales Hacker vs. Salesloft vs Outreach — Estimated traffic cost

    This chart from 2018 (data exported via SEMrush), displays the cost of the organic traffic compared by domain. Sales Hacker ranks for more commercial terms due to having the highest traffic cost.

    Sales Hacker vs. Salesloft vs Outreach — Rank zone distributions

    This chart from 2018 (data exported via SEMrush), displays the rank zone distribution by domain. Sales Hacker ranked for more organic keywords across all search positions.

    Sales Hacker vs. Salesloft vs Outreach — Support vs. demand keywords

    This chart from 2018 (data exported via SEMrush), displays support vs demand keywords by domain. Because Sales Hacker did not have a support portal, all its keywords were inherently demand focused.

    Meanwhile, Outreach was mostly ranking for support keywords at the time. Compared to Salesloft, they were at a massive disadvantage.

    Conclusion

    I wouldn’t be writing this right now without the help, support, and trust that I got from so many people along the way.

    • Joshua Giardino — Lead developer at Sales Hacker, my marketing mentor and older brother I never had. Couldn’t have done this without you!
    • Max Altschuler — Founder of Sales Hacker, and the man who gave me a shot at the big leagues. You built an incredible platform and I am eternally grateful to have been a part of it.
    • Scott Barker — Head of Partnerships at Sales Hacker. Thanks for being in the trenches with me! It’s a pleasure to look back on this wild ride, and wonder how we pulled this off.
    • Alina Benny — My marketing protege. Super proud of your growth! You came into Sales Hacker with no fear and seized the opportunity.
    • Mike King — Founder of iPullRank, and the man who gave me my very first shot in SEO. Thanks for taking a chance on an unproven kid from the Bronx who was always late to work.
    • Yaniv Masjedi — Our phenomenal CMO at Nextiva. Thank you for always believing in me and encouraging me to flex my thought leadership muscle. Your support has enabled me to truly become a high-impact growth marketer.

    Thanks for reading — tell me what you think below in the comments!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


    Moz Blog

    Posted in Latest NewsComments Off

    Sites vulnerable to XSS can be used to phish Googlebot

    How to protect your site from being used in XSS exploits.



    Please visit Search Engine Land for the full article.


    Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

    Posted in Latest NewsComments Off

    Machine Learning Should Be Used to Deliver Great Brand Experiences, Says PagerDuty CEO

    PagerDuty began trading on the New York Stock Exchange for the first time this morning and is now trading at more than 60% above their IPO price of $ 24. That gives the company a market capitalization of more than $ 2.7 billion. PagerDuty offers a SAAS platform that monitors IT performance. The company had sales of $ 118 million for its last fiscal year, up close to 50% over the previous year.

    The company uses machine learning to inform companies in real-time about technical issues. “Our belief is that machine learning and data should be used in the service of making people better, helping people do their jobs more effectively, and delivering those great brand experiences every time,” says PagerDuty CEO Jennifer Tejada. “PagerDuty is really about making sure that our users understand that this could be a good thing, being woken up in the middle of the night if it’s for the right problem. It’s a way that can help you deliver a much better experience for your customers.”

    Jennifer Tejada, CEO of PagerDuty, discusses their IPO and how machine learning should be used to deliver great brand experiences in an interview on CNBC:

    It’s Gotten Harder for Human’s to Manage the Entire IT Ecosystem

    If you think about the world today, it’s an always-on world. We as consumers expect every experience to be perfect. Every time you wake up in the morning, you order your coffee online, you check Slack to communicate with your team, and maybe you take a Lyft into work. Sitting behind all of that is a lot of complexity, many digital and infrastructure based platforms, that don’t always work together the way you’d expect them to. As that complexity has proliferated over the years and because developers can deploy what they like and can use the tools that they want it’s gotten harder for human beings to really manage the entire ecosystem even as your demands increase.

    You want it perfect, you want it right now and you want it the way you’d like it to be. PagerDuty is the platform that brings the right problem to the right person at the right time. We use machine learning, sitting on ten years of data, data on humans behavior and data on all these signals there that are happening through the system, and it really helps the developers that sit behind these great experiences to deliver the right experience all the time.

    Machine Learning Should Be Used to Deliver Great Brand Experiences

    Going public is the right time for us right now because there’s an opportunity for us to deliver the power of our platform to users all over the world. We are a small company and we weren’t as well-known as we could be and this is a great opportunity to extend our brand and help developers and employees across teams and IT security and customer support to deliver better experiences for their end customers all the time.

    At PagerDuty we take customer trust and user trust very seriously. We publish our data policy and we will not use data in a way other than what we describe online. We care deeply about the relationship between our users in our platform. Our belief is that machine learning and data should be used in the service of making people better, helping people do their jobs more effectively, and delivering those great brand experiences every time. PagerDuty is really about making sure that our users understand that this could be a good thing, being woken up in the middle of the night if it’s for the right problem. It’s a way that can help you deliver a much better experience for your customers.

    The post Machine Learning Should Be Used to Deliver Great Brand Experiences, Says PagerDuty CEO appeared first on WebProNews.

    WebProNews

    Posted in Latest NewsComments Off

    Moz’s Link Data Used to Suck… But Not Anymore! The New Link Explorer is Here – Whiteboard Friday

    Posted by randfish

    Earlier this week we launched our brand-new link building tool, and we’re happy to say that Link Explorer addresses and improves upon a lot of the big problems that have plagued our legacy link tool, Open Site Explorer. In today’s Whiteboard Friday, Rand transparently lists out many of the biggest complaints we’ve heard about OSE over the years and explains the vast improvements Link Explorer provides, from DA scores updated daily to historic link data to a huge index of almost five trillion URLs.

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    Click on the whiteboard image above to open a high-resolution version in a new tab!


    Video Transcription

    Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m very excited to say that Moz’s Open Site Explorer product, which had a lot of challenges with it, is finally being retired, and we have a new product, Link Explorer, that’s taking its place. So let me walk you through why and how Moz’s link data for the last few years has really kind of sucked. There’s no two ways about it.

    If you heard me here on Whiteboard Friday, if you watched me at conferences, if you saw me blogging, you’d probably see me saying, “Hey, I personally use Ahrefs, or I use Majestic for my link research.” Moz has a lot of other good tools. The crawler is excellent. Moz Pro is good. But Open Site Explorer was really lagging, and today, that’s not the case. Let me walk you through this.

    The big complaints about OSE/Mozscape

    1. The index was just too small

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    Mozscape was probably about a fifth to a tenth the size of its competitors. While it got a lot of the quality good links of the web, it just didn’t get enough. As SEOs, we need to know all of the links, the good ones and the bad ones.

    2. The data was just too old

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So, in Mozscape, a link that you built on November 1st, you got a link added to a website, you’re very proud of yourself. That’s excellent. You should expect that a link tool should pick that up within maybe a couple weeks, maybe three weeks at the outside. Google is probably picking it up within just a few days, sometimes hours.

    Yet, when Mozscape would crawl that, it would often be a month or more later, and by the time Mozscape processed its index, it could be another 40 days after that, meaning that you could see a 60- to 80-day delay, sometimes even longer, between when your link was built and when Mozscape actually found it. That sucks.

    3. PA/DA scores took forever to update

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    PA/DA scores, likewise, took forever to update because of this link problem. So the index would say, oh, your DA is over here. You’re at 25, and now maybe you’re at 30. But in reality, you’re probably far ahead of that, because you’ve been building a lot of links that Mozscape just hasn’t picked up yet. So this is this lagging indicator. Sometimes there would be links that it just didn’t even know about. So PA and DA just wouldn’t be as accurate or precise as you’d want them to be.

    4. Some scores were really confusing and out of date

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    MozRank and MozTrust relied on essentially the original Google PageRank paper from 1997, which there’s no way that’s what’s being used today. Google certainly uses some view of link equity that’s passed between links that is similar to PageRank, and I think they probably internally call that PageRank, but it looks nothing like what MozRank was called.

    Likewise, MozTrust, way out of date, from a paper in I think 2002 or 2003. Much more advancements in search have happened since then.

    Spam score was also out of date. It used a system that was correlated with what spam looked like three, four years ago, so much more up to date than these two, but really not nearly as sophisticated as what Google is doing today. So we needed to toss those out and find their replacements as well.

    5. There was no way to see links gained and lost over time

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    Mozscape had no way to see gained and lost links over time, and folks thought, “Gosh, these other tools in the SEO space give me this ability to show me links that their index has discovered or links they’ve seen that we’ve lost. I really want that.”

    6. DA didn’t correlate as well as it should have

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So over time, DA became a less and less indicative measure of how well you were performing in Google’s rankings. That needed to change as well. The new DA, by the way, much, much better on this front.

    7. Bulk metrics checking and link reporting was too hard and manual

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So folks would say, “Hey, I have this giant spreadsheet with all my link data. I want to upload that. I want you guys to crawl it. I want to go fetch all your metrics. I want to get DA scores for these hundreds or thousands of websites that I’ve got. How do I do that?” We didn’t provide a good way for you to do that either unless you were willing to write code and loop in our API.

    8. People wanted distribution of their links by DA

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    They wanted distributions of their links by domain authority. Show me where my links come from, yes, but also what sorts of buckets of DA do I have versus my competition? That was also missing.

    So, let me show you what the new Link Explorer has.

    Moz's new Link Explorer

    Click on the whiteboard image above to open a high-resolution version in a new tab!

    Wow, look at that magical board change, and it only took a fraction of a second. Amazing.

    What Link Explorer has done, as compared to the old Open Site Explorer, is pretty exciting. I’m actually very proud of the team. If you know me, you know I am a picky SOB. I usually don’t even like most of the stuff that we put out here, but oh my god, this is quite an incredible product.

    1. Link Explorer has a GIANT index

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So I mentioned index size was a big problem. Link Explorer has got a giant index. Frankly, it’s about 20 times larger than what Open Site Explorer had and, as you can see, very, very competitive with the other services out there. Majestic Fresh says they have about a trillion URLs from their I think it’s the last 60 days. Ahrefs, about 3 trillion. Majestic’s historic, which goes all time, has about 7 trillion, and Moz, just in the last 90 days, which I think is our index — maybe it’s a little shorter than that, 60 days — 4.7 trillion, so almost 5 trillion URLs. Just really, really big. It covers a huge swath of the web, which is great.

    2. All data updates every 24 hours

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So, unlike the old index, it is very fresh. Every time it finds a new link, it updates PA scores and DA scores. The whole interface can show you all the links that it found just yesterday every morning.

    3. DA and PA are tracked daily for every site

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    You don’t have to track them yourself. You don’t have to put them into your campaigns. Every time you go and visit a domain, you will see this graph showing you domain authority over time, which has been awesome.

    For my new company, I’ve been tracking all the links that come in to SparkToro, and I can see my DA rising. It’s really exciting. I put out a good blog post, I get a bunch of links, and my DA goes up the next day. How cool is that?

    4. Old scores are gone, and new scores are polished and high quality

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So we got rid of MozRank and MozTrust, which were very old metrics and, frankly, very few people were using them, and most folks who were using them didn’t really know how to use them. PA basically takes care of both of them. It includes the weight of links that come to you and the trustworthiness. So that makes more sense as a metric.

    Spam score is now on a 0 to 100% risk model instead of the old 0 to 17 flags and the flags correlate to some percentage. So 0 to 100 risk model. Spam score is basically just a machine learning built model against sites that Google penalized or banned.

    So we took a huge amount of domains. We ran their names through Google. If they couldn’t rank for their own name, we said they were penalized. If we did a site: the domain.com and Google had de-indexed them, we said they were banned. Then we built this risk model. So in the 90% that means 90% of sites that had these qualities were penalized or banned. 2% means only 2% did. If you have a 30% spam score, that’s not too bad. If you have a 75% spam score, it’s getting a little sketchy.

    5. Discovered and lost links are available for every site, every day

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So again, for this new startup that I’m doing, I’ve been watching as I get new links and I see where they come from, and then sometimes I’ll reach out on Twitter and say thank you to those folks who are linking to my blog posts and stuff. But it’s very, very cool to see links that I gain and links that I lose every single day. This is a feature that Ahrefs and Majestic have had for a long time, and frankly Moz was behind on this. So I’m very glad that we have it now.

    6. DA is back as a high-quality leading indicator of ranking ability

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So, a note that is important: everyone’s DA has changed. Your DA has changed. My DA has changed. Moz’s DA changed. Google’s DA changed. I think it went from a 98 to a 97. My advice is take a look at yourself versus all your competitors that you’re trying to rank against and use that to benchmark yourself. The old DA was an old model on old data on an old, tiny index. The new one is based on this 4.7 trillion size index. It is much bigger. It is much fresher. It is much more accurate. You can see that in the correlations.

    7. Building link lists, tracking links that you want to acquire, and bulk metrics checking is now easy

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    Building link lists, tracking links that you want to acquire, and bulk metrics checking, which we never had before and, in fact, not a lot of the other tools have this link tracking ability, is now available through possibly my favorite feature in the tool called Link Tracking Lists. If you’ve used Keyword Explorer and you’ve set up your keywords to watch those over time and to build a keyword research set, very, very similar. If you have links you want to acquire, you add them to this list. If you have links that you want to check on, you add them to this list. It will give you all the metrics, and it will tell you: Does this link to your website that you can associate with a list, or does it not? Or does it link to some page on the domain, but maybe not exactly the page that you want? It will tell that too. Pretty cool.

    8. Link distribution by DA

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    Finally, we do now have link distribution by DA. You can find that right on the Overview page at the bottom.

    Look, I’m not saying Link Explorer is the absolute perfect, best product out there, but it’s really, really damn good. I’m incredibly proud of the team. I’m very proud to have this product out there.

    If you’d like, I’ll be writing some more about how we went about building this product and a bunch of agency folks that we spent time with to develop this, and I would like to thank all of them of course. A huge thank you to the Moz team.

    I hope you’ll do me a favor. Check out Link Explorer. I think, very frankly, this team has earned 30 seconds of your time to go check it out.

    Try out Link Explorer!

    All right. Thanks, everyone. We’ll see you again for another edition of Whiteboard Friday. Take care.

    Video transcription by Speechpad.com

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


    Moz Blog

    Posted in Latest NewsComments Off

    Google Confirms Chrome Usage Data Used to Measure Site Speed

    Posted by Tom-Anthony

    During a discussion with Google’s John Mueller at SMX Munich in March, he told me an interesting bit of data about how Google evaluates site speed nowadays. It has gotten a bit of interest from people when I mentioned it at SearchLove San Diego the week after, so I followed up with John to clarify my understanding.

    The short version is that Google is now using performance data aggregated from Chrome users who have opted in as a datapoint in the evaluation of site speed (and as a signal with regards to rankings). This is a positive move (IMHO) as it means we don’t need to treat optimizing site speed for Google as a separate task from optimizing for users.

    Previously, it has not been clear how Google evaluates site speed, and it was generally believed to be measured by Googlebot during its visits — a belief enhanced by the presence of speed charts in Search Console. However, the onset of JavaScript-enabled crawling made it less clear what Google is doing — they obviously want the most realistic data possible, but it’s a hard problem to solve. Googlebot is not built to replicate how actual visitors experience a site, and so as the task of crawling became more complex, it makes sense that Googlebot may not be the best mechanism for this (if it ever was the mechanism).

    In this post, I want to recap the pertinent data around this news quickly and try to understand what this may mean for users.

    Google Search Console

    Firstly, we should clarify our understand of what the “time spent downloading a page” metric in Google Search Console is telling us. Most of us will recognize graphs like this one:

    Until recently, I was unclear about exactly what this graph was telling me. But handily, John Mueller comes to the rescue again with a detailed answer [login required] (hat tip to James Baddiley from Chillisauce.com for bringing this to my attention):

    John clarified what this graph is showing:

    It’s technically not “downloading the page” but rather “receiving data in response to requesting a URL” – it’s not based on rendering the page, it includes all requests made.

    And that it is:

    this is the average over all requests for that day

    Because Google may be fetching a very different set of resources every day when it’s crawling your site, and because this graph does not account for anything to do with page rendering, it is not useful as a measure of the real performance of your site.

    For that reason, John points out that:

    Focusing blindly on that number doesn’t make sense.

    With which I quite agree. The graph can be useful for identifying certain classes of backend issues, but there are also probably better ways for you to do that (e.g. WebPageTest.org, of which I’m a big fan).

    Okay, so now we understand that graph and what it represents, let’s look at the next option: the Google WRS.

    Googlebot & the Web Rendering Service

    Google’s WRS is their headless browser mechanism based on Chrome 41, which is used for things like “Fetch as Googlebot” in Search Console, and is increasingly what Googlebot is using when it crawls pages.

    However, we know that this isn’t how Google evaluates pages because of a Twitter conversation between Aymen Loukil and Google’s Gary Illyes. Aymen wrote up a blog post detailing it at the time, but the important takeaway was that Gary confirmed that WRS is not responsible for evaluating site speed:

    Twitter conversation with Gary Ilyes

    At the time, Gary was unable to clarify what was being used to evaluate site performance (perhaps because the Chrome User Experience Report hadn’t been announced yet). It seems as though things have progressed since then, however. Google is now able to tell us a little more, which takes us on to the Chrome User Experience Report.

    Chrome User Experience Report

    Introduced in October last year, the Chrome User Experience Report “is a public dataset of key user experience metrics for top origins on the web,” whereby “performance data included in the report is from real-world conditions, aggregated from Chrome users who have opted-in to syncing their browsing history and have usage statistic reporting enabled.”

    Essentially, certain Chrome users allow their browser to report back load time metrics to Google. The report currently has a public dataset for the top 1 million+ origins, though I imagine they have data for many more domains than are included in the public data set.

    In March I was at SMX Munich (amazing conference!), where along with a small group of SEOs I had a chat with John Mueller. I asked John about how Google evaluates site speed, given that Gary had clarified it was not the WRS. John was kind enough to shed some light on the situation, but at that point, nothing was published anywhere.

    However, since then, John has confirmed this information in a Google Webmaster Central Hangout [15m30s, in German], where he explains they’re using this data along with some other data sources (he doesn’t say which, though notes that it is in part because the data set does not cover all domains).

    At SMX John also pointed out how Google’s PageSpeed Insights tool now includes data from the Chrome User Experience Report:

    The public dataset of performance data for the top million domains is also available in a public BigQuery project, if you’re into that sort of thing!

    We can’t be sure what all the other factors Google is using are, but we now know they are certainly using this data. As I mentioned above, I also imagine they are using data on more sites than are perhaps provided in the public dataset, but this is not confirmed.

    Pay attention to users

    Importantly, this means that there are changes you can make to your site that Googlebot is not capable of detecting, which are still detected by Google and used as a ranking signal. For example, we know that Googlebot does not support HTTP/2 crawling, but now we know that Google will be able to detect the speed improvements you would get from deploying HTTP/2 for your users.

    The same is true if you were to use service workers for advanced caching behaviors — Googlebot wouldn’t be aware, but users would. There are certainly other such examples.

    Essentially, this means that there’s no longer a reason to worry about pagespeed for Googlebot, and you should instead just focus on improving things for your users. You still need to pay attention to Googlebot for crawling purposes, which is a separate task.

    If you are unsure where to look for site speed advice, then you should look at:

    That’s all for now! If you have questions, please comment here and I’ll do my best! Thanks!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


    Moz Blog

    Posted in Latest NewsComments Off

    How Is Google’s New "Questions and Answers" Feature Being Used? [Case Study]

    Posted by MiriamEllis

    Ever since Google rolled out Questions and Answers in mid-2017, I’ve been trying to get a sense of its reception by consumers and brands. Initially restricted to Android Google Maps, this fascinating feature which enables local business owners and the public to answer consumer questions made it to desktop displays this past December, adding yet another data layer to knowledge panels and local finders.

    As someone who has worked in Q&A forums for the majority of my digital marketing life, I took an immediate shine to the idea of Google Questions and Answers. Here’s a chance, I thought, for consumers and brands to take meaningful communication to a whole new level, exchanging requests, advice, and help so effortlessly. Here’s an opportunity for businesses to place answers to FAQs right upfront in the SERPs, while also capturing new data about consumer needs and desires. So cool!

    But, so far, we seem to be getting off to a slow start. According to a recent, wide-scale GetFiveStars study, 25% of businesses now have questions waiting for them. I decided to hone in on San Francisco and look at 20 busy industries in that city to find out not just how many questions were being asked, but also how many answers were being given, and who was doing the answering. I broke down responders into three groups: Local Guides (LGs), random users (RUs), and owners (Os). I looked at the top 10 businesses ranking in the local finder for each industry:

    Industry Number of Questions Number of Answers LGs RUs Os
    Dentists 1 0 0 0 0
    Plumbers 2 0 - - -
    Chiropractors 0 - - - -
    Mexican Restaurants 10 23 22 1 -
    Italian Restaurants 15 20 19 1 -
    Chinese Restaurants 16 53 49 4 -
    Car Dealers 4 5 3 2 -
    Supermarkets 7 27 24 3 -
    Clothing Stores 4 1 1 - -
    Florists 1 0 - - -
    Hotels 44 142 114 28 -
    Real Estate Agencies 0 - - - -
    General Contractors 1 0 - - -
    Cell Phone Stores 14 3 3 - -
    Yoga Studios 1 0 - - -
    Banks 1 0 - - -
    Carpet Cleaning 0 - - - -
    Hair Salons 1 0 - - -
    Locksmiths 1 0 - - -
    Jewelry Stores 0 - - - -


    Takeaways from the case study

    Here are some patterns and oddities I noticed from looking at 123 questions and 274 answers:

    1. There are more than twice as many answers as questions. While many questions received no answers, others received five, ten, or more.
    2. The Owners column is completely blank. The local businesses I looked at in San Francisco are investing zero effort in answering Google Questions and Answers.
    3. Local Guides are doing the majority of the answering. Of the 274 answers provided, 232 came from users who have been qualified as Local Guides by Google. Why so lopsided? I suspect the answer lies in the fact that Google sends alerts to this group of users when questions get asked, and that they can earn 3 points per answer they give. Acquiring enough points gets you perks like 3 free months of Google Play Music and a 75% discount off Google Play Movies.

      Unfortunately, what I’m seeing in Google Questions and Answers is that incentivizing replies is leading to a knowledge base of questionable quality. How helpful is it when a consumer asks a hotel if they have in-room hair dryers and 10 local guides jump on the bandwagon with “yep”? Worse yet, I saw quite a few local guides replying “I don’t know,” “maybe,” and even “you should call the business and ask.” Here and there, I saw genuinely helpful answers from the Local Guides, but my overall impression didn’t leave me feeling like I’d stumbled upon a new Google resource of matchless expertise.

    4. Some members of the public seem to be confused about the use of this feature. I noticed people using the answer portion to thank people who replied to their query, rather than simply using the thumbs up widget.

      Additionally, I saw people leaving reviews/statements, instead of questions:
      And with a touch of exasperated irony:
      And to rant:

    5. Some industries are clearly generating far more questions than others. Given how people love to talk about hotels and restaurants, I wasn’t surprised to see them topping the charts in sheer volume of questions and answers. What did surprise me was not seeing more questions being asked of businesses like yoga studios, florists, and hair salons; before I actually did the searches, I might have guessed that pleasant, “chatty” places like these would be receiving lots of queries.

    Big brands everywhere are leaving Google Questions and Answers unanswered

    I chose San Francisco for my case study because of its general reputation for being hip to new tech, but just in case my limited focus was presenting a false picture of how local businesses are managing this feature, I did some random searches for big brands around the state and around the country.

    I found questions lacking owner answers for Whole Foods, Sephora, Taco Bell, Macy’s, Denny’s, Cracker Barrel, Target, and T-Mobile. As I looked around the nation, I noted that Walmart has cumulatively garnered thousands of questions with no brand responses.

    But the hands-down winner for a single location lacking official answers is Google in Mountain View. 103 questions as of my lookup and nary an owner answer in sight. Alphabet might want to consider setting a more inspiring example with their own product… unless I’m misunderstanding their vision of how Google Questions and Answers is destined to be used.


    Just what is the vision for Google Questions and Answers, I wonder?

    As I said at the beginning of this post, it’s early days yet to predict ultimate outcomes. Yet, the current lay of the land for this feature has left me with more questions than answers:

    • Does Google actually intend questions to be answered by brands, or by the public? From what I’ve seen, owners are largely unaware of or choosing to ignore this feature many months post-launch. As of writing this, businesses are only alerted about incoming questions if they open the Google Maps app on an Android phone or tablet. There is no desktop GMB dashboard section for the feature. It’s not a recipe for wide adoption. Google has always been a fan of a crowdsourcing approach to their data, so they may not be concerned, but that doesn’t mean your business shouldn’t be.
    • What are the real-time expectations for this feature? I see many users asking questions that needed fast answers, like “are you open now?” while others might support lengthier response times, as in, “I’m planning a trip and want to know what I can walk to from your hotel.” For time-sensitive queries, how does Questions and Answers fit in with Google’s actual chat feature, Google Messaging, also rolled out last summer? Does Google envision different use cases for both features? I wonder if one of the two products will win out over time, while the other gets sunsetted.
    • What are the real, current risks to brands of non-management? I applauded Mike Blumenthal’s smart suggestion of companies proactively populating the feature with known FAQs and providing expert answers, and I can also see the obvious potential for reputation damage if rants or spam are ignored. That being said, my limited exploration of San Francisco has left me wondering just how many people (companies or consumers) are actually paying attention in most industries. Google Knowledge Panels and the Local Finder pop-ups are nearing an information bloat point. Do you want to book something, look at reviews, live chat, see menus, find deals, get driving directions, make a call? Websites are built with multiple pages to cover all of these possible actions. Sticking them all in a 1” box may not equal the best UX I’ve ever seen, if discovery of features is our goal.
    • What is the motivation for consumers to use the product? Personally, I’d be more inclined to just pick up the phone to ask any question to which I need a fast answer. I don’t have the confidence that if I queried Whole Foods in the AM as to whether they’ve gotten in organic avocados from California, there’d be a knowledge panel answer in time for my lunch. Further, some of the questions I’ve asked have received useless answers from the public, which seems like a waste of time for all parties. Maybe if the feature picks up momentum, this will change.
    • Will increasing rates of questions = increasing rates of business responses? According to the GetFiveStars study linked to above, total numbers of questions for the 1700 locations they investigated nearly doubled between November–December of 2017. From my microscopic view of San Francisco, it doesn’t appear to me that the doubling effect also happened for owner answers. Time will tell, but for now, what I’m looking for is question volume reaching such a boiling point that owners feel obligated to jump into management, as they have with reviews. We’re not there yet, but if this feature is a Google keeper, we could get there.

    So what should you be doing about Google Questions and Answers?

    I’m a fan of early adoption where it makes sense. Speculatively, having an active Questions and Answers presence could end up as a ranking signal. We’ve already seen it theorized that use of another Google asset, Google Posts, may impact local pack rankings. Unquestionably, leaving it up to the public to answer questions about your business with varying degrees of accuracy carries the risk of losing leads and muddying your online presence to the detriment of reputation. If a customer asks if your location has wheelchair access and an unmotivated third party says “I don’t know,” when, in fact, your business is fully ADA-compliant, your lack of an answer becomes negative customer service. Because of this, ignoring the feature isn’t really an option. And, while I wouldn’t prioritize management of Questions and Answers over traditional Google-based reviews at this point, I would suggest:

    1. Do a branded search today and look at your knowledge panel to see if you’ve received any questions. If so, answer them in your best style, as helpfully as possible
    2. Spend half an hour this week translating your company’s 5 most common FAQs into Google Questions and Answers queries and then answering them. Be sure you’re logged into your company’s Google account when you reply, so that your message will be officially stamped with the word “owner.” Whether you proactively post your FAQs while logged into your business’ account is up to you. I think it’s more transparent to do so.
    3. If you’re finding this part of your Knowledge Panel isn’t getting any questions, checking it once a week is likely going to be enough for the present.
    4. If you happen to be marketing a business that is seeing some good Questions and Answers activity, and you have the bandwidth, I’d add checking this to the daily social media rounds you make for the purpose of reputation management. I would predict that if Google determines this feature is a keeper, they’ll eventually start sending email alerts when new queries come in, as they’re now doing with reviews, which should make things easier and minimize the risk of losing a customer with an immediate need. Need to go pro on management right now due to question volume? GetFiveStars just launched an incredibly useful Google Q&A monitoring feature, included in some of their ORM software packages. Looks like a winner!
    5. Do be on the lookout for spam inquiries and responses, and report them if they arise.

    If you’re totally new to Google Questions and Answers, this simple infographic will get you going in a flash:

    For further tips on using Google Questions and Answers like a pro, I recommend following GetFiveStars’ 3-part series on this topic.


    My questions, your answers

    My case study is small. Can you help expand our industry’s knowledge base by answering a few questions in the comments to add to the picture of the current rate of adoption/usefulness of Google’s Questions and Answers? Please, let me know:

    1. Have you asked a question using this feature?
    2. Did you receive an answer and was it helpful?
    3. Who answered? The business, a random user, a Local Guide?
    4. Have you come across any examples of business owners doing a good job answering questions?
    5. What are your thoughts on Google Questions and Answers? Is it a winner? Worth your time? Any tips?

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


    Moz Blog

    Posted in Latest NewsComments Off

    New gTLDs are Like Used Cars

    There may be a couple exceptions which prove the rule, but new TLDs are generally an awful investment for everyone except the registry operator.

    Here is the short version…

    And the long version…

    Diminishing Returns

    About a half-decade ago I wrote about how Google devalued domain names from an SEO perspective & there have been a number of leading “category killer” domains which have repeatedly been recycled from startup to acquisition to shut down to PPC park page to buy now for this once in a lifetime opportunity in an endless water cycle.

    The central web platforms are becoming ad heavy, which in turn decreases the reach of anything which is not an advertisement. For the most valuable concepts / markets / keywords ads eat up the entire interface for the first screen full of results. Key markets like hotels might get a second round of vertical ads to further displace the concept of organic results.

    Proprietary, Closed-Ecosystem Roach Motels

    The tech monopolies can only make so much money by stuffing ads onto their own platform. To keep increasing their take they need to increase the types, varieties & formats of media they host and control & keep the attention on their platform.

    Both Google & Facebook are promoting scams where they feed on desperate publishers & suck a copy of the publisher’s content into being hosted by the tech monopoly platform de jour & sprinkle a share of the revenues back to the content sources.

    They may even pay a bit upfront for new content formats, but then after the market is primed the deal shifts to where (once again) almost nobody other than the tech monopoly platform wins.

    The attempt to “own” the web & never let users go is so extreme both companies will make up bogus statistics to promote their proprietary / fake open / actually closed standards.

    If you ignore how Google’s AMP double, triple, or quadruple counts visitors in Google Analytics the visit numbers look appealing.

    But the flip side of those fake metrics is actual revenues do not flow.

    Facebook has the same sort of issues, with frequently needing to restate various metrics while partners fly blind.

    These companies are restructuring society & the race to the bottom to try to make the numbers work in an increasingly unstable & parasitic set of platform choices is destroying adjacent markets:

    Have you tried Angry Birds lately? It’s a swamp of dark patterns. All extractive logic meant to trick you into another in-app payment. It’s the perfect example of what happens when product managers have to squeeze ever-more-growth out of ever-less-fertile lands to hit their targets year after year. … back to the incentives. It’s not just those infused by venture capital timelines and return requirements, but also the likes of tax incentives favoring capital gains over income. … that’s the truly insidious part of the tech lords solution to everything. This fantasy that they will be greeted as liberators. When the new boss is really a lot like the old boss, except the big stick is replaced with the big algorithm. Depersonalizing all punishment but doling it out just the same. … this new world order is being driven by a tiny cabal of monopolies. So commercial dissent is near impossible. … competition is for the little people. Pitting one individual contractor against another in a race to the bottom. Hoarding all the bargaining power at the top. Disparaging any attempts against those at the bottom to organize with unions or otherwise.

    To be a success on the attention platforms you have to push toward the edges. But as you become successful you become a target.

    And the dehumanized “algorithm” is not above politics & public relations.

    Pewdiepie is the biggest success story on the YouTube platform. When he made a video showing some of the absurd aspects of Fiverr it led to a WSJ investigation which “uncovered” a pattern of anti-semitism. And yet one of the reporters who worked on that story wrote far more offensive and anti-semetic tweets. The hypocrisy of the hit job didn’t matter. They still were able to go after Pewdiepie’s ad relationships to cut him off from Disney’s Maker Studios & the premium tier of YouTube ads.

    The fact that he is an individual with broad reach means he’ll still be fine economically, but many other publishers would quickly end up in a death spiral from the above sequence.

    If it can happen to a leading player in a closed ecosystem then the risk to smaller players is even greater.

    In some emerging markets Facebook effectively *is* the Internet.

    The Decline of Exact Match Domains

    Domains have been so devalued (from an SEO perspective) that some names like PaydayLoans.net sell for about $ 3,000 at auction.

    $ 3,000 can sound like a lot to someone with no money, but names like that were going for 6 figures at their peak.

    Professional domain sellers participate in the domain auctions on sites like NameJet & SnapNames. Big keywords like [payday loans] in core trusted extensions are not missed. So if the 98% decline in price were an anomaly, at least one of them would have bid more in that auction.

    Why did exact match domains fall so hard? In part because Google shifted from scoring the web based on links to considering things like brand awareness in rankings. And it is very hard to run a large brand-oriented ad campaign promoting a generically descriptive domain name. Sure there are a few exceptions like Cars.com & Hotels.com, but if you watch much TV you’ll see a lot more ads associated with businesses that are not built on generically descriptive domain names.

    Not all domains have fallen quite that hard in price, but the more into the tail you go the less the domain acts as a memorable differentiator. If the barrier to entry increases, then the justification for spending a lot on a domain name as part of a go to market strategy makes less sense.

    Brandable Names Also Lost Value

    Arguably EMDs have lost more value than brandable domain names, but even brandable names have sharply slid.

    If you go back a decade or two tech startups would secure their name (say Snap.com or Monster.com or such) & then try to build a business on it.

    But in the current marketplace with there being many paths to market, some startups don’t even have a domain name at launch, but begin as iPhone or Android apps.

    Now people try to create success on a good enough, but cheap domain name & then as success comes they buy a better domain name.

    Jelly was recently acquired by Pinterest. Rather than buying jelly.com they were still using AskJelly.com for their core site & Jelly.co for their blog.

    As long as domain redirects work, there’s no reason to spend heavily on a domain name for a highly speculative new project.

    Rather then spending 6 figures on a domain name & then seeing if there is market fit, it is far more common to launch a site on something like getapp.com, joinapp.com, app.io, app.co, businessnameapp.com, etc.

    This in turn means that rather than 10,000s of startups all chasing their core .com domain name off the start, people test whatever is good enough & priced close to $ 10. Then only after they are successful do they try to upgrade to better, more memorable & far more expensive domain names.

    Money isn’t spent on the domain names until the project has already shown market fit.

    One in a thousand startups spending $ 1 million is less than one in three startups spending $ 100,000.

    New TLDs Undifferentiated, Risky & Overpriced

    No Actual Marketing Being Done

    Some of the companies which are registries for new TLDs talk up investing in marketing & differentiation for the new TLDs, but very few of them are doing much on the marketing front.

    You may see their banner ads on domainer blogs & they may even pay for placement with some of the registries, but there isn’t much going on in terms of cultivating a stable ecosystem.

    When Google or Facebook try to enter & dominate a new vertical, the end destination may be extractive rent seeking by a monopoly BUT off the start they are at least willing to shoulder some of the risk & cost upfront to try to build awareness.

    Where are the domain registries who have built successful new businesses on some of their new TLDs? Where are the subsidies offered to key talent to help drive awareness & promote the new strings?

    As far as I know, none of that stuff exists.

    In fact, what is prevalent is the exact opposite.

    Greed-Based Anti-Marketing

    So many of them are short sighted greed-based plays that they do the exact opposite of building an ecosystem … they hold back any domain which potentially might not be complete garbage so they can juice it for a premium ask price in the 10s of thousands of dollars.

    While searching on GoDaddy Auctions for a client project I have seen new TLDs like .link listed for sale for MORE THAN the asking price of similar .org names.

    If those prices had any sort of legitimate foundation then the person asking $ 30,000 for a .link would have bulk bought all the equivalent .net and .org names which are listed for cheaper prices.

    But the prices are based on fantasy & almost nobody is dumb enough to pay those sorts of prices.

    Anyone dumb enough to pay that would be better off buying their own registry rather than a single name.

    The holding back of names is the exact opposite of savvy marketing investment. It means there’s no reason to use the new TLD if you either have to pay through the nose or use a really crappy name nobody will remember.

    I didn’t buy more than 15 of Uniregistry’s domains because all names were reserved in the first place and I didn’t feel like buying 2nd tier domains … Domainers were angry when the first 2 Uniregistry’s New gTLDs (.sexy and .tattoo) came out and all remotely good names were reserved despite Frank saying that Uniregistry would not reserve any domains.

    Who defeats the race to the bottom aspects of the web by starting off from a “we only sell shit” standpoint?

    Nobody.

    And that’s why these new TLDs are a zero.

    Defaults Have Value

    Many online verticals are driven by winner take most monopoly economics. There’s a clear dominant leader in each of these core markets: social, search, short-form video, long-form video, retail, auctions, real estate, job search, classifieds, etc. Some other core markets have consolidated down to 3 or 4 core players who among them own about 50 different brands that attack different parts of the market.

    Almost all the category leading businesses which dominate aggregate usage are on .com domains.

    Contrast the lack of marketing for new TLDs with all the marketing one sees for the .com domain name.

    Local country code domain names & .com are not going anywhere. And both .org and .net are widely used & unlikely to face extreme price increases.

    Hosing The Masses…

    A decade ago domainers were frustrated Verisign increased the price of .com domains in ~ 5% increments:

    Every mom, every pop, every company that holds a domain name had no say in the matter. ICANN basically said to Verisign: “We agree to let you hose the masses if you stop suing us”.

    I don’t necessarily mind paying more for domains so much as I mind the money going to a monopolistic regulator which has historically had little regard for the registrants/registrars it should be serving

    Those 5% or 10% shifts were considered “hosing the masses.”

    Imagine what sort of blowback PIR would get from influential charities if they tried to increase the price of .org domains 30-fold overnight. It would be such a public relations disaster it would never be considered.

    Domain registries are not particularly expensive to run. A person who has a number of them can run each of them for less than the cost of a full time employee – say $ 25,000 to $ 50,00 per year.

    And yet, the very people who complained about Verisign’s benign price increases, monopolistic abuses & rent extraction are now pushing massive price hikes:

    .Hosting and .juegos are going up from about $ 10-$ 20 retail to about $ 300. Other domains will also see price increases.

    Here’s the thing with new TLD pricing: registry operators can increase prices as much as they want with just six months’ notice.

    in its applications, Uniregistry said it planned to enter into a contractual agreement to not increase its prices for five years.

    Why would anyone want to build a commercial enterprise (or anything they care about) on such a shoddy foundation?

    If a person promises…

    • no hold backs of premium domains, then reserves 10s of thousands of domains
    • no price hikes for 5 years, then hikes prices
    • the eventual price hikes being inline with inflation, then hikes prices 3,000%

    That’s 3 strikes and the batter is out.

    Doing the Math

    The claim the new TLDs need more revenues to exist are untrue. Running an extension costs maybe $ 50,000 per year. If a registry operator wanted to build a vibrant & stable ecosystem the first step would be dumping the concept of premium domains to encourage wide usage & adoption.

    There are hundreds of these new TLD extensions and almost none of them can be trusted to be a wise investment when compared against similar names in established extensions like .com, .net, .org & CCTLDs like .co.uk or .fr.

    There’s no renewal price protection & there’s no need, especially as prices on the core TLDs have sharply come down.

    Domain Pricing Trends

    Aggregate stats are somewhat hard to come by as many deals are not reported publicly & many sites which aggregate sales data also list minimum prices.

    However domains have lost value for many reasons

    • declining SEO-related value due to the search results becoming over-run with ads (Google keeps increasing their ad clicks 20% to 30% year over year)
    • broad market consolidation in key markets like travel, ecommerce, search & social
      • Google & Facebook are eating OVER 100% of online advertising growth – the rest of industry is shrinking in aggregate
      • are there any major news sites which haven’t struggled to monetize mobile?
      • there is a reason there are few great indy blogs compared to a decade ago
    • rising technical costs in implementing independent websites (responsive design, HTTPS, AMP, etc.) “Closed platforms increase the chunk size of competition & increase the cost of market entry, so people who have good ideas, it is a lot more expensive for their productivity to be monetized. They also don’t like standardization … it looks like rent seeking behaviors on top of friction” – Gabe Newell
    • harder to break into markets with brand-biased relevancy algorithms (increased chunk size of competition)
    • less value in trying to build a brand on a generic name, which struggles to rank in a landscape of brand-biased algorithms (inability to differentiate while being generically descriptive)
    • decline in PPC park page ad revenues
      • for many years Yahoo! hid the deterioration in their core business by relying heavily on partners for ad click volumes, but after they switched to leveraging Bing search, Microsoft was far more interested with click quality vs click quantity
      • absent the competitive bid from Yahoo!, Google drastically reduced partner payouts
      • most web browsers have replaced web address bars with dual function search boxes, drastically reducing direct navigation traffic

    All the above are the mechanics of “why” prices have been dropping, but it is also worth noting many of the leading portfolios have been sold.

    If the domain aftermarket is as vibrant as some people claim, there’s no way the Marchex portfolio of 200,000+ domains would have sold for only $ 28.1 million a couple years ago.

    RegistrarStats shows .com registrations have stopped growing & other extensions like .net, .org, .biz & .info are now shrinking.

    Both aftermarket domain prices & the pool of registered domains on established gTLDs are dropping.

    I know I’ve dropped hundreds & hundreds of domains over the past year. That might be due to my cynical views of the market, but I did hold many names for a decade or more.

    As barrier to entry increases, many of the legacy domains which could have one day been worth developing have lost much of their value.

    And the picked over new TLDs are an even worse investment due to the near infinite downside potential of price hikes, registries outright folding, etc.

    Into this face of declining value there is a rush of oversupply WITH irrational above-market pricing. And then the registries which spend next to nothing on marketing can’t understand why their great new namespaces went nowhere.

    As much as I cringe at .biz & .info, I’d prefer either of them over just about any new TLD.

    Any baggage they may carry is less than the risk of going with an unproven new extension without any protections whatsoever.

    Losing Faith in the Zimbabwe Dollar

    Who really loses is anyone who read what these domain registry operators wrote & trusted them.

    Uniregistry does not believe that registry fees should rise when the costs of other technology services have uniformly trended downward, simply because a registry operator believes it can extract higher profit from its base of registrants.

    How does one justify a 3000% price hike after stating “Our prices are fixed and only indexed to inflation after 5 years.”

    Are they pricing these names in Zimbabwe Dollars? Or did they just change their minds in a way that hurt anyone who trusted them & invested in their ecosystem?

    Frank Schilling warned about the dangers of lifting price controls

    The combination of “presumptive renewal” and the “lifting of price controls on registry services” is incredibly dangerous.
    Imagine buying a home, taking on a large mortgage, remodeling, moving in, only to be informed 6 months later that your property taxes will go up 10,000% with no better services offered by local government. The government doesn’t care if you can’t pay your tax/mortgage because they don’t really want you to pay your tax… they want you to abandon your home so they can take your property and resell it to a higher payer for more money, pocketing the difference themselves, leaving you with nothing.

    This agreement as written leaves the door open to exactly that type of scenario

    He didn’t believe the practice to be poor.

    Rather he felt he would have been made poorer, unless he was the person doing it:

    It would be the mother of all Internet tragedies and a crippling blow to ICANN’s relevance if millions of pioneering registrants were taxed out of their internet homes as a result of the greed of one registry and the benign neglect, apathy or tacit support of its master.

    It is a highly nuanced position.

    Categories: 

    SEO Book

    More Articles

    Posted in Latest NewsComments Off

    New take on Showcase Shopping ads? Categories of used items showing for retailer outlet queries

    Similar to the Showcase ad format introduced this summer, the ads link to Google Shopping pages.

    The post New take on Showcase Shopping ads? Categories of used items showing for retailer outlet queries appeared first on Search Engine Land.



    Please visit Search Engine Land for the full article.


    Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

    Related Articles

    Posted in Latest NewsComments Off

    Case Study: How Two Artists Used Online Content to Build their Face to Face Business

    Content Marketing Case Studies | copyblogger.com

    Colorado-based artists, Lori Wostl and Lorri Flint, noticed that when they attended huge art retreats, the experience was more stressful than relaxing.

    So, they founded their business — Art Camp for Women — in order to provide a fun, supportive, relaxing camp adventure for their participants.

    They started a blog in order to help them market their camps, and got a sweet surprise a few years later when a huge national magazine called to offer them some amazing exposure.

    Let’s talk to Lorri and Lori to find out more about their business, and how content marketing helps them reach their professional goals.

    What’s your business?

    We are Lori Wostl and Lorri Flint of Art Camp for Women. We run mixed-media art retreats for women, and we write online about creativity, art techniques, and mixed-media artists.

    Who are your customers and readers, and how do you serve them?

    We have a small but growing niche of women who are interested in mixed-media art (and in attending an art retreat). 

Our “campers” are women generally over 40 who have time and money to spend on themselves.

    We’ve got women in our community from all walks of life — we have career women, retired women, empty-nesters, women who have recently recovered from cancer, women who are widowed or divorced … and everything in between!

    Was there a pressing problem you were trying to solve?

    Our business, Art Camp for Women, started because we wanted to provide an intimate retreat environment and a relaxing experience for our campers. ACFW is all-inclusive, and we provide chef-prepared meals, comfortable and cozy lodging, leading-edge art instruction, art supplies, and daily wine and chocolate.

    We provide art retreats on a scale that allows our campers to meet and relate to everyone at the camp — including well-known artist-teachers.

    We personally have attended “Big Box” retreats in the past with hundreds of attendees. Although the art instruction we received was amazing, our overall experience wasn’t great.

    Unless you went with a friend, you were completely on your own in the evenings — often in a hotel in a strange city, without a car.  We also had bring all our own art supplies, which meant lugging incredibly heavy bags through security and on the airplane.

    We also had to arrange and pay for our own meals, which meant eating unhealthy and expensive food in our hotel.

    Overall, the experience was exhausting. We are fit and healthy women, and we had to come home and rest up after going on our “retreats.”

     

    So we started ACWF because wanted to provide retreats where women could expand their art, be inspired by their surroundings, meet women from all over North America, and be rejuvenated — not become masters of the logistics of travel, lodging, food, and art supplies.

     

    What kinds of online content are most important to your business?

    On the Art Camp for Women blog, we publish art journaling prompts, free tutorials, interviews, organizing tips, and mixed-media art projects.

    We use Pinterest to pin artwork, organizing tips for artists, architecture, travel, art exhibits, and photos from our Art Camps.

    We use our Facebook page to direct people to our blog posts, and share links and resources from other writers and artists.  

    We also run an email newsletter that features our blog content, regular contests for our community, and special offers.

    In the last few years, we have joined a lot of different art groups online, trying to get the name of Art Camp for Women out into various communities. We also read and comment on lots of different blogs in the art community, which has helped us build relationships and market our camps.

    What resources or tools did you find most helpful when you were getting started?

    We took some classes on blogging and WordPress right at the beginning, and we knew that content marketing would be an important part of our marketing strategy.

    We also took some business-building classes with our local Chamber of Commerce.  

    This past fall, we signed up for Danny Iny’s Guest Posting course. By putting in a lot of work into doing guest posts, we have seen a spike in our web traffic, and increased our mailing list by more than 200%.

    Were you always a business owner, or did you have a more traditional career before you started this business?

    We had both worked in the corporate world as executives and trainers, and we have each had our own (different) businesses before we started ACFW.  The traditional careers were fine at the time, but there’s no going back for us at this point.  We like to make our own decisions and change direction quickly if we need to — flexibility is a top priority for us.

    What were some of your tipping points or “a-ha!” moments? How did they come about?

     

    In the fall of 2011, we were were working at our computers when we received a life-changing phone call. It was the photography editor from Oprah Magazine. They were doing a feature story about self-expression, and wanted to include our Art Camps in the story. Oprah has 3.8 million readers a month, so we were thrilled.

    Three months after that call, Art Camp for Women was the first item listed in the February 2012 cover story, “Express Yourself! You from A to Z.” 

    Our web traffic went from fewer than 100 hits a month (and sometimes far less) to more than 500 hits a day. In the first two weeks after the magazine was released, we had 4,000 hits on our site, and the average length of a visit on our site was over three minutes.

    Oprah magazine’s editors called us because we had a great website, and because we were findable in the search engines. And when they called, we were ready.

    Since then, we’ve seen a lot more diversity in the women interested in coming to Art Camp. We’ve also had better teachers and artists interested in working with us. The experience also raised our confidence quite a bit — we really felt like we were playing on a whole different level.

    Our only regret is that it would have been great to have some professional photographs ready — we sent the O editors the photos we had, but they didn’t use them.

    What does your business look like today, and what’s next for you?

    We have always been a business that operates in the black, and we have no company debt.  

    In the fall of 2012, we organized an online campaign for 2013 to both acquaint us with possible Art Camp teachers, and to grow our mailing list.

    Our biggest business goal is to keep increasing the number of fully-attended Art Camps we run each year. We’re also expanding our camp locations, and and we’ll be doing camps in the tropics and in Europe.

    We’re focusing on building our blog audience and our email mailing list.

    Personally, we want occupations that contribute to the demographic of our choice, (women artists and art lovers) with a comfortable income and flexible working hours. We also have a huge commitment to having fun while always learning something new.

     

    What advice would you give to bloggers and content creators who are trying to build an online audience?

    Build a viable mailing list and use it.

    Follow-up — stay in regular and timely contact with your list.

    Always say yes to an opportunity and then figure out how to do it.

    Don’t be afraid to give away tutorial information and actual (physical) gifts. It is a low-cost way to build your mailing list and grow your following.

    Be willing to drop something that doesn’t work — even if it’s your favorite part.

    After every event, or at regularly scheduled times, evaluate what worked and what didn’t. Make sure to do your evaluation in terms of dollars — not just emotions.

    Make your photographs as professional as possible. You never know when Oprah may come calling!

    About the Author: Beth Hayden is a Senior Staff Writer for Copyblogger Media. Get more from Beth on Twitter and Pinterest.

    Related Stories

    Copyblogger

    Posted in Latest NewsComments Off

    Mozscape in the Wild: How The API is (and Could be) Used

    Posted by Ryan_Watson

    Did you know that there are over 90 billion URLs are packed into our Mozscape API? That’s a lot of links. So many links, in fact, that it can be daunting to dream up all of the many ways that you could put those links to good use. When we originally built Linkscape (the predecessor to Mozscape), we mainly had one thing in mind… SEO and backlinks.

    But there’s a whole lot more than that.

    Links are only the beginning, it’s what those links can tell us that’s so darn interesting. Which is why I wanted to call out all of the amazing ways that developers (and marketers) are using Mozscape data to better their work, as well as encourage new uses of Mozscape data that have yet to be explored. (Feel free to jump in and create your own API key any time.)

    How Mozscape is Being Used Today

    Mozscape's wealth of links can be used in a variety of ways: from SEO audits, to domain valuations, to excel integration. Here at Moz, we have only begun to scratch the surface of how we can utilize the API. We currently use it to run some of our own tools such as Open Site Explorer and the Mozbar.

    But I don't want to focus on the way we use it. Let's take a look at the way other developers have demonstrated some exciting uses for Mozscape. Hopefully these will get your mind going, thinking up other ways to use the data as well.

    SEO Audits

    We’ll start with the most obvious of use cases, SEO audits. There quite a few examples of SEO audit tools that use Mozscape data, but a few of our favorites (that are in front of a paywall) are the HubSpot Website Grader and The Found SEO Audit Tool, both of which bring the heat.

    Mozscape data is what powers things like the total pages indexed by search, MozRank, a list of the most authoritative pages, along with their corresponding anchor texts. The beauty of this use case is that it can provide a great lead-gen funnel for all of the SEO agencies out there, proving value up front with an email address required prior to running the report. As a digital marketing agency, using Mozscape data to develop a site audit is a great way to get users into your sales funnel. You know, that inbound marketing stuff — cold calls are old news.

    Domain Valuation

    How valuable is a website, purely from an online authority perspective? Traditionally, that was a very tough question. You could look at things like site traffic (which typically isn’t very accurate) or rankings for certain terms, but that’s a far-sighted approach to the question. Think about using the metrics behind Mozscape, like MozRank, Domain Authority, and MozTrust instead. Flippa, for example, uses Mozscape data as a datapoint for due diligence.

    You could imagine this kind of domain valuation anywhere else domains are bought or sold, most of which have yet to use Mozscape data. The value, of course, is providing as much confidence to the buyers of web properties based on the “web footprint” of the site.

    Spreadsheet Kung-Fu

    The spreadsheet kung-fu of this industry is unmatched anywhere else. With the integration of Mozscape data to Excel, some have been able to make Excel sing. The beauty of using Excel for analyzing Mozscape data is that you can slice and dice as you please, without setting up complex API calls. Perhaps our favorite example of Excel comes from the illustrious Richard Baxter, with the Links API Extension from SEO Gadget.

    However, if Google Docs are more up your alley, the amazing Aleyda Solis created just the thing for you (so did Chris Lee). Tools like these allow the average marketer to dig into the firehose of data available through the API in a simple and recognizable interface.

    Client Reporting

    Yes, that's right. iAcquire uses the data when creating client reports as it not only helps them to inform the client about how their pages are doing but to also show the importance of certain pages on their site. The data is both a research tool and an education tool.

    "Below is a screenshot from a ranking research report showing data we gathered for the keyword 'inbound marketing tips.' Moz stats are represented throughout the stats columns. As we work with these reports we are able to see if any of our content distribution efforts resulted in links on page or domain as can be seen in the far left columns."

    iAcquire ranking research report

    How Mozscape Could Be Used

    That’s how Mozscape is being used today, but it’s only the tip of the iceberg. A few folks have realized the potential outside of the traditional use cases that I’ve mentioned above. The power of the data comes when we take Mozscape data outside of its traditional context of pure link evaluation. Let me show you what I mean.

    Link Building

    Its relatively easy to imagine Mozscape's data being used for link building. With Mozscape's massive amount of link data, SEOs are able to prioritize their link building efforts, and focus on value added efforts.

    CRM

    You could imagine that some of the examples noted above have been used for link building, but what about a deeper integration into a contact manager? Something that would allow the user to prioritize outreach by the value of a domain.

    Just as one can do with the Klout score (or Social Authority) on Twitter, the same can be done for customer relationship efforts in filtering Domain Authority to determine importance.

    Top Lists

    We’ve seen hints of blogs using Mozscape data determine a top startup list, like the GeekWire 200, but the same could be applied for any rankings list of web properties.

    Traditionally, lists have used Alexa or Compete traffic data to determine web prominence, but they’re so inaccurate. Other lists have used social specific metrics like social followings, but those too fall short. Geekwire’s list of the top 200 startups in Seattle uses a blend of both social and web data (External links, MozTrust) to determine just how influential a site is, providing the full picture.

    How Could You Use the API?

    I’m sure we've missed a ton of ideas, so we’re calling on you to help us find those new opportunities for Mozscape. Things like a tightening relationship between links and social networks, and categorizing link sources. How would you use this data, and how would you build it? Better yet, why not create your key and get going? 

    We want to make it easy for you.

    We've been working quite hard to make our indexes faster and have recently updated our Mozscape API documentation. We want to make it as simple for you to use the data to get your idea up an running as possible.

    Plus, if you create something, it's likely we'll get you added to our app gallery. We have everything from large corporations to individuals who have used the API and we show off their work in the gallery.

    We'd love to hear from you. Obviously we always encourage folks to jump in and check out the free API (as well as the paid), and use the data for something useful for you. We're also quite open to hearing about ways we can improve our own tools with the data or help educate people better. I look forward to reading through your feedback and seeing if there are ways we can help get people started using Mozscape.

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


    SEOmoz Daily SEO Blog

    Posted in Latest NewsComments Off

    Advert