Tag Archive | "Speed"

Efficient Link Reclamation: How to Speed Up & Scale Your Efforts

Posted by DarrenKingman

Link reclamation: Tools, tools everywhere

Every link builder, over time, starts to narrow down their favorite tactics and techniques. Link reclamation is pretty much my numero-uno. In my experience, it’s one of the best ROI activities we can use for gaining links particularly to the homepage, simply because the hard work — the “mention” (in whatever form that is) — is already there. That mention could be of your brand, an influencer who works there, or a tagline from a piece of content you’ve produced, whether it’s an image asset, video, etc. That’s the hard part. But with it done, and after a little hunting and vetting the right mentions, you’re just left with the outreach.

Aside from the effort-to-return ratio, there are various other benefits to link reclamation:

  1. It’s something you can start right away without assets
  2. It’s a low risk/low investment form of link building
  3. Nearly all brands have unlinked mentions, but big brands tend to have the most and therefore see the biggest routine returns
  4. If you’re doing this for clients, they get to see an instant return on their investment

Link reclamation isn’t a new tactic, but it is becoming more complex and tool providers are out there helping us to optimize our efforts. In this post, I’m going to talk a little about those tools and how to apply them to speed up and scale your link reclamation.

Finding mentions

Firstly, we want to find mentions. No point getting too fancy at this stage, so we just head over to trusty Google and search for the range of mentions we’re working on.

As I described earlier, these mentions can come in a variety of shapes and sizes, so I would generally treat each type of mention that I’m looking for as a separate project. For example, if Moz were the site I was working on, I would look for mentions of the brand and create that as one “project,” then look for mentions of Followerwonk and treat that as another, and so on. The reasons why will become clear later on!

So, we head to the almighty Google and start our searches.

To help speed things up it’s best to expand your search result to gather as many URLs as you can in as few clicks as possible. Using Google’s Search Settings, you can quickly max out your SERPs to one hundred results, or you can install a plugin like GInfinity, which allows you to infinitely scroll through the results and grab as many as you can before your hand cramps up.

Now we want to start copying as many of these results as possible into an Excel sheet, or wherever it is you’ll be working from. Clicking each one and copying/pasting is hell, so another tool to quickly install for Chrome is Linkclump. With this one, you’ll be able to right click, drag, and copy as many URLs as you want.

Linkclump Pro Tip: To ensure you don’t copy the page titles and cache data from a SERP, head over to your Linkclump settings by right-clicking the extension icon and selecting “options.” Then, edit your actions to include “URLs only” and “copied to clipboard.” This will make the next part of the process much easier!

Filtering your URL list

Now we’ve got a bunch of URLs, we want to do a little filtering, so we know a) the DA of these domains as a proxy metric to qualify mentions, and b) whether or not they already link to us.

How you do this bit will depend on which platforms you have access to. I would recommend using BuzzStream as it combines a few of the future processes in one place, but URL Profiler can also be used before transferring your list over to some alternative tools.

Using BuzzStream

If you’re going down this road, BuzzStream can pretty much handle the filtering for you once you’ve uploaded your list of URLs. The system will crawl through the URLs and use their API to display Domain Authority, as well as tell you if the page already links to you or not.

The first thing you’ll want to do is create a “project” for each type of mention you’re sourcing. As I mentioned earlier this could be “brand mentions,” “creative content,” “founder mentions,” etc.

When adding your “New Project,” be sure to include the domain URL for the site you’re building links to, as shown below. BuzzStream will then go through and crawl your list of URLs and flag any that are already linking to you, so you can filter them out.

Next, we need to get your list of URLs imported. In the Websites view, use Add Websites and select “Add from List of URLs”:

The next steps are really easy: Upload your list of URLs, then ensure you select “Websites and Links” because we want BuzzStream to retrieve the link data for us.

Once you’ve added them, BuzzStream will work through the list and start displaying all the relevant data for you to filter through in the Link Monitoring tab. You can then sort by: link status (after hitting “Check Backlinks” and having added your URL), DA, and relationship stage to see if you/a colleague have ever been in touch with the writer (especially useful if you/your team uses BuzzStream for outreach like we do at Builtvisible).

Using URL Profiler

If you’re using URL Profiler, firstly, make sure you’ve set up URL Profiler to work with your Moz API. You don’t need a paid Moz account to do this, but having one will give you more than 500 checks per day on the URLs you and the team are pushing through.

Then, take the list of URLs you’ve copied using Linkclump from the SERPs (I’ve just copied the top 10 from the news vertical for “moz.com” as my search), then paste the URLs in the list. You’ll need to select “Moz” in the Domain Level Data section (see screenshot) and also fill out the “Domain to Check” with your preferred URL string (I’ve put “Moz.com” to capture any links to secure, non-secure, alternative subdomains and deeper level URLs).

Once you’ve set URL Profiler running, you’ll get a pretty intimidating spreadsheet, which can simply be cut right down to the columns: URL, Target URL and Domain Mozscape Domain Authority. Filter out any rows that have returned a value in the Target URL column (essentially filtering out any that found an HREF link to your domain), and any remaining rows with a DA lower than your benchmark for links (if you work with one).

And there’s my list of URLs that we now know:

1) don’t have any links to our target domain,

2) have a reference to the domain we’re working on, and

3) boast a DA above 40.

Qualify your list

Now that you’ve got a list of URLs that fit your criteria, we need to do a little manual qualification. But, we’re going to use some trusty tools to make it easy for us!

The key insight we’re looking for during our qualification is if the mention is in a natural linking element of the page. It’s important to avoid contacting sites where the mention is only in the title, as they’ll never place the link. We particularly want placements in the body copy as these are natural link locations and so increase the likelihood of your efforts leading somewhere.

So from my list of URLs, I’ll copy the list and head over to URLopener.com (now bought by 10bestseo.com presumably because it’s such an awesome tool) and paste in my list before asking it to open all the URLs for me:

Now, one by one, I can quickly scan the URLs and look for mentions in the right places (i.e. is the mention in the copy, is it in the headline, or is it used anywhere else where a link might not look natural?).

When we see something like this (below), we’re making sure to add this URL to our final outreach list:

However, when we see this (again, below), we’re probably stripping the URL out of our list as there’s very little chance the author/webmaster will add a link in such a prominent and unusual part of the page:

The idea is to finish up with a list of unlinked mentions in spots where a link would fit naturally for the publisher. We don’t want to get in touch with everyone, with mentions all over the place, as it can harm your future relationships. Link building needs to make sense, and not just for Google. If you’re working in a niche that mentions your client, you likely want not only to get a link but also build a relationship with this writer — it could lead to 5 links further down the line.

Getting email addresses

Now that you’ve got a list of URLs that all feature your brand/client, and you’ve qualified this list to ensure they are all unlinked and have mentions in places that make sense for a link, we need to do the most time-consuming part: finding email addresses.

To continue expanding our spreadsheet, we’re going to need to know the contact details of the writer or webmaster to request our link from. To continue our theme of efficiency, we just want to get the two most important details: email address and first name.

Getting the first name is usually pretty straightforward and there’s not really a need to automate this. However, finding email addresses could be an entirely separate article in itself, so I’ll be brief and get to the point. Read this, and here’s a summary of places to look and the tools I use:

  • Author page
  • Author’s personal website
  • Author’s Twitter profile
  • Rapportive & Email Permutator
  • Allmytweets
  • Journalisted.com
  • Mail Tester

More recently, we’ve been also using Skrapp.io. It’s a LinkedIn extension (like Hunter.io) that installs a “Find Email” button on LinkedIn with a percentage of accuracy. This can often be used with Mail Tester to discover if the suggested email address provided is working or not.

It’s likely to be a combination of these tools that helps you navigate finding a contact’s email address. Once we have it, we need to get in touch — at scale!

Pro Tip: When using Allmytweets, if you’re finding that searches for “email” or “contact” aren’t working, try “dot.” Usually journalists don’t put their full email address on public profiles in a scrapeable format, so they use “me@gmail [dot] com” to get around it.

Making contact

So, because this is all about making the process efficient, I’m not going to repeat or try to build on the other already useful articles that provide templates for outreach (there is one below, but that’s just as an example!). However, I am going to show you how to scale your outreach and follow-ups.

Mail merges

If you and your team aren’t set in your ways with a particular paid tool, your best bet for optimizing scale is going to be a mail merge. There are a number of them out there, and honestly, they are all fairly similar with either varying levels of free emails per day before you have to pay, or they charge from the get-go. However, for the costs we’re talking about and the time it saves, building a business case to either convince yourself (freelancers) or your finance department (everyone else!) will be a walk in the park.

I’ve been a fan of Contact Monkey for some time, mainly for tracking open rates, but their mail merge product is also part of the $ 10-a-month package. It’s a great deal. However, if you’re after something a bit more specific, YAMM is free to a point (for personal Gmail accounts) and can send up to 50 emails a day.

You’ll likely need to work through the process with the whatever tool you pick but, using your spreadsheet, you’ll be able to specify which fields you want the mail merge to select from, and it’ll insert each element into the email.

For link reclamation, this is really as personable as you need to get — no lengthy paragraphs on how much you loved or how long you’ve been following them on Twitter, just a good old to the point email:

Hi [first name],

I recently found a mention of a company I work with in one of your articles.

Here’s the article:

Where you’ve mentioned our company, Moz, would you be able to provide a link back to the domain Moz.com, in case users would like to know more about us?

Many thanks,
Darren.

If using BuzzStream

Although BuzzStream’s mail merge options are pretty similar to the process above, the best “above and beyond” feature that BuzzStream has is that you can schedule in follow up emails as well. So, if you didn’t hear back the first time, after a week or so their software will automatically do a little follow-up, which in my experience, often leads to the best results.

When you’re ready to start sending emails, select the project you’ve set up. In the “Websites” section, select “Outreach.” Here, you can set up a sequence, which will send your initial email as well as customized follow-ups.

Using the same extremely brief template as above, I’ve inserted my dynamic fields to pull in from my data set and set up two follow up emails due to send if I don’t hear back within the next 4 days (BuzzStream hooks up with my email through Outlook and can monitor if I receive an email from this person or not).

Each project can now use templates set up for the type of mention you’re following up. By using pre-set templates, you can create one for brand mention, influencers, or creative projects to further save you time. Good times.

I really hope this has been useful for beginners and seasoned link reclamation pros alike. If you have any other tools you use that people may find useful or have any questions, please do let us know below.

Thanks everyone!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Google Confirms Chrome Usage Data Used to Measure Site Speed

Posted by Tom-Anthony

During a discussion with Google’s John Mueller at SMX Munich in March, he told me an interesting bit of data about how Google evaluates site speed nowadays. It has gotten a bit of interest from people when I mentioned it at SearchLove San Diego the week after, so I followed up with John to clarify my understanding.

The short version is that Google is now using performance data aggregated from Chrome users who have opted in as a datapoint in the evaluation of site speed (and as a signal with regards to rankings). This is a positive move (IMHO) as it means we don’t need to treat optimizing site speed for Google as a separate task from optimizing for users.

Previously, it has not been clear how Google evaluates site speed, and it was generally believed to be measured by Googlebot during its visits — a belief enhanced by the presence of speed charts in Search Console. However, the onset of JavaScript-enabled crawling made it less clear what Google is doing — they obviously want the most realistic data possible, but it’s a hard problem to solve. Googlebot is not built to replicate how actual visitors experience a site, and so as the task of crawling became more complex, it makes sense that Googlebot may not be the best mechanism for this (if it ever was the mechanism).

In this post, I want to recap the pertinent data around this news quickly and try to understand what this may mean for users.

Google Search Console

Firstly, we should clarify our understand of what the “time spent downloading a page” metric in Google Search Console is telling us. Most of us will recognize graphs like this one:

Until recently, I was unclear about exactly what this graph was telling me. But handily, John Mueller comes to the rescue again with a detailed answer [login required] (hat tip to James Baddiley from Chillisauce.com for bringing this to my attention):

John clarified what this graph is showing:

It’s technically not “downloading the page” but rather “receiving data in response to requesting a URL” – it’s not based on rendering the page, it includes all requests made.

And that it is:

this is the average over all requests for that day

Because Google may be fetching a very different set of resources every day when it’s crawling your site, and because this graph does not account for anything to do with page rendering, it is not useful as a measure of the real performance of your site.

For that reason, John points out that:

Focusing blindly on that number doesn’t make sense.

With which I quite agree. The graph can be useful for identifying certain classes of backend issues, but there are also probably better ways for you to do that (e.g. WebPageTest.org, of which I’m a big fan).

Okay, so now we understand that graph and what it represents, let’s look at the next option: the Google WRS.

Googlebot & the Web Rendering Service

Google’s WRS is their headless browser mechanism based on Chrome 41, which is used for things like “Fetch as Googlebot” in Search Console, and is increasingly what Googlebot is using when it crawls pages.

However, we know that this isn’t how Google evaluates pages because of a Twitter conversation between Aymen Loukil and Google’s Gary Illyes. Aymen wrote up a blog post detailing it at the time, but the important takeaway was that Gary confirmed that WRS is not responsible for evaluating site speed:

Twitter conversation with Gary Ilyes

At the time, Gary was unable to clarify what was being used to evaluate site performance (perhaps because the Chrome User Experience Report hadn’t been announced yet). It seems as though things have progressed since then, however. Google is now able to tell us a little more, which takes us on to the Chrome User Experience Report.

Chrome User Experience Report

Introduced in October last year, the Chrome User Experience Report “is a public dataset of key user experience metrics for top origins on the web,” whereby “performance data included in the report is from real-world conditions, aggregated from Chrome users who have opted-in to syncing their browsing history and have usage statistic reporting enabled.”

Essentially, certain Chrome users allow their browser to report back load time metrics to Google. The report currently has a public dataset for the top 1 million+ origins, though I imagine they have data for many more domains than are included in the public data set.

In March I was at SMX Munich (amazing conference!), where along with a small group of SEOs I had a chat with John Mueller. I asked John about how Google evaluates site speed, given that Gary had clarified it was not the WRS. John was kind enough to shed some light on the situation, but at that point, nothing was published anywhere.

However, since then, John has confirmed this information in a Google Webmaster Central Hangout [15m30s, in German], where he explains they’re using this data along with some other data sources (he doesn’t say which, though notes that it is in part because the data set does not cover all domains).

At SMX John also pointed out how Google’s PageSpeed Insights tool now includes data from the Chrome User Experience Report:

The public dataset of performance data for the top million domains is also available in a public BigQuery project, if you’re into that sort of thing!

We can’t be sure what all the other factors Google is using are, but we now know they are certainly using this data. As I mentioned above, I also imagine they are using data on more sites than are perhaps provided in the public dataset, but this is not confirmed.

Pay attention to users

Importantly, this means that there are changes you can make to your site that Googlebot is not capable of detecting, which are still detected by Google and used as a ranking signal. For example, we know that Googlebot does not support HTTP/2 crawling, but now we know that Google will be able to detect the speed improvements you would get from deploying HTTP/2 for your users.

The same is true if you were to use service workers for advanced caching behaviors — Googlebot wouldn’t be aware, but users would. There are certainly other such examples.

Essentially, this means that there’s no longer a reason to worry about pagespeed for Googlebot, and you should instead just focus on improving things for your users. You still need to pay attention to Googlebot for crawling purposes, which is a separate task.

If you are unsure where to look for site speed advice, then you should look at:

That’s all for now! If you have questions, please comment here and I’ll do my best! Thanks!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

SearchCap: EU domains at risk, mobile page speed & search pictures

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: EU domains at risk, mobile page speed & search pictures appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Top Marketing News: Content Marketing Trends, Google Checks Your Speed & YouTube Live-Streaming Updates

Content Trends 2018: BuzzSumo Research Report
A new report for BuzzSumo shows that social sharing is down 50% year-over-year, LinkedIn is becoming a leader in social engagement and more. BuzzSumo

YouTube Adds New Live-Streaming Tools, Including Monetization Options. Social Media Today
YouTube recently added automatic english language subtitles for live-streaming videos, along with the ability to replay live-streamed videos and chat simultaneously and more.  Social Media Today

Google Releases Mobile Scorecard & Impact Calculator Tools To Illustrate Importance Of Mobile Page Speed
Google continues to reinforce the importance of mobile page speed. The search engine recently released tools to help website admins and marketers alike better understand why speed matters on mobile.  Search Engine Land

We Regret to Inform You That Vero Is Bad [Updated]
Although Vero was supposed to be the better Instagram, but that doesn’t seem to be the case in light of functionality issues that have come to the surface. Gizmodo

Making your first AMP Story: Google’s answer to Snapchat and Instagram
Google recently announced AMP Stories, a new format similar to Snapchat and Instagram Stories, implemented via a new accelerated mobile pages (AMP) component. Search Engine Land

Marketing and IT Departments Need to Get In Sync to Best Capitalize on Mobile Technology
According to recent research from Adobe, technologies like augmented reality , virtual reality and artificial intelligence are poised to help accelerate the mobile evolution. AdWeek

Google Confirms “Edge Cases” When Content Theft Can Cause Negative Effects
Search Engine Journal reports: “Google has updated the search results pages by adding breadcrumbs to the top of the page. The breadcrumbs are triggered by informational search queries and are accompanied by images.” Search Engine Journal

Media Buyers: Snapchat Is Focused On Enabling Commerce In Ads
Look out Pinterest in Instagram — Snapchat is making a play for eCommerce advertisers. According to Digiday: “Snapchat is working on developing new commerce units to bolster its e-commerce offering.” Digiday

Twitter’s Rolling Out its New ‘Bookmarks’ Feature to All Users
Twitter announced that it’s rolling out its new ‘Bookmarks’ feature, which serve as an alternative to liking a Tweet you want to view later. Social Media Today

Facebook Rolls Out Job Posts To Become The Blue-collar Linkedin
Facebook is trying to compete with LinkedIn in the job market, and they’re starting with skilled workers. TechCrunch

GDPR Study Shows 65% Of Companies Unable To Comply
MediaPost reports: “Data Applications provider Solix Technologies released the results Tuesday of a survey outlining the General Data Protection Regulation (GDPR) readiness assessment, revealing that the majority of organizations are not prepared for May 2018 GDPR enforcement.”  MediaPost

 

On the Lighter Side:
Skippy Goes Smooth With Mark Ronson Soundtrack To Promote Its Creamy Peanut Butter – The Drum
Lacoste’s Iconic Crocodile Makes Room for 10 Endangered Species on Brand’s Polo Shirts – AdWeek

TopRank Marketing (And Clients) In the News:
TopRank Marketing Blog - The 50 Best Business & Marketing Blogs – Detailed
Steve Slater - Word of Mouth Marketing: How to Create a Strategy for Social Media Buzz & Skyrocket Referral Sales - BigCommerce
Rachel Miller and Lee Odden - Top Influencers to engage with ahead of the #SMMW18 conference – Onalytica
Lee Odden - Top 100 Digital Marketers 2018 – Brand24
Alex Rynne (LinkedIn) and Lee Odden – 7 Things Learned from Attending B2BMX – Cassie Ciopryna
Lee Odden – Humanizing Marketing —Takeaways from #B2BMX 2018 – Tabitha Adams
Cherwell Software – Cherwell Software wins the 2018 Killer Content Award! – Alison Munn

We’ll be back next week with more digital marketing news! If you need more in the meantime, follow @TopRank on Twitter or subscribe to our YouTube channel.

The post Top Marketing News: Content Marketing Trends, Google Checks Your Speed & YouTube Live-Streaming Updates appeared first on Online Marketing Blog – TopRank®.

Online Marketing Blog – TopRank®

Posted in Latest NewsComments Off

SearchCap: Google Speed Update, schema reviews & PPC artificial intelligence

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google Speed Update, schema reviews & PPC artificial intelligence appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Images: Your easiest page speed win

Page speed is important for both rankings and user experience, yet columnist Kristine Schachinger notes that many companies are missing an easy opportunity to improve in this area: image optimization.

The post Images: Your easiest page speed win appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

6 SEO Friendly Tips to Improve Site Speed on WordPress Blogs

"If a page takes more than a couple of seconds to load, users will instantly hit the back button and move on." – Loren Baker

In the world of SEO, user experience on websites has always been a factor, as has the time it takes for a site to load.

However, with the use of mobile devices surpassing desktop use (in most consumer-facing industries) and the wide adoption of broadband, people expect sites to load instantly.

Long gone are the days of waiting 10 seconds for a site to load.

If a page takes more than a couple of seconds to load, users will instantly hit the back button and move on to the next result.

Accordingly, Google officially started paying attention to site speed and declared its importance as a factor in rankings.

In order to keep up with Google’s site-ranking measures, WordPress blog users need to know exactly what they can do to improve their own site speed.

Remember when Google rolled out AMP (accelerated mobile pages)?

They now serve up publisher content in a simplified Google hosted experience that renders superfast. I like AMP from a user perspective because I know that AMP content will load incredibly fast on my mobile device, but as a publisher:

I’d rather speed up my blog and attract traffic directly to my site than have users stay on Google.

If you use StudioPress Sites or the Rainmaker Platform, your site will already load quickly. However, adding ad scripts, featured images, tracking codes, 301 redirects, etc. will slow down the loading of a site and increase demand on your server/hosting company.

Here are six simple tips I recommend since we used them to dramatically speed up the Search Engine Journal (SEJ) load time — it’s at 1.8 seconds!

1. Use a content delivery network

A content delivery network (CDN) is a group of servers that deliver web pages and other content according to the location of the user, the webpage origin, and its server.

It can handle heavy traffic and speeds up the delivery of content to different users.

For WordPress blogs looking to improve site speed, Cloudflare is a great tool to consider. Cloudflare offers a free content delivery network that speeds up the performance of your site and optimizes it for efficiency on any device.

It also offers security services that help protect websites from crawlers, bots, and other attackers.

2. Compress your images

Another effective way to reduce page-load time and increase site speed is by compressing your images. A CDN will help with this, but it doesn’t take care of 100 percent of the job.

There are several different plugins available that compress all the images on your website — and even compress new images as you upload them as well.

ShortPixel is a WordPress plugin that allows you to compress both new and old images on your blog. We use it on SEJ and various other sites, and absolutely love it.

It allows you to quickly compress images in batches for greater convenience, reduces the time it takes to do backups, and ensures all your processed files are kept safe and secure. The best part about it is that your image quality stays the same, regardless of the size of the image.

Other image-compression plugins also maintain the quality of your pictures and improve site speed.

3. Prevent ad scripts and pop-ups from slowing down the user experience

Many web pages today contain some form of third-party script that either runs ads for revenue or uses pop-ups to promote conversion. You want to build your audience and get more customers of course, but balance is key here.

Although it’s difficult to completely get rid of them to improve your site speed, you can tame their performance impact while keeping them on your website to provide their intended benefits.

The trick is to first identify the third-party scripts that run on your site, where they come from, and how they impact your blog.

You can use different real-time monitoring tools that track and identify which scripts delay your site-loading time and affect your site metrics.

One of my favorite tools to do this is Pingdom’s Website Speed Test, because it breaks down each file and script, and tells you which takes the most time to load.

The same rule applies for pop-up plugins that you add on to your site.

Knowing which ones work best to improve conversions and bring in email signups allows you to gauge which plugins to keep and which ones to uninstall.

One of the fastest pop-up plugins on the market is OptinMonster (a StudioPress partner). Its founder, Syed Balkhi, is a WordPress expert who stays on top of factors like site speed and overall user experience.

4. Install a caching plugin

Another effective way to reduce site-loading time is by installing caching plugins to your WordPress blog.

Caching plugins work by creating a static version of your WordPress blog and delivering it to your site users and visitors, which conveniently cuts your page-loading time in half.

Several cache plugins work best for WordPress, such as WP Super Cache and W3 Total Cache.

These plugins are easy to install and can be disabled anytime. They allow you to select certain pages on your blog (or all of them) to cache, and offer many other content compression settings that you can turn on or off.

WordPress supports many other plugins that allow you to optimize your blog to get rid of any latency in page-load time. It is important to test out these plugins to find the one that works best for you.

5. Disable plugins you don’t use

Tons of WordPress plugins can also make your site super slow, especially ones you don’t need.

It is important to review the plugins you have installed in the past and disable those that offer no significant value.

Many WordPress users install different plugins when they first create their blogs to enhance how they look, but realize over time that great-looking blogs don’t always attract traffic, especially if your page-loading time is slow.

Also, I would highly recommend making sure your plugins are updated. This may help improve page-load speed, but more importantly, it makes your site more secure.

6. Add one more layer of media optimization

One thing we realized at SEJ when speeding up the site was that even after optimizing images, ad scripts, and caching, there were still multiple forms of media that slowed down load time.

The internal fixes we implemented did not help with third-party media load times, such as embedded Twitter, YouTube, and Instagram content, or infographics from other sites.

One solution we found to assist with that is BJ Lazy Load. Essentially, this lazy-load plugin renders all written content first, then as the user scrolls down the page, images and other forms of media load. This way, the user doesn’t have to wait for tons of media to load before reading the main content.

What I really like about BJ Lazy Load is that in addition to images, it also lazy loads all embeds, iFrames, and YouTube videos. For a WordPress blog that uses a lot of embeds, it was ideal for us.

Bonus tip: ask your web host for help

If you run a WordPress blog or WordPress-powered site, then you should work with a hosting company that specializes in WordPress, such as WP Engine, Presslabs, or Rainmaker’s own Synthesis.

I’ve worked with all three, and one thing I can absolutely tell you is that if you contact them and ask how your site can be sped up, they will help you because the faster your site is, the less the load is on their servers.

As more and more people turn to mobile devices to access the internet, it is essential to optimize your blogs for mobile use and find ways to minimize page-loading time.

Remember, bounce rates increase when your page-load time is slow, which impacts whether or not your content gets read or skipped for other sites that load pages faster.

The post 6 SEO Friendly Tips to Improve Site Speed on WordPress Blogs appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

SearchCap: Bing Speed Test, Google Maps Driver Mode & Google Confusion

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Bing Speed Test, Google Maps Driver Mode & Google Confusion appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

SearchCap: Google Search Console, Apple Maps & Panda & Penguin Speed

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google Search Console, Apple Maps & Panda & Penguin Speed appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

How Website Speed Actually Impacts Search Ranking

Posted by Zoompf

Google uses a multitude of factors to determine how to rank search engine results. Typically, these factors are either related to the content of a webpage itself (the text, its URL, the titles and headers, etc.) or were measurements of the authenticity of the website itself (age of the domain name, number and quality of inbound links, etc.). However, in 2010, Google did something very different. Google announced website speed would begin having an impact on search ranking. Now, the speed at which someone could view the content from a search result would be a factor.

Unfortunately, the exact definition of “site speed” remained open to speculation. The mystery widened further in June, when Google’s Matt Cutts announced that slow-performing mobile sites would soon be penalized in search rankings as well.

Clearly Google is increasingly acting upon what is intuitively obvious: A poor performing website results in a poor user experience, and sites with poor user experiences deserve less promotion in search results. But what is Google measuring? And how does that play into search engine rankings? Matt Peters, data scientist at Moz, asked Zoompf to help find the answers.

Disclaimer

While Google has been intentionally unclear in which particular aspect of page speed impacts search ranking, they have been quite clear in stating that content relevancy remains king. So, in other words, while we can demonstrate a correlation (or lack thereof) between particular speed metrics and search ranking, we can never outright prove a causality relationship, since other unmeasurable factors are still at play. Still, in large enough scale, we make the assumption that any discovered correlations are a “probable influence” on search ranking and thus worthy of consideration.

Methodology

To begin our research, we worked with Matt to create a list of of 2,000 random search queries from the 2013 Ranking Factors study. We selected a representative sample of queries, some with as few as one search term (“hdtv”), others as long as five (“oklahoma city outlet mall stores”) and everything in between. We then extracted the top 50 ranked search result URLs for each query, assembling a list of 100,000 total pages to evaluate.

Next, we launched 30 Amazon “small” EC2 instances running in the Northern Virginia cloud, each loaded with an identical private instance of the open source tool WebPageTest. This tool uses the same web browser versions used by consumers at large to collect over 40 different performance measurements about how a webpage loads. We selected Chrome for our test, and ran each tested page with an empty cache to guarantee consistent results.

While we’ll summarize the results below, if you want to check out the data for yourself you can download the entire result set here.

Results

While we captured over 40 different page metrics for each URL examined, most did not show any significant influence on search ranking. This was largely expected, as (for example) the number of connections a web browser uses to load a page should likely not impact search ranking position. For the purposes of brevity, in this section we will just highlight the particularly noteworthy results. Again, please consult the raw performance data if you wish to examine it for additional factors.

Page load time

When people say”page load time” for a website, they usually mean one of two measurements: “document complete” time or “fully rendered” time. Think of document complete time as the time it takes a page to load before you can start clicking or entering data. All the content might not be there yet, but you can interact with the page. Think of fully rendered time as the time it takes to download and display all images, advertisements, and analytic trackers. This is all the “background stuff” you see fill in as you’re scrolling through a page.

Since Google was not clear on what page load time means, we examined both the effects of both document complete and fully rendered on search rankings. However our biggest surprise came from the lack of correlation of two key metrics! We expected, if anything, these 2 metrics would clearly have an impact on search ranking. However, our data shows no clear correlation between document complete or fully rendered times with search engine rank, as you can see in the graph below:

The horizontal axis measures the position of a page in the search results, while the vertical axis is the median time captured across all 2,000 different search terms used in the study. So in other words, if you plugged all 2,000 search terms into Google one by one and then clicked the first result for each, we’d measure the page load time of each of those pages, then calculate the median and plot at position 1. Then repeat for the second result, and third, and on and on until you hit 50.

We would expect this graph to have a clear “up and to the right” trend, as highly ranked pages should have a lower document complete or fully rendered time. Indeed, page rendering has a proven link to user satisfaction and sales conversions (we’ll get into that later), but surprisingly we could not find a clear correlation to ranking in this case.

Time to first byte

With no correlation between search ranking and what is traditionally thought of a “page load time” we expanded our search to the Time to First Byte (TTFB). This metric captures how long it takes your browser to receive the first byte of a response from a web server when you request a particular URL. In other words, this metric encompasses the network latency of sending your request to the web server, the amount of time the web server spent processing and generating a response, and amount of time it took to send the first byte of that response back from the server to your browser. The graph of median TTFB for each search rank position is shown below:

The TTFB result was surprising in a clear correlation was identified between decreasing search rank and increasing time to first byte. Sites that have a lower TTFB respond faster and have higher search result rankings than slower sites with a higher TTFB. Of all the data we captured, the TTFB metric had the strongest correlation effect, implying a high likelihood of some level of influence on search ranking.

Page size

The surprising result here was with the the median size of each web page, in bytes, relative to the search ranking position. By “page size,” we mean all of the bytes that were downloaded to fully render the page, including all the images, ads, third party widgets, and fonts. When we graphed the median page size for each search rank position, we found a counterintuitive correlation of decreasing page size to decreasing page rank, with an anomalous dip in the top 3 ranks.

This result confounded us at first, as we didn’t anticipate any real relationship here. Upon further speculation, though, we had a theory: lower ranking sites often belong to smaller companies with fewer resources, and consequently may have less content and complexity in their sites. As rankings increase, so does the complexity, with the exception of the “big boys” at the top who have extra budget to highly optimize their offerings. Think Amazon.com vs. an SMB electronics retailer vs. a mom-and-pop shop. We really have no proof of this theory, but it fits both the data and our own intuition.

Total image content

Since our analysis of the total page size surprised us, we decided to examine the median size, in bytes, of all images loaded for each page, relative to the search rank position. Other then a sharp spike in the first two rankings, the results are flat and uninteresting across all remaining rankings.

While we didn’t expect a strong level of correlation here we did expected some level of correlation, as sites with more images do load more slowly. Since this metric is tied closely to the fully rendered time mentioned above, the fact that this is equally flat supports the findings that page load time is likely not currently impacting search ranking.

What does this mean?

Our data shows there is no correlation between “page load time” (either document complete or fully rendered) and ranking on Google’s search results page. This is true not only for generic searches (one or two keywords) but also for “long tail” searches (4 or 5 keywords) as well. We did not see websites with faster page load times ranking higher than websites with slower page load times in any consistent fashion. If Page Load Time is a factor in search engine rankings, it is being lost in the noise of other factors. We had hoped to see some correlation especially for generic one- or two-word queries. Our belief was that the high competition for generic searches would make smaller factors like page speed stand out more. This was not the case.

However, our data shows there is a correlation between lower time-to-first-byte (TTFB) metrics and higher search engine rankings. Websites with servers and back-end infrastructure that could quickly deliver web content had a higher search ranking than those that were slower. This means that, despite conventional wisdom, it is back-end website performance and not front-end website performance that directly impacts a website’s search engine ranking. The question is, why?

TTFB is likely the quickest and easiest metric for Google to capture. Google’s various crawlers will all be able to take this measurement. Collecting document complete or fully rendered times requires a full browser. Additionally, document complete and fully rendered times depend almost as much on the capabilities of the browser loading the page as they do on the design, structure, and content of the website. Using TTFB to determine the “performance” or “speed” could perhaps be explainable by the increased time and effort required to capture such data from the Google crawler. We suspect over time, though, that page rendering time will also factor into rankings due to the high indication of the importance of user experience.

Not only is TTFB easy to calculate, but it is also a reasonable metric to gauge the performance of an entire site. TTFB is affected by 3 factors:

  1. The network latency between a visitor and the server.
  2. How heavily loaded the web server is.
  3. How quickly the website’s back end can generate the content.

Websites can lower network latency by utilizing Content Distribution Networks (CDNs). CDNs can quickly deliver content to all visitors, often regardless of geographic location, in a greatly accelerated manner. Of course, the very reason these websites are ranked so highly could be the reason they need to have high capacity servers, or utilize CDNs, or optimize their application or database layers.

Tail wagging the dog?

Do these websites rank highly because they have better back-end infrastructure than other sites? Or do they need better back-end infrastructure to handle the load of ALREADY being ranked higher? While both are possible, our conclusion is that sites with faster back ends receive a higher rank, and not the other way around.

We based this conclusion on the fact that highly specific queries with four or five search terms are not returning results for highly trafficked websites. This long tail of searches is typically smaller sites run by much smaller companies about very specific topics that don’t receive the large volumes of traffic that necessitate complex environments of dozens of servers. However, even for these smaller sites, fast websites with lower TTFB are consistently ranked higher than slower websites with higher TTFB.

Takeaways

The back-end performance of a website directly impacts search engine ranking. The back end includes the web servers, their network connections, the use of CDNs, and the back-end application and database servers. Website owners should explore ways to improve their TTFB. This includes using CDNs, optimizing your application code, optimizing database queries, and ensuring you have fast and responsive web servers. Start by measuring your TTFB with a tool like WebPageTest, as well as the TTFB of your competitors, to see how you need to improve.

While we have found that front-end web performance factors (“document complete” and “fully rendered” times) do not directly factor into search engine rankings, it would be a mistake to assume they are not important or that they don’t effect search engine rankings in another way. At its core, front-end performance is focused on creating a fast, responsive, enjoyable user experience. There is literally a decade of research from usability experts and analysts on how web performance affects user experience. Fast websites have more visitors, who visit more pages, for longer period of times, who come back more often, and are more likely to purchase products or click ads. In short, faster websites make users happy, and happy users promote your website through linking and sharing. All of these things contribute to improving search engine rankings. If you’d like to see what specific front-end web performance problems you have, Zoompf’s free web performance report is a great place to start.

As we have seen, back-end performance and TTFB directly correlate to search engine ranking. Front-end performance and metrics like “document loaded” and “fully rendered” show no correlation with search engine rank. It is possible that the effects are too small to detect relative to all the other ranking factors. However, as we have explained, front-end performance directly impacts the user experience, and a good user experience facilitates the type of linking and sharing behavior which does improve search engine rankings. If you care about your search engine rankings, and the experience of your users, you should be improving both the front-end and back-end performance of your website. In our next blog post, we will discuss simple ways to optimize the performance of both the front and back ends of a website.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off


Advert