Tag Archive | "Myths"

3 Content Marketing Myths and Their Reality-Based Solutions

We all know that creating content can be hard work. One of our goals at Copyblogger is to help you make sure you’re putting your work into the right things, so you get results and not just a fistful of disappointment. This week, we looked at three myths and mistakes that can hold writers back
Read More…

The post 3 Content Marketing Myths and Their Reality-Based Solutions appeared first on Copyblogger.


Copyblogger

Related Articles

Posted in Latest NewsComments Off

The 7 Citation Building Myths Plaguing Local SEO

Posted by JoyHawkins

Previously, I wrote an article unveiling some of the most common myths I see in the Local SEO space. I thought I’d do a follow-up that specifically talked about the myths pertaining to citations that I commonly hear from both small business owners and SEOs alike.

Myth #1: If your citations don’t include your suite number, you should stop everything you’re doing and fix this ASAP.

Truth: Google doesn’t even recognize suite numbers for a whopping majority of Google business listings. Even though you enter a suite number in Google My Business, it doesn’t translate into the “Suite #” field in Google MapMaker — it simply gets eliminated. Google also pays more attention to the location (pin) marker of the business when it comes to determining the actual location and less to the actual words people enter in as the address, as there can be multiple ways to name a street address. Google’s Possum update recently introduced a filter for search queries that is based on location. We’ve seen this has to do with the address itself and how close other businesses in the same industry are to your location. Whether or not you have a suite number in Google My Business has nothing to do with it.

Darren Shaw from Whitespark, an expert on everything related to citations, says:

“You often can’t control the suite number on your citations. Some sites force the suite number to appear before the address, some after the address, some with a # symbol, some with “Ste,” and others with “Suite.” If minor discrepancies like these in your citations affected your citation consistency or negatively impacted your rankings, then everyone would have a problem.”

In summary, if your citations look great but are missing the suite number, move along. There are most likely more important things you could be spending time on that would actually impact your ranking.

Myth #2: Minor differences in your business name in citations are a big deal.

Truth: Say your business name is “State Farm: Bob Smith,” yet one citation lists you as “Bob Smith Insurance” and another as “Bob Smith State Farm.” As Mike Blumenthal states: “Put a little trust in the algorithm.” If Google was incapable of realizing that those 3 names are really the same business (especially when their address & phone number are identical), we’d have a big problem on our hands. There would be so many duplicate listings on Google we wouldn’t even begin to be able to keep track. Currently, I only generally see a lot of duplicates if there are major discrepancies in the address and phone number.

Darren Shaw also agrees on this:

“I see this all the time with law firms. Every time a new partner joins the firm or leaves the firm, they change their name. A firm can change from “Fletcher, McDonald, & Jones” to “Fletcher, Jones, & Smith” to “Fletcher Family Law” over the course of 3 years, and as long as the phone number and address stay the same, it will have no negative impact on their rankings. Google triangulates the data it finds on the web by three data points: name, address, and phone number. If two of these are a match, and then the name is a partial match, Google will have no problem associating those citations with the correct listing in GMB.”

Myth #3: NAP cleanup should involve fixing your listings on hundreds of sites.

Truth: SEO companies use this as a scare tactic, and it works very well. They have a small business pay them for citation cleanup. They’ll do a scan of your incorrect data and send you a list of hundreds of directories that have your information wrong. This causes you to gasp and panic and instantly realize you must hire them to spend hours cleaning all this up, as it must be causing the ranking of your listing on Google to tank.

Let’s dive into an example that I’ve seen. Local.com is a site that feeds to hundreds of smaller directories on newspaper sites. If you have a listing wrong on Local.com, it might appear that your listing is incorrect on hundreds of directories. For example, these three listings are on different domains, but if you look at the pages they’re identical and they all say “Local.com” at the top:

http://directory.hawaiitribune-herald.com/profile?listingid=108895814

http://directory.lufkindailynews.com/profile?listingid=108895814

http://flbiz.oscnewsgazette.com/profile?listingid=108895814

Should this cause you to panic? No. Fixing it on Local.com itself should fix all the hundreds of other places. Even if it didn’t, Google hasn’t even indexed any of these URLs. (Note: they might index my examples since I just linked to them in this Moz article, so I’m including some screenshots from while I was writing this):

If Google hasn’t even indexed the content, it’s a good sign that the content doesn’t mean much and it’s nothing you should stress about. Google would have no incentive or reason to index all these different URLs due to the fact that the content on them is literally the same. Additionally, no one links to them (aside from me in this article, of course).

As Darren Shaw puts it,

“This one really irks me. There are WAY more important things for you to spend your time/money on than trying to fix a listing on a site like scranton.myyellowpageclassifieds.biz. Chances are, any attempt to update this listing would be futile anyway, because small sites like these are basically unmanaged. They’re collecting their $ 200/m in Adsense revenue and don’t have any interest in dealing with or responding to any listing update requests. In our Citation Audit and Cleanup service we offer two packages. One covers the top 30 sites + 5 industry/city-specific sites, and the other covers the top 50 sites + 5 industry/city-specific sites. These are sites that are actually important and valuable to local search. Audit and cleanup on sites beyond these is generally a waste of time and money.”

Myth #4: There’s no risk in cancelling an automated citation service.

People often wonder what might happen to their NAP issues if they cancel their subscription with a company like Yext or Moz Local. Although these companies don’t do anything to intentionally cause old data to come back, there have been some recent interesting findings around what actually happens when you cancel.

Truth: In one case, Phil Rozek did a little case study for a business that had to cancel Moz Local recently. The good news is that although staying with them is generally a good decision, this business didn’t seem to have any major issues after cancelling.

Yext claims on their site that they don’t do anything to push the old data back that was previously wrong. They explain that when you cancel, “the lock that was put in place to protect the business listing is no longer present. Once this occurs, the business listing is subject to the normal compilation process at the search engine, online directory, mobile app, or social network. In fact, because Yext no longer has this lock in place, Yext has no control over the listing directly at all, and the business listing data will now act as it normally would occur without Yext.”

Nyagoslav Zhekov just recently published a study on cancelling Yext and concluded that most of the listings either disappear or revert back to their previous incorrect state after cancelling. It seems that Yext acts as a sort of cover on top of the listing, and once Yext is cancelled, that cover is removed. So, there does seem to be some risk with cancelling Yext.

In summary, there is definitely a risk when you decide to cancel an ongoing automated service that was previously in place to correct your citations. It’s important for people to realize that if they decide to do this, they might want to budget for some manual citation building/cleanup in case any issues arise.

Myth #5: Citation building is the only type of link building strategy you need to succeed at Local SEO.

Many Local SEO companies have the impression that citation building is the only type of backlinking strategy needed for small businesses to rank well in the 3-pack. According to this survey that Bright Local did, 72% of Local SEOs use citation building as a way of building links.

Truth: Local SEO Guide found in their Local Search Ranking Factors study that although citations are important, if that’s the only backlinking strategy you’re using, you’re most likely not going to rank well in competitive markets. They found also found that links are the key competitive differentiator even when it comes to Google My Business Rankings. So if you’re in a competitive industry or market and want to dominate the 3-pack, you need to look into additional backlinking strategies over and above citations.

Darren adds more clarity to the survey’s results by stating,

“They’re saying that citations are still very important, but they are a foundational tactic. You absolutely need a core base of citations to gain trust at Google, and if you don’t have them you don’t have a chance in hell at ranking, but they are no longer a competitive difference maker. Once you have the core 50 or so citations squared away, building more and more citations probably isn’t what your local SEO campaign needs to move the needle further.”

Myth #6: Citations for unrelated industries should be ignored if they share the same phone number.

This was a question that has come up a number of times with our team. If you have a restaurant that once had a phone number but then closes its doors, and a new law firm opens up down the street and gets assigned that phone number, should the lawyer worry about all the listings that exist for the restaurant (since they’re in different industries)?

Truth: I reached out to Nyagoslav Zhekov, the Director of Local Search at Whitespark, to get the truth on this one. His response was:

“As Google tries to mimic real-life experiences, sooner or later this negative experience will result in some sort of algorithmic downgrading of the information by Google. If Google manages to figure out that a lot of customers look for and call a phone number that they think belongs to another business, it is logical that it will result in negative user experience. Thus, Google will assign a lower trust score to a Google Maps business record that offers information that does not clearly and unquestionably belong to the business for which the record is. Keeping in mind that the phone number is, by design and by default, the most unique and the most standardized information for a business (everything else is less standardize-able than the phone number), this is, as far as I am concerned, the most important information bit and the most significant identifier Google uses when determining how trustworthy particular information for a business is.”

He also pointed out that users finding the phone number for the restaurant and calling it continually would be a negative experience for both the customer and the law firm (who would have to continually confirm they’re not a restaurant) so there would be added benefit in getting these listings for the restaurant marked closed or removed.

Since Darren Shaw gave me so much input for this article, he also wanted to add a seventh myth that he comes across regularly:

Myth #7: Google My Business is a citation.

“This one is maybe more of a mis-labelling problem than a myth, but your listing at Google isn’t really a citation. At Whitespark we refer to Google, Bing, and Apple Maps as ‘Core Search Engines’ (yes, Yahoo has been demoted to just a citation). The word ‘citation’ comes from the concept of ‘citing’ your sources in an academic paper. Using this conceptual framework, you can think of your Google listing as the academic paper, and all of your listings out on the web as the sources that cite the business. Your Google listing is like the queen bee and all the citations out there are the workers contributing to keep the queen bee alive and healthy.”

Hopefully that lays some of the fears and myths around citations to rest. If you have questions or ideas of other myths on this topic, we’d love to hear about it in the comments!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Weird, Crazy Myths About Link Building in SEO You Should Probably Ignore – Whiteboard Friday

Posted by randfish

The rules of link building aren’t always black and white, and getting it wrong can sometimes result in frustrating consequences. But where’s the benefit in following rules that don’t actually exist? In today’s Whiteboard Friday, Rand addresses eight of the big link building myths making their rounds across the web.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about some of the weird and crazy myths that have popped up around link building. We’ve actually been seeing them in the comments of some of our blog posts and Whiteboard Fridays and Q&A. So I figured, hey, let’s try and set the record straight here.

1. Never get links from sites with a lower domain authority than your own

What? No, that is a terrible idea. Domain authority, just to be totally clear, it’s a machine learning system that we built here at Moz. It takes and looks at all the metrics. It builds the best correlation it can against Google’s rankings across a broad set of keywords, similar to the MozCast 10K. Then it’s trying to represent, all other things being equal and just based on raw link authority, how well would this site perform against other sites in Google’s rankings for a random keyword? That does not in any way suggest whether it is a quality website that gives good editorial links, that Google is likely to count, that are going to give you great ranking ability, that are going to send good traffic to you. None of those things are taken into account with domain authority.

So when you’re doing link building, I think DA can be a decent sorting function, just like Spam Score can. But those two metrics don’t mean that something is necessarily a terrible place or a great place to get a link from. Yes, it tends to be the case that links from 80- or 90-plus DA sites tend to be very good, because those sites tend to give a lot of authority. It tends to be the case that links from sub-10 or 20 tend to not add that much value and maybe fail to have a high Spam Score. You might want to look more closely at them before deciding whether you should get a link.

But new websites that have just popped up or sites that have very few links or local links, that is just fine. If they are high-quality sites that give out links editorially and they link to other good places, you shouldn’t fret or worry that just because their DA is low, they’re going to provide no value or low value or hurt you. None of those things are the case.

2. Never get links from any directories

I know where this one comes from. We have talked a bunch about how low-quality directories, SEO-focused directories, paid link directories tend to be very bad places to get links from. Google has penalized not just a lot of those directories, but many of the sites whose link profiles come heavily from those types of domains.

However, lots and lots of resource lists, link lists, and directories are also of great quality. For example, I searched for a list of Portland bars — Portland, Oregon, of course known for their amazing watering holes. I found PDX Monthly’s list of Portland’s best bars and taverns. What do you know? It’s a directory. It’s a total directory of bars and taverns in Portland. Would you not want to be on there if you were a bar in Portland? Of course, you would want to be on there. You definitely want those. There’s no question. Give me that link, man. That is a great freaking link. I totally want it.

This is really about using your good judgment and about saying there’s a difference between SEO and paid link directories and a directory that lists good, authentic sites because it’s a resource. You should definitely get links from the latter, not so much from the former.

3. Don’t get links too fast or you’ll get penalized

Let’s try and think about this. Like Google has some sort of penalty line where they look at, “Oh, well, look at that. We see in August, Rand got 17 links. He was under at 15 in July, but then he got 17 links in August. That is too fast. We’re going to penalize him.”

No, this is definitely not the case. I think what is the case, and Google has filed some patent applications around this in the past with spam, is that a pattern of low-quality links or spammy-looking links that are coming at a certain pace may trigger Google to take a more close look at a site’s link profile or at their link practices and could trigger a penalty.

Yes. If you are doing sketchy, grey hat/black hat link building with your private networks, your link buys, and your swapping schemes, and all these kinds of things, yeah, it’s probably the case that if you get them too fast, you’ll trip over some sort of filter that Google has got. But if you’re doing the kind of link building that we generally recommend here on Whiteboard Friday and at Moz more broadly, you don’t have risk here. I would not stress about this at all. So long as your links are coming from good places, don’t worry about the pace of them. There’s no such thing as too fast.

4. Don’t link out to other sites, or you’ll leak link equity, or link juice, or PageRank

…or whatever it is. I really like this illustration of the guys who are like, “My link juice. No!” This is just crap.

All right, again, it’s a myth rooted in some fact. Historically, a long time ago, PageRank used to flow in a certain way, and it was the case that if a page had lots of links pointing out from it, that if I had four links, that a quarter each of the PageRank that this page could pass would go to each of them. So if I added one more, oh, now that’s one-fifth, then that becomes one-fifth, and that becomes one-fifth. This is old, old, old-school SEO. This is not the way things are anymore.

PageRank is not the only piece of ranking algorithmic goodness that Google is using in their systems. You should not be afraid of linking out. You should not be afraid of linking out without a “nofollow” link. You, in fact, should link out. Linking out is not only correlated with higher rankings. There have also been a bunch of studies and research suggesting that there’s something causal going on, because when followed links were added to pages, those pages actually outranked their non-link-carrying brethren in a bunch of tests. I’ll try and link to that test in the Whiteboard Friday. But regardless to say, don’t stress about this.

5. Variations in anchor text should be kept to precise proportions

So this idea that essentially there’s some magic formula for how many of your keyword anchor text, anchor phrases should be branded, partially branded, keyword match links that are carrying anchor text that’s specifically for the keywords you’re trying to rank for, and random assorted anchor texts and that you need some numbers like these, also a crazy idea.

Again, rooted in some fact, the fact being if you are doing sketchy forms of link building of any kind, it’s probably the case that Google will take a look at the anchor text. If they see that lots of things are kind of keyword-matchy and very few things contain your brand, that might be a trigger for them to look more closely. Or it might be a trigger for them to say, “Hey, there’s some kind of problem. We need to do a manual review on this site.”

So yes, if you are in the grey/black hat world of link acquisition, sure, maybe you should pay some attention to how the anchor text looks. But again, if you’re following the advice that you get here on Whiteboard Friday and at Moz, this is not a concern.

6. Never ask for a link directly or you risk penalties

This one I understand, because there have been a bunch of cases where folks or organizations have sent out emails, for example, to their customers saying, “Hey, if you link to us from your website, we’ll give you a discount,” or, “Hey, we’d like you to link to this resource, and in exchange this thing will happen,” something or other. I get that those penalties and that press around those types of activities has made certain people sketched out. I also get that a lot of folks use it as kind of blackmail against someone. That sucks.

Google may take action against people who engage in manipulative link practices. But for example, let’s say the press writes about you, but they don’t link to you. Is asking for a link from that piece a bad practice? Absolutely not. Let’s say there’s a directory like the PDX Monthly, and they have a list of bars and you’ve just opened a new one. Is asking them for a link directly against the rules? No, certainly not. So there are a lot of good ways that you can directly ask for links and it is just fine. When it’s appropriate and when you think there’s a match, and when there’s no sort of bribery or paid involvement, you’re good. You’re fine. Don’t stress about it.

7. More than one link from the same website is useless

This one is rooted in the idea that, essentially, diversity of linking domains is an important metric. It tends to be the case that sites that have more unique domains linking to them tend to outrank their peers who have only a few sites linking to them, even if lots of pages on those individual sites are providing those links.

But again, I’m delighted with my animation here of the guys like, “No, don’t link to me a second time. Oh, my god, Smashing Magazine.” If Smashing Magazine is going to link to you from 10 pages or 50 pages or 100 pages, you should be thrilled about that. Moz has several links from Smashing Magazine, because folks have written nice articles there and pointed to our tools and resources. That is great. I love it, and I also want more of those.

You should definitely not be saying “no.” You shouldn’t be stopping your link efforts around a site, especially if it’s providing great traffic and high-quality visits from those links pointing to you. It’s not just the case that links are there for SEO. They’re also there for the direct traffic that they pass, and so you should definitely be investing in those.

8. Links from non-relevant sites or sites or pages or content that’s outside your niche won’t help you rank better

This one, I think, is rooted in that idea that Google is essentially looking and saying like, “Hey, we want to see that there’s relevance and a real reason for Site A to link to Site B.” But if a link is editorial, if it’s coming from a high-quality place, if there’s a reason for it to exist beyond just, “Hey, this looks like some sort of sketchy SEO ploy to boost rankings,” Googlebot is probably going to count that link and count it well.

I would not be worried about the fact that if I’m coffeekin.com and I’m selling coffee online or have a bunch of coffee resources and corvettecollectors.com wants to link to me or they happen to link to me, I’m not going to be scared about that. In fact, I would say that, the vast majority of the time, off-topic links from places that have nothing to do with your website are actually very, very helpful. They tend to be hard for your competitors to get. They’re almost always editorially given, especially when they’re earned links rather than sort of cajoled or bought links or manipulative links. So I like them a lot, and I would not urge you to avoid those.

So with that in mind, if you have other link ideas, link myths, or link facts that you think you’ve heard and you want to verify them, please, I invite you to leave them in the comments below. I’ll jump in there, a bunch of our associates will jump in there, folks from the community will jump in, and we’ll try and sort out what’s myth versus reality in the link building world.

Take care. We’ll see you again next week for another edition of Whiteboard Friday.

Video transcription by Speechpad.com

Feeling inspired by reality? Start building quality links with OSE.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Find More Articles

Posted in Latest NewsComments Off

The 15 Most Popular Myths About International SEO, Debunked

Posted by Kaitlin

There are lots of myths and misconceptions surrounding the subject of international SEO. I recently gave a Mozinar on this; I’d like to share the basis of that talk in written form here. Let’s first explore why international SEO is so confusing, then dive into some of the most common myths. By the end of this article, you should have a much clearer understanding of how international SEO works and how to apply the proper strategies and tactics to your website.

One common trend is the lack of clarity around the subject. Let’s dig into that:

Why is international SEO so confusing?

There are several reasons:

  • Not everyone reads Google Webmaster Guidelines and has a clear understanding of how they index and rank international content.
  • Guidelines vary among search engines, such as Bing, Yandex, Baidu, and Google.
  • Guidelines change over time, so it’s difficult to keep up with changes and adapt your strategy accordingly.
  • It’s difficult to implement best practices on your own site. There are many technical and strategic considerations that can conflict with business needs and competing priorities. This makes it hard to test and find out what works best for your site(s).

A little history

Let’s explore the reasons behind the lack of clarity on international SEO a bit further. Looking at its development over the years will help you better understand the reasons why it’s confusing, laying some groundwork for the myth-busting that is about to come. (Also, I was a history major in college, so I can’t help but think in terms of timelines.)

Please note: This timeline is constructed almost entirely on Google Webmaster blog announcements. There are a few notes in here about Bing and Yandex, but it’s mostly focused on Google and isn’t meant to be a comprehensive timeline. Mostly this is for illustrative purposes.

tYyiBfi.jpg

2006–2008

Our story begins in 2006. In 2006 and 2007, things are pretty quiet. Google makes a few announcements, the biggest being that webmasters could use geo-targeting settings within Webmaster Tools. They also clarify some of the signals they use for detecting the relevance of a page for a particular market: ccTLDs, and the IP of a server.

2009

In 2009, Bing reveals its secret sauce, which includes ccTLDs, reverse IP lookup, language of body content, server location, and location of backlinks.

kmO5bei.jpg

2010

In 2010, things start to get really exciting. Google reveals some of the other hints that they use to detect geo-targeting, presents the pros and cons of the main URL structures that you can use to set up your international sites, and gives loads of advice about what you should or shouldn’t do on your site. Note that just about the same time that Google says they ignore the meta language tag, Bing says that they do use that tag.

Then, in fall of 2010, hreflang tags are introduced to the world. Until this, there was no standard page-level tag to tell a search engine what country or language you were specifically targeting.

2011

Originally, hreflang tags were only meant to help Google sort out multi-regional pages (that is, pages in the same language that target different countries). Only, in 2011, Google expands hreflang tag support to work across languages as well. Also during this time, Google removes the requirement to use canonical tags in conjunction with hreflang tags, citing they want to simplify the process.

2012

Then in 2012, hreflang tags are supported in XML sitemaps (not just page tags). Also, the Google International Help Center is created, with a bunch of useful information for webmasters.

5AcJDZp.jpg

2013

In 2013, the concept of the “x-default” hreflang tag is introduced, and we learn that Yandex is also supporting hreflang tags. This same year, Bing adds geo-targeting functionality to Bing Webmaster Tools, a full 5 years after Google did.

2014

Note that it isn’t until 2014 that Google begins including hreflang tag reporting within Google Webmaster Tools. Up until that point, webmasters would have had to read about hreflang tags somewhere else to know that they exist and should be used for geo-targeting and language-targeting purposes. Hreflang tags become much more prominent after this change.

2015

In 2015, we see improvements to locale-adaptive crawling, and some clarity on the importance of server location.

To sum up, this timeline shows several trends:

  • Hreflang tags were super confusing at first
  • There were several iterations to improve hreflang tag recommendations between 2011 and 2013
  • Hreflang tag reporting was only added to Google Search Console in 2014
  • Even today, only Google and Yandex support hreflang. Bing and the other major search engines still do not.

There are good reasons for why webmasters and SEO professionals have misconceptions and questions about how best to approach international SEO.

At least 25% of hreflang tags are incorrect

Let’s look at the adoption of hreflang tags specifically. According to NerdyData, 1.7 million sites have at least one hreflang tag.

I did a quick search to find out:

438,417 sites have hreflang=“uk”

7,829 sites have hreflang=“en-uk”

Both of these tags are incorrect. The correct ISO code for the United Kingdom is actually gb, not uk. Plus, you can’t target by country alone — you have to target by language-country pairs or by language. Thus, just writing “uk” is incorrect as well.

That means at least 25% of hreflang tags are incorrect, and I only did a brief search to find a couple of the most commonly mistaken ones. You can imagine just how many sites out there are getting these hreflang values wrong.

All of this is to prove a point: the field is ripe for optimization when it comes to global SEO. Now, let’s debunk some myths!
vXmLaF7.jpg

Myth #1: I need to have multiple websites in order to rank around the world.

There’s a lot of talk about needing ccTLDs or separate websites for your international content. (A ccTLD is a country-coded top-level domain, such as example.ca, which is country-coded for Canada).

However, it is possible for your website to rank in multiple locations around the world. You don’t necessarily need multiple websites or sub-domains to rank internationally; in many cases, you can work within the confines of your current domain.

In fact, if you take a look at your analytics on your website, even if it has no geo-targeting whatsoever, chances are you already have traffic coming in from various languages and countries.

Many global brands have only one site, using subfolders for their multilingual or multi-regional content. Don’t feel that international SEO is beyond your reach because you believe it requires multiple websites. You may only need one!

The most important thing to remember when deciding whether you need separate websites is that new websites will start with zero authority. You will have to fight an uphill battle to build authority for establish and rank those new ccTLDs — and for some companies, organic traffic growth may be for many years after launching ccTLDs. Now, this is not to say that ccTLDs are not a good option. But you just need to keep this in mind that they are not the only option.

Myth #2: “The best site structure for international rankings is _________.”

There’s a lot of debate about what the best site structure is for international rankings. Is it subfolders? Subdomains? ccTLDs?

Some people swear by ccTLDs, saying that in some markets users prefer to buy from local sites, resulting in higher click-through rates. Others champion subdomains or sub-directories.

There is no one answer to the best international site structure. You can dominate using any of these options. I’ve seen websites of all site structures dominate in their verticals. However, there are certain advantages and disadvantages to each, so it’s best to research your options and decide which is best for you.

2Wmy7mJ.jpg

Google has published their pros and cons breakdown of the URL structures you can use for international targeting. There are 4 options listed here:

  • Country-specific, aka ccTLDs
  • Subdomains
  • Subdirectories with gTLDs (generic top-level domains, like .com or .org)
  • URL parameters. These are not recommended.

Subdirectories with gTLDs have the added benefit of consolidating domain authority, while subdomains and ccTLDs have the disadvantage of making it harder to build up domain authority. In my opinion, subdomains are the least advantageous of the 3 options because they do not have the distinct advantage of geo-targeting that ccTLDs do, and they don’t have the advantage of a consolidated backlink profile that subdirectories do.

The most important thing to think about is what’s best for your business. Consider whether you want to target at a language level or a country level. Then decide how much effort you want (or can) put behind building up domain authority to your new domains.

Or, for those who are more visual learners:

YsZjsb2.jpg

  • ccTLDs are a good option if you’re Godzilla. If branding isn’t a problem for you, if you have your own PR, if building up domain authority and handling multiple domains is no big deal, then ccTLDs are a good way to go.
  • Subdirectories are a good option if you’re MacGuyver. You’re able to get the job done using only what you’ve got.
  • Subdomains are a good option if you’re Wallace and Gromit. Somehow, everything ends well despite the many bumps in the road.

ELxkSig.jpg

I researched the accuracy of each type of site structure. First, I looked at Google Analytics data and SEMRush data to find out what percentage of the time the correct landing page URL was ranking in the correct version of Google. I did this for 8 brands and 30 sites in total, so my sample size was small, and there are many other factors that could skew the accuracy of this data. But it’s interesting all the same. ccTLDs were the most accurate, followed by subdirectories, and then subdomains. ccTLDs can be very effective because they give very clear, unambiguous geo-targeting signals to search engines.

However, there’s no one-size-fits-all approach. You need to take a cold, hard look at your business and consider things like:

  • Marketing budget you have available for each locale
  • Crawl bandwidth and crawl budget available for your site
  • Market research: which locales should you target?
  • Costs associated with localization and site maintenance
  • Site performance concerns
  • Overall business objectives

As SEOs, we’re responsible for forecasting how realistically our websites will be able to grow and improve in terms of domain authority. If you believe your website can gain fantastic link authority and your team can manage the work involved in handling multiple websites, then you can consider ccTLDs (but whichever site structure you choose will be effective). But if your team will struggle under the added burden of developing and maintaining multiple (localized!) content efforts to drive traffic to your varied sites, then you need to slow down and perhaps start with subdirectories.

Myth #3: I can duplicate my website on separate ccTLDs or geo-targeted sub-folders & each will rank in their respective Googles.

This myth refers to taking a site, duplicating it exactly, and then putting it on another domain, subdomain, or subfolder for the purposes of geo-targeting.

And when I say “in their respective Googles,” I mean the country-specific versions of Google (such as google.co.uk, where searchers in the United Kingdom will typically begin a search).

You can duplicate your site, but it’s kind of pointless. Duplication does not give you an added boost; it gives you added cruft. It reduces your crawl budget if you have all that content on one domain. It can be expensive and often ineffective to host your site duplicated across multiple domains. There will be cannibalization.

Often I’ll see a duplicate ccTLD get outranked by its .com sister in its local version of Google. For example, say a site like example.co.uk is a mirror of example.com, and the example.com outranks the example.co.uk site in google.co.uk. This is because geo-targeting is outweighed by the domain authority of the .com. We saw in an earlier chart that ccTLDs can be the most accurate for showing the right content in the right version of Google, but that’s because those sites had a good spread of link authority among each of their ccTLDs, as well as localized content.

wRdZ9rC.jpg

There’s a big difference between the accuracy of ccTLDs when they’re localized and when they are dupes. I did some research using the SEMRush API, looking at 3 brands using ccTLDs in 26 country versions of Google, where the .com outranked the ccTLD 42 times. You shouldn’t just host your site mirrored across multiple ccTLDs just for the heck of it; it’s only effective if you can localize each one.

To sum it up: Avoid simply duplicating your site if you can. The more you can do to localize and differentiate your sites, the better.

Myth #4: Geo-targeting in Search Console will be enough for search engines to understand and rank my content correctly.

Geo-targeting your content is not enough. Like we covered in the last example, if you have two pages that are exactly the same and you geo-target them in Search Console, that doesn’t necessarily mean that those two pages will show up in the correct version of Google. Note that this doesn’t mean you should neglect geo-targeting in Google Search Console (or Bing or Yandex Webmaster Tools) — you should definitely use those options. However, search engines use a number of different clues to help them handle international content, and geo-targeting settings do not trump those other signals.

Search engines have revealed what some of the international ranking factors they use are. Here are some that have been confirmed:

  • Translated content of the page
  • Translated URLs
  • Local links from ccTLDs
  • NAP info — this could also include local currencies and links to Google My Business profiles
  • Server location*

*Note that I included server location in this list, but with a caveat — we’ll talk more about that in a bit.

You need to take into account all of these factors, and not just some of them.

Myth #5: Why reinvent the wheel? There are multinational companies who have invested millions in R&D — just copy what they do.

The problem here is that large multinational companies don’t always prioritize SEO. They make SEO mistakes all the time. It’s a myth that you should look to Fortune 500 websites or top e-commerce websites to see how they structure their website— they don’t always get it right. Imitation may be the best form of flattery, but it shouldn’t replace careful thought.

Besides, what the multinational companies do in terms of site structure and SEO differs widely. So if you were to copy a large brand’s site structure, which should you copy? Apple, Amazon, TripAdvisor, Ikea…?
x1gVRwM.jpg

Myth #6: Using URL parameters to indicate language is OK.

Google recommends against this, and from my experience, it’s definitely best to avoid URL parameters to indicate language or region.

What this looks like in the wild is:

http://www.example.com?language=english

or

http://www.example.com/example/?lang=en

…where the target language or region of the page changes depending on the parameter. The problem is that parameters aren’t dependable. Sometimes they’ll be indexed, sometimes not. Search engines prefer unique URLs.

Myth #7: I can just proxy localized content into my existing URLs.

In this situation, a website will use the IP address or the Accept-Lang header of a user to detect their location or browser language preference, then change the content of the page based on that information. So the URL stays the same, but the content changes.

Google and Bing have clearly said they don’t like parameters and recommend keeping one language on one URL. Proxied content, content served by a cookie, and side-by-side translations all make it very problematic for search engines to index a page in one language. Search engines will appear to crawl from all over the world, so they’ll get conflicting messages about the content of a page.

Basically, you always want to have 1 URL = 1 version of a page.

Google has improved and will continue to improve its locale-aware crawling. As of early 2015, they announced that Googlebot will crawl from a number of IP addresses around the world, not just the US, and will use the Accept-Lang header to see if your website is locale-adaptive and changing the content of the page depending on the user. But in the same breath, they made it very clear this technology is not perfect, this does not replace the recommendation for using hreflang, and they still recommend you NOT use locale-adaptive content.

Myth #8: Adding hreflang tags will help my multinational content rank better.

Hreflang tags are one of the most powerful tools in the international SEO toolbox. They’re foundational to a successful international SEO strategy. However, they’re not meant to be a ranking factor. Instead, they’re intended to ensure the correct localized page is shown in the correct localized version of Google.

In order to get hreflang tags right, you have to follow the documentation exactly. With hreflang, there is no margin for error. Make sure to use the correct language (in ISO 639-1 format) and country codes (in ISO 3166-1 Alpha 2 format) when selecting the values for your hreflang tags.

Hreflang requirements:

  • Exact ISO codes for language, and for language-country if you target by country
  • Return tags
  • Self-referential tags
  • Point to correct URLs
  • Include all URLs in an hreflang group
  • Use page tags or XML sitemaps, preferably not both
  • Use HTTP headers for PDFs, etc.

Be sure to check your Google Search Console data regularly to make sure no return tag errors or other errors have been found. A return tag error is when Page A has an hreflang tag that points to Page B, but Page B doesn’t have an hreflang tag pointing back to Page A. That means the entire hreflang association for that group of pages won’t work, and you’ll see return tag errors for those pages in Google Search Console.

Either the page tagging method or the XML hreflang sitemap method work well. For some sites, an XML sitemap can be advantageous because it eliminates the need for code bloat with page tags. Whichever implementation allows you to add hreflang tags programmatically is good. There are tools on the market to assist with page tagging, if you use one of the popular CMS platforms.

Here are some tools to help you with hreflang:

Myth #9: I can’t use a canonical tag on a page with hreflang tags.

When it comes to hreflang tags AND canonical tags, many eyes glaze over. This is where things get really confusing. I like to keep it super simple.

The simplest thing is to keep all your canonical tags self-referential. This is a standard SEO best practice anyways. Regardless of whether you have hreflang tags on a page, you should be implementing self-referential canonical tags.

Myth #10: I can use flag icons on my site to indicate the site’s language.

Flags are not languages — there’s even a whole website dedicated to talking about this common myth: http://www.flagsarenotlanguages.com. It has many examples of sites that mistakenly use flag icons to indicate languages.

For example, the UK’s Union Jack doesn’t represent all speakers of English in the world. Thanks to the course of history, there are at least 101 countries in the world where English is a common tongue. A flag of a country to represent speakers of a language is very off-putting for any users who speak the language but aren’t from that country.

Here’s an example where flag icons are used to indicate language. A better (and more creative) approach is to replace the flag icons with localized greetings:

NPPjSuJ.jpg

If you have a multi-lingual site(s), you should not use flags to represent language. Instead, use the name of the language, written in the local language. English should be “English,” Spanish should be “Español,” German should be “Deutsch,” etc. You’d be surprised how many websites forget to use localized language or country spellings.

Myth #11: I can get away with automated translations.

The technology for automated translations or machine translations has been improving in recent years, but it’s still better to avoid automated translations, especially machine translation that involves no human editing.

Automatic translations can be inaccurate and off-putting. They can hurt a website trying to rank in a competitive landscape. A great way to get an edge on your competitors is to use professional, high-quality native translators to localize your content into your target languages. High-quality localization is one of the key factors in improving your rankings when it comes to international SEO.

If you have a very large amount of content that you cannot afford to translate, choose some of the most important content for human translation, such as your main category and product pages.

Myth #12: Whichever site layout and user experience works best in our core markets should be rolled out across all our markets.

This is something I’ve seen happen on many, many sites, and it was part of the reason why eBay failed in China.

Porter Erisman tells the story in his book Alibaba’s World, which I highly recommend. He spoke of how, when eBay and Alibaba were duking it out in China, eBay made the decision to apply its Western UX principles to its Chinese site.

In Alibaba’s World, Erisman writes about how eBay “eliminated localized features and functions that Chinese Internet users enjoyed and forced them to use the same platform that had been popular in the US and Germany. Most likely, eBay executives figured that because the platform had thrived in more industrialized markets, its technology and functionality must be superior to a platform from a developing country.

“Chinese users preferred Alibaba’s Taobao platform over eBay, because it had an interface that Chinese users were used to – cute icons, flashing animations, and had a chat feature that connected customers with sellers. In the West, bidding starts low and ends high, but Chinese users preferred to haggle with sellers, who would start their bids high and end low.”

From this story, you can tell how localization — in terms of site design, UX, and holistic business strategy — can be of tantamount importance.

Here is an example of Lush’s Japanese site, which has bright colors, a lot going on, and it’s almost completely localized into Japanese. Also notice the chat box in the bottom right:

u38CJm9.jpg

Now compare that to the Lush USA site. There’s a lot more white space here, fewer tiles, and the chat box is only a small button on the right sidebar.
JWv4GcN.jpg
They’ve taken the effort to adjust layout according to how they want to express their brand to each market, rather than just replacing tiles in the same CMS layout with localized tiles. Yet, in both markets they have many elements that are similar, too. They’re a good example of keeping a unified global brand while leaving plenty of room for local expression.

The key to success internationally is localizing your online presence while at the same time having a unified global brand. From an SEO perspective, you should make sure there’s a logical organization to your global URLs so that localized content can be geo-targeted by subdirectory, subdomain, or domain. You should focus on getting hreflang tags right, etc. But you should also work with a content strategy team to make sure that there will be room for trans-creation of content, as well as with a UX design team to make sure that localized content can be showcased appropriately.

Design, UX, site architecture — all of these things play increasingly important roles in SEO. By localizing your design, you’re reducing duplicate content and you’re potentially improving your site engagement metrics (and by corollary, your clickstream data).

Things that an SEO definitely wants to localize are:

  • URLs
  • Meta titles & descriptions
  • Navigation labels
  • Headings
  • Image file names, internal anchor text, & alt text
  • Body content

Make sure to focus on keyword variations between countries, even within the same language. For example, there are differences in names and spellings for many things in the UK versus the US. A travel agency might describe their tours to a British audience as “tailor-made, bespoke holidays,” while they would tell their American audience they sell “customized vacation packages.”
If you used the same keywords to target all countries that share a common tongue, you’d be losing out on the ability to choose the best keywords for each country. Take this into account when considering your keyword optimization.

Myth #13: We can just use IP sniffing and auto-redirect users to the right place. We don’t need hreflang tags or any type of geo-targeting.

A lot of sites use some form of automatic redirection, detecting the user’s IP address and redirecting them to another website or to a different page on their site that’s localized for their region. Another common practice is to use the Accept-Language header to detect the user’s browser language preference, redirecting users to localized content that way.

However, Google recommends against automatic redirection. It can be inaccurate, can prevent users and search engines from indexing your whole site, and can be frustrating for users when they’re redirected to a page they don’t want. In fact, hreflang annotations, when correctly added to all your localized content and correctly cross-referenced, should eliminate or greatly reduce the need for any auto-redirection. You should avoid automatic redirection as much as possible.

Here are all the reasons (that I can think of) why you shouldn’t do automatic redirection:

  • User agents like Googlebot may have a hard time reading all versions of your page if you keep redirecting them.
  • IP detection can be inaccurate.
  • Multiple countries can have multiple official languages.
  • Multiple languages can be official in multiple countries.
  • Server load time can be negatively affected by having to add in all these redirects.
  • Shared computers between spouses, children, etc., could have different language preferences.
  • Expats and travelers may try to access a website that assumes they’re locals, making it frustrating for the users to switch languages.
  • Internet cafes, hotel computer centers, and school computer labs may have diverse users.
  • The user prefers to browse in one language, but make transactions in another. For example, many citizens are fluent in English, and will search in English if they think they can get better results that way. But when it comes to the checkout process, especially when reading legalese, they will prefer to switch to their native language.
  • A person sends a link to a friend, but that friend lives in a different place, and can’t see the same thing as her friend sees.

Instead, a much better user experience is to provide a small, unobtrusive banner that appears when you detect a user may find another portion of your site more relevant. TripAdvisor and Amazon do a great job of this. Here’s an image from Google Webmaster Blog that exemplifies how to do this well:

gB6AhGP.jpg

One exception to the never-use-auto-redirection rule is that, when a user selects a country and/or language preference on your site, you should store that preference in a cookie and redirect the user to their preferred locale whenever they visit your site in the future. Make sure that they can set a new preference any time, which will re-set the cookie preference.

On that note, also always make sure to have a country and/or language selector on your website that’s located on every page and is easy for users to see and for search engine bots to crawl.

Myth #14: I need local servers to host my global content.

Many website owners believe they need local servers in order to rank well abroad. This is because Google and Bing clearly stated that local servers were an important international ranking factor in the past.

However, Google confirmed last year that local server signals are not as important as they once were. With the rise in popularity of CDNs, local servers are generally not necessary. You definitely need a local server for hosting sites in China, and it may be useful in some other markets like Japan. It’s always good to experiment. But as a general rule, what you need is a good CDN that will serve up content to your target markets quickly.

Myth #15: I can’t have multi-country targeted content that’s all in the same language, because then I’d incur a duplicate content penalty.

This myth is born from an underlying fear of duplicate content. Something like 30% of the web contains some dupe content (according to a recent RavenTools study). Duplicate content is a fact of life on the web. You have to do something spammy with that duplicate content, such as create doorway pages or scrape content, in order to incur a penalty.

Geo-targeted, localized content is not spammy or manipulative. There are valid business reasons for wanting to have very similar content geared for different users around the world. Matt Cutts confirmed that you will not incur a penalty for having similar content across multiple ccTLDs.

The reality is, you CAN have multi-country targeted content in the same language. It’s just that you need to combine hreflang tags + localization in order to get it right. Here are some ways to avoid duplicate content problems:

  • Use hreflang tags
  • Localized keyword optimization
  • Adding in local info such as telephone numbers, currencies, addresses in schema markup, and Google My Business profiles
  • Localized HTML sitemaps
  • Localized navigation and home page features that cater to specific audiences.
  • Localized images that resonate with the audience. American football, for example, is not very popular outside the US. Also, be mindful of holidays around the world and of current events.
  • Transcreated content (where you take an idea and tailor it for a specific locale), rather than translation (which is more word-for-word than concept-for-concept)
  • Obtain links from local ccTLDs pointing to your localized content

As you can see, there are many common myths surrounding international SEO, but hopefully you’ve gained some clarity and feel better equipped to build a great global site strategy. I believe international SEO will continue to be of growing interest, as globalization is a continuing trend. Cross-border e-commerce is booming — Facebook and Google are looking at emerging markets in Africa, India, and Southeast Asia, where more and more people are going online and getting comfortable buying online.

International SEO is ripe for optimization — so you, as SEO experts, are in a very good position if you understand how to set your website up for international SEO success.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

The 9 Most Common Local SEO Myths, Dispelled

Posted by JoyHawkins

[Estimated read time: 7 minutes]

I regularly hear things in the Local SEO world that many people believe, but which are completely false. I wanted to put some of these myths to rest by listing out the top 9 Local SEO myths that I run into most frequently.

1. Deleting your listing in Google My Business actually removes the listing from Google.

Business owners will often question how they can get rid of duplicate listings on Google. One of the more common things people try is claiming the duplicate and then deleting it from the Google My Business Dashboard. When you go to delete a listing, you receive a scary message asking if you’re sure you want to do this:
The truth is, removing a listing from Google My Business (GMB) just makes the listing unverified. It still exists on Google Maps and will often still rank, provided you didn’t clear out all the categories/details before you deleted it. The only time you’d really want to delete a listing via GMB is if you no longer want to manage the listing.

Google confirms this in their help center article:

When you delete a local page, the corresponding listing will be unverified and you will no longer be able to manage it. Google may still retain business information from the page and may continue to show information about the business on Maps, Search, and other Google properties, including marking the business as permanently closed, moved, or open, depending on the information that’s known about the business.

2. Failure to claim your page means your business won’t rank anywhere.

I’m sure most of you have received those annoying phone calls that say: “Your business is not currently verified and will vanish on Google unless you claim it now!”

First of all, consider the authority of the people who are calling you. I can say with certainty they are not experts in this industry, or they wouldn’t resort to robo-calling to make sales.

The Moz Local Search Ranking Factors does list verifying your listing as #13 for making an impact on ranking in the 3-pack, but this is often because business owners add more data to the listing when they verify it. If they left the listing exactly how it was before verifying, the verification “status” would not likely impact the ranking much at all. We often see unverified pages outranking verified ones in really competitive markets.

3. “Professional/Practitioner” listings on Google are considered duplicates and can be removed.

Google often creates listings for the actual public-facing professionals in an office (lawyers, doctors, dentists, realtors, etc), and the owner of the practice usually wants them to disappear. Google will get rid of the listing for the professional in two different cases:

a) The professional is not public-facing. Support staff, like hygienists or paralegals for example, don’t qualify for a listing and Google will remove them if they exist.

b) The business only has one public-facing individual. For example, if you have a law firm with only one lawyer, Google considers this to be a “Solo Practitioner” and will merge the listing for the professional with the listing for the office. Their guidelines state to “create a single page, named using the following format: [brand/company]: [practitioner name].”

In the case that the professional has left your office, you can have the listing marked as moved if the professional has retired or is no longer working in the industry. This will cause it to vanish from the search results, but it will still exist in Google’s back-end. If the professional has moved to a different company, you should have them claim the listing and update the address/phone number to list their new contact information.

4. Posting on G+ helps improve your ranking.

Phil Rozek explains this best: “It’s nearly impossible for people to see your Google+ posts unless they search for your business by name. Google doesn’t include a link to your ‘Plus’ page in the local pack. Google doesn’t even call it a ‘Plus’ page anymore. Do you still believe being active on Google+ is a local ranking factor?”

No, posting on G+ will not cause your ranking to skyrocket, despite what the Google My Business phone support team told you.

5. “Maps SEO” is something that can be effectively worked on separately from “Organic SEO.”

I often get small business owners calling me saying something along the lines of this: “Hey, Joy. I have an SEO company and they’re doing an awesome job with my site organically, but I don’t show up anywhere in the local pack. Can I hire you to do Google Maps optimization and have them do Organic SEO?”

My answer is, generally, no. “Maps Optimization” is not a thing that can be separated from organic. At Local U in Williamsburg, Mike Ramsey shared that 75% of ranking local listings also rank organically on the first page. The two are directly connected — a change that you make to your site can have a huge influence on where you rank locally.

If you’re a local business, it’s in your better interests to have an SEO company that understands Google Maps and how the 3-pack works. At the company I work for, we’ve always made it a goal to get the business ranked both organically and locally, since it’s almost impossible to get in the 3-pack without a strong organic ranking and a website with strong local signals.

6. Google employees are the highest authority on which ranking signals you should pay attention to.

Google employees are great; I love reading what they come out with and the insight they provide. However, as David Mihm pointed out at Local U, those employees have absolutely no incentive to divulge any top-secret tips for getting your website to rank well. Here are some recent examples of advice given from Google employees that should be ignored:

  1. Duplicate listings will fix themselves over time.
  2. Posting on Google+ will help your ranking (advice given from phone support reps).
  3. If you want to rank well in the 3-pack, just alter your business description.

Instead of trusting this advice, I always suggest that people make sure what they’re doing matches up with what the pros are saying in big surveys and case studies.

7. Setting a huge service area means you’ll rank in all kinds of additional towns.

Google allows service-area businesses to set a radius around their business address to demonstrate how far they’re willing to travel to the customer. People often set this radius really large because they believe it will help them rank in more towns. It doesn’t. You will still most likely only rank in the town you’re using for your business address.

8. When your business relocates, you want to mark the listing for the old location as closed.

The Google My Business & Google MapMaker rules don’t agree on this one. Anyone on the Google MapMaker side would tell a business to mark a listing as “closed” when they move. This will cause a business listing to have a big, ugly, red “permanently closed” label when anyone searches your business name.

If your listing is verified through Google My Business, all you need to do is edit the address inside your dashboard when you move. If there’s an unverified duplicate listing that exists at your old address, you want to make sure you get it marked as “Moved.”

9. Google displays whatever is listed in your GMB dashboard.

Google gives business owners the ability to edit information on their listing by verifying it via Google My Business. However, whatever data the owner inputs is just one of many sources that Google will get information from. Google updates verified listings all the time by scraping data from the business website, inputs from edits made on Google Maps/MapMaker, and third-party data sources. A recent case I’ve seen is one where Google repeatedly updated an owner-verified listing with incorrect business hours due to not being able to properly read the business hours listed on their website.

Were you surprised by any of those Local SEO myths? Are there others that you come across regularly? I’d love to hear about it, so please leave a comment!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

Seven Myths About Solar Panels Debunked




style="display:inline-block;width:250px;height:250px"
data-ad-client="ca-pub-7815236958543991"
data-ad-slot="8717335615">

It was easy to rule out solar energy as science fiction in the early days. Not only the technology was irregular and inconsistent, but also very expensive. But now, affordable residential models that are available, people are finally taking this form of energy seriously. Not surprisingly, critics have grown more desperate in recent years. You are trying to protect a source of power established, or simply do not believe in solar panels, critics were spreading lies and half truths at an unprecedented pace. Take a moment to discredit some of our favorites.

Myth: Only the rich can afford them.

Fact: According to the Solar Energy Industries Association (SEIA), the price of the installation has dropped more than 73 percent since 2006. State and federal subsidies have made solar panels generous affordable for all Americans economic strata.

Myth: They need much maintenance.

Fact: Because no moving parts, residential systems require minimal maintenance. The only advise installers is to inspect dirt and debris every few months. Apart from this, these units are virtually maintenance free.

Myth: They will not work in certain fields.

Fact: It is true that solar panels work more efficiently in areas that receive a lot of sunlight, that can still produce energy from ambient light on rainy or cloudy days. As such, they will work well in the country.

Myth: They damage ceilings.

Reality: In fact, the opposite is true. Many studies have found that the panels effectively protect roof surfaces of the weather. We must also add that equipment, when installed correctly, no damage to your roof in any way. You can put in or taken without damaging the wood shingles or below them.

Myth: It hurts resale value.

Fact: Again, this is not even close! Several studies have shown that residential work system can add thousands of dollars to the value of the average home. In fact, a recent report found that ownership of the host system could add up to $ 15,000 and resale value of a home.

Myth: They are not made to last.

Fact: As with any new technology, there are bound to be some people who have had a bad experience. Critics have seized on incomplete reports of disgruntled to question the structural integrity of residential systems. Are really hard and durable enough to withstand the elements? According to the companies that sell they are! That’s why most vendors offer warranties of 20-25 years in their units. Not surprisingly, most have proven to last as long or longer.

Myth: It’s easy to use.

Reality: Again, critics seek to scare people to use new technologies tell half the story. Yes, there was a time when you had to be a techie to harness the power of the sun. But because of recent improvements, solar panels today are as easy to use as a microwave, DVD player, or any other appliance in your home.

Now that you know the truth about solar energy, you can make your own decision on the fastest growing form of alternative energy.

Latest solar news

More Articles

Posted in Latest NewsComments Off

The Myths & Realities Of How Of The EU’s New “Right To Be Forgotten” In Google Works

Depending on what you read, a “Right To Be Forgotten” court ruling in the European Union this week means that now anyone can ask for anything to be removed from Google, which will soon collapse under an overwhelming number of requests. In reality, it’s far more limited than it…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

8 Myths of the Zombie Content Apocalypse

Image of zombies

You’ve seen the conversation and heard the wild conjecture about the content apocalypse.

Let’s be honest. Marketing zombies caused this problem.

That’s right, you heard me. Marketing zombies.

Their undead shuffling has spammed the world with a ceaseless stream of bad posts, bad emails, bad white papers, and bad videos. Perfectly good marketers and writers have been bitten, turning into undead content machines, oozing black goop all over the interwebs.

You can see their moans all over Twitter, Facebook, and Google+.

But the content apocalypse is just a cautionary horror story shared by marketers who are sick and tired of seeing their friends turn into zombies.

Here are eight of the most common myths shared by the zombie-fearing, spam-coated digerati:

1. People are sick of content

Myth: The undead shuffle will turn people away from the horror show, forever swearing off all content.

Research shows that increasing smartphone and tablet use is translating into demand for more media viewing choices. People want new, fresh content developed specifically for the evolving media devices they are using. (Hat tip to Shel Holtz.)

2. Google will punish all content creators

Myth: The mighty anti-venom created by Google in the form of Penguin and Panda updates will become so strong that all blogs will die!

Far from it.

The more natural and unique your content is, the more likely normal people will enjoy it. And as a result, so will Google.

Many of Google’s updates seek to reward well-referred, natural content while eliminating link schemes and duplicate content.

3. We can’t get away from content zombies

Myth: The plague of marketing zombies will forever clog our Internet pipes. There’s no way we can send magic clubs into the ethernet and hit them all in the head!

The Copyblogger team (and extended family of writers) never subscribes to violence. We do, however, encourage unsubscribing from content marketing zombies. Try it, you might find your social network experiences are more enjoyable.

4. The zombie moans are too loud

Myth: Our content can’t ever beat out the noise created by the marketing zombies!

Some of this is caused by people who have had success, but are no longer being read. Others perpetuate this myth after a short attempt at creating content.

The content zombies have attacked and are moaning out the same content! It’s time to fight back and differentiate!

Differentiating content that stands out requires a few distinct skill sets:

  • Creative approaches to media production
  • Strong headline writing
  • New takes on topics
  • Experience-based usefulness

These all work to make content special.

5. Only a superhero can beat the zombies

Myth: Without a well-known social personality producing content, we can’t rise above the moans!

Ok, this one’s definitely not true.

Influencer myths would have you believe that without a well-known blogger or established personality, you cannot create new content that will be read. Or, if you have one and they depart, you will sink into the zombie ooze.

In reality, readerships tend to be blood loyal to useful and entertaining content presented with undying consistency.

Readers share content and socially validate it with their friends and with search engines. If you have an established readership and you continue to produce great quality content without X personality, your content will still rise to the top.

In one case, I have a client who lost a well-known personality last year. We continued with blue chip writers who delivered pragmatic, useful intelligence week in and week out. Over the course of one calendar year, traffic literally quintupled.

Don’t believe the hype … you can beat the content zombies!

6. You have to pay to exterminate content zombies

Myth: Native advertising is the only way to rise above the content noise.

Native advertising seems to be the way most corporate types move so they can become heard above the zombie din. In truth, consumers are confused by sponsored content.

When it comes down to determining effectiveness, one study shows that sponsored content only works when it has context for the reader. Paying doesn’t get you more results unless the content is specifically engineered to serve the reader.

By the way, content that is not native advertising also falls to the wayside if it is not entertaining or useful. Notice the pattern that is emerging?

Bad content fails, sponsored or not.

7. The black ooze prevents content from producing results

Myth: Customers won’t use our content to buy because they don’t trust it … thanks to all of the free black zombie ooze on the interwebs!

The more strategic your marketing program is in its design, from usefulness to value, the more impact it will have. Create “Rainmaker content” and your business will see outcomes. Create “me too” content, like spammy blog posts, and watch the black ooze pour out of your social media accounts while the red ink seeps from the bottom line.

8. Once a zombie, always a zombie

Myth: Once you become a content zombie, you are forever infected and can never come back to humanity.

Whether it was Mark Schaefer’s originating content shock post, Shel Holtz’s passionate defense of content marketing, or our own Sonia Simone’s measured response, everyone focused on encouraging content creators to develop stronger, more differentiated products.

It’s never too late to improve content quality and stand out.

Are you running from the ooze, too? Let us know in the comments …

Image credit: Zombie Walk by Ciao Schiavo.

About the Author: Geoff Livingston is an author, public speaker and strategist who helps companies and nonprofits develop outstanding marketing programs. He brings people together, virtually and physically for business and change.

The post 8 Myths of the Zombie Content Apocalypse appeared first on Copyblogger.

Related Stories

Copyblogger

More Google Articles

Posted in Latest NewsComments Off

UX Myths That Hurt SEO – Whiteboard Friday

Posted by randfish

User experience and SEO: friends or enemies? They've had a rocky past, but it's time we all realized that they live better in harmony. Dispelling the negative myths about how UX and SEO interact is the first step in improving both the look and search results of your website. 

In today's Whiteboard Friday, Rand talks about some persistent UX myths that we should probably ignore.

Have anything to add that we didn't cover? Leave your thoughts in the comments below!


Video Transcription

"Howdy, SEOmoz fans, and welcome to another edition of Whiteboard Friday. This week I wanted to talk a little about user experience, UX, and the impact that it has on SEO.

Now, the problem historically has been that these two worlds have had a lot of conflict, especially like late '90s, early 2000s, and that conflict has stayed a little bit longer than I think it should have. I believe the two are much more combined today. But there are a few things that many people, including those who invest in user experience, believe to be true about how people use the web and the problems that certain user experience, types of functionality, certain design types of things cause impact SEO, and they impact SEO negatively. So I want to dispel some of those myths and give you things that you can focus on and fix in your own websites and in your projects so that you can help not only your SEO, but also your UX.

So let's start with number one here. Which of these is better or worse? Let's say you've got a bunch of form fields that you need a user to fill out to complete some sort of a registration step. Maybe they need to register for a website. Maybe they're checking out of an e-commerce cart. Maybe they're signing up for an event. Maybe they're downloading something.

Whatever it is, is this better, putting all of the requests on one page so that they don't have to click through many steps? Or is it better to break them up into multiple steps? What research has generally shown and user experience testing has often shown is that a lot of the time, not all of the time certainly, but a lot of the time this multi-step process, perhaps unintuitively, is the better choice.

You can see this in a lot of e-commerce carts that do things very well. Having a single, simple, direct, one step thing that, oh yes, of course I can fill out my email address and give you a password. Then, oh yeah, sure I can enter my three preferences. Then, yes, I'll put in my credit card number. Those three things actually are more likely to carry users through a process because they're so simple and easy to do, rather than putting it all together on one page.

I think the psychology behind this is that this just feels very overwhelming, very daunting. It makes us sort of frustrated, like, "Oh, do I really have to go through that?"

I'm not saying you should immediately switch to one of these, but I would fight against this whole, "Oh, we're not capturing as many registrations. Our conversion rate is lower. Our SEO leads aren't coming in as well, because we have a multi-step process, and it should be single step." The real key is to usability test to get data and metrics on what works better and to choose the right path. Probably if you have something small, splitting it up into a bunch of steps doesn't matter as much. If you have something longer, this might actually get more users through your funnel.

Number two. Is it true that if we give people lots of choice, then they'll choose the best path for them, versus if we only give people a couple options that they might not go and take the action that they would have, had we given them those greater choices? One of my favorite examples from this, from the inbound marketing world, the SEO world, the sharing world, the social world is with social sharing buttons themselves. You'll see tons of websites, blogs, content sites, where they offer just an overwhelming quantity of tweet this, share this on Facebook, like this on Facebook, like us on Facebook, like our company page on Facebook, plus one this on Google+, follow us on Google+, embed this on your own webpage, link to this page, Pinterest this.

Okay. Yes, those are all social networks. Some of them may be indeed popular with many of your users. The question is:  Are you overwhelming them and creating what psychologists have often called the "paradox of choice," which is that we as human beings, when we look at a long list of items and have to make a decision, we're often worse at making that decision than we would be if we looked at a smaller list of options? This is true whether it's a restaurant menu or shopping for shoes or crafting something on the Internet. Etsy has this problem constantly with an overwhelming mass of choice and people spending lots of time on the site, but then not choosing to buy something because of that paradox of choice.

What I would urge you to do is not necessarily to completely get rid of this, but maybe to alter your philosophy slightly to the three or four or if you want to be a little religious about it, even the one social network or item that you think is going to have the very most impact. You can test this and bear it out across the data of your users and say, "Hey, you know what? 80% of our users are on Facebook. That's the network where most of the people take the action even when we offer them this choice. Let's see if by slimming it down to just one option, Twitter or Facebook or just the two, we can get a lot more engagement and actions going." This is often the case. I've seen it many, many times.

Number three. Is it true that it's absolutely terrible to have a page like this that is kind of text only? It's just text and spacing, maybe some bullet points, and there are no images, no graphics, no visual elements. Or should we bias to, hey let's have a crappy stock photo of some guy holding up a box or of a team smiling with each other?

In my experience, and a lot of the tests that I've seen around UX and visual design stuff, this is actually a worse idea than just going with a basic text layout. If for some reason you can't break up your blog post, your piece of content, and you just don't have the right visuals for it, I'd urge you to break it up by having different sections, by having good typography and good visual design around your text, and I'd urge you to use headlines and sub-headlines. I wouldn't necessarily urge you to go out and find crappy stock photos, or if you're no good at creating graphics, to go and make a no good graphic. This bias has created a lot of content on the web that in my opinion is less credible, and I think some other folks have experienced that through testing. We've seen it a little bit with SEOmoz itself too.

Number four. Is it true that people never scroll, that all the content that you want anyone to see must be above the fold on a standard web page, no matter what device you think someone might be looking at it on? Is that absolutely critical?

The research reveals this is actually a complete myth. Research tells us that people do scroll, that over the past decade, people have been trained to scroll and scroll very frequently. So content that is below the fold can be equally accessible. For you SEO folks and you folks who are working on conversion rate optimization and lead tracking, all that kind of stuff, lead optimization, funnel optimization, this can be a huge relief because you can put fewer items with more space up at the top, create a better visual layout, and draw the eye down. You don't have to go ahead and throw all of the content and all of the elements that you need and sacrifice some of the items that you wanted to put on the page. You can just allow for that scroll. Visual design here is obviously still critically important, but don't get boxed into this myth that the only thing people see is the above the fold stuff.

Last one. This myth is one of the ones that hurts SEOs the most, and I see lots of times, especially when consultants and agencies, or designers, developers are fighting with people on an SEO team, on a marketing team about, "Hey, we are aiming for great UX, not great SEO." I strongly disagree with this premise. This is a false dichotomy. These two, in fact, I think are so tied and interrelated that you cannot separate them. The findability, the discover bility, the ability for a page to perform well in search engines, which remains the primary way that we find new information on the Internet, that is absolutely as critically important as it is to have that great user experience on the website itself and through the website's pages.

If you're not tying these two together, or if you're like this guy and you think this is a fight or a competition, you are almost certainly doing one of these two wrong. Oftentimes it's SEO, right? People believe, hey we have to put this keyword in here this many times, and the page title has to be this big on the page. Or, oh we can't have this graphic here. It has to be this type of graphic, and it has to have these words on it.

Usually that stuff is not nearly important as it was, say, a decade ago. You can have fantastic UX and fantastic SEO working together. In fact, there almost always married.

If you're coming up with problems like these, please leave them in the comments. Reach out to me, tweet to me and let me know. I guarantee you almost all of them have a creative solution where the two can be brought together.

All right, gang, love to hear from you, and we will see you again next week for another edition of Whiteboard Friday. Take care."

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


SEOmoz Daily SEO Blog

Posted in Latest NewsComments Off

B2B Social Media: Video of Jay Baer destroying social media myths

B2B marketers often hold misguided beliefs about the impacts of social media on their industry. Watch the excerpt video to learn three takeaways on social media myths from Jay Baer’s keynote discussion at the B2B Summit 2011.
MarketingSherpa Blog

Posted in Latest NewsComments Off

Advert