Friday, November 4, 2011

Is White Hat SEO Even Risky?

Google's advice to website owners is that if you create a good user experience and don't violate their guidelines all will be good. Yes, you'll still have to compete against other sites and you'd certainly benefit from a marketing plan, but essentially the risk of something bad happening that isn't of your own doing is nil.

And yet I've seen contrary evidence after having worked on two sites this year that did nothing out of the ordinary except change their domain names (one voluntary and one because of a trademark issue). Despite following Matt Cutt's advice and the advice of numerous SEOs, me included, both sites suffered significant traffic declines, which in turn, had a real impact on revenues.

Simple Domain Change

The first case is the simplest form of a domain change in that all the URL paths and filenames stayed the same. In addition, content wasn't changed significantly, navigation stayed the same, and functionality was identical. The sorts of things that did change were the logo and the addition of some extra white space in the page layout. The migration itself included one-to-one 301 redirects, an updated XML sitemap, and notifying Google of the domain change via its Webmaster Tools.

At first everything looked good. Traffic remained steady on the new domain compared to the old domain (using the same Google Analytics code made this trending analysis easy). Of course, during this time what we were seeing was the effect of the redirects from rankings of the old content and not Google's ranking of the new content.

A couple of weeks into the effort traffic dropped precipitously (down 52 percent). For those wondering, this domain change didn't occur around the time of any Panda updates - the previous update (2.4) was on August 12 and the next one (2.5) was a couple of weeks away on September 28. Also, the old site had not been affected by any of the Panda rollouts since they started.


Complex Domain Change

The second case is a little more complicated than the first. This time around not only did the domain change, but the URL structure was altered too. This meant that there was a greater chance of the 301 redirects being mapped incorrectly, which in turn could affect rankings for the new site. Fortunately, the developers of this site remained vigilant and any incorrectly mapped or missed redirects were fixed within moments of being discovered. In the end, only about 2 percent to 3 percent of the mappings required tweaking. The design changed significantly, but the content didn't and cross links were preserved.

As with the first case, there was no evidence of a problem at the beginning with traffic ramping up to previous levels (there was new Google Analytics code this time so traffic started at zero). However, once the new site was indexed, rankings for the old domain started to decline without a corresponding ranking increase for the new domain. Or for those who don't like to talk about rankings, traffic dropped 23 percent. This domain change kicked off right after the Panda 2.4 update in August. Since the domain was new, it presumably wasn't evaluated and, regardless, the traffic decline didn't happen until a week later. Panda 2.5, along with its subsequent tweaks, wasn't due for weeks.


Traffic declines of this magnitude are typically reserved for those who have been overly aggressive with their SEO. So what are we to make of such a situation when all the rules were followed? And what recourse do we have to remedy such issues when even the team receiving hundreds of thousands of dollars in ad spend won't talk to us regarding the organic search side of the business? I wish I had the answers.

What I find curious is that domain changes I've worked on prior to 2010 went well with either no dip in traffic or a short-lived one. This suggests that we don't just have to worry about the risk that Google won't behave as we expect or even as Google tells us it will, but that we also have to worry about what we believe to be true not remaining so from year to year.

Google AdSense Reports For Ad Network Performance

Google added a new report that breaks down your earnings on Google's certified ad network through DoubleClick Ad Exchange

In March 2010, Google launched the Google certified ad network and it concerned many publishers. In fact, many publishers blocked it leading Google to warn them not to block the ad network.

In the end, publishers just wanted more transparency and almost 20 months later, we have it.

Even more so, Google said they are "confident" that their "AdWords inventory can provide the highest paying ad." But advertisers are confident in Google, not the 3rd party payouts. The question Google is proving, is Google showing 3rd party ads when it is best for the advertiser.

That being said, you can look for yourself by going to the Performance reports tab and clicking on the ‘Ad networks’ in the side navigation.

Here is a screen shot of the report:



What have you learned from these reports?

Forum discussion at WebmasterWorld.

Google To Bill Developers For Maps API

Not totally unexpected, Google announced they will soon be charging Maps API developers for usage of the Google Maps API.

Google said now that it is October, six months after they updated their terms of service, they will start enforcing the new pricing options for the API. They noted, "no site exceeding these limits will stop working immediately." 

Instead, they will soon ask you for your credit card and in "early 2012" begin charging you for usage.
Your options include:
  • Reduce your usage to below the limits
  • Opt-in to paying for your excess usage at the rates given in the FAQ
  • Purchase a Maps API Premier license
Many in the developer community are upset, but not all.
A WebmasterWorld thread has one saying:
Considering the amount of money Google has spent building the best maps on the planet IMO, it's about time they started charging. Paying to use Google maps, just in order to maintain the street view alone, is worth it.
Some say that Google gives away an addictive drug and then once you're hooked, they start charging you for it.

To be honest, developers can always switch to Bing.

Forum discussion at WebmasterWorld.

SEO Stars Vs. SEO Celebrities

Rob Kerry has a wonderful post named Celebrities Killed The SEO Star. In short, Rob said, "I believe that celebrity SEOs, brands and blogs are feeding a generation of untested and poorly trained search marketers, who pass themselves off as SEO experts."

Ah, link bait at it's best - at least within the SEO space.

I have known Rob for a long time, in fact, he blogged here in his earlier days.

I am not going to suggest Rob is talking about me as being an SEO Celebrity? Why? Well, because he defined an SEO celeb as a "high profile SEO bloggers recently ceased client work and personal projects, in order to appear impartial and trustworthy to their community." I was never an SEO that provided client work, I started off as an SEO blogger. But I made that decision to remain impartial and trustworthy and it has worked, I think.

In any event, let me take this argument and flip it around a bit. Note, I write this while my house has been with out electricity or heat since Saturday and I am not expecting it back until this coming Saturday (note: my office has power and heat.) 

Most SEO "Stars" I know of became an SEO star through mentions and vouches from SEO Celebrities. There are very few SEO Stars that are well known without being mentioned by blogs - such as SEOBook's blog. :)

Now, I am all for SEOs testing and using hard data. In fact, SEO Scientist and many others have been pushing for this for years. It has gotten some SEOs who are really respected in trouble for using data wrong.
There is room for the SEO Celebrity and the SEO Star but ultimately, results and SEO technique comes from day to day practice, not blogging. I would think that is common sense.

Forum discussion at Sphinn.

Google Vince 2.0 = Google Panda 2.5.x?

One of the senior members at WebmasterWorld started a thread suggesting that the last group of Google Panda updates were more "Vince" update like than Panda update like.

Vince was an update that seemed to push big brands to the front of the results. Panda was more about demotion versus Vince which was more about promotion, or at least that is what SEOs and Webmasters feel.
The senior member, potentialgeek, said:

Vince and Panda work together. Vince promotes authority/trusted sites; Panda demotes thin/shallow sites. Vince 2.0 I believe gave more authority to sites that Vince 1.0 liked. The dial got turned up. After 2.0, Vince in my sector let the top two authority sites for widgets get many top rankings for related keyword search phrases; e.g., red widgets; blue widgets; white widgets, etc., etc. Before Vince they had no rankings for those phrases. After Vince 1.0 they got top 5; after Vince 2.0, top 1-2.
Technically, these updates are still likely Panda related but they can have characteristics of Vince.

Google's Matt Cutts Definitive Cloaking Video

Google's Matt Cutts recently did an almost 9 minute video on the topic of cloaking, he called the video the "definitive cloaking video."

Here are the main points:
  • Cloaking is showing different content to users than GoogleBot.
  • Cloaking Violation of Quality Guidelines
  • Cloaking is High Risk
  • It Is Often Used For Deceptive Reasons
  • No Such Thing Has White Hat Cloaking
Matt then goes through various things to look for when a site might be cloaking.

What about geolocation or mobile devices? The general rule is you "do not need to worry about it", because it is not cloaking. Why? As long as you serve GoogleBot was U.S. users see, then it is not cloaking. Same thing with mobile devices, because you should treat GoogleBot as a desktop user.

He gets into more details in this almost 9 minute video below.


Forum discussion at Google +.

Google Indexing AJAX Content Via POST Requests: Facebook & Disqus

Recently, people started noticing Google being able to index Facebook comments without problem. The hard part is that those comments are rendered in AJAX and only visible via POST requests. GoogleBot, Google's spider, was not able to index that type of content on the web before.

Now, Google announced they have "started experiments to rewrite POST requests to GET." They still recommend using GET requests but for AJAX and some JavaScript becoming really popular, Google knows the importance of crawling this content.

It is widely used for blog comments, such as Facebook comments and Disqus comments.

Matt Cutts confirmed this on Twitter a few hours before the Google blog post:

Cutts AJAX Content

For more details on the technical stuff around this, see the Google blog post.

Google: Experts Exchange A Top Blocked Site In Google

I am not a fan of Experts Exchange, I've called it one of the most annoying sites found in Google. Well, with the Panda update it is no longer found that often in Google.

In fact, Google's Matt Cutts insinuated that Experts Exchange was one of the most blocked sites in the Google Blocked Sites tool and web search feature.

In a Hacker News post on how much traffic the site lost due to Panda, Google's Matt Cutts replied:
One of the signals that we've said that we use in the Panda algorithm that launched in April is how many users blocked a particular site.
To me, a response like that in response to Experts Exchange losing their traffic due to Panda, means that the site was one of the most blocked sites by Google searchers. Of course, I can be wrong and looking too deeply into this.

Forum discussion at Hacker News.

Two New Google Google Webmaster Hangouts

Google is hosting two new Google Webmaster Hangouts in the next two days. 

The first is at Wednesday, 2 November 2011 at 4pm eastern/1pm pacific. It is hosted by Googler John Mueller and will be on the topic of "anything webmaster related like crawling, indexing, duplicate content, Sitemaps, Webmaster Tools, pagination, duplicate content, multi-lingual/multi-regional sites, etc."

The second is on Thursday, 3 November 2011 at 10:30am UK time. It is hosted by Googler Pierre Far and will be on the topic of "anything webmaster related like crawling, indexing, sitemaps, duplicate content, Webmaster Tools, pagination, etc etc."

If you have power, internet and a webcam - you won't want to miss them.

Forum discussion at Google Webmaster Help.