Google announced support for specifying a canonical representation of content back in Feb of 2009. This was done in order to help webmasters avoid duplicate content in their sites by specifying the rel=”canonical” link element in the head section of duplicate pages. In December of 2009, Google announced the ability to implement the canonical tag cross-domain. This is particularly helpful for webmasters who have mulitple domain names pointing at the same content and cannot implement 301 redirects on their servers.
If you have a dynamic site, editing the head of each document individually is not possible. In this situation you can utilize the Apache REQUEST_URI server variable in PHP to create the canonical link element dynamically. The example code is below:
<link rel="canonical" href="http://www.domain.com<?php echo htmlspecialchars($_SERVER['REQUEST_URI']) ?>" />
In this example, you would put the code snippet in the head area of your dynamic page(s) on your other domains ie. domain.net and domain.org. This tells the search engines that no matter what the path or page is on those domains, the canonical location of that page is located on domain.com.
Well I sent a t-shirt off to Jeremy AKA Shoemoney and he was kind enough to do a blog post about it. In case you didn’t know, Jeremy is a successful web publisher, Technorati Top 100 blogger and the creator of AuctionAds.com
Eweek published an article yesterday about the negative effects of Google’s “Big Daddy” rollout. In the article, Larry Page is quoted as saying the issue has caught Google by surprise and they have a team investigating what happened.
I can speak from my own experience. I have a few sites which have dropped from 1000’s of pages indexed down to only a few. One site dropped to only having it’s index page listed. What’s interesting to me is that Google hasn’t come out before now to tell us their index is “broken”. Most of us in the SEO community have known something was broken with the Big Daddy rollout for awhile now.
Lack of Storage? Google’s CEO Eric Schmidt has recently stated in an interview that Google faces a massive machine crisis, saying they are running out of storage. Not sure how a company with the resources of Google can have a machine shortage. Perhaps they were trying to purge some spam and duplicate content from their index in to free up much needed storage. Perhaps the pruning went too far, deleting quality pages and sites.