Friday, October 7, 2011

Deepak Rajput - Google Panda 2.5

Yet another Panda has been released from Googles’ cage, the big G even confirmed that the algorithm was officially changed with roughly 500 updates, Search Engine Land reports that this version of Panda is to be referred to as 2.5.

The fact that Google has not released any specifications of the algorithm should not come as any form of surprise. Even the types of content, website and pages that are being targeted have been kept under lock and key. The only thing Google has said about the Panda is the default statement which is said with every update.

WebProNews states that the Panda was let out of its’ cage on the 28th of last month, but there are a few blogs that suggest it was in fact released the day before. If we side with WebProNews on this occasion we can see that Google 2.5th Panda has had the biggest gap of 7 weeks since last Aprils’ update.

Panda 1.0  -  Feb 24th

7 Weeks

Panda 2.0  -  April 11th

4 Weeks

Panda 2.1  -  May 10th

5 Weeks

Panda 2.2  -  June 16th

5 Weeks

Panda 2.3  -  July 23rd

3 Weeks

Panda 2.4  -  August 12th

7 Weeks

Panda 2.5  -  September 28th

A lot of SEO’s are only just hearing of the new Panda because as we should all be aware of, it takes a few days for some sites to be affected. Similar to all previous updates of the Panda, many sites have seen massive changes in their SERP’s. However now is a time that we try and “fix” whatever was wrong with our site if we have dropped off of the face of the SERP’s or just simply sit back and enjoy the free ride towards a flood of traffic.

A Step By Step Guide To Solving One Of The Most Important But Commonly Overlooked SEO Issues – Canonicalization

One of the most common issues I come across when providing SEO to clients is lack of canonicalization. For those of you who aren’t aware of what it canonicalization is, here’s an explanation:

A lot of websites are actually displaying multiple versions of the same site to the search engines without even realising it. This is due to the fact that the content they provide can be called up on multiple URLs – one of the most common ways in which this happens is by making both the www version and the non-www version point to the same place. For example both  and  would display the same content.

Notice that the second URL doesn’t contain the www. and shows only the root domain, whereas the first one is calling on the subdomain. This is quite a common issue and many people don’t realise that the www. is actually a subdomain in the same way that blog. or tools. can be a subdomain.

While it may not appear to be causing much of a problem, there are certain issues that can arise from not having this correctly implemented.

Duplicate content is one of the major issues this can cause as you are displaying the exact same content on different pages across your site. This can cause indexation issues as you can end up with the search engines indexing half of your site with the www. format and the other half without. This also opens up opportunities for your site to get hit with duplicate content penalties.

As well as the potential content pitfalls, there’s also the potential issues with links to your site. Showing two different versions of your site can result in gaining links to each of these from other sites. This is effectively splitting your linkjuice between the pages and causing the duplicate versions to compete with each other for rankings. This is most commonly and issue with the homepage, with sites often having links to both the www. and non www. version.

So now you know the potential problems how do you go about correcting it?

The most effective way to solve this issue it to implement a canonical 301 redirect via a .htaccess file. This effectively redirects any page that doesn’t have the www. to the www. version or vice versa.

To do this you need to be hosted on Linux for the redirect to work correctly, if not you’re going to need to do a bit more work and have a look at redirecting through IIS:

Follow these steps to get up and running:

  1. Open up notepad or whichever plain text editor you use and create a new text document and save this as “.htaccess” (make sure it doesn’t come out as .htaccess.txt or something like that)

  2. Place this file into the root folder on the site

  3. Decide on whether to use the www. or the non www. version. I would generally recommend the www. as users are more familiar unless the non www. version is more popular, have more links or is how you have been marketing the site.

  4. Place the appropriate  code into the .htaccess file :

To redirect "" to "" place this code in the .htaccess file:

RewriteEngine On
RewriteCond %{HTTP_HOST} ^ [NC]
RewriteRule (.*)$1 [R=301,L]

To redirect "" to "" place this code in the .htaccess file:

RewriteEngine On
RewriteCond %{HTTP_HOST} ^ [NC]
RewriteRule (.*)$1 [R=301,L]

  1. Upload this file and test to make sure everything works. This is the most important point – you must test everything to make sure it is working ok, 301 redirects can be messy if they go wrong.

301 redirects pass on a large percentage of the weight of the page that is being redirected, so sitewide this may help to clear up a lot of issues and give your pages a nice boost in the SERPs.

Deepak Rajput - Search Engine Optimization(SEO) is one of the best practices

Search engine optimization best practice guide:

Here i list the most important tips to do a better search engine optimization; this can really make your website top on search engines. Follow my steps only if you are serious about your website.


Keyword research:

  • Try to find the most performing keywords to point based on relevance and KEI for every webpage of the website (or as much as doable).
  • Restrict the KWs to point of every page to 1 or 3 single phrases.
  • To keep analyze of which keywords the pages has to be pointed insert those in the Meta KW tag properly, a most good tag to keep observe of this.

On-Page Optimization:

  • Choosing of a Domain name: Try to have the very important KW in the domain name. Don’t turn it feel spammy, in case a web user feels it spammy, Google for sure also thinks the same.
  • Making of a Title: Try to have all possible variations of the KW. Do not use the same word greater than 2 times and not after each other. Keep company name at the last portion of the title if branding is unavoidable.
  • Better Internal navigation: Every web page HAS to be accessible within two clicks (if a site not built on a database). If not with navigation structure then with a html sitemap. Isolate if any WebPages are most important than any others and if so, try to have more links to it. For example if the index page is most important have a link to it from every page – both for the human visitors and also for the SEs. Never go for the navigation options with flash or some complex js – but if you want to use it then have a basic HTML navigation at the footer of the page.
  • Use heading tags H1 and H2 with KWs in them and assign them in the style sheet. Use the heading tags in a appropriate way to avoid possible filters.
  • About the Meta description: KW in the beginning, action for the user in the end.
  • KW rich text begins early in the page source: Use CSS-P for it and have the CSS and JS as external files.
  • Page should have a lot of text: You should try to keep your web pages minimum 250 words each.
  • Try to use the Title attribute in the anchor tags when linking.
  • Try to use the ALT description tags with the images.
  • Try to use the strong or bold tag if possible to get more strength on KWs.
  • Use the KWs in the body text (3% to 4% is recommended). Avoid a very high density, if it looks a bit spammy, reduce the frequency of it. You write articles for the humans not the SEs.
  • In case you are using any tables, try to have the summary attribute.
  • Use XML and put all design in the external CSS file to minimize the file sizes and make it “very easier” for the search engine spiders to see the actual content.
  • Avoid frames and flash built site.
  • Remove dangerous words from the title, headings, file name and other important areas.
  • Try to have static URLs. If you use a database get it mod_rewritten.
  • Try to use dashes in the URLs to separate the keywords. Verified by my own SEO evaluation as well as written on the Google blog by Matt Cutts and GG.
  • Minimize your files size under 100Kb. (spiders may not read additional)
  • Don’t go for ANY type of black hat SEO or anything could unknowingly be seen as black hat. This includes doorways, non-301 redirects, hidden texts or links, spamming, mirror domains, cloaking etc.
  • Validate the page source code by using W3C valuator so you make it very easier for the SE spiders.
  • Never link to any unknown site that could be seen as a bad site, if it is unavoidable use a “rel=nofollow” attribute with it.
  • Use 301 redirection to have the Page Rank on one version of the domain name (confirmed by GG on WMW and by Matt Cutts on his blog)
  • If you have any link to an unimportant page, for example copyright or terms & conditions on every page have the “rel=nofollow” attribute so that you supply the PR more to your other important pages than that one.

Off-page optimization:

  • Attain back-links from high PR pages to your site. Try to get them related in the same theme and the same niche. Avoid massive two-way linking, it could be penalized your website – three way linking technique is better.
  • Avoid temporary links to your site, try to get permanent back-links, the older they are the better always.
  • Have your KWs (which want to get performed on SEs) in the anchor text of the links to your site.
  • Get away from massive cross linking.
  • Whenever you think that “now I got enough back-links”, set a new target to multiply them.
  • Best methods to attain links: Write and post moderate quality articles to article directories, write anything unique and valuable, get submitted into niche web directories (also general once in a while are ok) and Social bookmarking sites. But get away from the FFA kind in which there is no approval.
  • Make a XML Google sitemap and submit it in it’s webmaster tool.
  • Get a more reliable web host company that has more than 99% uptime, is fast and is not hosting poor sites (you don’t want to be in a bad neighborhood.). Always better to have a unique IP for your websites.
  • Submit your site to SEs, social bookmarking sites and web directories ONLY manually.

Factors in the long run:

  • Duplicate content is not King. Always UNIQUE content is King. Write anything useful that has not been written never before and you will understand what I mean.
  • Add lots unique and valuable content on a regular basis. Try to make minimum one article or web page with text every week.
  • Make your web site bigger. One big good one is always much better than 100 small not-so-good ones.
  • Have good statistics and analyze tools to monitor source you get your visitors, their paths on your website and optimize it.
  • Observation: Check and observe every time. Check the SERPs, check the statistics, check everything and learn yourself. If something goes good push it, if something was not good find a solution for it.

Personal recommendations:

  • If you have a site about something try to concentrate it in a certain niche.
  • Create a search engine friendly web directory in your niche of specialty. You will get a huge advantage of this. You can ask or make them sign up for your newsletter or can ask a link back when they are submitting their site. When you send your newsletter have it useful information and you won’t get any unsubscribe.
  • Do a detailed research before starting a new site. Look at which sites are currently existing, what is required and wanted and then build one that is the best in a certain field. Like if you find there are 20 related sites like the one you want to build, find out how you can make it the best of them.
  • If you provide services try to always give more than what is expected and you will build up a good reputation.
  • Find discussion groups in your related field and offer valuable information and help members by replying their questions etc. Have your site link in the forum signature and you could understand that it will be more famous.
  • In the body content give importance to the visitor.
  • Big search engines have million of parameters in their algorithm, don’t try to cheat it.
  • learn the SEO industry by reading blogs and forums to catch the algorithm changes and follow up with optimizing your site.

Best personal advice:

  • please do not build a web site only to make money but build a web site for the benefit and support to your visitors and you could find that the money will come.

Wednesday, October 5, 2011

Deepak Rajput - Simple SEO Tips

Some may say - there is nothing simple about search engine optimisation. However, whether SEO is easy or not really depends on the particular website. In most cases SEO is not difficult as long as sufficient time and effort is invested into the project.

1. Create webpage titles related to targeted keywords.
2. Make your meta description unique for each webpage.
3. Use most important keywords in webpage name or URL.
4. Use cascading style sheets (CSS) rather then tables.
5. Add H1 tags preferably towards the top of each webpage.
6. Implement breadcrumbs functionality on your website.
7. Analyze your targeted keywords using Google suggest.
8. Implement the sitemap as text links.
9. Put CSS and JavaScript into external file.
10. Describe your images with the use of the alt tags.
11. Make sure your site logo alt tag contains your keywords.
12. Avoid automatic directory submission.
13. Find quality links to your website. Not easy.
14. Don't use same anchor text in all your inbound links.
15. Use deep linking to many webpages on your website.

Online ICICI Bank Login, ICICI, ICICI Login, ICICI Bank Online,Online ICICI

Here From Below Link You can Login and Check your icici bank account status online within few minutes...

Online SBI, Online SBI Login, SBI Online Login, SBI, State Bank Of India Login

Here From Below Link You can go to SBI Online, Online SBI, Online SBI Login...

Tuesday, October 4, 2011

Only 7% Of SEOs Don't Provide Ranking Reports

We polled our readers, mostly SEOs, asking Do You Provide SEO Ranking Reports?
The answer is overwhelming yes. We had just under 250 responses and here is the break down:
  • 66% always provide ranking reports to clients
  • 25% provide only upon request
  • 7% never provide ranking reports

SEO Ranking Report Results SEO's News

Do these results surprise you at all?

Real Time Google Analytics

Thursday, while I was offline again, Google announced a cool new feature to Google Analytics to show real time statistics.

Google launched "Google Analytics Real-Time: a set of new reports that show what's happening on your site as it happens." Here is a screen shot:

Some of you may see this feature now, for those that don't - you can sign up to get access to it early over here. Until then, be on the lookout for it.

How real time are these real time reports? People are saying about 3-4 hour delay. So that does not seem all that real time to me, does it?

They also launched Google Analytics Premium for more services and support at a cost.

Google Panda 2.5 Update

Earlier last week, Google pushed out an update to the Google Panda algorithm.

I didn't see it before going offline for the holiday but Google confirmed the update based on a WebProNews story featuring DaniWeb getting hit again by the update.

In fact, Dani from DaniWeb posted in the Google Webmaster Help forums complaining:

However, we were hit again on Wednesday, September 28th, once again losing more than half of our traffic. I think this might even be a bigger hit than last time. I am still investigating whether or not this was another iteration of Panda that just went out or something different?

Wow, lost half their traffic again, after all that work to get back in!

SearchMetrics released their stats on who was hit the hardest by this update and Danny has a good summary of that at Search Engine Land.

Google said this update hit sometime on Tuesday and people started to take notice on Wednesday. Google confirmed it on Thursday.

Here is the update history:

    * Panda 2.5 on September 28th
    * Panda 2.4 in August
    * Panda 2.3 on around July 22nd.
    * Panda 2.2 on June 18th or so.
    * Panda 2.1 on May 9th or so.
    * Panda 2.0 on April 11th or so.
    * Panda 1.0 on February 24th                                                                              Deepak Rajput

Monday, October 3, 2011

Deepak Rajput - 10 Amazing Photos of the Milky Way Galaxy

It's Official – Panda Update 2.5 Has Arrived!

What the webmasters had been speculating for the past weeks has finally happened. Panda 2.5 has struck Google itself has given a confirmation of the same, saying that the latest issue of its Panda algorithm update is live in this statement given by the search engine giant-

“We’re continuing to iterate on our Panda algorithm as part of our commitment to returning high-quality sites to Google users. This most recent update is one of the roughly 500 changes we make to our ranking algorithms each year.”

There is no mistaking it now. It is confirmed and it is official. And has been reported all over the internet. So, if your site's traffic from Google has dropped in the last 2 days, well you have been hit!

How The Story Broke Out?

Dani Horowitz, the founder and CEO of DaniWeb, an IT discussion community, sent out an email saying that her site was hit on the 28th of September. This is the second time DaniWeb was hit, earlier this year too, she was hit. But then she made a lot of changes, and gradually made a complete recovery. But, this time, the Panda has hit the site again, and they are unable to understand the reason as the site had been modified according to the guidelines laid by Google.

The email that Dani sent out says, “After being hit in February, and fully documenting every change we’ve made, we eventually made a more than complete recovery. However, we were hit again on Wednesday, September 28th, once again losing more than half of our traffic, I think this might even be a bigger hit than last time. It is a definite confirmation that we lost all our Google traffic overnight.”
Till now, Google has just confirmed to the update, but is not sharing any further details about what types of sites, pages or content this update targeted. But here is a video by Google explaining how they tweak the search algorithms.

The dreaded date has been thought to be either 28th of September, or as some webmasters are saying, 27th of the month. But ever since the Panda rolled out on February 24th this year, speculations were rife, as to what sites would be hit next and why. Google was on the warpath against content farms and sites like Ezine Articles, Hubpages, eHow were hit.

Then they rolled out the international version of the Panda Update on the 11th of April, which affected all searches using English language on Google including both English speaking countries and English searches in non English countries too.

Since then updates have come almost every month, and Google is cleaning out the scraper sites. What other sites have been affected, will be revealed soon. What can you do to fight the Panda, will be discussed soon, so keep watching this space.

Google+ Gets 50 Million Users – Introduces Circles Sharing!

Just days of going public, Google+ has now touched the 50 million users mark. And that too in just 88 days of being launched. This astonishing figure was given by Paul Allenon his blog, where he estimated that Google+ hit the 50 million mark over the weekend.

Google+ Keeps Growing:

Paul Allen estimated that Google+ has been growing by at least 4% per day, meaning that around 2 million new users have been signing up each day. This means that Google+ is turning out to quite “social!”. Also, as it is integrating every Google+ functionality and the +1 button and Circles (targeted sharing) into its products like- the Chrome browser, Android phones, Gmail, Google Reader, Blogger, Google Photos etc. it is making the social network easily accessible. Google is giving the users numerous ways of signing in to Google+. The reach is well planned and user response is growing.

The graph shows the huge difference in the speed with which Google+ reached 50 million, in just 88 days, and how its ace competitor Facebook reached that in more than 1000 days. The growth is almost viral.

Paul's methodology of calculating the number of Google+ users works on the basis of rare last names' occurrences. He has made an estimation that Google+ has 43.4 million users on September 22nd . Then it went public and the user base rose to over 50 million on September 25th.

Every Research Has The Same Result- Google+ is expanding fast

Hitwise's research has the same results. According to their analysis, Google+ opening its doors resulted in a huge raise in the market share of visits for the site with a 1269% increase. In the USA alone, Google+ was visited by 15 million people.

They also revealed that within a span of a week, Google+ climbed the ranking charts and went from ranking as 54th most visited social networking site, to the number 8th in the category.

The Google+ 'Circle' Moves On:

Google+ is continuing to roll out newer and better feature, in a parallel fashion to its increasing user base. The much talked about “Circles” are better now. They allow for more sharing, infact the Circles can be shared now. Now your can not only segregate your contacts into different Circles, but you can share a Circle with other contacts too.

You can share a circle with Deepak Rajput:
  • Your other circles
  • Other Contacts
  • With Everyone-going public.
So, the mystery of who is in your particular Circle can be shared, but privacy still remains intact as
  • The name of the circle is not visible
  • The future updates are not shared.
Here is how you share a Circle-click on the circle you want to share and then click the share this circle option. Then simply choose who you want to share the Circle with.

See how it is done here:

The purpose of this feature is for those who are interested in sharing with like minded people. To connect with them etc.

SEO Tips – Check Out The 20 Minutes Quick Audit Checklist!

Audits are really necessary when it comes to measuring and analyzing SEO activities. You need to know what is working for your site and what's not, so that you can focus on the “what's not” part. Audits become necessary for SEOs as they need something substantial to prove that they have been doing well and that too in the right direction.

Now, comes the next challenge. How to decide that what will be audited. Making that perfect audit checklist is not an easy task. Why? Well mainly because of these essential factors-
  • Each site is unique
  • Every business/company runs a site with varied goals
  • Even every competition is different and unique
  • SEO factors also have different impacts on the rankings.
Add to all this the ever changing nature of search engine algorithms, and you have the huge task of preparing a checklist. For those struggling to find time to do SEO audits, InMotion Hosting has posted a quick 20 minute SEO checklist.

This check list will work for you if you are running short of funds to get a formal check done. But if the client is pressurizing for results and you are unable to improve the rankings, may be you need to stop and do a quick check.

However, doing the right check is important in case you identify a problem, which is not actually a problem! The result- your fix, could lower the results, and put you two steps behind. Also, presenting the clients with their site checked against such a checklist will make them understand that you are putting in your 100% and they will give you more leverage.