Saturday, November 30, 2013

Google Query Processing By Identifying Entities- Hummingbird, Semantics and Knowledge Graph

With the launch of Google Hummingbird update and the increased use of semantics and Knowledge Graph, the role of entities along with their relationship are set to play a greater role in the ranking of search results. Majority of search queries contains an entity of some sort or the other. For example, a query like "Neil Armstrong Biography" contains an entity "Neil Armstrong". With the help of Knowledge Graph database, Google can easily identify that Neil Armstrong was an astronaut and it can serve the query of the user in a well defined manner. This is the power of Hummingbird backed with the cumulative effectiveness of semantics and Knowledge Graph.



How Does Google Identifies Terms Related to an Entity?


Apart from just identifying entities, there are certain terms related to an entity which provide substantial value to Google in order to process user queries in a more defined manner. As per the example above, the terms associated with the entity might include:

Ohio (where he was born)
United States (his native country)
Gemini 8 and Apollo 11 (spacemissions)
Carol Held Knight and Janet Shearon (his wives)
Eric Armstrong, Marc Armstrong and Karen Armstrong (his children)
Nasa, aerospace engineer, first person to walk on moon, astronaut (terms used to associate Armstrong with)

These terms are great signals for Google to identify the closest matching entity.

Now, Google can Think Both Ways

Search terms associated with an entity and the relationship between them has empowered Google to think both ways! Now, if a search query contains an entity then Google can easily identify information to that entity and if a search query does not contains any entity then still Google can identify entities based on the processing of the terms associated with those entities and those contained in the search query. A recent patent granted to Google has provided enough vocabulary to Google in order to sharpen up its brains using its popular Hummingbird algorithm along with a perfect match of semantics.

The Patent Empowering Semantics, Hummingbird and Knowledge Graph


The patent granted to Google on November 19th, 2013 discusses about "Assigning terms of interest to an entity". The patent lets Google identify candidate terms and weigh each one of them as per their relative frequency.

Use of Data Mining

Under data mining, patterns of data are identified and processed in order to enhance intelligence and determine true value of concepts.

Steps to Process Queries

1- Identification of candidate terms - As a first process, some candidate terms are identified using filtering, modifying and scoring techniques. 
2- Identifying known for terms- Now, an evaluation of candidate terms reveals terms that the entity is known or recognized for. It may describe features, unique features associated with the identity, location of the entity, understand more about the entity etc. 
3- Continuing the process of identifying semantics of known for terms associated with the entities. 
4- Identification of known for terms  in different languages.

Here is a detailed description to the patent:




Full Patent Information Can be Viewed Here: 

Assigning Terms of Interest to an Entity

Inventors:Lee; Jason (Forest Hills, NY), Stern; Tamara I. (New York, NY), Donaker; Gregory J. (Brooklyn, NY), Blair-Goldensohn; Sasha J. (New York, NY)
Applicant:
NameCityStateCountryType

Lee; Jason
Stern; Tamara I.
Donaker; Gregory J.
Blair-Goldensohn; Sasha J.

Forest Hills
New York
Brooklyn
New York

NY
NY
NY
NY

US
US
US
US
Assignee:Google Inc. (Mountain View, CA) 
Appl. No.:13/430,624
Filed:March 26, 2012

Also See:

How Google Identifies Substitute Terms of a Query?
Google Patent to Identify Erroneous Business Listings
How Google Identifies Spam in Information Collected From a Source?
Google Patent Named Ranking Documents to Penalize Spammers
Taxonomic Classification While Finding Context of Search Query
Google Granted Patent for Detecting Hidden Texts and Hidden Links
Rich Snippets in Google
How to Add Ratings and Review Stars on Google Search Results
Query Highlighting on Google Search Results
List of Google Search Operators

Friday, November 29, 2013

Experience Hand Free Search on Your Laptop with Google Hotword Extension

Finally, you will be entering a whole new world of search moving closer to the star trek era where a personal assistant understands your voice commands and provides answers to your queries. Google has gone ahead and launched the much awaited Hotword Extension for Chrome which allows hands free Google search on your laptop. 

google hotword extension

How Will It Work?


It is really simple. First go ahead and download the Chrome extension from here, then start preceeding queries with the voice command "OK Google". It will start processing all the queries that starts with "OK Google". Some examples are given below:- 

OK Google how many ounces are in a cup?
OK Google set a timer for 30 minutes
OK Google what is the temperature today?
Ok Google how many calories are there in Oranges? etc.

So, what are you waiting for? Just download the extension and enjoy the star trek experience.

Download Google Voice Search Hotword

Wednesday, November 27, 2013

Is There a Limit on the Number of Links You Can Have Per Page?

There has been a long debate over the link count one can have on a page. The old recommendation that one should have 100 links per page still holds true. Matt Cutts recently uploaded a video where he describes that having links on a web page completely depends on its theme and need. If the content of the web page requires lots of links then those links are absolutely fine. There is no virtual limit as such. But, if the page adds links simply to spam the web or to artificially pass on page rank to other links then that may be counted as spam by Google.

Some of the recommended best practices suggests that having less than 100 links on a web page is the best way to go.

Now, have a look at the video:


Also See:- 

How to Clean Up Your Link Profile?
Google Will Provide Unnatural Link Examples
Google Diavow Links Tool
Top Ways To Get Natural Backlinks To Your Site
Are You Making Your Website Vulnerable To Future Google Updates
Battle of Ranking Continues
Link Building Factors 2013
How to Create Backlinks that Google Loves
Seo Secrets
How to rank in Google

Monday, November 25, 2013

When Should You Use Fetch as Google?

The fetch as Google tool allows you to see any webpage the way Google sees it. This tool should be used when you have confusions regarding effective crawling of your web page.



Reasons for Using Fetch as Googlebot

1- You are using rich media files or flash and have doubts that Google may not crawl your web pages effectively.

2- Hacked pages can be identified easily with the help of this tool.

3- If you want to check the crawlability of your site then this is the best tool to use.

Fetch as Googlebot shows the 4 different parameters:-

URL
Date
Googlebot Type
Download Time

An example is given below:-


Remember, you can only use 500 fetch request per week, per Webmasters Tools.

Also See:- 

Site Analysis Tools
Free Seo Tools
Google Moonshot Changes
Seo Factors That Have the Biggest Impact on Rankings
Utilizing Tf-Idf Score to Increase Site Rankings

Sunday, November 24, 2013

How to Boost Up the CTR of Your PPC Campaigns?

Are you experiencing low clickthrough rate of your PPC campaigns? This post specially caters to boost up the CTR of your Pay Per Click campaigns.

Optimize Your Ad Copy

Optimizing your ad copy is the first step in boosting up the CTR of your ads in your PPC campaigns. There are several ways through which you can make your ad stand apart in the search results. Here are some of the recommend best practices:-

mouse click
Increase CTR (image credit- cybertegic.com)


Choose a catchy headline

Make your ad headline as impressive as possible. If your brand is popular then use your brand name in the headline. Words like “Free”, “Offer”, “Best Quality”, “Guarantee”, “Sale” etc. tends to perform well in the search results.

Use an appropriate description

Include prices, phone numbers and limited offers text in this area. Words like “money back guarantee”, “Apply Now”, “Enroll Now”, “Buy Now”, “Experts”, “Free Shipping”  etc tends to perform well in the search results.

Add Sitelinks

Site link ads
You can display links to your web pages from beneath the text of your ads. In order to activate site links for your site’s ad edit the campaign settings and specify your site links. Additional links are a great way to let your users show the most important and high converting landing pages. Displaying more than one links of your site on the search results offers more chances of increasing CTR.


Create a Mobile Click to Call Campaign

Displaying click to call numbers in your ad is a great way to persuade the users to click on them. Users searching for services using their mobile phones are more likely to click on phone numbers displayed in the ads.

Include Seller Reviews

Adwords seller reviews


Ratings and reviews have showed a tremendous response among the search engine users. Studies reveal that people are more inclined to click on ads that have been previously rated by other users. Hence, displaying user ratings and seller reviews on your ad is a great way to enhance the performance.

Display Social Annotations

Adwords Social Annotations


Social annotations allow you to display the number of Google Plus users your brand has. Enhanced social values always encourage the users to click on the ad as it gives them a feeling of trust.

Highlight Third Party Reviews

People tend to believe third party reviews because they think to be unbiased. Displaying third party reviews from reputable sources increases the click through rate.

Add Location Extensions

One of the best ways to display your business address and phone numbers in your ads is through local extensions. These types of ads provide two benefits. Firstly, it helps to increase the physical presence of customers on your store or office and secondly it helps to increase the CTR.

Promote Your Ad via Offer Extensions

Offers always excite customers and promoting your ad via relevant offer extensions is a great way to persuade users to click on your ad. Make sure that your company exactly offers the same service as displayed in the ad offer or there are chances that your ad will be disapproved by Google.

Promote Individual Products Directly in Your Ads

You can display direct product ads on Google with the help of “Product Listing Ads”. You also need to have a merchant center account in order to create product listing ads campaign.

Also See:- 

Adword Quality Score
How to Fix Disapproved Ads in PPC?
How to Calculate CTR
Basics of Google Adwords
Quoted Search Results
List of free seo tools

Friday, November 22, 2013

How Recipe Sites Can Gain Traffic Using Rich Snippets and Recipe Views?

One of the most popular niches that exists on the web is related to food and recipes. People love to search recipes using Google and this is the reason the big G has included a special tab of "recipe views" in order to specifically cater to the demands of users searching only recipes.  


Recipe Rich Snippets


In order to enhance the visibility of your recipe page in Google search results, it is recommended to use rich snippets. The rich snippets for recipes is a great way to mark up your web pages with additional information specific to recipes. The type of information you can include with rich snippets are:- 

Name - The name of the recipe

Type - The type of the recipe

Photo - A picture of the recipe

Preparation Time - An estimated time of preparation. 

Calorie - Information related to calories.

Reviews and Ratings - Ratings and reviews as given by the people.

Instructions- Full instructions for preparing the recipe.

Yield- The average yield of the recipe.

Author- The author information related to the recipe.

A web page having so much information specially related to the searched user query surely receives a boost in rankings as it automatically becomes relevant and user friendly. 


Here is a table displaying the full properties of recipe snippet mark up along with their descriptions:- 


recipe snippet mark up
recipe snippet mark up


Here is a quick video which explains the use of recipe rich snippets. Here is the detailed information about the recipe mark-ups you may use.




Recipe Views


Recipe views allows you to view recipe only search results. You can select recipes in the left hand panel of the search result page and switch to recipe views. 

Here is how Google looks with recipe views:- 




Also See:- 

Thursday, November 21, 2013

Google Still Continuing with the Testing of New Adwords Layout?

Google is probably testing new adwords layout. In this new layout, the ads are clearly highlighted as "Ads" in a yellow button. This is a good move by Google as now, the users would be able to clearly identify the ads vs the organic results. Earlier, a light background colour was used to distinguish between ads and organic results.

Here is a screenshot of what I saw:



An announcement is yet to be made by Google in this regard but as Google rigourously tests updates before making them living with all of its audience, you might not see this change yet. Google made this update live for some time this morning and I captured a screenshot.

Also See:-

Google Adwords Bidding
Steps to Optimize Your PPC Campaigns
Why Seo is Still Favored by Organizations
Adwords Quality Score
Basics of Google Adwords
Google Drops Tilde Operator
How to Make Use of Authorship Profile
Google Patent Ranking Documents
Seo Vs PPC Comparison
How to Fix Disapproved Ads in PPC

Wednesday, November 20, 2013

4 Tools You Can Use for Link Earning

Link earning is the process of getting natural links to your website which Google approves of. This is opposed to Black Hat link building which artificially builds links to a site. Here are 4 top tools that can assist you in your White Hat link earning campaign. 

Social Mention


Social mention is an excellent tool for determining the strength, passion, sentiment and reach of your brand. It lets you know what the people are discussing about your brand in the virtual world? You can see results related to your brand that is discussed and shared on blogs, microblogs, bookmarks, comments, events, images, news, audio, video, QA's etc. It would be great if you follow up with the people who are already discussing about your company in order to earn a backlink. People who know your company and share stuff related to your brand are more likely to provide you a backlink as compared to people who seldom share any stuff related to your company.



Social mention is a great tool that lets you track the success of your brand in the social world and also lets you 
find the existing link earning opportunities.

Whitespark Local Citation Finder


Whitespark local citation finder is a great tool for finding all the locally relevant resources that are related to your brand. The sites identified by Whitespark lets you list your business and earn a strong locally relevant backlink. The best part of the tool is, you can identify as many citation sites as possible. You need to keep yourself ready with keywords related to your brand and you can find awesome link building opportunities for yourself.



Mention


Mention is an awesome service that keeps you alerted if any new brand mentions happens on the web. Imagine if you want to know from how many new sources your competitor is earning a backlink? This service will help you to identify exactly that. It is similar to Google alerts but the interface offers easy customization of the data by enabling us to apply filters on the gathered data. 



Bit.ly


Bit.ly is more popular as a URL shortening service but in reality it's much more than that. It lets you identify the topics which are of high interest among your audience. You can discover content, benchmark the domains which the audiences are visiting more, get branded URL's, allows easy social media monitoring, allows customization of data using API's etc. 


Also See:- 

Saturday, November 16, 2013

5 Ways to Fix Duplicate Content Issue Effectively

Duplicate content issue is one of the commonest problems in the World Wide Web. It happens when the same piece of content is available to the search engine bots and the user on more than one URL. As a result, search engines need to work a lot in order to remove duplicate results from their index. Also, as a result of growing spam through the use of widespread duplicate content, Google launched the Panda update which penalized sites having duplicate or near duplicate content in them. Now, it has become necessary for the webmasters to keep their site safe from any kind of duplicate content penalty applied by Google. There are several ways through which the webmasters can keep their site safe from being penalized under “duplicate content penalty” by Google.



1- Add Rel=canonical Tag

The rel=canonical link was introduced in 2012 to solve the problem of similar web documents. The rel=canonical link element lets the search engines identify a preferred version of the URL to be considered as original. As for example, if a site has 3 URL’s namely:-

Example.com/camcorder (the original URL)

Example.com/electronics/camcorder (Duplicate URL 1)

Example.com/electronics?item=”camcorder” (Duplicate URL 2)

From the above 3 different URL’s, the same piece of information about the camcorder can be accessed. This can cause serious duplicate content issue for the main site. We can add rel=canonical tag in 2 duplicate URL’s as provided below:-

<head>
<link rel="canonical" href=" http://www.example.com/camcorder / >
</head>

Adding the above rel=canonical tag in the duplicate URL’s will tell the search engine crawlers to attribute the content of the page to the original URL this saving the site from getting penalized due to the duplicate content issue.

2- Assign a 301 Redirect

301 redirects tell the search engines that the page has moved to another location thus passing all the link equity and value to the main site. This should be the solution when the duplicate page has backlinks and traffic coming to it.
The 301 redirection should be provided in the .htaccess file. An example code is given below:-
Redirect 301 / http://mt-example.com/

3- Remove the Link

In many cases the simple and the best solution is to remove the duplicate pages from your site. This will make your task and the search engine crawlers task much easier. You can remove the pages and return 404’s for them.

4- Use robots.txt or Meta robots

Another preferred way of fixing the duplicate content issue is by either using robots.txt or the Meta robots tag.

Through robots.txt

Add the following code in order to block the search engine crawlers from accessing the duplicate content. This will ensure the duplicate content can be seen by the users but remains blocked for the search engines.

User-agent: *
Disallow:  /duplicate
Disallow: /duplicate.html
Disallow: /original/duplicate.html

Change the lines as per the file names and locations of the duplicate URL’s.

Through Meta robots tag

Meta robots tag is a header level directive that tells the search engines to index the contents of the web page as per the directives mentioned in the tag.

A simple directive like nofollow can direct the search engines to not to index the contents of the web page. An example is given below:-

<head>
<meta name= “ROBOTS” content= “NOINDEX, NOFOLLOW” />
</head>

5- Use Parameter Blocking

For large ecommerce sites, parameter blocking can be used as an effective solution for blocking the duplicate content. To set parameter blocking, follow the steps given below:-

a- Log in to the Google Webmaster Tools
b- Move to “URL Parameters” located under “Crawl” tab.
c- Click on Edit and select “no” from the drop down list. A “no “indicates the presence of duplicate content in the selected URL parameter.

A word of caution: - Be 100% sure when you are using URL parameters to block similar content because it can cause the non duplicate pages to get blocked by the search engines.

For me, the preferred options are using rel=canonical tag and the Meta robots tag. Both these options are less risky and solve the duplicate content issue effectively.

Also See:- 

Thursday, November 14, 2013

Using Twitter to Repair a Damaged Online Reputation

In today's world, your online reputation is your actual reputation. A Google search that comes up with old Internet skeletons can scare off potential employers, clients, partners, or investors. Getting a handle on your online mess can be difficult, but is absolutely necessary. After all, it is necessary to get rid of bad reviews and blogs such as reviewreputation.com are an excellent source to remain educated.

Before you pay to clear up questionable content that's been haunting you online, consider the founder of the microblogging movement - Twitter. Used correctly, Twitter can help you mop up the mess that you - or someone else - made of your online reputation.




Twitter Has Clout

As the top micro-blogging site in the world, Twitter ranks high in search engine listings. If someone doesn't own a business or operate a website, their Twitter page is among the very first items that appear in a search of their name. Use this to your advantage. By flooding your Twitter feed with truthful, positive content about you, your business, or your blog, you will ensure that at least some of your better qualities will be among the first things people see when they run your name through Google.

The Truth Is in the Search

When people think of online search, Google is the first thing that comes to mind. But the truth is, Twitter has a powerful - and underutilized - search function that can uncover a gold mine of information regarding what is being said about you or your business. By searching your Twitter handle, you can immediately see who is talking about you on the world's most popular place to anonymously gripe.

Engage in Dialogue

Once you find conversations that involve you, it's simple enough to start a dialogue with the people having them. Are they disgruntled former employees? Angry customers? Investors who feel like they haven't seen a big return quickly enough? By engaging them in conversation on Twitter, you not only might be able to convince them to rescind some of the negative comments they made in a cloud of anger, but more importantly, you'll have put it on the record for everyone to see that you attempted to engage and correct a perceived injustice. You'll come out looking like a rational individual who did everything they could to soothe an angry but unreasonable detractor.

Don't Mess Up in the First Place

Twitter is a great place to vent. It's always accessible, someone is always listening, and you only have a few words to get your point across. The thing is, even if you delete your tweet the next day when cooler heads prevail, it may already have been retweeted, saved, or better yet, someone may have taken a screenshot. Resist the urge to post anything political, religious, racial, or anything you know will offend or alienate someone who stumbles upon it.
Twitter is a strong but underutilized tool for managing a damaged reputation. The best move you can make is avoid hurting your digital rep by abusing Twitter in the first place. Be smart, be consistent, and always remember that your online reputation can be the only reputation that matters.

Also See:- 

Monday, November 11, 2013

Make Your Pages Load Faster with Google Speed Suggestion

Google has introduced a new feature under Google Analytics which displays the speed of the web pages and also suggest tips to optimize them. You can use these suggestions to make your web pages load faster and improve the user experience of your site.

Where to Find Site Speed Suggestions?


You can find the site speed suggestions under site speed section in behavior tab in Google Analytics.


The speed suggestions section displays 4 columns namely Page, Pageviews, Average Load Time, Page Speed Suggestions and Page Speed Score.


Page Speed Insights


The page speed suggestions column displays the page speed insights report and scope for improvement. It also displays a score in the range 1-100, the more the score the better the speed.

Why Optimizing Speed is Important?


Page speed is an important factor for search experience optimization. Users are more likely to spend time on the site which loads faster as opposed to sites which takes time to load. Hence, this extremely useful feature will help the webmasters to take actions on slow pages of the site in order to improve the search experience of the user.

Also See:- 

Page Speed Important for Ranking
Google Search Quality Updates
mod Page Speed for Apache
Search Experience Optimization
Google Shares Views on Guest Blogging and Links
EMD Update
Page Layout Algorithm
Google Plus and Seo

Friday, November 8, 2013

How to Fix a Disapproved Ad in Adwords?

It can be extremely frustrating for a business generating revenue from Adwords to see their ads getting disapproved. Being the best PPC network in the US, as quoted by PPC Management Company, Webrageous, businesses find it extremely necessary to get that ads running on Adwords. There are several reasons for a disapproved ad. This post will shed some light on how to fix the disapproved ads in PPC?

disapproved

Monitor Your Adwords Account


It is important to constantly monitor the Adwords account in order to find out which ads are disapproved. You may find the disapproved ads under the status column located under the ads tab of your campaign page. Check out the reason for disapproval and fix it as per the directions.

Reasons for Disapproval


Violation of Google’s ad policies is the only reason for disapproval. Google divides 7 important policy types regarding the successful running of the ad. These are:-

User Experience – The ads should serve the users well. Variables like accurate phone numbers, ads character limit (25 characters for headline and 70 characters for description), image and video file limitations, accurate company details, excessive capitalization, no instant download links etc. are all important.

Safety and Security – Google follows the policy that users should experience a safe and secure advertising. Sites should keep themselves safe from all forms of viruses, phishing, Trojans, scams, cloaking etc. It must be the responsibility of the webmasters to ensure that their sites provide a safe browsing experience.

Accuracy - Claims made by an advertiser should be accurate. All the offers and discount including the pricing information should be up to date.

Violation of User’s Privacy or Trust – Ads should not violate the user’s trust or privacy. Google does not allow running of ads whose primary motive is to collect personal information. Also, sites should be SSL secured if they are dealing with personal credit card information of their customers.

Legal and Safe Products or Services - Google does not allows ads related to abortion, sexual services, illegal drugs, illegal hacking, promotion of tobacco products etc. It allows restricted promotion of gambling and casino related services.

Copyright Laws – All the ads should be legal for every respective country. Any ad violating laws country laws, state laws, business industry laws etc are disapproved. Copyrighted materials are not allowed for promotion unless being approved.

Compatibility with Google’s Brand Decisions


Ads must be compatible with Google’s own brand decisions like any ad which are violent and are against a particular group of people or organization. Ads which use inappropriate languages like using the word “click” as a call to action phrase are eligible for disapproval by Google.

Some Tips


Always enable the “policy details” column while running your ads as this will enable you to see the approval status of your ads. You can enable policy details by going to the ads tab, clicking columns, then clicking customize columns and then clicking on add next to policy columns.

You may also apply filters to find out all the disapproved ads at once.

To apply filters follow the below steps:-

1-      Go to campaigns tab.
2-      Click on the ads tab, then click filter.
3-      Select “Create Filter” and choose “Approval Status” from the drop down menu.
4-      Remove the check boxes except “Disapproved”.
5-      Click apply and you are done.

Editing the Old Ad and Creating a New One


Before editing the old ad, find out the reason for its disapproval by using the above methods. Then, go ahead and create a new ad which complies with the Google guidelines. In most of the cases, changes in ad text solve the problem else you also need to make changes in the URL to get your ad approved.

Google’s advertising Policies

To view, Google’s full advertising policies that affects the approval of an ad to be displayed in search results, please visit:- https://support.google.com/adwordspolicy/topic/1626336?hl=en&ref_topic=2996750

Avoiding Ad Disapprovals

Here is a video sharing awesome tips on how to avoid ad disapprovals on adwords:-




Also See:- 

Adwords Quality Score
Basics of Google Adwords
Seo Vs PPC Comparison
Determine a Keywords Overall Worth
How to Calculate CTR

Thursday, November 7, 2013

Get a Responsive Design Without Losing Any SEO Value

Many webmasters are afraid of implementing responsive design on their site because of the fear of losing SEO value. But, the fact is, responsive design is great from both user point of view and SEO point of view.

If you are implementing two separate versions of site like site.com for desktop users and m.site.com for mobile users then it is necessary to use rel=canonical tag for your m.site.com. This will provide the full credit to the desktop version of the site and help it to rank better. The Page Rank would not get divided between the desktop and mobile versions.

But, having a responsive design allows to have a single version of the site serving both the desktop and the mobile users.

Here is a video from Matt Cutts explaining the confusion between SEO value and responsive design.



Also See:-

How to Design SEO Friendly Web Page?
Navigation in Seo
Orphan Page
Breadcrumbs
Site Wide Links
Doorway Pages
RSS feeds
Rel Canonical Element
Image Optimization
Benefits of SEO

Sunday, November 3, 2013

A Breakdown of Google’s Webspam Algorithm Updates

Google has remained extremely busy in wiping out web spam from its search results. The biggest problem that Google still faces is identification and removal of spam. A dedicated team headed by Matt Cutts does this job of tackling spammy tactics used by webmasters in order to influence the search engine results. This is the reason; Google has released several Webspam algorithm updates specifically targeting overly promotional SEO practices. A breakdown of such major Webspam updates are given below:-

Major Webspam Google Updates


Florida, November 2003

The Florida update that happened on November 16th, 2013 is the first web spam oriented algorithm update which targeted those sites that were applying aggressive SEO techniques. For the first time a filter was introduced in the algorithm which filtered sites applying black hat seo tactics like keyword stuffing.

Austin, January 2004

Heavy on page SEO tactics like Meta tag stuffing and invisible text was taken over by this update. Some webmaster called it as “another Florida update”.

Jagger, October 2005

The Jagger update was the first update to take on spammy “off page seo” tactics like low quality links. Tactics which gained huge success before 2005 like reciprocal links and link farms were taken on by this update. Also, Google penalized sites engaged in buying and selling links.               
                                
May Day, May 2010

This update impacted the long tail queries affecting sites that provided thin content to target long tail queries to pass more and more organic traffic to their sites. For the first time, site authority was considered as a factor for ranking sites.

Attribution, January 2011

Attribution update effected sites having “low levels of original content”. Google took on sites having scraped content and less original content in them.

Panda, February 2011

The full fledged content focused update was launched on February 23rd, 2011. This update lowered the rankings of low quality sites or sites having thin content in them. Panda update was also known as “Farmer update” because of its ability to target content farms sites. The content farm sites were those sites that had lots of pages which focused on specific long tail queries and provided a bad user experience. These sites were full of ads and were missing the content quality. Panda lowered the rankings of sites having low quality, duplicate or thin content in them.

Human quality testers were used for the first time for judging sites based on quality, trust, design and speed. It gave Google the power to identify sites that have content “most loved or most preferred by the users”.

Panda 2.0, April 2011

As Panda was a huge algorithm change so Google took measures in rolling out the updates in a slow and regular fashion. These updates continue even today but they have now been integrated into the main ranking algorithm and updates are more frequent and less noticeable. Panda’s second refresh included all English queries and used Chrome data to lower the rankings of sites which users blocked the most.

Panda 2.1, May 2011

Google continued to improve its Panda algorithm in order to return quality sites in search results.

Panda 2.2, June 2011

This update happened on June 21st and focused on the main motto of stopping the low quality sites from ranking higher on the search results.

Panda 2.3, July 2011

Some new signals were introduced with the Panda 2.3 update which was not disclosed openly by Google.

Panda 2.4, July 2011

The Panda update was rolled out internationally with 2.4 update.

Panda 2.5, September 2011

As part of Google’s commitment to return high quality sites, Google continued to roll on more Panda updates.

Panda Flux, October 2011

Some minor updates started happening with Panda flux.

Panda 3.1, November 2011

Panda updates now became a common phenomenon which updates happening every month. As such, these updates were less noticeable.

Panda 3.2, January 2012

The main algorithm remained the same and Google continued to improve its algorithm.

Panda 3.3, February 2012

This update was really minor and less noticeable.

Panda 3.4, March 2012

This update affected around 1.6% of the search queries.

Panda 3.5, April 2012

Another Panda refresh happened on April 19th, 2012.

Penguin, April 2012

Another major Webspam targeted algorithmic change was launched on April 24th, 2012 with the name Penguin. Google decreased the rankings of sites having low quality link profile with the Penguin update. This update impacted 3.1% of English queries. Google suggested creating amazing and compelling sites that provided high value to the user.

Panda 3.6, April 2012

On the other hand, Google continued to update the regular Panda update.

Penguin 1.1, May 2012

The first update of Penguin happened on May 25th 2012.

Panda 3.7 and 3.8, June 2012

Two more updates happened on June. Panda 3.7 had a somewhat bigger effect than the 3.8.

Panda 3.9, July 2012

This update affected around 1% of queries that was less enough to notice.

Panda 3.9.1, August 2012

With each Panda update coming so soon, they became less noticeable.

Panda 20, September 2012

This update was large and affected around 2.4% of search queries.

EMD, September 2012

In order to decrease the rankings of keyword stuffed domains, Google introduced the Exact Match Domain (EMD) update on September 27th, 2012. This update decreased the rankings of less authoritative sites having exact match keyword in their domain name.

Penguin 3, October 2012

This update was minor affecting around 0.3% of search queries.

Panda 21 and 22, November 2012

More Panda updates rolled out affected smaller portion of queries.

Panda 23, December 2012

This update affected around 1.3% of the search queries.

Panda 24, January 2013

Google continued updating the Panda update in 2013 with the first release of 2013 coming on January 22nd.

Panda 25, March 2013

After this update, Panda was incorporated in the main algorithm.

Penguin 2.0, May 2013

This update had a less impact on low quality sites and black hat SEO remained unaffected by this update.

Penguin 2.1, October 2013

As the Penguin 2.0 had a smaller affect on decreasing the rankings of low quality sites so Google started out a major update in the form of Penguin 2.1. This update happened on Oct, 4th 2012 and had a noticeable impact on the search results.

Also See:- 

Why Paid Links Are a Violation of Google's Guidelines
Google Disavow Links Tool
Google Algorithm Change History
Google Reminds of Taking Action Against Sites That Sell Page Rank
How to Design Seo Friendly Web Page
Links from Article Marketing Sites
No Follow Vs Do Follow