Thursday, January 29, 2015

Penguin Might Integrate Into Google's Overall Algorithm Updates

Google Penguin updates are set to happen faster and most probably the percentage of queries affected would be lesser. If the updates happen faster then minor changes are done which indirectly happens to affect smaller percentage of queries. On the other hand, if updates happen after a long gap then the algorithm receives a good number of new functions and definitions that tends to affect a larger percentage of search queries. 

Google is planning to integrate the Penguin algorithm into it's main algorithm that updates frequently in a similar manner as it did for the Panda update. This means updates would be made faster and no prior or after notifications would be given by Google whenever there happens a Penguin update.

So folks if Google does not notifies you about a future Penguin update then don't assume there hasn't been an update. Penguins have got an access to Google's overall continuous algorithm updates.

Also See:

Wednesday, January 28, 2015

Google Introduces Trash Can To Restore Your Lost Analytics Data

When was the last time you accidently deleted a view, property or account from your Google Analytics account? Till today, there was no feature available to pull in the lost data but starting today, Google Analytics has launched a new feature known as the Trash Can that will allow users to restore the data that they delete accidently. 

Where to Locate the Trash Can?

Trash Can can be found under the Administration tab. Just go to the admin tab, select an account, and click the Trash Can feature on the left-hand panel. It gets displayed under the accounts section. Check the data that you wish to reclaim, click “Restore, and done! Your view, property or account is now just as it was before you deleted it. 

When Will It Get Activated?

This feature has been activated on all GA accounts starting from today i.e. 28th January, 2015.

How Long Will The Lost Data Remain In The Trash Can?

The lost data will remain safe for 35 days from the day you originally trashed it. After 35 days, say a final goodbye to your data.

Awesome, this feature is great. Do you love this feature, please share your comments below.

Also See:

Monday, January 26, 2015

Bing Reveals Secrets of Content Quality in It's Ranking Algorithm

I was browsing some old documents on content quality when I found Bing search quality insights blog has an interesting post completely focused on what Bing considers in terms of content quality before ranking web documents. This is really a great effort from Bing to educate the webmasters regarding the quality of content that they should have on their site and I really appreciate it. Google had earlier provided some information on content and site quality in it's EAT factor. Information like these is indeed useful for the audience and webmasters both. I will break down the major components here and compare both what Google and Bing suggests. I personally feel, creating content based on the factors discussed below will help your site to gain rankings both in Google and Bing.

The 'AUP' Factor in Content Quality 

Content quality is mainly judged on the basis of 3 factors. I will name it as the 'AUP' factor. The AUP factor consists of Authority, Utility and Presentation.


If someone searches with a query "healthy eating during pregnancy", what results are you expecting as an end user? You will love to see results from authoritative websites, right? something that you can trust. Search engines have special trust algorithm that determines the trust factor associated with the content, its author, or the website. In addition to this, a variety of factors are used to establish and determine the authority of a page. These include signals from social networks, cited sources, name recognition and the author’s identity.

If we compare the search results returned by Google and Bing then both are more or less the same. Sites like mayoclinic, nhs, babycenter and webmd are trusted by both Google and Bing. So, what are those trust factors that are working for these websites?

See Also - How Google Might Determine Site Quality Based on Phrase Model

Bing and Google both generate a content quality score based on a number of factors. The content quality value is determined by comparing a web page to patterns found in known low-quality web pages such as parked web pages, content farm web pages, and/or link farm web pages. The presence or absence of each pattern can have a corresponding effect on the content quality value.

Negative Signals That Reduce the Content Quality Score

  • Duplicated content
  • Inaccurate or nonsense content
  • Spelling and grammatical mistakes
  • Having a lot of ads on the site that distract the audience
  • Making the user read through a lot of content before he finds the actual content he is looking for
  • Use of overly large pictures that do not add value
  • Rewording existing content from other sites
  • Using commonly known facts, for example, “Lion is an animal. Lion lives in a Jungle. Lion is the king of the jungle ”
  • Linking to low quality sites

Is the content sufficiently useful for the topic it is trying to address?  The utility factor addresses the issue of supporting ample information a web page. The level of depth and the presence of supporting multimedia content: instructional videos, images, graphs, etc. are all covered under this factor.

Presence of unique content is always preferred in place of recycled, low quality or duplicate content. Bing even discloses one of the ranking factors by publicly saying "A great example of this are real estate listing sites. These sites generally syndicate information available elsewhere (via MLS or government sources). However, even these kinds of sites can move up in the ranking results if they set themselves apart with unique value that others in that category may not have, such as school information or nearby transportation options."

See Also: 7 Types of Content You Must Avoid in Your Site
                 How to Create User Intent Based Content


An easy to read page, having well presented menu, proper internal links, accessible design and the primary content easy to find are some of the basics of effective presentation of content which the search engines consider before ranking any web page.

Bing said "It will promote and support websites and webmasters that provide ads relevant to the content of their website and place ads so that they do not interfere with the user experience."

Examples of Low Quality Content Pages

Here are some examples of low quality pages which will find hard to rank itself in the search engines:

The page below displays a lot of ads and also the author information is missing. So, from authority, utility and presentation point of view, this page just sucks!

Formula for Judging the Content Quality

In Bing, the relevance of a result is a function of:

  • Topical relevance to the query (“Does it address the query?”)
  • Content Quality (as measured by the AUP factor), and
  • Context (“Is the query about a recent topic?”, “What’s the user’s physical location?” etc…)

Ranking= f (topical relevance, context, content quality)

Improving the content quality on your site is just one of the easiest ways through which you can help the search engines find more useful content for the users. For all those SEO folks, looking to increase rankings of their website, try the cocktail of EAT + AUP factor and I am sure you will get to see some good results.

Also See:

Understanding Google's Page Layout Algorithm
Google Indepth Articles
Quoted Search Results
Importance of Content for SEO
How to Make Your Website More User Friendly
Step by Step Seo Copywriting Guide
Seo Tutorial
5 Tools to Make Your Site Content Hummingbird Friendly
Seo Guide for Schema Vocabulary
Rich Snippets in Google
5 Ways to Boost SEO by Leveraging Google Brandvantage
Ways to Increase TrustRank
How Google Identifies Entities Using Attributes
Trust Button and Persona Pages
How Google Uses Contextual Search Terms
Taxonomic Classification to Find Real Context of Words

Friday, January 23, 2015

A New Structured Data Testing Tool For Better Results

Google recently launched a new version of the Structured Data Testing tool to better reflect Google’s interpretation of your content. The new tool can be accessed at this URL.

The new tool contains examples for all the available structured data namely:

 This new tool has expanded support for the JSON-LD markup syntax. 

Features of This New Tool

  • Structured data Validation for all Google features  
  • Support for JSON-LD syntax, including in dynamic HTML pages
  • Clean display of the structured data items on your page
  • Syntax highlighting of markup problems right in your HTML source code

Also See:

Monday, January 19, 2015

Should You Stop Creating New Content and Instead Focus on Updating Old Ones?

When we talk about content marketing, the first thing that comes in our mind is creating new content. Several webmasters and business owners have literally wasted tons of money creating content that has generated little to no impressionsIn spite of that, we follow the older tradition and keep on investing on new content simply for the sake of traffic and engagement. Alas, most of the time we fail. 

Isn't it a better strategy to revamp the old content piece instead of spending your time and money on creating new ones? This may not hold true in many situations like in the case of latest NEWS, you will need to create new stuff often but what in the case of evergreen content that generate the maximum number of backlinks and social shares? Such content never get old and present before you an opportunity to make that content piece stand apart from that of your competitors. So, in the year 2015, where do you wish to spend your time and money on? Creating new ones or revamping the older ones?

I would suggest to go with a 60:40 ratio and use 60% of your resources to create new content but use 40% of your resources to revamp the older content that you are already having in your site. I always prefer to give an example of Wikipedia. How does Wikipedia became so popular and trustworthy because it keeps on updating the older content pieces on a regular basis. 

This is the reason, none of its pieces grow old and they always rank at the top of Google. The pieces are comprehensive but they aren't created in a single day, they are created after tons of revisions which continue everyday. We can take some mileage from this Wikipedia example and update our older content present on some of our top landing pages. This would be extremely effective not only to the business but for the end user as well. Search engines highly value content that benefits the end user and this strategy of updating the older content might work really well.

Saturday, January 17, 2015

How to Include Your Business Social Profiles in Search Results Under Knowledge Graph?

Google has started displaying the social profiles of businesses along with the Knowledge Graph. This is a good move by Google as the entire social visibility of the brand can now become visible to the user with a simple brand name search query. 

As for example, a search for Samsung brand returns the social profiles maintained by samsung under the Knowledge Graph displayed on the right hand side.

Now, the question is, how does Google identifies which is the official social account of the business? or How can you enable Google to identify and display your brands social profiles in Google search results?

The answer is, you need to use the  structured data markup to specify your preferred social profiles. Social profiles currently supported are:


In order for Google to recognize structured data as social profiles, make sure you fulfill these requirements:

Publish markup on a page on your official website
Pages with markup must not be blocked to the Googlebot by robots.txt
Include a Person or Organization record in your markup with:
"url" = the url of your official website
"sameAs" = the urls of your official social media profile pages

You can use he vocabulary and JSON-LD markup format for displaying markups.

Example Snippet for My Blog using JSON-LD markup format:

<script type="application/ld+json">
{ "@context" : "",
  "@type" : "Organization",
  "name" : "Seosandwitch",
  "url" : "",
  "sameAs" : [ "",

OR you can also use the vocabulary: 

<span itemscope itemtype="">
  <link itemprop="url" href=""> 
  <a itemprop="sameAs" href="">FB</a>
  <a itemprop="sameAs" href="">Twitter</a>

The SCRIPT block can be inserted anywhere on the page — either the head or body of the page.

Also See:

Wednesday, January 14, 2015

Google Introduces Analytics Spreadsheet Add-On - Schedule and Run Reports Automatically

Philip Walton, lead Google Analytics developer recently introduced the Google Analytics spreadsheet add-on that works to automate your work and saves time from manually editing analytics data in spreadsheets. It works similar to the magic script but the interface is easier to use. In fact, it is an enhancement of the older magic script. It brings you the power of the Google Analytics API combined with the power of data manipulation in Google Spreadsheets. This tool is useful for:

  • Querying data from multiple views.
  • Creating custom calculations involving several dimensions and metrics based on custom dates.
  • Create visualizations with the built-in visualization tools, and embed those visualizations on third-party websites.
  • Scheduling your reports to run at a future date automatically.
  • Enhancing privacy settings to control who can view and edit your spreadsheet.

How to Install the Google Analytics Add-On

  • Create a new Google Spreadsheet (or open an existing one).
  • Choose: Add-ons > Get Add-ons… from the menu bar. (You can also get this add-on by clicking here)
  • Find the Google Analytics Add-on from the add-ons gallery and select it.
  • Click accept when asked for permissions. A "Google Analytics" submenu should now appear in the Add-ons menu.
Have a look at the below video for easy understanding:

How to Create and Run Reports?

  1. Open spreadsheet. Click on Add-ons -> Google Analytics -> Create a New Report
  2. A right hand sidebar will appear. Provide a name for the report you wish to prepare.
  3. Select account information and choose custom metrics and dimensions.
  4. When done, click the create report button.
This will create your custom report.

Now go to "Add-ons" > "Google Analytics" > "Run Reports" to run your first report. You can also  go to "Add-ons" > "Google Analytics" > "Schedule Reports" from the menu bar. This opens a report scheduling dialog where you can schedule when your reports will run.

Saturday, January 10, 2015

How to Do Content Marketing in 2015? Will Videos and Podcasts Replace Text Content?

What should your ideal content marketing strategy in 2015? If your business is relying on text as the only content marketing strategy then you need to think again. I guess, other modes of content marketing like videos and podcasts might win the race.

As we move ahead, people would be investing lesser time reading content and instead will like to listen to it. Life is getting easier and we have technologies to work for us! Videos and Podcasts are those channels which people would love to spend time on. As marketers, if we are not leveraging the worth of video marketing and podcast marketing then we are completely going off the road in terms of content marketing for 2015.

Here are some points that you need to really figure out before you plan your content marketing strategy:

1- What is your ideal audience? 
2- In which digital channel are you receiving the maximum engagement? Is it Facebook, is it Twitter, is is YouTube or is it Pinterest? Figure out, what is it?
3- What is the demographic of your ideal audience?
4- Which piece of content has given you the maximum social shares? Ever wondered? Then check it out with BuzzSumo. 
5- Are you creating enough fat content that consists of infographics, videos, ebooks, white papers etc.
6- Is your strategy based on your goals? 
7- Are you planning content based on devices? If your audience is mobile then page load speed can become a critical factor. So, are you considering it?
8- Is your content adding value or just keywords?
9- Are you creating specialized content or general content? 
10- How does the content produced by you differs uniquely with that of your competitors? Or how does it adds more value than your competitors?
11- Is your content marketing strategy documented?
12- What is your budget and how do you plan for people to engage?
13- Are you also planning to measure content marketing ROI?
14- Having a blog generates over 60% of leads. Wow!! Are you seriously considering this?
15- Are you considering adding active content aka interactive content (polls and quizzes) rather than passive content? Active content leads to 70% more conversions as compared to passive content. 
16- Are you looking at the customer's life cycle and planning your strategy?
17- What will your brand ultimately achieve through content marketing?
18- Are you developing persona for your unique audiences?
19- Is user generated content (only experts please) a better way of generating content that receives maximum engagement?
20- Have you ever thought of including in person-events as your content marketing strategy

You need to listen to Haroon's Hangout! The internationally featured show on marketing innovation discussing what should an ideal content marketing strategy for 2015.

Also See:

7 Types of Content Every Content Writer Must Avoid Writing

Wednesday, January 7, 2015

How Google Might Connect Topics to Wikipedia Articles Using Probabilistic Entity Linking

The Hummingbird update and the Knowledge Graph form a vital part of the Google semantic search technology. The way Google predicts answers to questions asked by searchers is indeed remarkable. But, Wikipedia and Freebase forms the base of this technology. Google uses probabilistic entity linking technique in order to connect topics to already existing Wikipedia articles. This is what we see for all queries that returns results based on the Knowledge Graph.

Google probably uses a novel efficient Gibbs sampling scheme which can also incorporate side information, such as the Wikipedia graph. This conceptually simple probabilistic approach achieves state-of-the-art performance in entity-linking on the Aida-CoNLL dataset.

The ‘entity-linking’ task involves annotating phrases, also known as mentions, with unambiguous identifiers, referring to topics, concepts or entities. Mapping text to unambiguous references provides a first scalable handle on long-standing problems such as language polysemy and synonymy, and more generally on the task of semantic grounding for language understanding.

The constructed LDA model has each topic associated with a Wikipedia article. Using this ‘Wikipedia-interpretable’ LDA model, the topicword assignments discovered during inference qualify directly for entity linking. The topics are constructed using Wikipedia, and the corresponding parameters remain fixed. This model has one topic per Wikipedia article, resulting in over 4 million topics. Furthermore, the vocabulary size, including mention unigrams and phrases, is also in the order of millions.To ensure efficient inference we propose a novel Gibbs sampling scheme that exploits sparsity in the Wikipedia-LDA model. 

Have a look at the below example: 

Here, the word "croft" is related to 2 entities namely Lara Croft and Robert Craft but linking England, Bat and Inning together, Google can easily recognize, Robert Craft is related to cricket and England. 

Also See:

Thursday, January 1, 2015

Types of Natural Language Search Results Google Displays For "Intent" or "Non Factual" Search Queries

The launch of the Hummingbird update added brains to the Google search algorithm. Google can now easily return answers to non factual search queries instead of returning 10 blue links and asking the user to search for an answer himself/herself. 

Talking from a user point of view, Google always want the user to receive instant answers to its questions. NLSR (Natural Language Search Results) is an effort by Google to decrease the time spend by an user in searching for an accurate answer. 

Google with the help of its sophisticated algorithms can find out answers from authoritative sources and display them before the user thereby decreasing the time spent by an user in searching for an answer. 

A natural language query is a query using terms a person would use to ask a question, such as "how do I make ice cream?". Some natural language queries are non-factual. A non-factual query may be a query that includes a request for specific information about a topic. Google has the ability to identify clear-intent queries and match the queries to the stored answers and provide an enhanced search result with complete answers from one or more authoritative sources.

The return of direct answers by Google is one of the elementary stages where Google has already started to behave like a QA Engine instead of a search engine. 

Example Search Snippets for NLP Results

Query 1- How to prepare green tea

Paragraph based search snippet

Query 2- How to add signature in outlook

List based search snippet

Query 3 - Who is the finance minister of india

Bolded answer based search snippet

Accurate detection of answer from a paragraph in an authoritative source. 


What is the distance between earth and mars

Query 4 - 24*7

Direct mathematical answers as search snippet

Query 5 - Synonyms of beautiful

Text plus audio based search snippet

Query 6- Area code of las vegas

Knowledge graph based search snippets

Query 7- How to gain weight fast for men

NLP + PPC Ads Based Search Results

Queries based on how, who, when often return an answer snippet and in the years to come, this ratio of answers to links as references will keep on increasing. More answers will be delivered by Google making users search less and do more. organic search visibility of websites will certainly hamper but semantic modification of content will get businesses an upper hand.

Example - How to make chocolate cake is able to divert so much of traffic on its website with Google's semantic algorithms picking its website to return answers for the user's query. If your site is listed as an answer to the query then you can divert immense amount of traffic to your site but doing that is not easy. Have a look at the screenshot below:

Happy Semantic Search in 2015!

Also See:

5 Tools to Make Your Site Hummingbird Friendly
Google Query Processing by Identifying Entities
What is LSI?
How Google Identifies Substitute Terms of a Query?
Google Patent to Identify Erroneous Business Listings
How Google Might Predict Mobile Search Queries
Co-Occurence Frequencies
Seo Tutorial
Taxonomic Classification While Finding Context of Search Query