Thursday, July 31, 2014

Google Adds Bot and Spider Filtering in Analytics

Earlier it was hard to distinguish between bots and real traffic as both were counted and displayed under Google Analytics. But now, thanks to the latest improvement, you can exclude bot and spider traffic from the real traffic.

How to Exclude Bot Traffic?

1- Click on Admin and go to View Settings.
2- Click on the check box that says "Exclude all hits from known bots and spiders"

Also See:

How to Create Goals in GA?
Google Analytics Interview Questions and Answers
New Google Quick View Button on Mobile Search Results
Query Highlighting on Google Search Results
How GA Collects Data From Mobile Apps

Tuesday, July 29, 2014

How Google Identifies Entities Using Attributes?

Google has gone a way ahead in accurately identifying entities associated with the search queries. A new Google patent describes how Google might identify entities more accurately using the attributes listed in the search queries entered by the user. Any user searching for a movie name might enter the search query as:

Actor/Actress Name + movie dialog (including other words and stop words)


Actor/Actress Name + song of the movie (including other words and stop words)


Movie where kate Winslet says Teach me to ride like a man!


Kate Winslet is an Entity or Attribute


Teach me to ride like a man! is an Attribute

Entering this query gives us the following results:

The first result from the site accurately predicts the answer as "TITANIC"

This is great but Google was able to do so by correctly matching the attributes with the entity which in this case was "Titanic". 

Similar to the above example, there can be many more queries entered by the user where he/she might be looking to find book names, places names, a particular medical condition etc. Whatever be the entity, there are several features associated with the entity. By accurately identifying the features and mapping the relationship between the entity and the attributes, Google is more likely to provide an accurate answer to a user search query. 

Here is an image depicting how Google might process a query using attribute identification:

The search query entered by the user will go through the entity identification system which will predict the main entity using the matching attributes associated with that entity. The entity data store will store the individual entities while the attribute data store will store the attributes or features that those entities might possess. 

Also See:

Monday, July 28, 2014

Dynamic Sitelinks in Adwords - FREE Ad Extension from Google

Yes, you heard that right, dynamic sitelinks are free when any user clicks on them but don't get too excited, you still need to pay per click when someone clicks on the headline of the ad. Dynamic sitelinks are automatically generated sitelinks that appear below your ad text. This makes your ad more relevant to potential customers. The power to display dynamic sitelinks rests with Google and it might show such links only on some instances when it feels displaying dynamic sitelinks might increase the potential relevance of the ads and the CTR.

It is suggested that you continue adding and optimizing sitelinks because the overall impression share of the dynamic sitelinks will be low. 

Also See:

Google Pigeon Algorithm to Power Local Search Results

Google has launched a new local ranking algorithm and the SEO community has named this update the "Pigeon" update. This algorithm powers the local search results both on the Google Maps and the main search results.  This update was noticed after many webmasters witnessed changes in their local search rankings. Pigeon update is not a spam filtering algorithm but rather it tends to deliver better quality search results. 

The Pigeon Update - Set to Rule Local Search Results? 

Very quickly this update has started to roll out for both English and non-English search queries. Still, no information from Google has been provided in this regard. Waiting to see when will Google finally reveal about this major local update and interestingly what name it gives to the update?

Also See:

Alexa's New Keyword Research Tool - Look Up By Keyword and Discover Organic and Paid Traffic to Any Site

Alexa, the web information company owned by Amazon has launched a new keyword research tool for internet marketers that will help to look up by keyword and discover both organic and paid keywords that are sending traffic to any website. Previously, with the help of Alexa, we were able to find both organic and paid keywords sending traffic to any site but now we can perform keyword based research and discover sites that are receiving the maximum traffic from those keywords. 

Alexa Keyword Research Tool - What it Displays?

Suppose, if we wish to see which sites are getting the maximum traffic for the keyword "weather" then Alexa will return the information as displayed in the screenshot below:

It will display:

1- The top sites sending traffic to the sites from the searched keyword.
2- The keyword data containing the list of other keywords sending traffic to the site.
3- Percentage of traffic from the searched keyword.

We can view this data for both 'organic' and 'paid' traffic.

By clicking on the 'view site keywords link' we can reveal every single and popular keywords sending traffic to the site:

But hold on, this tool is available for paid subscribers only who have opted for a plan of $149 and above. 

Also See:

Saturday, July 26, 2014

How Google Might Predict Mobile Search Queries? Text Prediction Techniques and Research Findings

Due to the growing use of mobile search and bombardment in mobile search queries, Google is forced to apply some new techniques for making the overall search experience pleasant for the users. Query prediction remains one of the main areas to reduce the time interval between the manual entry of query and display of search results. For desktop users, Google has been successful in providing instant search and impressive query suggestions which reduces the time spent by the user on manually entering the query. But, in the case of mobile devices it is difficult. The main reason being, the search patterns for mobile queries are different and the user behavior based on the device complexities are also different. This article is based on a research study published here which helps to uncover the details as to how Google predicts mobile search queries and what are the text prediction techniques it uses in order to automatically predict a search query.

Normally, most of the users type on a 9 key cell phone and the average search query goes upto 15 letters long. It takes approx. 30 key presses and 40 seconds to enter the query. The time taken is too long and presence of text prediction systems might help to reduce this time gap. Three of the most common text prediction systems used are:

eZiType - Those which complete the individual words before they are typed. 

T9 - Originally developed by Tegic Communications, T9 is a technology that refers to the 9 buttons used on a cell phone. T9 is  apredictive texting technology which allows the user to enter text with just one key press instead of pressing the key 4 times for typing the letter S.

iTap - This technoloy has been developed by Motorola. It helps to guess the complete word when the user has just typed in the first few words. 

As per this research, 2 sets of users are shown 2 types of screens - query display screen and the query input screen. User are allowed to enter the query through the query entering screen and the query was displayed in the query display screen. In total, 6 interfaces were used with query suggestions ranging from 0 to 6. This means, different users were shown different interfaces. As per the results of this experiment, the workload time for queries with suggestions improved considerably. The number of key presses required was reduced due to the presence of query identification systems.  

Also See:

Google Patent to Identify Erroneous Business Listings
New Google Patent to Identify Spam in Information Collected From a Source
Google Patent Named Ranking Documents to Penalize Spammers
Taxonomic Classification to Find Real Context of Words
Google Tag Manager
Query Highlighting on Google
How Does Google Applies Semantic Search?

Wednesday, July 23, 2014

New Display Targeting Reports in GA for GDN Advertisers

Starting today, you will be able to see a new display targeting report in Google Analytics. This new report will allow the advertisers to see how well the Google Display Network ads are performing. The new display targeting reports will show the following columns:

Display Keywords 
Ad Group
Bounce Rate

News of this new report was announced on Google Plus. This new report is displayed in Adwords tab under Acquisition menu.

Also See:

How to Create Goals in GA?
Google Analytics Interview Questions and Answers
New Google Quick View Button on Mobile Search Results
Query Highlighting on Google Search Results
How GA Collects Data From Mobile Apps

Tuesday, July 22, 2014

Facebook Launches New Bookmarking Feature "Save" For Viewing Posts At a Later Time

Facebook today unveiled a new feature for saving posts to view at a later time. This new bookmarking feature is known by the name "Save". This feature is available for all Facebook posts and allows you to save links, places, music, books, movies and TV shows. 

Many times it happens that we are unable to research more on the posts shared by our friends because of time issues so starting today, you can just click on the save link displayed on the bottom right of the post and save your favorite posts to view them at a later time. This is an excellent addition to Facebook because due to the constant post updations that happens constantly on the timelines, users often get very little time to review the posts shared by their friends. Saving them for viewing at a later time will provide more visibility to the important posts.

How To Use the "Save" Option on Facebook?

You can save links, places, music, books, movies and TV shows. From your News Feed, click Save in the bottom right of a post or click arrow shown beside any post and then select Save.

How to View the Saved Items?

Click saved displayed in the left column in your homepage. 

Monday, July 21, 2014

5 Ways to Generate More Leads From Your Landing Pages

The page where a user lands is known as a landing page. Normally, when you promote your services using paid channels of advertising, you want the customers to land on a specially designed landing page that contains only the important elements and highlights the conversion friendly elements. Landing page is the single most important element in the whole website. An elegantly designed landing page has greater chances of generating leads and conversions as compared to a faulty designed page. Here are 5 useful ways through which you can generate more leads from your landing pages:

1- Keep The User Enter The Details ASAP 

Don't make the user wait. The lead form should contain minimum fields, infact it should contain only the ones required to initiate a conversion. 

2- Use Arrows to Point to Your CTA Buttons

People who have used arrows to point to their conversion forms have experienced a dramatic increase in conversions.

3- Use a Smiling Woman Model Image to Attract and Convince Customers

Yes, this works! If you don't believe me then start the testing today. A smiling woman model placed intelligently next to your conversion form will persuade more people to fill in the details. This image will give them more confidence in the services you provide.

4- Use the Word FREE

The magic word that are used by almost all the top brands of this world is the word "FREE". Think of how you can make use of this word without affecting the costing and see the magic work. When people read the word "FREE", they instantly feel that it won't cost me anything so they readily fill up the form and generate a lead. They also think that if they don't act today and make use of the available opportunity then tomorrow perhaps the FREE service won't be available so they don't want to miss that opportunity and instantly fill up the form. This is a psychological factor and marketers make use of this factor to the maximum.

5- Design a Clutter Free Page - Don't Say Too Much 

The landing page design should be clutter free. Only the most important conversion friendly and customer attraction elements should be present. Unnecessary text and images should be avoided. Say very little and persuade the customer to act. Remember,a landing page should be designed in a manner so as to make the user act and you should refrain from saying too much.

Also See:

Landing Page Experience Ratings
Conversion Tracking in Adwords
How to Create a New Dashboard in Google Analytics
Google Makes Improvements in Real Time Reports
Difference Between Clicks and Visits

New Robots.txt Tester to Identify Errors

Google has recently launched a robots.txt tester to test the robots.txt file for any errors. This new testing tool can be found under the Crawl section in the Webmasters tools. This tool is incredibly helpful in testing new URL's to identify whether they are disallowed for crawling.

You can now test the directives of your robots.txt file and check whether it is working properly. Once you are done with the testing, you can upload the file on the server to bring the changes into effect.

With this tool, you will also be able to review the older versions of your robots.txt file and see the issues that restricts Googlebot from crawling the website.

This is a great new addition in the GWT.

Also See:

Google's Take on Sitewide Links
List of Meta Tags Supported by Google
Google Human Quality Raters Do Not Influence A Website Ranking Directly
Google Now Cards
List of Meta Tags
Google Expands Knowledge Graph
Google Disavow Links Tool
Query Highlighting on Google

Thursday, July 17, 2014

Topic Modeling - How Google Extracts "Topics" and Establishes Relationship Between Documents

Topic Modeling is a highly technical topic and is associated with the on page optimization part of the search engine optimization processTopic model is a kind of statistical model that helps to extract "Topics" from web pages in order to determine the relevancy of that page with respect to the search queries. It helps to discover hidden topic based patterns and assigns scores based on the relevancy of documents. For documents related to the keyword SHIP, we can expect the use of some common words like "sea", "water', "fleet", "river", "mast", "vessel", "cargo", "fishes" etc but for some documents related to SPACESHIP, words used would be "earth","planets", "aircraft", "space mission", "satellite", "lunar", "orbit" etc. However, in both these documents, there might be the usage of some common words like "the", "has", "is", "for" etc. A study of the words associated around a particular topic helps to unveil the main or important topic of that document. 

Similarly, topic modeling will also help to find relevant webpages that relate more to the main topic present on the search query. For example, the query "Charles Dickens" will return webpages that have a high score based on the high topic associated word percentage of the main query as opposed to web pages that have a low score. A document using 70% words related to the main topic will rank higher than the document using 40% related words. (Please note that actual Google rankings is based on more than 200 factors and topic modeling is just one of those signals). 

The Types of Topic Models

LSA/LSI - Latent Semantic Analysis or Latent Semantic Indexing (For determining relationships)

The technique of LSA was suggested by Scott Deerwester, Susan Dumais, George Furnas, Richard Harshman, Thomas Landauer, Karen Lochbaum and Lynn Streeter in 1998. Commonly used for natural language processing, LSA identifies concepts present on two or more similar documents and establishes a relationship between them. It assumes that words closest in meaning are present on a similar set of documents. For example, a common word like "pet" can be present on documents that discusses about "dogs" and "cats" while words like "litter" or "bark" can be present on documents related to cats and dogs respectively. It uses a technique called "Singular Value Decomposition" to identify patterns and then establishes a relationship between the terms and concepts. LSI is helpful in overcoming the difficulties associated with the processing of similar meanings and multiple meanings. 

LDA - Latent Dirichlet Allocation (For identifying topic probability)

This model is helpful in determining a topic for a document. Let us take an example:

Consider these sentences:

  • Online shopping requires the use of credit card.
  • Major online shopping sites accepts payments through credit cards.
  • Chocolate cake is one of my favorite recipe.
  • You need eggs and cocoa powder to prepare chocolate cakes.

Through LDA we can determine, that 1st and 2nd sentence is related to the topic "online shopping" and the sentence 3rd and 4th is related to the topic "chocolate cakes". LDA is pretty accurate in identifying mixture of topics and then sorting them according to a fixed percentage. 

For example, in the 1st sentence, we can say the topics are divided as per the below percentage:

8= total words
2/8= online shopping

Similarly, for all the 4 sentences mentioned above, we will get an LDA ratio of 1/2 online shopping and 1/2 chocolate cake.
ESA - Explicit Semantic Analysis

ESA was designed by Evgeniy Gabrilovich and Shaul Markovitch in order to properly define and assign labels to the different concepts. It is similar to LSI but differs in the sense that it uses a knowledge base like Wikipedia, ODP (Open Directory project) or the Reuters Corpus to label the concepts or entities. The Knowledge Graph used by Google is the perfect example of ESA in action.

Topic Models and Search Engine Rankings

Topic Modeling has been used extensively by the search engines like Google and Bing. Vector analysis used in the algorithms helps the search engines to store or retrieve data based on topics and their relationships. Google has been successful in implementation of a scalable search engine technology based on topic modeling. An internet marketer with a sound knowledge of techniques like LSI, PLSI, ESA, LDA etc can draft a document in a manner that can influence the algorithmic values already used by them. By creating documents focused around topics and using similar words in them can make the document stand highly relevant in the eyes of search engines. The use of LSITF-IDF ScoreCo-Occurences and Co-CitationsKnowledge Graph etc are nothing but the use of topic modeling. However, as said earlier, Google uses more than 200 signals for determining ranking and topic modeling is just one signal. 

Also See:

Wednesday, July 16, 2014

Why Remarketing in Adwords is Great For Targeted Branding?

Branding is vital for a business's success and remarketing in Google Adwords is just the tool that every business owner will fall for to get their brand noticed from several web properties altogether at the same time. Marketing in the simplest terms means promoting services or products to a gorup of targeted audience using various channels in order to generate profits and remarketing means targeting the same users again and again until they convert.

Remarketing in Adwords

In layman's terms, remarketing in Adwords involves the display of banners and text ads to users who had previously visited your website. These advertisements will get displayed before them no matter whatever website they keep on visiting (almost all popular websites have adsense so chances are that your ad will get displayed on almost every site that the user visits). This means, he/she will keep on viewing your ad again and again until he converts or have the cookies cleared.

Now, this is what any business needs for branding purpose! Any business owner would love to get his brand name displayed everywhere where his targeted audience goes. In simpler terms, brands want to follow the users and remarketing helps the businesses to follow their users. Google remarketing in adwords is one of the best tools available for branding.

Dynamic Remarketing

There is nothing better than dynamic remarketing which takes remarketing one step ahead. It helps to display customized ads to each and 

every user. Suppose your brand need to target students in the age group 16-20 then with the help of dynamic remarketing you can show tutorials customized for each student like some of them might be interested in Maths while other in English. Ads can change themselves on the go and will display only those tutorials which the students are interested in.

Recently, Rand Fishkin was quoted saying the below lines:

"That's why SEO is neuropsychology. SEO is conversion rate optimization. SEO is social media. SEO is user experience and design. SEO is branding. SEO is analytics. SEO is product. SEO is advertising. SEO is public relations. The fill-in-the-blank is SEO if that blank is anything that affects any input directly or indirectly."

For the branding part, a single channel called Remarketing in Adwords can prove really beneficial for your overall digital marketing campaign. Some SEO's feel they must make use of only link building and content marketing but the truth goes far beyond that. In current times, SEO is more like branding and promoting brands involves the use of one or more paid channels of internet marketing which indirectly impacts the progress of other channels. Remarketing can be done in a tight budget so without costing much, brands still have the power to target audiences and convert them while improving their brand value at the same time.

Also See:

New Adwords Extension - Test it Out

Tuesday, July 15, 2014

Troubleshooting Incorrect hreflang Annotations With International Targeting in GWT

Sites that have more than one regional variations make use of the hreflang tag to serve the correct language and URL to users defined by particular regions and languages. This set up enhances the overall user experience. But, sometimes, the setting might go wrong and incorrect herflang annotations will degrade the user experience thereby affecting the site presence on Google.

Starting today,  Google has made troubleshooting hreflang annotations easier. Now, it will help you to identify 2 of the most common issues associated with hreflang annotations.You can find these errors located in the International Targeting section under Search Traffic. The issues are given below:

Missing Return Links

If the pages in your site has missing return hreflang annotations links then Google will find such links and display them in GWT.

Incorrect hreflang Values

The value of the hreflang should be a language code in ISO 639-1 format such as "es", or a combination of language and country code such as "es-AR". If the URL's are not in the proper format then Google will provide example URLs to help you fix them.

Also, Google has moved the geographic targeting setting under the International Targeting section, for easier management of multilingual sites.

Also See:

Solving Duplicate Content Issue of Multi Language Sites
Search Queries in Webmasters Tool
How to Find Out the Total Number of Pages Blocked by Robots.txt
Google Tag Manager
Google Adwords Bidding
Multi Channel Funnel in Google Analytics
Duplicate Content and SEO
URL Canonicalization

Monday, July 14, 2014

Why Offline Promotion is Important to Boost SEO?

Planning a link acquisition strategy is really hard and concentrating only on promoting a brand online can make it dull, boring and sometimes unnatural. The better approach is to follow a dual pattern and promote the website via both online as well as offline. The point here is Google only counts online activity and does not considers offline promotion so how is that going to benefit any website in terms of SEO? Aha..this looks like a good question. But, the answer is fairly simple, promoting your brand offline will certainly fetch some natural backlinks or brand mentions online that would certainly be useful in aiding SEO.

Tactics of offline Brand Promotion That Aids SEO

Offline promotion basically consists of traditional media like brochures, word of mouth promotion, banners, billboards, advertisements in newspapers and magazines, radio and Television ads etc. All these traditional channels of marketing can fill in the void that most of us are unable to see while promoting our sites online. Here are some useful strategies that will help the brands raise their online reputation and brand factor simply by investing some money in offline promotion:


Participate in conferences and see your brand become a common name related to your niche.

Sponsoring Or Organizing Contests

Sponsor contests and events and help your brand to reach masses who are currently ignoring the brand promotion online.

Brochures and Business Cards

Distribute brochures and business cards. They must look stunning, awesome and unique. The audience should instantly recognize your brand, such should be the power of the design.

Free Gifts

Occasional distribution of gift coupons is also a great strategy to reach even more people.

Blogger Outreach

Participating in blogger events and meeting potential bloggers who could take your brand to the next level is a great way to get started. 

Yes, it requires some time and effort to build relationships but the end results are satisfying.

The New SEO Strategy - Online + Offline Promotion

We need to remember that online promotion should be coupled with proper offline promotion that is focused towards increasing the brand value. The overall brand value generated online+offline will surely provide an overall boost to the current search engine optimization strategy. The biggest benefit offered would be that of natural backlinks. Some of the audiences whom you will reach offline might have online presence and if you are able to impress them, they will surely link back to your site in one way or the other. This is what you require today in order to succeed in SEO. Don't invest time and money in creating unnatural links all by yourself which Google has already started ignoring, instead go for offline promotion and capture some relevant natural links. Moreover, almost each one of us is having a social presence and a small tweet or share mentioning your brand can go a long way in making Google believe the popularity and reputability if your brand. Yes, this is far better than tweeting about your brand from the same profile again and again.

I hope you got the crux here- The new SEO strategy should be based both online and offline. The reason is obvious- natural backlinks and brand mentions on social media.

Also See:

Friday, July 11, 2014

Focus on "EAT" and See Your Rankings Climb Up!

Google has recently updated the manual quality rating guidelines to separate high quality sites from low quality ones. The new guidelines focuses on the concept of EAT that stands for "Expertise","Authoritativeness" and "Trustworthiness". As we have already covered earlier on Seosandwitch that Google has shifted its focus on the brand value of a website and reputed brands do enjoy higher rankings on Google, the concept of EAT presses on this earlier fact. The idea is simple, if the user trusts the information presented on your site or if the information is written by an expert author then Google will rate the quality of your site higher. Higher ratings is equal to higher rankings. Now, let's focus on a little detail as to how one can earn high ratings on expertise, trust and authority.

Understanding EAT - Expetise, Trustworthiness and Authoritativeness


The author of the content matters.  For example, if we search with a query "How to do SEO", we see Google returning results from authors like Matt McGee and Rand Fishkin who hold true expertise in the SEO niche. With the help of Google Authorship, Google already knows the person behind the content and if that person is an expert on a particular niche then those webpages authored by the expert might rank higher on the search engines due to a high quality rating score. It is important to mention here that there can be many factors responsible for ranking a web page so actual search results are totally query specific. However, expertise of an author is one of the factors when it comes to trust the contents of any webpage. 

Here are some more examples:

Danny Sullivan is an expert on topics related to search engines and associated topics.
David Amerland is an expert on topics related to semantic search.

I am sure you got what Google is trying to portray!


Sites accepting user contributed content often lacks trust and expertise. Examples are forum/QA sites. The level of engagement that the content receives and the authors associated with the content determines the amount of trust that should pass onto the document. The question to ask yourself is "Will you trust the advice given on the webpage?", "Is the advice shared by a person who has enough experience or reputation in the said field?" If the answers are in positive then Google will clearly provide a high quality trust rating. 


Authority of a site is dependent on several factors like the number and quality of inbound links coming to the site, the Domain Authority of the domain, the Page Authority of the webpage, the PageRank of the Webpage, the social dominance etc. Links are the sole criteria that determines the authoritativeness of any webpage. The higher the authority of a domain, the better quality rating it receives.

Key Takeaways:

1- Use supplementary content on your site to increase the relevancy score and allow the user to engage with your site for a greater period of time. Displaying similar content or products is an example of supplementary content. A greater emphasis has been laid on the supplementary content to rate a page as high quality.

2- Overabundance of advertising is a strict NO under these new ratings. Over the fold ads, pop-up ads, inline advertising, ads that are made to look like navigation links, ads that irritate the users might hamper the ratings of the site.

3- Presence of contact information on the site will make the site look genuine and e-commerce merchants should have return and exchange policies clearly mentioned on the site for better results.

4- QA sites having webpages without an answer are to be considered as low quality.

5- Thin affiliate sites will have low ratings. 

6- Online reputation of a site or brand will matter. Sites having negative reputation will be given a low rating.

7- Pages having "lack of purpose" containing gibberish content would be considered as low quality.

Hope this provides some useful insights on the quality of a webpage and how you can work towards improving the existing reputation and rating of your site. 

Also See:

Domain Authority and Its Impact
Authority and Relevance in SEO
How to Clean Up Your Link Profile?
All Forms of Link building are Wiped Out by Google
Google Will Provide Unnatural Link Examples
Google Disavow Links Tool