Google Seo Sandwitch Blog

Thursday, October 2, 2014

Robots.txt - The Complete Guide With Examples

Robots.txt is a text file that is used to stop web crawlers from accessing certain files and folders on the server which the webmaster does not wants to appear in search engines. Without any specific instructions, the search engine crawlers access each and every folder and their associated files present in the server so there are chances that some private information may get public if proper commands are not specified in the robots.txt file. Also, this file needs to be present in the root server only.

WWW Robots (also called wanderers, crawlers or spiders) are a set of programs that continuously visit many pages in the World Wide Web by recursively retrieving linked pages. Robots.txt is a notepad file that contains commands that are used to direct such web robots to access or deny crawling of certain files or folders. The robots.txt is the first file which the crawlers read when they access any web server. Although, it is important to note here that certain web crawler programs may not abide by the instructions provided in the robots.txt file and might crawl the private files and folders as well. But, in general robots follow the commands provided in the file.

The Robots Exclusion Standard - A Short History

The Robots Exclusion Standard or the Robots Exclusion Protocol is a set of rules advising web crawlers or robots to ignore certain parts of a website that are restricted from public viewing. The credit of proposing the "Robots Exclusion Protocol" is attributed to Martijn Koster, who suggested it when working for Nexor around 1994. The file "robots.txt" was popularized shortly after when this file was used to restrict the Aliweb (one of the earliest search engines) crawlers. 


Martijn Koster - The man behind the creation of robots.txt

The Robots Exclusion Standard is not an official standard backed by a standards body, or owned by any commercial organisation. This protocol is not governed by any organization and as such not enforced by anybody. There is no guarantee that all current and future robots will use it. 

The robots.txt file consists of  5 major parameters (fields or directive):

A- User-agent - This field holds the the robot value. For example if instructions are for Google search engine bot, then this field will hold the value:

User-agent: googlebot

B- Disallow - This field specifies the names/paths of files and folders which the crawlers must ignore. For example, if the folder "passwords" needs to be disallowed then the following command will be written:

Disallow: /passwords

C- Hash # - This is a comment parameter. If you wish to add certain lines of text in the file which you do not wish to execute then you can use the hash tag. The line below can be used for the above disallow command:

Disallow: /passwords # this line will disallow the folder named password

D- Allow - Just the opposite of disallow. This allows crawling of all files and folders. The line below will allow crawling of the folder named "passwords"

allow: /passwords

E- Crawl-delay - This parameter will set the number of seconds to wait between successive requests to the same server. For example, if you want the crawlers to wait for 5 seconds, the following command needs to be written:

Crawl-delay: 5

## Note that crawl-delay paramter is not supported by Google and Yahoo. 

Where Should You Place the Robots.txt File?

It should be placed in the root folder of your server. It will generate URL something like this:

www.examplesite.com/robots.txt

Please note that robots.txt is a publicly available file so any one on the web can directly visit the URL and see the contents of your file.

When Should You Use Robots.txt?

1- Prevent indexing of an unannounced site.
2- Prevent crawling of an infinite URL space.
3- Prevent indexing of search folder.
4- Block indexing of customer account information.
5- Block all checkout and payment related information.
6- Prevent indexing of duplicate files and folders which does not serves any user purpose.
7- Block crawling of individual user reviews on site.
8- Disallow crawling of widgets and CMS related folders.
9- Disallow accessing the customer cart folder.
10- Prevent indexing of online chats happening on the site. etc..

List of Popular Robots User Agents

Baiduspider
Bingbot
Googlebot
Googlebot-Image
GurujiBot
iaskspider
ia_archiver
magpie-crawler
Mediapartners-Google
msnbot
NetResearchServer
NewsGator
OOZBOT
Orbiter
Seekbot
sogou spider
Sosospider
Speedy Spider
TweetedTimes Bot
TwengaBot
Yahoo! Slurp
Yahoo! Slurp China
YahooSeeker
YahooSeeker-Testing
YandexBot
YandexImages
YandexMetrika
Yasaklibot
Yeti

Examples

1- To restrict crawling of URL's starting with /banana/cookie/ and the file named apple.html

# robots.txt for http://www.example.com/

User-agent: *
Disallow: /banana/cookie/ # This is an infinite virtual URL space
Disallow: /tmp/ # these will soon disappear
Disallow: /apple.html

2- To exclude all crawlers from the entire server

User-agent: *
Disallow: /

3- To allow crawling of all files and folders

User-agent: *
Disallow:

## You don't need to use any robots.txt file to allow crawling of your entire server because by default every cralwer will access the contents on your server.

4- To exclude all pages generated dynamically and having the parameter "reply" in them 

user-agent: *
Disallow: /www/cgi-bin/post.cgi?action=reply
   
#In this case, only the URL's containing the "reply" parameter will be excluded from search. Such URL's will be:

http://example.com/www/cgi-bin/post.cgi?action=reply&id=1
http://example.com/www/cgi-bin/post.cgi?action=reply&id=5 etc.

The following URL's will be crawled:

http://example.com/www/cgi-bin/post.cgi?action=edit
http://example.com/www/cgi-bin/post.cgi?action=comment

5- To disallow folder named "web" but to allow it's subfolders named "Webone" and "webtwo".

user-agent: *
Disallow: /web/
Allow: /web/webone/
Allow: /web/webtwo/   

6- To enable crawling of the site for googlebot but disallow crawling for Bingbot:

user-agent: googlebot
Allow: /

user-agent: bingbot
Disallow: /

7- To block all URL's with the word "froogle" followed by underscore

Disallow: /froogle_

8- To block the search folder from crawling:

User-agent: *
Disallow: /search

Should You Block Duplicate Pages Using Robots.txt?

Listen to what Matt has to say:




Use of Wildcards in Robots.txt (More Examples)

A wildcard is a character denoted by the asterisk sign (*) which can be used as a substitute of any of the subset of the matching characters. You can use wildcards for allowing or excluding the indexing a large set of specific URL's.

Google, Bing, Yahoo, and Ask support a limited form of "wildcards" for path values. These are:

* designates 0 or more instances of any valid character
$ designates the end of the URL

1- To block all URL's containing question mark in them 

user-agent: *
Disallow: /*?

2- To block all URL's starting with ebooks followed by ? and containing the parameter q followed by =

user-agent: *
Disallow: /ebooks?*q=*

## This will block the following URL's:

/ebooks?q=string
/ebooks?q=parameter etc.

but, will allow the following ones:

/electronics?q=string
/pamper?q=parameter etc.

3- To exclude all URL's that end with .jpeg

User-agent: Googlebot
Disallow: /*.jpeg$

4- To exclude all URL's that end with .gif

User-agent: Googlebot
Disallow: /*.gif$

How to Create Your File?

The best option is to open a notepad and type the instructions directly into it. Thereafter, save the notepad file by the name of robots. Saving in notepad will add the .txt extension automatically on the file so you don't need to name the file as robots.txt while saving it. After the file is created, upload it on the root server so that the file can be fetched using the below address:

www.yoursite.com/robots.txt

Another way is by making use of online tools for generating robots.txt files. Such tools are listed below:

http://tools.seobook.com/robots-txt/generator/
http://www.yellowpipe.com/yis/tools/robots.txt/
http://www.mcanerin.com/EN/search-engine/robots-txt.asp
http://www.robotsgenerator.com/

How to Test Your File?

The best way is to login to your webmasters account and make changes on the robots.txt file there. You can preview the changes and see whether the file is working properly or not. 

Test your robots.txt Using Google Webmasters Tools:


  1. From the Webmaster Tools Home page, choose the site whose robots.txt file you want to test.
  2. Select the robots.txt Tester tool From the Crawl heading.
  3. Make changes to your live robots.txt with the help of the text editor.
  4. Correct the syntax warnings and logic errors if shown.
  5. Type in an extension of the URL or path in the text box at the bottom of the page.
  6. Select the user-agent you want to test.
  7. Click the TEST next to the dropdown user-agent list to run the simulation.
  8. Check to see if TEST button now reads ACCEPTED or BLOCKED to find out if the URL you entered is blocked from Google web crawlers.
Else, there are several online tools that will help you to check your robots.txt file. Below are some of the recommended tools:

http://www.feedthebot.com/tools/
http://www.frobee.com/robots-txt-check
http://tools.seobook.com/robots-txt/analyzer/

Can You Use Robots.txt to Optimize Googlebot's Crawl?

Matt Cutt's advice:



Limitations of Robots.txt

  • Despite the use of clear instructions, the protocol is advisory meaning that it may or may not be followed by the search engine crawlers. It is like a rule book mentioning the instructions to be followed, some good robot programs will read and follow those instructions while the other bad ones will avoid it. 
  • Robots.txt is a public file, henceforth every instruction specified here is made public. This means all of your secret folder names are made public and are available before hackers. All the instructions are disclosed publicly.
  • There is no official standards body that guides the usage of robots.txt protocol.

Noindex Meta Tag vs Robots.txt

Webmasters have the choice of using either robots.txt file or the Noindex meta tag to block URL's. Similar to the robots.txt file, the Noindex meta tag instructs the robots to stay away from specific URL's. The Noindex meta tag should be assigned individually for each page.

An example is provided below:

<meta name="robots" content="noindex" />

## This tag will go in the head section of every URL that needs to be blocked.

Things to Keep in mind:

1- Noindex meta tag will tell the crawlers not to index the contents of the page. The crawlers will read the contents of the page, pass the link juices if any but will not index the page.

2- With robots.txt file, there are chances that the URL is crawled by the search engines and gets displayed in the search results without any snippet because that remains blocked from the search engines.

3- The best way to remove URL's from Google index is through the use of Noindex meta tags or with the help of the Google URL removal tool.

4- Only one "Disallow:" line is allowed for each URL in robots.txt file.

5- If you have several subdomains on your site then all those subdomains will require separate robots.txt files.

Also See:

How to Find Out the Total Number of Pages Blocked by Robots
51 Secrets You Didn't Knew About Google
50+ Seo Tips
Rich Snippets in Google
How to Add Ratings and Review Stars on Google Search Results
Query Highlighting on Google Search Results
List of Google Search Operators                                                                                                              Google Tag Manager
5 Ways to Fix Duplicate Content Issue
How to Set Up a Custom 404 Page

Sunday, September 28, 2014

Planning a Winning SEO Strategy - 7 Step Guide with Examples

How to plan an SEO strategy that always works? As inbound marketers, we always look for the best SEO strategies to put into place in order to generate the maximum ROI for our clients. Some strategies fail while others give us excellent results. What is common in all those winning strategies that can be applied to each and every site so as to get a formula for SEO success?

Well, in this post, I will present my thoughts on planning a successful SEO strategy by carefully implementing 7 steps.

1- Understand Business Objectives and Derive KPI’s

Every business thrives on certain objectives. As internet marketers and search engine optimizers, our only purpose is not only to understand how search engine works, but also to understand what the business owner expects from our work. The measurement of success can be different for different business owners. Hence, it becomes necessary to identify the business objectives and derive proper KPI (Key Performance Indicators) before any SEO strategy is planned.

Here are a few example objectives and their corresponding KPI’s.

Objective
KPI
A new restaurant in New York
·         Increase in total number of unique visitors
wants to make their brand popular
·         Increase in number of brand mentions

·         Brand coverage on popular NEWS properties

               or industry based authoritative properties

·         Increase in the number and quality of tweets,

               comments, likes on brand's social profiles etc.


An online gifts shop wants to increase
Increase in the number of conversions and their
their profits and ROI
Monetary value. This can be divided into 2 parts:

Primary KPI- Total Value of conversions & goal conversion rate

Secondary KPI- Increase in the number of user registrations


An online ad revenue generating
Increase in the number of total visits
site wants to increase traffic
Increase in rankings on high traffic keywords


A book publisher want to target
Percentage of increase in total traffic from audiences in the age
students in the age group 18-24
group 18-24 (you may track this with the help of Demographics and

Interest Reports in GA)


A new shampoo selling brand wants
Percentage of increase in female visitors (you may track this
 to target female audience
with the help of Demographics and Interest Reports in GA)

As the table above suggests, it is important to determine the KPI’s before you actually start planning your main strategies. Although the above KPI’s are simple but they can be changed as per the exact needs of the business. I would recommend having a read at this excellent KPI tutorial from Avinash Kaushik to have a deep dive in the world of KPI’s.

2- Perform a Complete Site Audit (Link + Design + Content)

Once you know your business objectives and have the KPI’s in place, the next step is to perform a complete site audit. A comprehensive site audit should be completed in 3 steps:

Link Audit

Before you start working on any site, make sure to perform a complete analysis of links it already has. Many times business owners fall in the trap of buying low quality and cheap link building packages without even knowing that these sort of unnatural links can be disastrous for the online visibility of their site. A thorough link audit will disclose any unnatural links pointing to the site that might hurt the site in the long run.  Also, make sure to check the GWT (Google Webmasters Tools) for any manual link building penalty which Google might have applied on the site.

A link audit should be done using any of the available tools like OSE, Ahrefs, GWT, Majesticseo or other similar tools. Every link audit report must specify at least 3 metrics:

A-     Whether the link is unnatural or natural (this is tricky and requires manual review even though you can take help of tools like link detox to make it easier). Have a look at the below chart for determining the intent of the link.


Image Credit: BruceClay 

A-     The DA/PR of the site providing the link.

B-      The anchor text in order to determine the percentage of exact match anchor texts.
A chart must be prepared that clearly mentions the ratio of natural vs. unnatural links. This can help to take faster and accurate decisions.

Design Audit

The design audit should be done to optimize the site layout in order to make them search as well as user friendly. Some issues that should be taken care of are mentioned below: 

A-     Site should be redirected properly. There should not be any canonicalization issues. For more help, see Matt Cutts advice here

B-      Design should be responsive and mobile friendly. (The world is going mobile; you need to be prepared for it). See recommendations for building Smartphone optimized sites

C-     404 errors should be located and fixed. See Google’s advice here

D-      Proper sitemaps should be provided. Both sitemap.xml (for the search engines) and sitemap.html (for the user) should be present on the site. Have a look at Google’s suggestion on sitemaps

E-      Site load speed should be optimum. (You may check the site speed here

F-     Proper internal links with appropriate natural anchor text should be used. I recommend to read John Doherty’s post on smarter internal linking

G-      A correct robots.txt should be specified. Check out Moz’s cheat sheet on robots.txt

H-      The images used should be properly optimized. Have a look at this excellent guide from Neil Patel to learn the secrets of image optimization. 

I-     The main KPI improving design elements are used in the appropriate place where there are chances of maximum conversion. For this, you may take help of In-page analytics in GA.

Content Audit

Thanks to the Panda update, site content needs to be more user friendly and less keyword stuffed. Content is the main fulcrum around which the user rotates, it needs to be comprehensive and problem solving. Remember, every user uses search engines in order to resolve their queries which can be transactional, navigational and informational. As an SEO, you must think from user point of view and ask yourself, “Does the content specifically answer what the user demands?” and “Will the user engage with the content?” If the answers are NO, content enhancement and seo copywriting is needed.

A content audit should be done keeping in mind the following factors: 

A-     The content should be unique and user friendly. (See- how to create user intent based content?)

B-      Comment space should be made available. (See- how positive comments can increase Google rankings)

C-      It should be comprehensive with proper images, videos and interactive elements.

D-     It should be written keeping in mind the targeted audience. 

E-      The site should not have empty content pages or low quality content pages.

F-      Every page should serve a purpose and the content should be focused around that purpose.
G-     Use of H1, bold and italics should be taken care of.

H-     Technical factors like TF-IDF score and semantic relevancy should be analyzed properly. For more help, I would recommend reading the Moz tutorial on how usability, user experience and content affect search engine rankings.

3- Make the Brand a Magnet that Attracts Users (& Links too!)

In 2014 and ahead, the biggest factor that will impact rankings is “user engagement”. You need to prepare strategies to transform a website from a lone web property to abrand. Brands have the power of drive user engagement in the form of comments, mentions, links, social shares etc. All these factors are important and counted by Google in its ranking algorithm. (See- Why reputation and branding matters in seo). 

In order to transform your brand into a magnet that has the power to attract the users, following essential steps and strategies are recommended: 

A-     Build up your site in a manner your targeted customers would love to see it. 

B-      Add each and every element that will hold the visitors on the site. Innovative thinking is the key here.

C-      Matt Cutts had pointed out earlier in an interview published in Wired  "And we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons." This hints towards brand popularity and authenticity. Ask yourself; “is your brand popular enough to be listed on Wikipedia” or “newsworthy enough to be covered by the NYT?”  The strategies you plan for your brand promotion will directly impact the future scope of your brand being listed on these high trusted seeds. 

D-     Get your brand listed on notable sites that rank high on Google for brand related queries. Some of the essential ones are Wikipedia (you will need references and citations to get listed here), Facebook, Twitter, LinkedIn, YouTube, Flickr, Google Plus, CrunchBase, Pinterest etc. Once you get your brand listed here, work towards building a healthy relationship with your customers. This will help to build up the reputation of your brand.

E-      Invest some time and money in creating stuffs that are link worthy. In Seo terminology, we call these as “link baits”. The maximum number of link baits you have in your site, the higher would be the chances of receiving natural backlinks. But, hold on as link baits have become more common, people have started creating bad link baits that do not pay up in the long run. Creating sustainable and effective link baits takes creative thinking and time. Hats off to Cyrus Shepherd for creating this excellent post mentioning the ineffectiveness of bad link baits and the worth of good linkable assets.

F-      Have a PR department or outsource one that will do the job of identifying channels that can generate the maximum publicity for the brand. The PR personnel will help in maintaining relationships with other PR personnel who cover your industry based NEWS on a regular basis and therefore provide ample opportunities for your brand promotion. These things don’t have overnight so don’t expect immediate results. Build and solidify your networks and everything will fall into place gradually. To get a head start, read this useful post from Samuel Scott regarding how to frame a proper PR strategy. 

G-     Social media is the brand’s report card. How well a particular brand is performing online is measured by the amount of engagement it is able to drive on its social channels. Yes, there are certain exceptions to it but I have seen this happen most of the times. Hence, it becomes necessary for the brand to build up rapport with the users on each and every social channel that might help the brand in some way or the other. As said before, user engagement is a vital factor which Google is counting upon and in future too many algorithmic changes are expected to judge and refine the level of user engagement any brand is able to generate.

H-     Ensure that your brand is receiving citations online by the activities your brand is participating in either online or offline. Citations are also a measure to track the popularity of a brand. 

I-        Make use of tools like Hubspot, SocialMention or Google Alerts that allows you to track brand mentions whenever they happen. Identify the people who are already covering your brand and thank them for this. Get the engagements going and you never know this can generate word of mouth promotion for your brand. 

J-       Email marketing, print advertising, TV commercials, billboards, direct mails all play an important role in building up a brand so do not ignore the other advertisement channels simply because you want to grow online. This is the biggest mistake which most of the small and medium sized organizations make while planning their online marketing strategy. A successful online marketing strategy will have place for offline marketing aswell

4- Innovate Towards Advancement

Businesses need to revolutionize themselves as per the user behavior and future market potentials. Ask yourself “Is your brand taking the right step towards advancement in an innovative manner?” Think and research on the products and services that your competitors are building and identify what difference in value, price or uniqueness you offer to your customers? If the same product is sold in a different packet chances are people will ignore it soon. Innovation is the healthiest part of the competition and I simply love it as a consumer.

Think of Google and the level of advancement it offers to its users. It updates its products and services regularly and this is the reason competitors like Yahoo and Bing find it hard to compete with Google. As per the recent scenario here are the essentials that every business must take today in order to boost up their SEO efforts: 

A-     Have a mobile site or a responsive site. Smartphone are increasing at a rate that you can hardly imagine. Within years from now, every SEO strategy would be build around mobile phones and not desktops. Are you prepared for it? Take a look at this tutorial mentioning 14 changes you need to make in your site today to make it mobile optimized

B-      Design a personalized app for your brand and promote it. Yes, there are specialized persons working on the promotion of apps who are known as “App Store Optimizers”. You will need them sooner but before that have an app for your brand. As a savvy internet marketer, it is our responsibility to educate our clients regarding the steps they should take in order to succeed in this highly competitive internet space. (See: App store optimization guide with tools)

C-      Rich snippets are the current and future snippets. We can also expect dynamic rich snippets in future (Did Google just stole my idea?). If the site you are promoting does not offer interesting stuff that can be highlighted on an SERP, why would the user click on it? Rich snippets come in various types and it is highly recommended to use them on every site. This is what I mean by advancement.

5- Create Channels and Spread the Word

Segmentation is a vital part of any internet marketing strategy. Selecting appropriate channels and planning a customized strategy for each one of them is required. The table below lists some of the best channels of internet marketing where any marketer should focus upon:

Channel Type
Mode
Purpose
Cost
ROI





Email Marketing
Newsletters
Send offers/latest NEWS and initiate return visits
Low
High
Social Media Marketing
Status Updates
Educate the consumer/Bring in new customers/send offers & latest NEWS/build relationship/ increase brand value
High
Mid
Search Engine Optimization
Organic Visits
Generate leads/ increase brand value/ improve reputation/ increase traffic
High
Mid
PPC Advertising
Inorganic Visits
Generate leads/ increase brand value/ improve reputation/ increase traffic
High
High





# Cost and ROI may vary depending on the business type.





6- Keep an Eye on Competitors

Every business has its own competitors. It is important to check the current SEO status of the chosen competitor and compare it with the brand to be promoted. Comparison should be done on various levels. A thorough and practical comparison can give us some vital stats and standards that we need to set up prior to start any campaign. Also, the analysis should be done on a continuous basis instead of planning it right before the start of any campaign. Performing it at regular intervals can give you insights regarding your progress and your competitor’s progress. This sort of healthy competition analysis gives more reasons and scope for improvement.

Now, let us look at the table below that focuses on two levels of competitor analysis which must be carried on during the entire phase of any SEO campaign.

Level 1
Pre-Campaign Competition Analysis


Type
Description


Rank Audit
Check the rankings of your competitor on different search engines for the chosen keywords.
Technical SEO Audit
Check the technical seo issues like meta tags, canonical tags, domain name, redirection, robots.txt, sitemaps, navigation structure, internal linking, finding DA, PR etc.
Content Audit
Analyze the content quality on a high level using metrics like TF-IDF score and semantic analysis.
Traffic Audit
Compare the recent and past traffic trends of the site using tools like Alexa and Semrush.
Link Audit
Perform a complete link analysis and check the quality of links pointing to the site. It is important to identify the natural and high authority links.
Social Presence Audit
Analyze the depth of social media presence of the competitor on sites like Twitter, Facebook, LinkedIn, Pinterest, Google Plus etc.


Put the traps in place
With tools such as Google alerts and socialmention, identify the brand mentions, links, likes, tweets etc that the brand is able to generate on a regular basis and plan your efforts accordingly.


Level 2
Post-Campaign Competition Analysis (May be 6 months after the start of the campaign)



Repeat the steps you did at level 1 and compare your success so far. Determine if the growth is positive or negative. There should not be a large difference between your competitors graph and your site's graph. If it is totally negative, find out the reasons why that happened and try to revamp your current strategy.

7- Monitor, Test and Reinvent

A strategy that you plan today might not be that effective years or even months from now. Hence, it is extremely important that you must monitor, test and reinvent your current seo strategy in order to remove the tasks that have become outdated and replace them with new tasks as per the algorithmic changes or the recent marketing trends.

Monitoring

Monitoring involves the use of tools like Google Webmasters, Google Analytics, Hubspot etc. These tools lets you place a short piece of Java Script code on your site and allows you to track the user behavior that is constantly happening. I prefer creating a custom dashboard in Google Analytics and measure the behavior of the user as per the metrics that are closely related to the determined KPI’s. During the monitoring phase, it is essential to keep a track of the following factors:

Google Webmasters Tools

A-     Keep an eye on the crawl errors under GWT. This is important because if at any point of time, Google is unable to crawl any webpages then you must see an instant notification here. Check the number of not found errors for desktop, Smartphone and feature phone. Every other strategy will fall into place only after Google successfully crawls your site.

B-      Check the blocked URL column under Crawl section in GWT to identify there are no important URL’s that have been blocked accidently because this may affect the performance of the entire site.

C-      Test that the sitemaps are working and are regularly updated.

D-     Monitor the search queries that are bringing in traffic to your site and make sure to download the search query data every month because Google by default displays only 3 months of data. If you wish to check the search queries for the more than 3 months, there is no way to do that. Hence, it is important that you download the data and keep it for your future reference.

E-      Check the links to your site section under GWT and keep a track on the number of links that are pointing to the site. Keep an eye on the anchor text and the top domains that link to your site.

F-      Check the manual action penalty tab on a regular basis and before the start of any new campaign in order to identify whether any manual action penalty has been applied by Google.

G-     Allow Google to email any major site issues to your email id. You can enable the email notifications tab under the Webmaster Tools Preferences located under the admin section.

Google Analytics

A-     Identify the top channels that are sending traffic to your site by moving to the “channels” tab under “Acquisition” section.

B-      Track the amount of new visitors that are visiting the site using mobile devices. You can find the stats in the “devices” tab in the mobile section.

C-       Check the page speed suggestions column under the “site suggestion” tab in the “Behavior” section. Page speed is an important factor and should not be ignored.

D-     Check the in-page analytics section and identify the top areas where users are clicking on the site. Adjust the highly converting elements as per the pre-determined KPI’s where the users click the most keeping in mind the overall user behavior. This can really help in increasing the conversion rate.

E-      Move to the “site search” column to identify the products or keywords that the users are most interested in when they come to your site. In order to provide a seamless user experience, your site must serve all the relevant demands of the user.

Testing

Testing is an important phase of the overall SEO campaign because it reveals the actual user behavior and distinguishes between what is thought to be a success and what is actually a success.

A/B Testing - In almost every phase of your SEO process, you must not forget one thing and that is the KPI's. After all this is what you are running the campaign for. A/B testing in Google Analytics allows you to test two different versions of the conversion page in order to find out which page performs better and leads to more conversions. This helps to put the best page in the site and achieve greater ROI.

Multivariate Testing - This type of testing goes a level deeper than A/B testing and allows you to test the performance of the various elements in a particular web page as opposed to two separate web pages as in A/B testing. Multivariate testing is beneficial in finding which design or content element is playing a major role in conversions.

You can create content experiments in Google Analytics to test the effectiveness of your landing pages. Remember, traffic that does not lead to conversions is equal to NULL traffic. Experimenting is a great way to ensure the traffic the site is getting is not leading to drop outs but instead contributing towards fruitful conversions.

Reinvent

No strategy can remain stable for a long period of time. A constant strategy becomes stale sooner or later. Hence, in order to keep your SEO strategy return results, it is important to reinvent it for its betterment. Remember the 2nd level competitor analysis? May be you might need to reframe your strategy based on what your competitors are doing and what updates Google had recently. A successful seo strategy should be customized and innovative. Also, a word of caution here, do not rely on competitors completely because if they spam and you follow that strategy then a Google penalty is waiting for you. 

Hope you enjoyed reading this article. Let me know your thoughts and views in the comments below.

Also See:

How Google Identifies Entities Using Attributes
Trust Button and Persona Pages
Types of Graphs Google Uses to Rank Webpages
Universal Analytics
Google Disavow Links Tool
Multi Channel Funnel in Google Analytics
Regular Expressions on Google Analytics
List of Google Search Operators