Amazon Best Products with minimum price

Tuesday 26 December 2017

How to create Robot.txt file for SEO?

Robot.txt is a text file webmasters create to instruct web robots most often search engines how to crawl pages on their website. Robot.txt file also known as robots exclusion protocol (REP). It also tells web robots which pages to crawl and not to crawl. The REP also includes directives like meta robots, as well as page - sub directory, or site - wide instructions for how search engines should treat links (such as "follow" or "nofollow").

Let's say a search engine is about to visit a site. Before it visits the target page, it will check the robots.txt for instructions.

The Basic format of robot.txt file looks like:


User – agent: [user-agent name]
Disallow: [URL string not to be crawled]

 








Together, these two lines are considered a complete robots.text file.


User – agent: *
Disallow: /

 








The above code, is the actual skeleton of a robots.txt file.The asterisk after "user-agent" means that the robots.txt file applies to all web robots that visit the site. The slash after "Disallow" tells the robot to not visit any pages on the site.
You all might be wondering why would anyone want to stop web robots from visiting a site. This is where the secret to this SEO hack comes in.You might be having a lot of pages on your site, right? If a search engine crawls your site, it actually crawls all the pages of your website, it will take the search engine bot a while to crawl them, which can also have negative effects on your ranking. That's because Googlebot (Google's search engine bot ) has a crawl budget.

This is how Google explains:

1. Crawl rate limit

Which limits the maximum fetching rate for a given site. The Crawl rate can go up and down based on a couple of factors:

a) Crawl Health: if the site responds really quickly for a while the limit goes up, that is more connections can be used to crawl. If the site slows down or responds to errors, the limit goes down and Googlebot crawls less.

b) Limit set in Search Console: website owners can reduce Googlebot's crawling of their site.

2. Crawl Demand

Even if the crawl rate limit isn't reached, if there's no demand from indexing, there will be low activity from Googlebot. The two factors that play a significant role in determining crawl demand are:

a) Popularity: URLs that are more popular on the Internet tend to be crawled more often to keep them fresher in our index.

b) Staleness: our systems attempt to prevent URLs from becoming stale in the index.

Crawl Budget: The number of URLs Googlebot can and wants to crawl.

Finding your robots.txt file:

If you just want a quick look at your robots.txt file, or want to see for any site all you have to do is type the basic URL of the site into your browser's search bar (e.g. abc.com, example.com etc.). Then add/robots.txt onto the end.

Following things will happen:

1. You'll find a robots.txt file.
2. You'll find an empty file.
3. You'll get a 404 error.



Let's see Few examples of robot.txt in for www.abcxyz.com site.

Robots.txt file URL: www.abcxyz.com/robots.txt


User – agent: *
Disallow: /

 









Using the above syntax, would tell all web crawlers not to crawl any pages on www.abcxyz.com, including the homepage.



User – agent: *
Disallow:

 








Using the above syntax, would tell all web crawlers to crawl all pages on  www.abcxyz.com, including the homepage.


User – agent: Googlebot
Disallow: /abcwyz-subfolder/

 








Using the above syntax, would tell only Google's crawlers not to crawl any pages that contain the URL string www.abcxyz.com/abcxyz-subfolder/.


User – agent: Bingbot
Disallow: /abcwyz-subfolder/blocked-page.html

 








Using the above syntax, would tell only Bing's crawlers to avoid crawling the specific page at the URL string www.abcxyz.com/abcxyz-subfolder/blocked-page.

Technical Phrases:

1. User-agent: The specific web crawler to which you're giving crawl instructions (search engine).
2. Disallow:  The command used to tell a user-agent not to crawl particular URL. Only one "Disallow" line is allowed for each URL.
3. Allow: Only applicable for Googlebot. The command to tell Googlebot it can access a page or sub folder even though its parent page or sub folder may be disallowed.
4. Crawl-delay: How many milliseconds a crawler should wait before loading and crawling page content.
5. Sitemap: Used to call out the location of any XML sitemap(s) associated with this URL. Only supported by Google, Ask, Bing and Yahoo.

Some points to be noted:

  • A robots.txt file must be placed in a website's top-level directory (web-crawling robots only look for the file in one specific place; the main directory(root domain or homepage).If a user agent visits  www.abcxyz.com/robots.txt and does not find a robots file there, it will assume the site does not have one and proceed with crawling everything on the page.
  • The file must  be named as "robots.txt" as it is a case sensitive (not Robots.txt, robots.TXT, pr anywise)
  • Each sub domain  on a root domain uses separate robots.txt files. This means that both blog.abcxyz.com and abcxyz.com should have their own robots.txt files.
  • The /robots.txt file is a publicly available: just add/robots.txt to the end of any root domain to see that website's directives. This means that anyone can see what pages you do want to crawl or don't want to crawled.   
  • It's generally a best practice to indicate the location of any sitemaps associated with this domain at the bottom of the robots.txt file. Example:


User – agent: *

Allow: /*.htmls

Disallow: /*/data/*

Sitemap: https://www.abcxyz.com/en-gb/sitemap.xml

 














Some common cases that justify Why do we need robots.txt?

  • Preventing duplicate content from appearing in SERPs
  • Keeping entire sections of a website private
  • Keeping internal search engine results pages from showing up on a public SERP
  • Preventing search engines from indexing certain files on your website
  • Specifying the location of sitemap(s)
  • Specifying a crawl delay in order to prevent your servers from being overloaded when crawlers load multiple pieces of content at once
In case, there are no areas on your site to which you want to control user-agent access, you may not need a robots.txt file at all.

How does robots.txt work?

1. Crawling the web to discover content
2. Indexing that content so that it can be served up to searched who are looking for information.

To Crawl sites, search engines follow links to get from one site to another, crawling across many links and website. This crawling is also known as "spidering".

After arriving at a website but before spidering it, the search crawler will look for a robots,txt file. If it finds one the crawler will read that first and then continue through the page. And any case, there is no robots.txt it will proceed the entire website.






That's all from my end...

If you have any queries, feel free to write in comments down below..
Stay tuned for more digital advertising!!

Thank You...









































Sunday 17 December 2017

SEO Techniques -White Hat & Black Hat SEO

Search Engine Optimization (SEO) techniques are classified into two broad categories:
  • Techniques that search engines recommend as part of good design referred to as White Hat SEO, and 
  • Techniques that search engines do not approve and attempt to minimize the effect of referred to as Black Hat or spamdexing.
SEO Tactics are as follows in detail:

An SEO tactic, technique or method is considered as White Hat if it follows the followings :
  • If it conforms to the search engine's guidelines.
  • If it does not involves any deception.
  • It ensures that the content a search engine indexes and subsequently ranks is the same content a user will see.
  • It ensures that a web page content should have been created for the users and not just for the search engines.
  • It ensures the good quality of the web pages.
  • It ensures the useful content available on the web pages.
  • Always follow a White Hat SEO tactic and don't try to fool your site visitors. Be honest and definitely you will get something more.
  • Next chapter onward we will put light on White Hat SEO techniques. The White Hat SEO are very simple and can be done without investing much cost.


Black Hat or Spamdexing

An SEO tactic, technique or method is considered as Black Hat or Spamdexing if it follows the followings
  • Try to improve rankings that are disapproved of by the search engines and / or involve deception.
  • Redirecting users from a page that is built for search engines to one that is more human friendly.
  • Redirecting users to a page that was different from the page  the Search Engine ranked.
  • Serving one version of a page  to search engine spiders/bots and another version to human visitors. This is called Cloaking SEO tactic.
  • Using hidden or invisible text or with the page background color, using a tiny font size or hiding them within the HTML code such as "no frame" sections.
  • Repeating keywords in the meta tags, and using  keywords that are unrelated to the site's content. This called Meta tag stuffing.
  • Calculated placement of keywords within a page to raise the keyword count, variety, and density of the page. This is called Keyword stuffing.
  • Creating low - quality web pages that contain very little content but are instead stuffed with very similar keywords and phrases. These pages are called Doorway or Gateway Pages.
  • Mirror websites by hosting multiple web sites all with conceptually similar content but using different URLs.
  • Creating a rogue copy of a popular website which shows contents similar to the original to a web crawler, but redirects web surfers to unrelated or malicious web sites. This called Page hijacking.
Always be away to adopt any of the above Black hat tactic to improve the rank of your website. Search Engines are smart enough to identify all the above properties of your site and ultimately you are not going to get anything.

What is SEO Copywriting?

SEO Copywriting is the technique of writing the view able text on a web page in such a way that it reads well for the surfer, and also targets specific search terms. Its purpose is to rank highly in the search engines for the targeted search terms.

As well as the view able text, SEO Copywriting usually optimizes other on-page elements for the targeted search terms. These include the Title, Description and keywords tags, headings and alt text.

The idea behind SEO Copywriting is that search engines want genuine content pages and not additional pages (often called "doorway pages" ) that are created for the sole purpose of achieving high rankings.

What is Search Engine Rank?

When you search any keyword using a search engine then it displays thousands of results found in its database. A page ranking is measured by the position of web pages displayed in the search engine results. If search engine is putting your web page on first position then your web page rank will be number 1 and it will be assumed as with a high rank.

SEO is the process of designing and developing a web site to attend a high rank in search engine results.

What is On-page and Off- page SEO?

Conceptually, there are two ways of doing SEO:
  • On - Page SEO: This includes providing good content, good keywords selection, putting keywords on correct places, giving appropriate title to every page etc.
  • Off- Page SEO: This includes link building, increasing link polarity by submitting in open directories, search engines, link exchange etc.   
SEO Website Domain
  • When you start thinking of doing a business through internet, first thing which you think about is your website domain name. Before you choose a domain name you should consider  the followings:
  • Who would be your target audience?
  • What you intend to sell to them. Is it tangible item or just text content.
  • What will make your business idea unique or different than everything else that is already on the market?
Many people think it is important to have keywords in a domain. Keywords in the domain name are usually important, but it usually can be done while keeping the domain name short, memorable, and free of hyphens.

Using keywords in your domain name gives you a strong competitive advantage over your competitors. Having your keywords in your domain can increase click through rates on search engine listings and paid ads as well as make it easier to using your keywords in getting keyword rich descriptive inbound links.

Avoid buying long, confusing domain names. May people separate the words in their domain names using either dashes or hyphen. In the past the domain name itself was a significant ranking factor but now search engines have advanced it is not very significant factor anymore.

Keep two to three words in your domain name it will be more memorable. Some of the most memorable websites do a great job of branding by creating their own word. Few examples are eBay, Yahoo!, Expedia, Slashdot, Fark, Wikipedia, Google...

You should be able to say it over the telephone once and the other person should know how to spell it and they should be able to guess about what you sell.

Few Points:

Finally, you should be able to answer yourself for the following questions:
  • Why do you want to build your website? Why should people buy from your site and not from other site? What makes you different from others?
  • Who are your target audience and what you intend to sell to them?
  • List 5-10 websites which you think, they are amazing. Now think why are they amazing?
  • Create 5 different domain names. Make at least 1 of them funny. Tell them to a half  dozen people and see which ones are the most memorable. You will get more honest feedback of the people do not know you well.
  • Buy your domain name which is more catchy, memorable and relevant to your business.


That's all from my end...

If you have any queries, feel free to write in comments down below..
Stay tuned for more digital advertising!!

Thank You...




      

Saturday 2 December 2017

How does Search Engine Optimization/SEO works?

SEO commonly known as Search Engine Optimization is the activity of optimizing Web pages or whole sites in order to make them more search engine friendly, thus getting higher positions in search results.

SEO is sometimes also called SEO copyrighting because most of the techniques that are used to promote sites in search engines deal with text.


  • SEO stands for Search Engine Optimization.
  • SEO is all about optimizing a website for Search Engines.
  • SEO is the process of designing and developing a website to rank well in search engine results.
  • SEO is to improve the volume and quality of traffic to a website from search engines.
  • SEO is a subset of search engine marketing.
  • SEO is the art of ranking in the search engines.
  • SEO is marketing by understanding how search algorithms work and what human visitors might search. 
A comprehensive search engine optimization project is divided into four interrelated phases:

1. Pre-site activities - The research and planning activities undertaken before an  existing or new site or page is actually touched or built.

  • Understanding your organization's online business strategy.
  • Researching your market category, customers and competitors.
  • Keyword research and selection.
2. On-Site activities - The activities involved in the content and design of web pages.
  • Writing the title, description and keyword meta tags.
  • Writing content - Body copy, titles, images tags, outbound links that reflect and enhance keywords.
  • Building Internal Links - Helping the search engines navigate the site.
  • Site design and construction - Ensuring the web pages utilities design and code that can be properly crawled and indexed by the search engines.
3. Off-site activities -Building a portfolio of quality inbound links to your website.

4. Post-site activities - Analyzing and responding to site traffic and user feedback once a website has been optimized. Effective SEO is a continuous feedback.

If you plan to do some basic SEO, it is essential that you understand how search engines work and which items are most important in SEO.


How Search Engine Works?

Search engines perform several activities in order to deliver search results:
  • Crawling - is the process of fetching all the web pages lined to a website. This task is performed by a software, called a crawler or a spider (or Google bot, as is the case with google). 
  • Indexing - is the process of creating index for all the fetched webpages and keeping them into a giant database from where it can later be retrieved. Essentially, the process of indexing is identifying the words and expressions that best describe the page and assigning the page to particular keywords.
Processing - when a search request comes, the search engine processes it. i.e. it compares the search string in the search request with the indexed pages in the database.
  • Calculating Relevancy - Since it is likely that more than one pages contains the search string, so the search engine starts calculating the relevancy of each of the pages in its index to the search string.
  • Retrieving Results - The last step in search engines' activities is retrieving the best matched results. Basically,  it is nothing more than simply displaying them in the browser. 
Search engines such as Google and Yahoo! often update their relevancy algorithm dozens of times per month. When you see changes in your rankings it is due to an algorithmic shift or something else outside of your control.

Although the basic principle of operation of all search engines is the same, the minor differences between their relevancy algorithms lead  to major changes in results relevancy.

That's all from my end...

If you have any queries, feel free to write in comments down below..
Stay tuned for more digital advertising!!

Thank You...



















Tuesday 28 November 2017

How to Upload HTML5 Display Ads in Google AdWords?

Hey Guys, the Flash - based ads are finally out(since January 2,2017) with HTML5 ads are in. Google said it will "enhance the browsing experience for more people on more devices". To know, what is Google Display Network visit our blog on Google Display Network.

HTML5 image ads are used to create animated images and other features. Using HTML5 ads makes the display ads to look more interactive and interesting. HTML5 ads can be created with Google tools, including Google Web Designer.

Follow the steps below to upload the HTML5 ads :

1. Your files should be in compressed .zip folder. In case you needs ads without any coding experience then the solution for this Google Web Designer, as the ads come with all the necessary files than.

2. Upload ads directly in the Google Display Network Campaign
 Once the ads files are ready, either upload them directly by creating a new campaign or add into the desired campaign and ad group, then go to the Ads Tab, Select "+Ad" then choose  "Image Ad" and 'upload an ad".


Drag and drop in your zipped ad. We can even upload multiple images at once.

Once the ad is approved without any error, you can add in desired Display URL and Final URL,

Uploading HTML5 ads via Google AdWords Editor:

1. In the account, select the campaign and ad group where the ad appears.
2. In the type list, select Ads and Extensions and then Image Ads.
3. In the edit panel, select one or more ads.
4. In the edit panel, Click the Choose Image Link.
5. Choose the HTML5 files you want to upload (files should be in .zip file)
6. Click Open.

Note: AdWords Editor doesn't accept Flash files for image ads.


That's all from my end...

If you have any queries, feel free to write in comments down below..
Stay tuned for more digital advertising!!

Thank You...











Monday 27 November 2017

How to Use AdWords Auction Insights in Marketing Strategy??

Google AdWords Auction Insight Report is the key tool to keep track of your competition. This blog will take you through all the components of the report and how to interpret & extract the data along with how to use it to optimize your campaigns.

Auction Insights report lets you compare your performance with other advertisers who are participating in the same auctions that you are. The Auction Insights report is available for both Search and Shopping Campaigns. It basically helps us to make strategic decisions about bidding and budgeting  by showing where you are leading and where you are missing the opportunities.

Auction Insights Reports:

1) The Auction Insights Report for Search Campaigns:- It provides 6 different parameters: Impression Share,Average Position, Overlap Rate, Top of the Page rate, position above rate and outranking share. We can generate the report for one or more keywords, ad groups, or campaigns and segment the result by date and time.( as long as they meet the minimum activity selected for the time selected).

2) The Auction Insights Report for Shopping Campaigns:- It provides 3 different parameters: Overlap Rate, impression share, outranking share.We can generate the report for one or more ad groups, or campaigns and segment the result by date and time.( as long as they meet the minimum activity selected for the time selected). We can extract the data for shopping campaigns from October 2014 to present.

In detail description of all the Auction Insights Parameters: 

A) Average Position (Only for Search Campaigns )
 Average Position is a parameter that helps you to know how high your ads are ranking compared with those of your competitors competing in the same auctions. It's the average rank of your ad in the auctions, which determines the order of your ads on the SERP. (Search Engine Result Page)

For example: if one of the participants in the Auction Insight Report showed a position of  "5" in the column of Average Position, it means the participant's ad on a average showed on the 5th place on the SERP where your ad also showed.

B) Impression Share
Impression Share is the number of impressions you received divided by the estimated number of impressions you were eligible (eligibility is based on your ads' approval statuses, bids, targeting settings, and Quality Scores) to receive, it also shares the impression share of other competitors/advertisers as a proportion of the auctions in which you were also competing. Note that Shopping Campaigns don't use Quality Score.

C) Top of Page Rate (Only for Search Campaigns )
It tells you how often your ad ( or the ad of another advertisers, depending on which row you're viewing) was shown at the top of the SERP page, above the unpaid search results (organic search results)

D) Outranking Share
It is the percentage of number of times your ad ranked higher in the auction than another advertiser's ad, plus the number of times your ad showed when theirs did not, divided by the total number of ad auctions you participated in.

E) Position Above Rate (Only for Search Campaigns )
Position Above Rate is how often other advertiser's ad was shown in a higher position than yours was, when both of your ads were shown at the same time.

For example: if one of the other advertisers in your Auction Insights report is showing "5%"  in the column, it means that other advertiser's ad showed in a position above yours in 5 out of every 100 times your ads showed at the same time.

How to find the Auction Insights Report in Google AdWords?

1. Sign in to your AdWords account.
2. Click the Campaigns, Ad Groups, or keywords tab based on what information one wants to see in the report.
3. If you want to select specific campaigns, ad groups, or keywords click on the box next to each item. Click the Details tab at the top of the statistics table.
4. In the drop - down menu, of the Details tab, under "Auction Insights", click All or Selected.

Create an Auction Insights

The Auction Insight Filter provides with information as in which of your keywords, ad groups, or campaigns have Auction insights reports available to view. Creating a filter, help you locate keywords, ad groups, and campaigns that have auction insights reports available.

1. Sign in to your AdWords account.
2. Click the Campaigns, Ad Groups, or keywords tab based on what information one wants to see in the report.
3. Click the Filter button above your statistics table.
4. Click Create Filter.




5. Click the drop - down menu in the panel that appears to see a list of conditions for new filters. Select Auction Insights and click Available from the "matches any" drop - down.


6. Save this new filter by checking the "Save Filter" box and clicking Apply. The table then automatically updates to show your keywords or ad groups with available Auction insights reports.

7. The report will be as shown below. You can segment your report on the basis of 'Device' or 'Time' to get more insight from segment drop down menu.


8.  We can also export the report or schedule the report to be shared on your email.



9. This is how the report look like when you segment it into week, month, quarter, year and day of the week.We can increase your bids during days when your competitor is highly competitive and lower your bid when they are lying low.

Note:
Auction Insight report provides information on other advertisers who participated in the same auctions as you. But this does not mean or indicate that the other advertisers have the same advertising settings as you. The other metrics shown are based only on instances when your ads were also estimated to be eligible to appear. This report won't reveal the actual keywords, quality bids, or settings from your campaigns, and it won't give you insight into the same information for others.

That's all from my end...

If you have any queries, feel free to write in comments down below..
Stay tuned for more digital advertising!!

Thank You...






















Tuesday 31 October 2017

How to create Dynamic treeview using PHP(CodeIgniter), bootstrap treeview, MySQL

Creating Dynamic Tree View from database by JSON response:

In this post we are using CodeIgniter Framework to create Dynamic tree view. Following are the step we will use in this tutorial.




1) Create user Table in Database, in user table we will create four columns:

  • id (auto-increment and primary key)
  • username
  • password
  • parent_key
2) Create Members Controller in CodeIgniter Application/Controller folder. 

3) Create members view in CodeIgniter Application/View folder.

1) Create user Table in Database:


  1. CREATE TABLE `user` (
      `id` int(11) NOT NULL,
      `username` varchar(255) NOT NULL,
      `password` varchar(255) NOT NULL,
      `parrent_key` varchar(255) NOT NULL
    ) ENGINE=InnoDB DEFAULT CHARSET=latin1;

In the above table parent_key is the parent of the child(id). Which make the Tree View of the relation of parent and child.

2) Members.php in Application/Controller

<?php
defined('BASEPATH') OR exit('No direct script access allowed');
 
class Members extends CI_Controller 
{

 public function index()
 {
  $data = [];
  $parent_key = 'parent_key'; // pass any key of the parent
  $row = $this->db->query('SELECT id, username from user WHERE parent_key="'.$parent_key.'"');
  
  if($row->num_rows() > 0)
  {
   $data = $this->members_tree($parent_key);
  }
  else
  {
   $data=["id"=>"0","name"=>"No Members presnt in list","text"=>"No Members is presnt in list","nodes"=>[]];
  }
  echo json_encode(array_values($data));
 }

 public function members_tree($parent_key)
 {
  $row1 = [];
  $row = $this->db->query('SELECT id, username from user WHERE parent_key="'.$parent_key.'"')->result_array();
  foreach($row as $key => $value)
        {
         $id = $value['id'];
         $row1[$key]['id'] = $value['id'];
         $row1[$key]['name'] = $value['username'];
         $row1[$key]['text'] = $value['username'];
         $row1[$key]['nodes'] = array_values($this->members_tree($value['id']));
        }
        return $row1;
 }

}


Explanation of the above code:

$data = [];
// Creating the blank array

$parent_key = 'parent_key';
// pass the parent key of all the child under the specific key.

$row = $this->db->query('SELECT id, username from user WHERE 
parent_key="'.$parent_key.'"');
// Fetch all username which comes under the declared parent key.

if($row->num_rows() > 0)
{
  $data = $this->members_tree($parent_key);
}
else
{
 $data=["id"=>"0","name"=>"No Members presnt in list","text"=>"No Members is presnt in
 list","nodes"=>[]];
}
// If condition goes true(fetch one or more than one row) then it call the member_tree method of Members class and if  query does not return any value then it passes the value in json format.

echo json_encode(array_values($data));
// Echo the $data in json formate

A) members_tree Method:

$row = $this->db->query('SELECT id, username from user WHERE parent_key="'.$parent_key.'"')->result_array();
// Fetch the all data which comes under the parent_key

foreach($row as $key => $value)
        {
        $id = $value['id'];
        $row1[$key]['id'] = $value['id'];
        $row1[$key]['name'] = $value['username'];
        $row1[$key]['text'] = $value['username'];
        $row1[$key]['nodes'] = array_values($this->members_tree($value['id']));
        }
//Creating Tree by all fetched value and make a tree format of bootstrap

return $row1;
// Return all the value to index() method.

That's it in Members Controller.

3) members View in Application/View Folder.

A ) Include all css/js file in view <head>

<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.0/css/bootstrap.min.css">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-treeview/1.2.0/bootstrap-treeview.min.css" />
<script type="text/javascript" charset="utf8" src="https://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.9.1.min.js"></script>
<script type="text/javascript" charset="utf8" src="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-treeview/1.2.0/bootstrap-treeview.min.js"></script>

B) Create div with id name.

<div class="col-md-8" id="treeview_json">
     
</div>

C ) Call the Members controller by Ajax in View

<script type="text/javascript">
$(document).ready(function(){
 var treeData;
 $.ajax({
   type: "GET",  
   url: "<?php echo base_url('Members');?>",
   dataType: "json",       
   success: function(response)  
   {
  initTree(response)
   }   
 });
 function initTree(treeData) {
  $('#treeview_json').treeview({data: treeData});
 }
});
</script>

Now you can see the Dynamic created Tree in Browser.

If you have any queries please write down in below comment box.

Thank You.

Monday 30 October 2017

A/B Testing in Google Adwords

Most advertisers realize that associating with your buyer is both a workmanship and a science. You may have smart thought about what works innovatively. In any case, you have to test it out, and demonstrate your promoting comes about with numbers.

A/B Testing is an approach to figure out what gets your ads the best outcomes. It's trying out, for instance, your Call-to-action like "Buy Now" or "Buy This", and afterward making sense of what your clients are emphatically reacting to the most. The  more positive a response is the higher is your ROI (Return on Investment).

In A/B testing, you make a change in your Ad against the current one, and then analyse and determine which one gives the optimal results. You test and measure for the outcome/result you're looking( sales, CTR{click through rate}, leads etc.) Then finally you use the group or Ad giving you best results.

The figure will make it a bit simpler:

Following are the steps that you consider while doing A/B testing:

1. What is the goal you want to achieve or improve? 

a) More leads generation.
b) Increase CTR (Click Through Rate)
c) Increase Brand Awareness
d) Increase Sales

2. What part of AD are you testing?

There are many ways you can test the ads:

a) Call to Actions 
b) Display URL
c) Punctuation and Capitalization.
d) Destination URL (landing page)
f) Device targeting
g) The headline text 
h) Any Offer Displayed etc. You can have various variations.

3) What tool are you going to use to measure the results?

a) Google Adwords Statistics.
b) Third party tool  like omniture etc.
c) Google Analytics

4) Time period you are going to run the test?

Have a finite time to conclude, analyse your results, and then move on.

5)  Variation ads

Take 2 ads, Use AD 1 as control ad and AD 2 has variation. For example: 

AD 1

AD 2



6. Decide what factor you need to improve and in what percent or how much?

7. Your step after reaching your desired results. If one Ad is doing well than the other you will stop the one not performing and make the other your desired Ad. Otherwise, if you find that there's not much difference will you continue to run the test ad or keeping both the ads or try another A/B testing variations. 

Follow the below steps for running A/B testing in Google AdWords :

a) Make 2 ads in Google AdWords
b) Chose the variable test factor example headline, display URL etc.
c) Set up your goal be it to increase sale, leads etc.
d) Then write 2 different variations in your ads.
e) Run the ads for a specific time you want to run.
f) Finally, track the results of both the ads.


A/B testing is powerful. It work. When you are doing A/B testing, ensure you remember your true objectives.  It's all about increasing your companies/business ROI (return on investment), and making your organization more beneficial without abundance spending. Start by experimenting with  may be one or two variables.


Adwords Expriments

AdWords include campaigns drafts and experiments  also known as AdWords experiments. It helps you to make changes to your mirror campaign without impacting the original campaign or activated as an Experiment. If we run a draft as an experiment, then it will test the changes made to your campaign and compare its performance with the original campaign over time. 

Once you have created your drafts, apply those changes directly to your campaign or test it by running the ad as an experiment.It is followed by defining experiment split. Experiment split is the percentage of traffic that you wish to split between and the original campaign. Generally a 50 percent is fair to judge and compare the affect of changes made in experiment to the original campaign.

Analyzing the Experiments performance

Once, the experiments starts running you will be able to see the results. You can then further analyse the performance of the experiments with respect to the original campaign to see whether an experiment delivered results or not.

Therefore, to optimize and achieve your business ROI(return on investment) A/B testing is essential in Google AdWords. 

I will be discussing how to make Campaign Draft and experiments in detail in my next blog.

That's all from my end...

If you have any queries, feel free to write in comments down below..

Stay tuned for more digital advertising!!

Thank You...