Web forms are interactive HTML forms published on web pages which users can fill in and submit. Like any other web form, the ones created in Zoho CRM can be embedded on your web pages for various purposes, such as gathering feedback, placing product orders, or making service requests. The information submitted through the form is captured automatically in your Zoho CRM account.
Let's look at an example. You have a lead generation form on your landing page that is not giving you as many leads as expected. You may want to look into the reason for this as the form is the gateway between your website and visitors. There are many best practices to increase the performance of web forms. Something as simple as reducing the number of fields and changing the color and text of your CTA button can boost the rate of form submissions. If you want to check how these changes will impact submission rates, you can use A/B testing.
A/B testing is essentially running an experiment where two or more variations of a web form are shown to different segments of the website visitors. You can then compare the performance of the variants to identify which version gets the best results. Instead of making changes to your lead generation form based on assumptions, the A/B Testing feature for web forms in Zoho CRM helps you make an informed decision and use the form that gives the best results.
The process of A/B testing can be categorized into three parts:
- Start A/B testing
This part involves selecting the web form you want to execute the A/B test for and creating the variants. You can then set up A/B testing by specifying the target audience and how long to run the experiment for. - View A/B testing results
Once A/B testing is completed, you can view the status and the statistics on the performance of the original form and the variants. - Launch the winner
After you run this test, you can determine which form has the highest conversion rate, which is the percentage of visitors who submitted the form. You can launch the winning form to replace the original form on the web page.
Terminology
- Original Form: The web form you want to carry out A/B testing on.
- Variants: The versions of the original web form you want to test.
- Visitors: The number of people who have visited the web page that hosts the web form.
- Unique Submissions: The first form submission made by each visitor. Multiple submissions can be made but only one submission per visitor will be considered as a unique submission.
- Conversion Rate: The percentage of visitors who submitted the web form.
- Abandon Rate: The percentage of visitors who started filling out the form but failed to submit it.
- Complete Submission: The number of visitors who filled in all the fields in the web form and submitted it.
- Partial submission: The number of visitors who did not fill all the fields in the web form and submitted it with some fields left empty.
- Improvement Rate: The increase in the percentage of submissions received from the variant compared to the original form.
- Leading: The webform variant with the highest conversion rate during the A/B test.
- Winner: The web form variant with the highest conversion rate at the end of the A/B test.
Start A/B Testing
You can create different versions of the original web form by including more fields, modifying the field properties, changing the Call To Action buttons, changing fonts, revising the background color of the form and so on. All the changes that you make in the variant can be viewed from the Changes List. You can also revert the changes, if necessary.
A/B testing configuration
Setting up A/B Testing involves defining the percentage of visitors that should be roped in for the A/B testing. Further, the percentage of visitors who should view the original web form and the variants should also be specified. The date and time or the number of visitors after which the A/B Testing should end determines the duration of your A/B Testing. You can also save the A/B Testing configuration as draft.
Status of A/B testing
Once the A/B test has started, the status of the test can be categorized as following:
- Running: An ongoing A/B testing process.
- Scheduled: The A/B test is scheduled to take place in the future.
- Completed:The A/B testing is complete.
- Paused: An ongoing A/B test has been temporarily put on hold.
- Launched: The Winner has been launched to replace the original web form.
Note: Even if one variant has a slightly higher conversion rate than another, you may still choose to use the version with a lower rate for other reasons. You can choose to launch whichever variant you want, regardless of the winner of the test. - Drafted: A preliminary version of the A/B test that you have configured but not yet executed.
Note
- There can only be one A/B test in the Running status at any one time.
- A/B testing is supported only for manual embed in case of CRM Forms used in Zoho Sites. It is not be supported for forms that have been embedded as elements.
Setting up A/B testing
To set up A/B testing
- Go to Setup > Developer Space > Webforms > A/B Testing.
- Click Create A/B Testing in the A/B Testing page.

You can also create an A/B test by clicking the Create A/B Testing button from the Webforms page or by moving your mouse pointer over a webform on the Webforms tab and clicking More and then Create A/B Testing.

In the Create A/B Testing page, enter the following details:
- A/B testing name: Specify a name for this experiment.
- Form name: Choose the web form you want to test from the dropdown.
- Click Create.
You will be redirected to the WYSIWYG editor where you will create a variant of the original form.

Customize the variant as required. You can customize it in the following ways:
- Click More in the Variant tab and then Clone or Delete to clone or delete a variant.

- Click the Add icon (+) to add a new variant.
- Click View changes to view the list of changes made to the variant compared to the original form.
- Hover over a field in the Changes List page and click Revert to undo the change.

- Click Preview to view what the variant looks like.
- Click Next.
In the A/B Testing Configuration pop-up:
- Enter the percentage of visitors to include in the A/B testing experiment in the Test Sampling field.
- Split the visitors between the different versions of the form by dragging the handle to the left or right.
- Specify when the test will end.
- Select the check box to send a notification about which form wins the A/B test through email.
- Click Start A/B Testing.
- Click Save as Draft to save the A/B testing configuration as a draft.

Note
- A/B testing works only for web forms that are hosted in Embed or iFrame format.
- You can schedule the A/B testing to start at a future date.
- Once A/B testing has been started, the configurations or the variants cannot be edited.
- Once the A/B test is complete, only the original form will be available for visitors until another variant is launched.
View A/B testing results
You can view the A/B testing result by hovering over the A/B testing name and clicking the View Results option. There are two tabs: Preview and Analytics.

- It will take ten minutes after the A/B test is complete to declare the winner.
- The analytics will not be dynamically updated. You will need to refresh the page to view the updated results.
Preview tab
The Preview tab displays the original form and the variants that you have created for testing. You can preview the form variants in full screen by clicking the Expanded View icon (). You can also see the changes made to the variants compared to the original form.

Analytics tab
The Analytics tab displays the following details:
A/B Testing Details
- The duration of the A/B test.
- The number of days, weeks, months, or visitors for the A/B test.
- The user who started the A/B test.
- The status of A/B test.

A/B testing stats
- Experiments: the original form and the variants.
- Unique submissions by visitors.
- The conversion rate and improvement in the conversion rate compared to original form.
- The percentage of visitors assigned to the original form and the variants.

- The leading form: The version of the form with the highest conversion rate during the A/B test.
- The winner: The version of the form with the highest conversion rate after the A/B test is complete.
The graphs display comparisons between the original form and the variants. You can view comparisons for:
- The rate of conversion
This graph displays a comparison of the conversion rate and the improvement rate between the original form and each of the variants.


- The fields filled in the web form
This graph displays the rate at which the fields in the original web form and the variants were filled in.

Pause A/B testing
If you temporarily pause the A/B testing, the original web form will be displayed to all visitors while it is paused. When the test is resumed, the variants will be shown again.
To pause A/B testing
- Go to Setup > Developer Space > Webforms > A/B Testing.
- Click Pause in the list view in the Status column.

Alternatively, you can open the A/B test and click the Pause button next to the Status in the A/B Testing Details section.
You can also filter the list based on Status using the filter button ().

- Click Resume to continue testing.

Once the A/B test is complete and the winner is declared, you can launch the winning web form on your website that will replace the original form. The winner is declared based on the conversion rate, that is, the total number of visits vs. the total number of unique submissions made by the visitors. You can either launch manually or automatically after the period of a/b testing is completed.
To launch a web form manually
- Go to Setup > Developer Space > Web Forms > A/B Testing.
- Click Launch in the list view under the Status column.
The winner will not be launched until you click this button.
Alternatively, you can open the A/B test, move your mouse pointer over the experiments in the A/B Testing Stats section and click Launch.

To launch a web form automatically
In the A/B testing Configuration page, select Automatically launch the winner once the test is completed.
Check Based on Condition and mention the conversion rate, if you want to launch only when a certain conversion rate is achieved.
Zia suggestions for A/B testing
With advancements in AI and the ever-evolving landscape of digital marketing, Zoho CRM has incorporated Zia, our artificial intelligence, to refine further and enhance your A/B testing experiences. These AI-driven insights provide deeper layers of analysis and recommendations, taking the guesswork out of your optimization efforts. Here's what Zia brings to the table.
Zia is designed to help optimize your webforms by identifying those with low conversion rates or field filled rates, and suggesting A/B testing for improvement.
- A webform is considered to have a low conversion rate if its rate falls below the 20th percentile of a module's conversion rate, or below 2% when a module only contains one webform. The conversion rate is a critical metric as it indicates the percentage of visitors who complete the desired action, such as filling out a form or making a purchase.
- If the rate of filling out fields in a webform is below the 10th percentile, it is considered to have a low field filled rate. This rate indicates how frequently visitors complete each field in the webform. A low field filled rate could imply that a field is perplexing, unnecessary, or not user-friendly.
If a webform's conversion rate is below these specified thresholds, or if its field filled rate is less than the 10th percentile, Zia will suggest performing an A/B testing, comparing two versions of the webform against each other to determine which one performs better. This suggestion is based on a comprehensive evaluation of the webform's conversion rate and the availability of any recommended variants that may have different layouts, content, or field types.
Zia offers data-driven suggestions to improve the conversion rate of your form during the A/B test setup. These recommendations involve rearranging or eliminating specific fields in order to optimize your webforms for better user engagement and completion rates.
Zia will propose rearranging fields with a field filled rate below the 10th percentile and removing fields with a 0% field filled rate.
Traffic split suggestions
When a particular variant leads to a meagre conversion rate, Zia will suggest a split of traffic where the poor-performing form gets the least traffic, or the user can divert the traffic of the low-performing form to other performing forms. These traffic changes will be shown in the A/B testing stats - % of visitors columns. The split suggestion will only be shown when there is enough data to confirm that a specific variant is underperforming to avoid false positives with a small sample of data.
Calculation of traffic split suggestions
The calculation of traffic split suggestions involves determining the appropriate traffic split between the variants and the original form based on the following factors:
- Sample size: The size of the total population being tested.
- Statistical significance: The level of confidence you want to achieve in the results, typically set at 95%.
- Minimum detectable effect: The minimum difference in performance between the variants and the original form that you want to be able to detect.
Based on these factors, the appropriate traffic split is determined.
Suggestions for extending the test duration
Zia may recommend extending the duration of A/B testing under specific conditions:
- No significant winner identified: If there is no significant winner with a 95% or higher confidence score.
- Sufficient days/visits remaining: If the test has not reached the maximum limit of time or visits, the suggestion won't show. For example, if the test has been running for 60 days and the suggested extension is 40 days, it won't show since the maximum limit is 90 days.
Factors considered for suggesting the test extension
The suggestion to extend the duration of the A/B testing includes factors like statistical significance score, confidence intervals, and minimum sample size throughout the designated period of A/B testing.
Additionally, the suggestion to extend the A/B testing is not limited to the duration of the test. If the A/B testing condition is based on the number of visitors, the suggestion may also recommend extending the test to accommodate more visitors.
For example, if the initial test was designed for 1,000 visitors but the results are not conclusive, the suggestion may recommend extending the test to 1,500 visitors to obtain more reliable results.
Criteria for the launch suggestion
The launch suggestion can be shown when both of the following conditions are met:
- The winner of the A/B test is a variant form and not the original form.
- The conversion rate of the winner variant in the A/B test is less than the conversion rate outside the A/B test (during the starting A/B test).

Dynamic traffic allocation
Traffic allocation based on form performance
Instead of static traffic allocation to the variants, the system will allocate the traffic dynamically based on the forms' feedback. The system will automatically take care of the Exploration-Exploitation trade-off by dynamically allocating most traffic to the best-performing form at that point in time and randomly exploring other forms. Traffic allocation is based on the performance of the forms. When a form is submitted, it is considered as positive, and when a form is dropped, it is considered as negative.
Statistical significance and confidence scoring
Winner declaration
Zia offers a confidence score for the declared winner by analyzing the statistical significance of the test. This score allows you to make informed decisions with certainty. This feature provides a confidence score for the determined winner of an A/B test, allowing for reliable and robust comprehension of the outcomes. This score is determined based on the statistical significance of the test, ensuring that you can confidently make decisions.

Notes:
- Extending the A/B test duration will not be recommended if a clear winner has been determined with a high degree of certainty (a confidence score of 95% or above) or if extending the testing period exceeds the maximum limit of 90 days.
- The adaptive traffic reallocation and split suggestions will be offered only when the system has enough data to make reliable determinations.