Activate Experiment

Activate Experiment

The Activate Experiment step evaluates whether a user qualifies for a Full Stack A/B test and assigns a variation accordingly. This is the point where your application decides which experience a user should see.

The experiment activation logic runs entirely on the server side using the PageSense SDK.

What this allows you to do  

By activating an experiment, you can:

  • Check whether a user is eligible for a Full Stack experiment

  • Assign a consistent variation to qualified users

  • Control application behavior based on the assigned variation

  • Ensure the same user continues to see the same variation across sessions

When to activate an experiment  

You should activate an experiment:

  • After the SDK is successfully initialized

  • When a user session begins or when user identity is available

  • Before rendering or executing variation-specific logic

NOTE: Activation should not be called before the SDK initialization callback completes.

Activating an experiment with user attributes  

User attributes help determine whether a user qualifies for an experiment based on audience targeting rules defined in PageSense.

KOTLIN

  1. import com.zoho.pagesense.android.abtesting.PageSenseClient
  2. // Create a map to hold user attributes
  3. val userAttributes = mutableMapOf(
  4. "DeviceType" to "Phone",
  5. "OS" to "Android",
  6. "OSVersion" to "14",
  7. "DeviceModel" to "Pixel 8 Pro"
  8. )
  9. // Activate the Full Stack A/B Test experiment
  10. val variationName = pageSenseClient.activateExperiment(
  11. experimentName,
  12. userId,
  13. userAttributes
  14. )
  15. // Handle variation-specific logic
  16. if (variationName == "Original") {
  17. // Handle Original variation
  18. } else if (variationName == "Variation 1") {
  19. // Handle Variation 1
  20. } else if (variationName == "Variation 2") {
  21. // Handle Variation 2
  22. } else if (variationName == "Variation 3") {
  23. // Handle Variation 3
  24. } else {
  25. // User is not part of the experiment
  26. }

JAVA

  1. import com.zoho.pagesense.android.abtesting.PageSenseClient;
  2. import java.util.HashMap;
  3. // Create a map to hold user attributes
  4. HashMap<String, String> userAttributes = new HashMap<>();
  5. userAttributes.put("DeviceType", "Phone");
  6. userAttributes.put("OS", "Android");
  7. userAttributes.put("OSVersion", "14");
  8. userAttributes.put("DeviceModel", "Pixel 8 Pro");
  9. // Activate the Full Stack A/B Test experiment
  10. String variationName = pageSenseClient.activateExperiment(
  11. experimentName,
  12. userId,
  13. userAttributes
  14. );
  15. // Handle variation-specific logic
  16. if ("Original".equals(variationName)) {
  17. // Handle Original variation
  18. } else if ("Variation 1".equals(variationName)) {
  19. // Handle Variation 1
  20. } else if ("Variation 2".equals(variationName)) {
  21. // Handle Variation 2
  22. } else if ("Variation 3".equals(variationName)) {
  23. // Handle Variation 3
  24. } else {
  25. // User is not part of the experiment
  26. }

 Parameter

Parameter

Type

Required

Description

experimentName

String

Yes

Name of the Full Stack experiment configured in PageSense.

userId

String

Yes

Unique and stable identifier for the user. Must remain consistent across sessions.

userAttributes

Map<String, String>

No

Optional user attributes used for audience targeting and segmentation.

Understanding the response  

  • If the user qualifies for the experiment, the API returns the variation name assigned to the user in String format.

  • If the user does not qualify due to audience targeting or traffic allocation, the API returns null.

You should always handle the null case to ensure safe fallback behavior.

Edge Cases  

Scenario

Result

Experiment not found

Returns null

User outside traffic allocation

Returns null

Audience targeting fails

Returns null

SDK not initialized

Behavior undefined (must not be called before initialization)

How the API Works    

When activateExperiment() method is invoked, it follows a series of steps to determine whether a variation should be assigned to the user:

1. Audience Targeting    

The API first checks whether the user meets the experiment’s audience targeting rules defined in PageSense. These rules can include user attributes like browser, device type, OS, or any custom properties passed in the user attributes.

  • If the user’s attributes match the audience targeting conditions, the evaluation proceeds.

  • If they don’t match, the API immediately returns null, indicating the user is not eligible for the experiment.

2. User Storage Service  
  

User Storage Service stores the variation allocated to the users for a given experiment in the user provided storage layer such as Database, Redis Cache or File System. It ensures that the user is always assigned the same variation for a given A/B Test across different sessions and browsers.

  • If a stored variation already exists for the given user Id for the experiment, it is retrieved from the storage and returned.

  • If not, the SDK proceeds to assign a new variation via hashing algorithm.

3. Hashing with MurmurHash    

The user ID and the experiment key is combined to form a unique key and the API applies the MurmurHash algorithm to this unique key to produce a deterministic numeric value between 0 and 9999.

This hash value determines the user’s position in the experiment’s traffic allocation range and assign a variation.

  • MurmurHash always generates the same hash value for a given user  ID and  the experiment key combination.

  • This ensures that users always receive a consistent variation assignment across different sessions and browsers.

4. Variation Mapping    

Each variation within an experiment is assigned a value range based on its allocated traffic percentage. For example, in an A/B Test experiment with 80% traffic allocation and four variations, each variation being allocated individual traffic split of 25%, the value ranges for the four variations will be assigned as shown below,

Variation

Value Range

Original

0 – 2000

Variation 1

2001 – 4000

Variation 2

4001 – 6000

Variation 3

6001 – 8000

 

These ranges are non-overlapping and collectively cover the experiment’s total traffic allocation.

  • If the user’s hash value falls within a particular variation’s range, that variation will be allocated to the user.

  • If the user’s hash value falls outside all assigned ranges for the variations, no variation will be assigned to the user and the user will not qualify for the experiment, and NULL will be returned for the variation.

5. Tracking and Analytics  


Once a variation is assigned:

The API triggers the tracking events for the user visit to PageSense, recording the experiment name, user ID, variation name and user attribute details if passed.

6. Return Value  


Outcome

Return Value

User qualifies and variation allocated

Returns the variation name

User does not match the audience targeting rules

Returns NULL

User falls outside traffic allocation

Returns NULL


Activating an experiment without user attributes  

In some cases, you may want to activate a Full Stack experiment without passing any user attributes. This is useful when:

  • User attributes are not yet available

  • The experiment targets All Visitors

  • You want to rely only on the user identifier for variation assignment

When user attributes are not provided, the SDK evaluates the user only against experiments that do not require additional audience conditions.

KOTLIN & JAVA

  1. // Activate the Full Stack A/B Test experiment without user attributes
  2. val variationName = pageSenseClient.activateExperiment(
  3. experimentName,
  4. userId
  5. )

Important notes  

  • Only experiments targeting All Visitors will qualify when no user attributes are provided

  • Variation assignment remains deterministic for the same user ID

  • Always handle the null case to ensure safe fallback behavior

Best practices  

  • Use a stable user identifier (such as user ID or account ID).

  • Pass user attributes only when they are available.

  • Avoid calling activateExperiment multiple times for the same user session.

  • Always handle users who are not part of the experiment.