A/B Testing

A/B testing lets you compare two variants to find what drives more revenue – either two promotions against each other, or a promotion against no promotion at all. Adsgun splits your traffic automatically and tracks the results per variant in real time.

1 How A/B testing works

When an A/B test is running, Adsgun intercepts each new visitor and assigns them to one of the two variants. Depending on the test type, a variant is either a promotion or no promotion at all (the control). The assignment is random based on the traffic split you configure, and each visitor stays in the same variant for the duration of their session.

Adsgun then tracks how each variant performs across a set of metrics including orders, revenue, conversion rate, and revenue per visitor, giving you a clear side-by-side comparison of which variant is driving better results.

What makes a good A/B testFor reliable results, change only one variable between the two promotions – for example, test 20% off vs 30% off, but keep everything else (visibility, URL targeting, storefront blocks) identical. If too many things differ between variants it becomes impossible to know what caused the performance difference.

2 Preparing your promotions

Before creating an A/B test, you need to prepare the promotions you want to test. Promotions used in A/B tests must be flagged in advance – they cannot be pulled from your regular active promotions.

  1. Create or open the promotions you want to test
    Go to Promotions in Adsgun and create the two promotions you want to compare. If you already have them, open each one for editing.
  2. Flag each promotion for A/B Testing
    In the promotion editor, scroll to the A/B Testing section and check the Flag this promotion for A/B Testing checkbox. Do this for both promotions.
  3. Save both promotions
    Once flagged, each promotion’s status will be locked to Draft and it will not run as a standalone promotion. It is now available to select when creating an A/B test.

Visibility must match between variantsBoth promotions in a test must have the same visibility setting. You cannot mix a Public promotion with a Private one. If you are testing Public promotions, both variants must be Public. Same rule applies to Private and Customer Account visibility.

3 Creating a test

Once your promotions are flagged, head to the A/B Testing section of Adsgun to set up the test.

  1. Navigate to A/B Testing
    In the Adsgun sidebar, click A/B Testing. Then click Create A/B Test.
  2. Enter a test name
    Give the test a descriptive name so you can identify it later in your list of tests, for example “30% vs 20% OFF – March”.
  3. Select the test type
    Choose between Promotion vs Promotion or Promotion vs Control. See Section 4 for the difference.
  4. Set the minimum duration
    Enter the minimum number of days the test must run before a winner can be declared. The default is 7 days. See Section 5 for guidance on choosing this value.
  5. Configure variants
    Name each variant (e.g. “PROMOTION 1”, “PROMOTION 2”) and select the corresponding flagged promotion from the dropdown for each.
  6. Set the traffic split
    Use the slider to define what percentage of visitors each variant receives. The default is 50% / 50%.
  7. Save the test
    Click Save. The test is created in Draft status. You can review it before starting.

4 Test types

Adsgun supports two types of A/B tests, depending on what you want to measure.

Promotion vs Control
Compares one promotion against no promotion at all. The control group sees your store at full price. Use this to measure the actual revenue impact of running a promotion versus not running one.

When to use Promotion vs ControlThis test type is useful when you want to answer the question “does this promotion actually increase revenue, or does it just eat into my margins?” The control group gives you a baseline to compare against.

5 Traffic split & duration

Traffic split

The traffic split determines what percentage of visitors are assigned to each variant. A 50/50 split is the default and is recommended for most tests – it gives both variants equal exposure and reaches statistical significance fastest.

You can adjust the split if needed, for example if you want to limit exposure to a more aggressive discount (e.g. 80% to the safer offer, 20% to the higher discount). Keep in mind that an uneven split means the smaller variant will take much longer to collect enough data to be meaningful.

Minimum duration

The minimum duration is the number of days the test must run before Adsgun will allow you to declare a winner. The default is 7 days, which is the recommended minimum for most stores.

  • Why 7 days minimum?

    Shopping behavior varies significantly by day of the week. A test that runs for only 2–3 days might capture a weekend spike or a weekday slump that does not reflect typical performance. Running for at least 7 days ensures each variant is exposed to a full weekly cycle, making the results far more reliable.

100 visitors per variant required addition to the minimum duration, Adsgun requires at least 100 visitors per variant before results are considered statistically meaningful. If your store has lower traffic, you may need to run the test longer than 7 days to reach this threshold.

6 Starting the test

After saving, the test opens in Draft status. You can review the full configuration – both variants, their promotions, the traffic split, and all settings – before going live.

When you are ready, click Start Test in the bottom right corner. The test status changes to Running and Adsgun begins assigning visitors to variants immediately.

The Test URL

Once the test is running, Adsgun provides a Test URL at the top of the test detail page. This is a special URL you can share to run the A/B test – each visitor who lands via this URL is automatically assigned to a variant. Copy it using the Copy URL button and use it in your ads, emails, or any traffic source you want to include in the test.

The Test URL contains a unique adsgun_test parameter that Adsgun uses to identify the test and assign the visitor to the correct variant. Do not modify this URL.

7 Reading results

While the test is running, the detail page shows a live metrics panel for each variant side by side. Here is what each metric means:

Metric What it measures
Visitors Total unique visitors assigned to this variant. Sourced from Google Analytics 4 – may be delayed 24–48 hours.
Page Views Total number of pages viewed by visitors in this variant. From GA4, subject to the same delay.
Product Views Number of product detail page views recorded for visitors in this variant. From GA4.
Add to Carts Number of add-to-cart events by visitors in this variant. From GA4.
Orders Live Number of completed orders placed by visitors in this variant. This is live data – no delay.
Conv Rate Conversion rate: orders divided by visitors. Requires both GA4 visitor data and live order data to calculate.
Revenue Live Total revenue generated by orders in this variant. Live data – updates immediately as orders come in.
AOV Average Order Value – total revenue divided by number of orders. Tells you how much customers spend per order on average in each variant.
RPV Revenue Per Visitor – total revenue divided by number of visitors. The single most important metric for comparing variants, as it accounts for both conversion rate and order value together.
GA4 data is delayed 24–48 hoursVisitor, page view, product view, and add-to-cart metrics come from Google Analytics 4 and may take up to 48 hours to appear. Orders and revenue are live. Statistical significance is most accurate once GA4 data is fully synced. All analytics are tracked per device.

Focus on RPV for your decisionA variant might have a higher conversion rate but a lower AOV, or vice versa. RPV combines both into a single number that tells you which variant is actually generating more revenue per visitor – which is what matters most for your bottom line.

8 Declaring a winner

Adsgun enforces two requirements before you can declare a winner:

  • The test has been running for at least the minimum duration you set (default 7 days).
  • Each variant has received at least 100 visitors.

While either of these conditions has not been met, Adsgun displays a warning banner on the test detail page explaining what is still needed. Once both conditions are satisfied, the option to declare a winner becomes available.

No clear winner?If the results between variants are very close after sufficient data has been collected, it may mean the difference between the two offers is not significant enough to matter to your customers. In that case, consider running a new test with a larger difference between the variants, or keep the more profitable option (higher margin) and call it a draw.