A/B testing
Last updated: May 6, 2026
Overview
A/B Testing is an experimentation feature built into the Appbrew studio and apps that allows brands to test different versions of their app's design — at the block or page level — to understand what drives better performance. Rather than guessing what works, brands can run controlled experiments and let data decide.
Problem Statement
Brands building mobile apps on Appbrew currently have no way to validate whether a design change actually improves performance. Changes are made based on intuition, and there's no mechanism to measure their impact on real user behaviour and revenue outcomes.
What It Does
A/B Testing randomly splits app users into groups ("variants") and serves each group a different version of a block or page. It then measures how each variant performs across key business metrics, helping brands identify which version works better.
This is different from segmentation. Segmentation shows content based on who a user is (e.g. logged-in vs. guest). A/B testing is about understanding what works better — by assigning users and comparing outcomes.
Scope
A/B tests can be created at two levels:
Block Level — Test a specific block within a page. Use this when you want to isolate the performance of a single component (e.g. a banner).
Page Level — Test an entire page against an alternate version. Use this when evaluating broader experience changes.
⚠ Note: You cannot run a page-level test on a page that already has an active block-level test.
Success Metrics For Analysis
Brands can track test performance across five metrics:
Metric | What It Measures |
|---|---|
Revenue per Session | Compare how much revenue each visit on both variants generates |
Conversion Rate | Compare how often visitors on both variant convert into buyers |
Average Order Value | Compare how much users spend per order through both variants |
Add to Cart Rate | Compare how often users add products to their cart through both variants |
Product View Rate | Compare how often users view product pages by clicking through both variants |
How It Works
Setting Up a Block-Level Test
Go to the design page in Appbrew studio and select the block you want to test
Click Create A/B Test in the right edit panel
Give the test a name and description
Create a new variant — either from scratch or by copying the original and editing it
Choose your success metric and set audience distribution
Schedule the test or start it immediately
Setting Up a Page-Level Test
Go to the design page in Appbrew studio and select the page you want to test
Click Create A/B Test in the left navigation panel
Give the test a name and description
Create a new variant — either from scratch or by copying the original and editing it
Choose your success metric and set audience distribution
Schedule the test or start it immediately
Tracking Performance
For all tests
Click More in the top navigation
Select A/B Tests
View performance for each test in the table against your chosen metric
Click the analytics icon on the right for detailed analysis and additional controls
Once a test concludes or sufficient data is available, apply the winning variant directly from the table using Apply Block or Apply Page.
Alternatively, you can also view test data directly within design page:
For blocks — Once a test is created, the data can be viewed in the edit panel on the right itself after selecting the block.
For pages — Once a test is created, the data can be viewed in the left navigation panel after selecting the page.
Value To Brands
A/B Testing removes guesswork from app optimisation. Brands can make confident, data-backed decisions about their design — knowing that changes they ship have been validated against real user behaviour and business outcomes.
Common Use Cases
Use Case 1: Testing a Banner on the Home Page (Block Level)
Scenario: A fashion brand wants to know whether a lifestyle image or a product-focused image on their home page hero banner drives more conversions.
How they'd use it: They create a block-level A/B test on the hero banner block — Variant A keeps the existing lifestyle image, Variant B swaps it for a product flat-lay with a direct CTA. They set Conversion Rate as the success metric and run the test for 2 weeks.
Potential Outcome: Variant B drives a 12% higher conversion rate, so they apply it as the permanent block.
Use Case 2: Comparing Two Homepage Layouts (Page Level)
Scenario: A beauty brand has redesigned their homepage and wants to validate whether the new layout drives more revenue before fully committing to it.
How they'd use it: They create a page-level A/B test — Variant A is the current homepage, Variant B is the redesigned version. They set Revenue per Session as the success metric with a 50/50 audience split.
Potential Outcome: The new layout generates 18% more revenue per session, giving the brand the confidence to fully roll it out.