Creating Experiments
Step-by-step guide to setting up A/B tests in Snoopr.
Overview
Creating an experiment involves building variant Stories and configuring traffic allocation. This guide walks through the complete setup process.
Prerequisites
Before starting an experiment:
- Have a live evergreen Story - experiments require a baseline
- Plan what you're testing - define a clear hypothesis
- Ensure sufficient traffic - experiments need data to be conclusive
Step 1: Create Your Variant Story
An experiment variant is just a Story with different content.
- Go to Stories in your dashboard
- Click New Story (or duplicate an existing one)
- Build your variant content
- Name it descriptively (e.g., "Onboarding - Short Version")
Tip: Duplicate your evergreen Story to start with a working baseline, then modify what you want to test. See Story Editor Basics for editing help.
Step 2: Mark as Experiment
By default, Stories are evergreen candidates. To make a Story an experiment variant:
- Open the Story settings
- Set Story Type to "Experiment"
- Save
Experiment Stories don't compete to be the evergreen - they only receive traffic when explicitly allocated.
Step 3: Configure Traffic Allocation
With your variant ready, allocate traffic:
- Go to Publish Settings for your app
- You'll see your evergreen Story at 100%
- Add your experiment Story
- Set the percentage (minimum 5%)
- The evergreen percentage adjusts automatically
Example:
- Set Experiment A to 20%
- Evergreen automatically becomes 80%
Step 4: Publish and Monitor
- Click Publish Changes
- Traffic begins splitting immediately
- Monitor in Analytics with the experiment filter
Allow sufficient time for data collection before drawing conclusions.
Creating Multi-Variant Experiments
Test more than two options by adding multiple experiment Stories:
- Create Story variants B, C, D...
- Allocate traffic across all variants
- Compare all variants against each other
Example setup:
| Variant | Traffic | What's Different |
|---|---|---|
| Evergreen (Control) | 40% | Current design |
| Variant B | 20% | Shorter copy |
| Variant C | 20% | Different CTA |
| Variant D | 20% | New illustrations |
Testing Specific Changes
Copy Changes
Test different messaging:
- Duplicate evergreen Story
- Update text elements only
- Run experiment
Keep everything else identical to isolate the variable.
Design Changes
Test visual approaches:
- Create variant with new design
- Match all copy exactly
- Compare engagement metrics
Flow Changes
Test different screen orders or counts:
- Create variant with modified flow
- Track drop-off at each screen
- Compare completion rates
Naming Conventions
Use clear, descriptive names for experiment Stories:
| Pattern | Example |
|---|---|
| Feature + Variant | "Onboarding - 3 Screens" |
| Date + Test | "2026-01 CTA Test" |
| Hypothesis | "Shorter = Better" |
Avoid generic names like "Test 1" or "Variant A" - you'll forget what they mean.
Experiment Checklist
Before launching:
- Variant Story is complete and error-free
- Variant content is finalized (cannot edit once live)
- Story type is set to "Experiment"
- Traffic allocation is configured
- Analytics are enabled
- Hypothesis is documented (even informally)
After launching:
- Verify traffic is splitting correctly
- Check both variants render properly
- Monitor error rates for anomalies
- Set a reminder to check results
Related
- Traffic Allocation - Allocation rules and strategies
- Analyzing Results - Understand your experiment data
For Developers: See A/B Testing Analytics for SDK integration details.