Creating Experiments

Step-by-step guide to setting up A/B tests in Snoopr.

Overview

Creating an experiment involves building variant Stories and configuring traffic allocation. This guide walks through the complete setup process.

Prerequisites

Before starting an experiment:

  • Have a live evergreen Story - experiments require a baseline
  • Plan what you're testing - define a clear hypothesis
  • Ensure sufficient traffic - experiments need data to be conclusive

Step 1: Create Your Variant Story

An experiment variant is just a Story with different content.

  1. Go to Stories in your dashboard
  2. Click New Story (or duplicate an existing one)
  3. Build your variant content
  4. Name it descriptively (e.g., "Onboarding - Short Version")

Tip: Duplicate your evergreen Story to start with a working baseline, then modify what you want to test. See Story Editor Basics for editing help.

Step 2: Mark as Experiment

By default, Stories are evergreen candidates. To make a Story an experiment variant:

  1. Open the Story settings
  2. Set Story Type to "Experiment"
  3. Save

Experiment Stories don't compete to be the evergreen - they only receive traffic when explicitly allocated.

Step 3: Configure Traffic Allocation

With your variant ready, allocate traffic:

  1. Go to Publish Settings for your app
  2. You'll see your evergreen Story at 100%
  3. Add your experiment Story
  4. Set the percentage (minimum 5%)
  5. The evergreen percentage adjusts automatically

Example:

  • Set Experiment A to 20%
  • Evergreen automatically becomes 80%

Step 4: Publish and Monitor

  1. Click Publish Changes
  2. Traffic begins splitting immediately
  3. Monitor in Analytics with the experiment filter

Allow sufficient time for data collection before drawing conclusions.

Creating Multi-Variant Experiments

Test more than two options by adding multiple experiment Stories:

  1. Create Story variants B, C, D...
  2. Allocate traffic across all variants
  3. Compare all variants against each other

Example setup:

VariantTrafficWhat's Different
Evergreen (Control)40%Current design
Variant B20%Shorter copy
Variant C20%Different CTA
Variant D20%New illustrations

Testing Specific Changes

Copy Changes

Test different messaging:

  1. Duplicate evergreen Story
  2. Update text elements only
  3. Run experiment

Keep everything else identical to isolate the variable.

Design Changes

Test visual approaches:

  1. Create variant with new design
  2. Match all copy exactly
  3. Compare engagement metrics

Flow Changes

Test different screen orders or counts:

  1. Create variant with modified flow
  2. Track drop-off at each screen
  3. Compare completion rates

Naming Conventions

Use clear, descriptive names for experiment Stories:

PatternExample
Feature + Variant"Onboarding - 3 Screens"
Date + Test"2026-01 CTA Test"
Hypothesis"Shorter = Better"

Avoid generic names like "Test 1" or "Variant A" - you'll forget what they mean.

Experiment Checklist

Before launching:

  • Variant Story is complete and error-free
  • Variant content is finalized (cannot edit once live)
  • Story type is set to "Experiment"
  • Traffic allocation is configured
  • Analytics are enabled
  • Hypothesis is documented (even informally)

After launching:

  • Verify traffic is splitting correctly
  • Check both variants render properly
  • Monitor error rates for anomalies
  • Set a reminder to check results

For Developers: See A/B Testing Analytics for SDK integration details.