Experimentation

Run A/B tests to optimize your Stories with data-driven decisions.

Overview

Experimentation lets you test different versions of your Stories to find what works best. Instead of guessing, you can show variant A to some users and variant B to others, then compare the results.

Snoopr handles the complexity of running experiments - random assignment, consistent user experiences, and statistical analysis - so you can focus on creating great content.

Why Run Experiments?

Validate hypotheses Think a shorter onboarding will improve completion? Test it. Believe adding images will boost engagement? Prove it.

Reduce risk Roll out changes gradually. If a new design performs poorly, only a fraction of users are affected.

Optimize continuously Small improvements compound. A 5% lift in completion rate, then another 3%, adds up to significant gains over time.

What Can You Test?

Experiments work at the Story level. Each variant is a complete Story with its own screens, elements, and flow.

Common things to test:

TestExample
Copy"Get Started" vs "Let's Go" button text
Length5-screen vs 3-screen onboarding
OrderPermissions request first vs last
DesignMinimal vs rich visuals
InteractionSingle-choice vs multi-select

Key Concepts

Evergreen Story

Your default Story - what users see when they're not part of an experiment. One evergreen Story must be live before you can run experiments.

Variants

Different versions of your Story being tested. Each variant receives a percentage of traffic.

Traffic Allocation

How you divide users between your evergreen Story and experiments. Total traffic always equals 100%.

Statistical Significance

Whether your results are conclusive. Small differences might be random chance; large consistent differences indicate a real winner.

Experiment Lifecycle

Create variants -> Allocate traffic -> Run experiment -> Analyze results -> Pick winner
  1. Create variants: Build the Stories you want to test
  2. Allocate traffic: Decide what percentage sees each variant
  3. Run experiment: Let data accumulate
  4. Analyze results: Compare completion rates, engagement, drop-offs
  5. Pick winner: Promote the best performer to evergreen

Ready to start? See Creating Experiments for step-by-step setup.

Track performance: See Analytics to understand your metrics.

For Developers: Learn about A/B Testing Analytics for SDK integration details.