A/B Testing Analytics

Track and analyze experiment performance with variant-level metrics.

Overview

When you run A/B tests in Snoopr, the SDK automatically tracks which variant each user sees and includes this context in all analytics events. The dashboard provides variant-level breakdowns for completion rates, drop-offs, and engagement.

How Variant Assignment Works

  1. User requests Story: SDK calls the CDN with device ID in x-snoopr-visitor-id header
  2. CDN assigns variant: CloudFront@Edge assigns user to a variant (sticky assignment)
  3. SDK receives assignment: Response headers include x-snoopr-experiment-id and x-snoopr-variant-id
  4. Events include context: All subsequent events for that Story include the experiment context
// Events automatically include when in an experiment:
{
  event: "carousel_viewed",
  experimentId: "exp-abc123",
  variantId: "variant-b",
  // ... other fields
}

Accessing Experiment Data

In the Dashboard

Navigate to Analytics and select an active experiment from the filter. You'll see:

  • Per-variant metrics: Views, completions, dismissals for each variant
  • Completion rate comparison: Side-by-side completion rates
  • Statistical significance: Whether results are conclusive
  • Interaction rates: Engagement with interactive elements per variant

Via API

Query experiment metrics programmatically:

const metrics = await analyticsService.getExperimentMetrics({
  appId: 'app-xxx',
  experimentId: 'exp-abc123',
  startDate: '2026-01-01',
  endDate: '2026-01-28'
})

// Returns per-variant breakdown:
// {
//   variantId: 'variant-a',
//   views: 1250,
//   completions: 875,
//   dismissals: 200,
//   interactions: 3400,
//   completionRate: 0.70,
//   dismissalRate: 0.16,
//   interactionRate: 2.72
// }

Evergreen Traffic

Users not in any experiment are tracked separately as "evergreen" traffic. This includes:

  • Users who saw the Story before the experiment started
  • Traffic after an experiment concludes
  • Control groups from ended experiments

Query evergreen metrics separately:

const evergreen = await analyticsService.getEvergreenMetrics({
  appId: 'app-xxx',
  startDate: '2026-01-01',
  endDate: '2026-01-28'
})

User ID vs Device ID

Variant assignment uses device ID by default (anonymous users). When you call identify():

  • The user ID is sent in subsequent requests via x-snoopr-user-id
  • Assignment can be based on user ID for cross-device consistency
  • Events include both device ID and user ID
// After login, user sees consistent variant across devices
Snoopr.identify('user-123')

Accessing Assignment in Code

Check the current experiment assignment:

// Via SnooprProvider context (when a Story is active)
const { experiment } = useSnoopr()

if (experiment) {
  console.log('Experiment:', experiment.experimentId)
  console.log('Variant:', experiment.variantId)
}

The experiment object is available after showStory() returns and persists until the Story is dismissed.

Best Practices

Run experiments for sufficient duration Wait for statistical significance before concluding. The dashboard indicates when results are conclusive.

Don't change variants mid-experiment Modifying variant content invalidates results. Create a new experiment instead.

Segment by user properties Use identify() with properties like plan or cohort to analyze experiment impact across segments.

Monitor error rates per variant Check carousel_error events aren't concentrated in one variant, which could indicate a bug.


Setting up experiments? See Experimentation Guide for concepts and Publishing & Going Live for traffic allocation.