Back to Case Studies

Quicken: Why Timing is Everything in A/B Test Implementation


Extensive user research was conducted on the high traffic/high value Product Comparison page. Results found that the comparison page played an integral role in the decision-making process for both new and existing users. Looking to improve the matrix, and move more visitors into the purchase funnel, Quicken wanted to create a test variation for the Product Comparison page.


With the holiday season fast approaching, Quicken wanted to launch, conclude, and implement the winning variation prior to Black Friday, when returning users receive a coupon for product upgrades. The urgency leading up to the holidays increased the temptation to prematurely implement the new variation to expedite the desired impact.


The testing variation was created based on further rounds of qualitative studies. Quicken then worked with Blast to design and implement a testing protocol. To prevent prematurely determining a winner, the protocol guaranteed the test to run for two business cycles (two weeks), allowing each variation to gather enough conversion volume. This time frame also made it possible to account for varying visitor behavior during each day of the week.

Result updates were provided at the end of each business cycle, rather than reporting on early test performance where conversions are low and external factors can’t be accounted for.

TEST CONTROL                                                         TEST VARIATION

image showing quicken control and variation

The main KPI’s for this test included:

  • Transaction Rate (units sold/visitors to page)
  • Add to Cart Rate (add to carts/visitors to page)
  • Cart to Transaction Rate (units sold/Add to Carts)


Performance After Week One

  Statistical Significance Test Variation
Transaction Rate 91% -5.1%
Add to Cart Rate 99% -9.8%
Cart to Transaction Rate 99% +5.2%

With a significant amount of conversion volume during the first week, it would have been easy to rush to judgment. Instead, Quicken adhered to the testing protocol and let the test run for two business cycles.

Performance After Week Two

  Statistical Significance Test Variation
Transaction Rate 99% +5.5%
Add to Cart Rate 6% -0.1%
Cart to Transaction Rate 99% +5.4%

By learning from initial variations and applying those learnings to subsequent tests, Blast was able to find an optimal balance between messaging, relevance, and usability that altered the product mix towards the pro version. This resulted in significant increases in conversion rates and average revenue per user with visitors from both segments.

Looking at week over week data, there was clearly an unusual change in performance. Prior to determining a winner, it was important to identify what factors contributed to the Control doing so well in the first week, and the Variation ultimately doing better after two business cycles.

Blast discovered a key insight in Quicken’s channel traffic. The difference in the amount of traffic coming from email during the first week versus the second week was significant.

image comparing week one and week two email traffic

An email was sent targeting existing customers during week one, resulting in an unusually high volume of existing users to the Comparison Chart page. These existing users were already familiar with the original site, so the Test Variation likely caused friction. This hypothesis was proven by looking at week over week comparison of the Add to Cart Rate KPI:

Week Over Week Add to Cart Rate

Test Variation Performance
Week 1 Only Week 2 Only
-9.8% +12.27%

Historical precedent showed most visitors on Black Friday are existing customers, and their behavior would likely be similar to test performance in week one. Thus, Blast recommended Quicken retain the Control for the busy holiday season.

Patience is a Virtue

It would have been a disservice to Quicken’s business to implement the variation based solely on overall test results. By diving deeper into the data, Blast provided key insights to make an informed business decision. Quicken agreed with Blast’s recommendation and decided to implement the “winning” variation in the future, when existing users have more time to acclimate to the new design.

The Blast team has been a joy to work with on this experiment. In particular, Roopa Carpenter, who we work with on our test strategy & execution has been patient, clear and customer-focused. Over and over again, she has been very responsive to all our needs. Most important, even during the heat of peak season, she remained professional and focused. We truly value this partnership!

Janin Kompor, Web Strategy Leader
Quicken Logo



telestream testing case study pdf download image

Interested in Working with Blast?

Request More Information or Call 1 (888) 252-7866

Project Overview


  • Improve performance of the product comparison matrix, and overall page, to convert more visitors


  • Created a testing variation based on qualitative studies conducted by Quicken
  • Created a testing protocol guaranteeing the test would run for two business cycles (two weeks)
  • Identified main KPI’s for test and reported on them at end of each business cycle


  • Looking at overall results, Blast identified an unusual pattern of results and dug deeper into the data
  • Key insight was the difference in the amount of traffic coming from email during each week of testing
  • High percentage of visitors in week one were returning users which caused friction
  • Blast recommended retaining the Control, even though the variation “won”


Usability & UXUsability & UX
Testing & PersonalizationTesting & Personalization

Strategy Driven Analytics Consulting Company
Copyright © 1999-2019 Blast Analytics & Marketing

Privacy  |  Privacy Settings  |  Terms  |  Code of Ethics