Top 11 A/B Testing Tips

  1. Always test. You should constantly be testing something on your direct mail pieces, emails, web content, and telemarketing/in-person scripts. You should always be looking to improve. Your existing materials may end up being the best, but you don’t know that until you test it.
  2. Test one item at a time. In order to test what makes the difference, you should always change just one element at the time. It might be the message, colour, envelope, greeting, or other small piece of the campaign. If you change both the colour and the reply device, you will not know which made the difference if it performs better than your existing package. Certain software allows you to test multiple items at a time for your website. This should be reserved for relatively minor elements, rather than the layout.
  3. Test for statistical significance. Though the response rate or average gift may be higher on your test package, you need to test if this is statistically significant or not. Average gifts can be brought up by one large gift, making it look like the test package performed better, when in actuality it performed worse, aside from the one large gift. 
  4. Ensure your test group has enough recipients. This can be a tricky number to determine, as it really depends on how sure you want to be that differences seen aren’t just a circumstance of chance. I recommend having at least 1/3 the number in the control group in your test group. For exact numbers to have a statistically significant test, contact us and get a copy of our campaign analysis Excel template.
  5. When testing website elements, be patient. Depending on your site traffic, let the test run for anywhere from a couple of days to a couple of weeks.
  6. Prioritize elements to test based on implementation difficulty and opportunity to determine which tests to run first. For webpage testing, start with the most frequent landing page, which may not be your home page, as this is the page most customers see first. Top exit pages, labeled % exit in Google Analytics, may also indicate a problem and an opportunity for improvements.
  7. When testing web or email, concentrate your results analysis on the donation rate, rather than just the click through rate.
  8. Ensure your control and test group are selected randomly. Random selection ensures that other factors don’t influence the results. For instance, if your test group contained only donors in one post code and your control group contained only donors in another post code, the difference in demographics may be the reason for a difference in donations rather than the tested element.
  9. Run your test and control groups at the same time. Because time of day, week, or month may influence response rates and donation amounts, the test and control need to be done at the same time.
  10. Do not worry about the possible loss of revenue. If you are changing only one element in a package, script, email, or webpage, you are unlikely to experience a large loss of revenue. To improve, you must take risks.
  11. Don’t discount elements that were previously less effective than a test element. Test elements every so often, as effective packages sometimes experience fatigue and may just need to be rotated out for a time.